I have a web application which has 8 tabs on the menu. When I click on one tab the current content fades out and the content specific to tab comes up.
This thing is working all fine but the application is pretty slow as the fadeIn and fadeOut is being done on very large html (maybe around ~2000 lines/tab of ) content which is inturn making the whole application slower.
What can I do to make it snappy and smooth.
Here are some doubtful points I have...
Can I load the tab content via AJAX instead of loading all of them in one go?
If I load them via AJAX for ex of Tab A and then user clicks on Tab B, should I remove Tab A content from the DOM and reload it on click of Tab A Again or should I keep it in the DOM.
What will happen to the click handlers as I need to do dynamic event binding in that case.
What else can be done in order to enhance the performance?
Thanks a lot!
Well, we had that kind a problem, where we have around 10 tabs and need to load data. The doubts you are having is already turned to solution. I ll define wat we have done to accomplish the task with more smooth and perfect result
We are binding the data on tab click using Ajax call, rather than making elements on the runat server. We are binding it on client side Dom manipulation.
If there is a grid, that will have large set of amount, binding data for evey pagination, rather to load whole set of data.
On every tab click data is loading and data on previous tab is cleared. Or into its default position.
If large set if data remain on html, so Dom will Also be heavy and which in turn slower your application.
5.for this task to accomplish you have to create Ajax calls and more Dom.manupulation , which will inturn take more time. So have to decide the path in which direction you want to go.
From my standpoint , there are some observations. I can think if to enhance the performance of the application.
Check if there are any heavy image which will take time to load or slower down you page to compete all loading
Remove the reference of any unused js file or CSS.
User of async and defer with external scripts.
Try not to use more server side element which overhead your page to load, because they gonna bind on server side. Which makes you page heavy.
In some cases compression techniquie is also helpful for me.
Ajax calls and client side manipulation is best, rather to send whole page on server side. !!!
From my experience, these are the things I learned and implement when working with web application to improve the performance of the application.
Related
I am trying to make a page which consists of many UI tools and visualizations. The problem is as these visualizations increase in a single page the performance of the browser goes down and becomes slower.
I was thinking that if I can save some UI nodes which user is not looking at and send it to server for saving and when users tries to come to the same node to visualize I can get the node and it's event handlers and initialize it again faster.
I tried jquery clone method but when I stringify the object to save it looses the event handlers. Has anyone tried to do this before?
I am not sure if i get the question right, but i assume that there is one scrollable page with lots of graphs and other visualizations. And problem is that page becomes too slow because of that (scrolling interacting, memory etc.). Have you tried to show user those visualizations only if they are in browser view and remove those which are not visible (including events). IF user scrolls back, it would get re-rendered once again (with initing events for this specific visualization).
Hope this helps if it not possible solution let me know i will remove this answer
Take https://www.instagram.com/instagram as an example. If you click on 'Followers', a list of all the followers is loaded, but only a few are loaded at a time, and you need to continually scroll until all data is loaded.
Is there a way in Javascript to automatically load all the data, or maybe even automate the scrolling for that particular div?
Thanks.
A few things to consider.
The method Instagram is using to show some followers until you scroll to get more is called Infinite scrolling.
Basically, all we have to know is that this method is implemented on the client, that is, in the browser, using JavaScript.
The first solution is: if you can reverse engineer the Instagram code (minified I suppose) and find the good methods to call, you can force fetch new items even if you didn't scroll.
On the other hand, another technique would be to constantly simulate a scroll to the end, I let you refer to this answer on how to.
I'm beginning to convert a large web application, with a lot of complex JQuery DOM manipulation and an ASP.NET MVC backend, into something more manageable. I'm converting the server-side code into a REST API, and I want to use AngularJS to drive the UI.
I'm starting by converting one area of the app over, which consists of three screens. Making a selection in one screen presents you with a new set of choices, and you can move forward and back. I've managed to achieve this behavior very cleanly by using routes; each choice is a link with href='#/something...', and this causes a change in UI state by loading a different controller/template.
I'm now trying to animate this transition. The ng-animate attribute gets me most of the way there with the enter and leave options, but these actually fire at the same time! The result is visually very confusing. What I'd ideally want is a clean way to manage the following sequence of events:
User clicks on a button in the first screen
First screen animates out
At the same time, request is made to REST api for the next set of choices
After the previous two things are complete, the second screen animates in.
I can achieve this without animation by using the resolve parameter to the $routeProvider, but I don't know how to make the animation work properly! What is the correct to manage this kind of state?
You can write your own code that will implement animation as described here:
http://code.angularjs.org/1.1.4/docs/api/ng.directive:ngAnimate
So perhaps it is possible to do it like this:
On "Leave" you start the animation and fire the request to the server.
At the same time on "Enter" you do not start the animation right away. You are waiting for the signal from data loader.
Server responds with data. You signal that this is ready.
"Enter" implementation starts the animation.
How monitor the flag? You can simply watch global variable with setTimeout, but this is kind of ugly. I think this would be much better to use some sort of pub/sub mechanism that would allow you to subscribe to "data ready" event. For example, I use postal.js and is quite satisfied with it.
I have product listing page, which displays all the products which satisfy the search criteria. And these could be any number of products.
What I want is something like FB, that I display only first 5-7 products and as the user scrolls down, the products should be loaded dynamically.
I'd consider switching to jQuery or Mootools as JS libraries if you want to do this - both have native support for the infinite scroller concept as it's commonly called. It's not that hard to implement yourself though, mainly a matter of keeping track what you loaded last, and installing an onScroll event to detect when the bottom of the page is reached.
Here's a good tutorial using native JS to implement it, both server and client side. You'll need to replace the XHR invocations by the proper Prototype alternatives yourself (or not, wouldn't really matter).
As many developers will be I'm producing web based application that are using AJAX to retrieve data and HTML.
I'm new to web development and javascript but have a couple of decades experience in programming in other languages.
I'm using mootools, which is a great framework, but have been battleing with the lack of destructors in javascript or even onDestroys/ unloads for the dom elements.
I've written a number of UI classes ( mostly to learn ) and alot of them use setInterval timers to periodically get data from the WebServer and update elements on the page (mostly images from cameras).
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I've been reading more on the mootools site and have realized a number of mistakes I've been making and have started to correct alot of the issues. The biggest of which was not using Element.store and Element.retrieve instead of linking my classes directly to the Elements.
I've already found that the contents of the div being reloaded need to be freed by calling destroy on all its child elements before calling the Request.HTML but that will not remove (clear) any timers that are running.
So I've done a JSFiddle here deinitialize classes to show what i've been trying, its appears to work fine but the following and what i want to know is,
Is it a good idea?
are there any other issues I might have missed?
can you see any problem with this type of implementation ?
or am I reinventing the wheel and missed
something?
Explanation
When the class is initialized it stores itself with the element.
It also appendes (makes if necessary) itself into an AssocClasses array also stored with the element.
I've created a ClearElement function that is called whenever the contents of an element are to be replace with and AJAX call or other method, which gets all elements within the div and if they have and AssocClasses array attached, calls the deinitialize on each of the Classes in the array, then it calls destroy on each of its direct children to free the elements/storage.
Any information, pointers etc would be most greatfully recieved.
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I would rethink your timer storage and use of evalScripts in your ajax calls.
Keep these outside of your AJAX requests. When doing peer code reviews rarely have I seen an instance where these were needed and could be done in a better way.
Maybe on the link that is clicked have it trigger a callback function on Complete or onSuccess
Without seeing your exact code it will be hard to advise further.