I am trying to make a page which consists of many UI tools and visualizations. The problem is as these visualizations increase in a single page the performance of the browser goes down and becomes slower.
I was thinking that if I can save some UI nodes which user is not looking at and send it to server for saving and when users tries to come to the same node to visualize I can get the node and it's event handlers and initialize it again faster.
I tried jquery clone method but when I stringify the object to save it looses the event handlers. Has anyone tried to do this before?
I am not sure if i get the question right, but i assume that there is one scrollable page with lots of graphs and other visualizations. And problem is that page becomes too slow because of that (scrolling interacting, memory etc.). Have you tried to show user those visualizations only if they are in browser view and remove those which are not visible (including events). IF user scrolls back, it would get re-rendered once again (with initing events for this specific visualization).
Hope this helps if it not possible solution let me know i will remove this answer
Related
Take https://www.instagram.com/instagram as an example. If you click on 'Followers', a list of all the followers is loaded, but only a few are loaded at a time, and you need to continually scroll until all data is loaded.
Is there a way in Javascript to automatically load all the data, or maybe even automate the scrolling for that particular div?
Thanks.
A few things to consider.
The method Instagram is using to show some followers until you scroll to get more is called Infinite scrolling.
Basically, all we have to know is that this method is implemented on the client, that is, in the browser, using JavaScript.
The first solution is: if you can reverse engineer the Instagram code (minified I suppose) and find the good methods to call, you can force fetch new items even if you didn't scroll.
On the other hand, another technique would be to constantly simulate a scroll to the end, I let you refer to this answer on how to.
I have a web application which has 8 tabs on the menu. When I click on one tab the current content fades out and the content specific to tab comes up.
This thing is working all fine but the application is pretty slow as the fadeIn and fadeOut is being done on very large html (maybe around ~2000 lines/tab of ) content which is inturn making the whole application slower.
What can I do to make it snappy and smooth.
Here are some doubtful points I have...
Can I load the tab content via AJAX instead of loading all of them in one go?
If I load them via AJAX for ex of Tab A and then user clicks on Tab B, should I remove Tab A content from the DOM and reload it on click of Tab A Again or should I keep it in the DOM.
What will happen to the click handlers as I need to do dynamic event binding in that case.
What else can be done in order to enhance the performance?
Thanks a lot!
Well, we had that kind a problem, where we have around 10 tabs and need to load data. The doubts you are having is already turned to solution. I ll define wat we have done to accomplish the task with more smooth and perfect result
We are binding the data on tab click using Ajax call, rather than making elements on the runat server. We are binding it on client side Dom manipulation.
If there is a grid, that will have large set of amount, binding data for evey pagination, rather to load whole set of data.
On every tab click data is loading and data on previous tab is cleared. Or into its default position.
If large set if data remain on html, so Dom will Also be heavy and which in turn slower your application.
5.for this task to accomplish you have to create Ajax calls and more Dom.manupulation , which will inturn take more time. So have to decide the path in which direction you want to go.
From my standpoint , there are some observations. I can think if to enhance the performance of the application.
Check if there are any heavy image which will take time to load or slower down you page to compete all loading
Remove the reference of any unused js file or CSS.
User of async and defer with external scripts.
Try not to use more server side element which overhead your page to load, because they gonna bind on server side. Which makes you page heavy.
In some cases compression techniquie is also helpful for me.
Ajax calls and client side manipulation is best, rather to send whole page on server side. !!!
From my experience, these are the things I learned and implement when working with web application to improve the performance of the application.
I'm developing a single page application that uses a lot of widgets (mainly grids and tabs) from the jqWidgets library that are all loaded upon page load. It's getting quite large and I've started to notice after using (I emphasize using because it doesn't start to lag after simply being open for any amount of time, but specifically, after opening and closing a bunch of tabs on my page, each tab containing multiple grids loaded thru Ajax that have multiple event listeners tied to each) the site for a couple minutes the UI becomes quite slow and sometimes non-responsive, when the page is refreshed everything works smooth again for a few minutes then back to laggy. I'm still testing on localhost. My initial reaction was that the DOM has too many elements (each grid creates hundreds of divs! And I have a lot of them) so event listeners which are tied to IDs have to search through too many elements and become slow. If this is the case it won't be too hard to fix, is my assumption likely to be the culprit or do I have worse things to fear?
UPDATE: here are captures of the memory time line and heap snapshot. On the memory timeline there was no interaction with the site, the two large increases are page refreshes, the middle saw tooth section is just letting my site idle.
Without seeing any code examples it doesn't sound too bad.
If you have a LOT of jQuery selectors try and make those specific as possible. Especially if you're selecting a lot of items a lot of the time.
For example, if you have a bunch of class "abc", try and specify before that where to look - e.g. are they only found within table cells? are they only found within paragraph tags? The more specific you make your selector the better as if you specify the selector like this:
$('.class')
Then it will search the entire DOM for anything that matches .class, however, if you specify it as follows: $('p .class') then it will only search all paragraph tags for the class.
Other performance killers are wiring up events and then never removing them. If you have any code that removes elements that have event handlers attached to them then best practice is to remove the event handlers when the element is removed. Otherwise you will start piling up orphaned events.
If you are doing a large single page application look to a library like backbone (http://backbonejs.org/) or angular (http://angularjs.org/) to see if this can help you - they alleviate a lot of these issues that people who use plain jQuery will run in to.
Finally, this post (http://coding.smashingmagazine.com/2012/11/05/writing-fast-memory-efficient-javascript/) is seriously good at outlining out you can write fast, efficient javascript and how to avoid the common performance pitfalls.
Hope this helps.
It does sound like you have a memory leak somewhere. Are you using recursion that's not properly controlled or do you have loops that could be ended early, but you fail to break out of them when you find something you're looking for before the loop naturally ends. Are you using something like this:
document.getElementById(POS.CurrentTableName + '-Menus').getElementsByTagName('td');
where the nodelist returned is huge and you only end up using a tiny bit of it. Those calls are expensive.
It could be your choice of architecture also. Hundreds of divs per grid doesn't sound manageable logically by a human brain. Do you address each div specifically by id or are they just an artifact of the lib you're using and are cluttering up the DOM? Have you checked the DOM itself as you're using it to see if you're adding elements in the hinterland by mistake and cluttering up the DOM with junk you don't use causing the DOM to grow continuously as you use the app. Are you adding the event handlers to the elements numerous times instead of just once?
For comparison, I too have a single page app (Google-Chrome App - Multi currency Restaurant Point of Sale) with anywhere from 1,500 to 20,000 event handlers registered making calls to a sqlite back end on a node.js server. I used mostly pure JS and all but 50 lines of the HTML is written in JS. I tie all the event handlers directly to the lowest level element responsible for the event. Some elements have multiple handlers (click, change, keydown, blur, etc).
The app operates at eye blink speed and stays that fast no matter how long its up. The DOM is fairly large and I regularly destroy and recreate huge portions of it (a restaurant table is cleared and recreated for the next sitting) including adding up to 1,500 event handlers per table. Hitting the CLEAR button and it refreshing the screen with the new table is almost imperceptible, admittedly on a high end processor. My development environment is Fedora 19 Linux.
Without being able to see your code, its a little difficult to say exactly.
If the UI takes a little bit before it starts getting laggy, then it sounds likely that you have a memory leak somewhere in your JavaScript. This happens quickly when using a lot of closures as well as nested function and variable references without cleaning them up when your done with them.
Also, event binding to many elements can be a huge drain on browser resources. If possible, try to use event delegation to lower the amount of elements listening to events. For example:
$('table').on('click','td', myEventHandler);
Be careful to make sure that event bindings only occur once as to avoid actions being unintentionally fired many times.
Good luck!
Beatports new interface has solved a major problem I was looking for the solution too.
Namely, it keeps a "player" interface at the moment and you can browser to different parts of the site (also changing the url) without reloading or interrupting the player.
I cannot for the life of me understand how they have done this, can any of you guys figure it out?!
Many thanks for any replies I get
Looks like they are just using AJAX to load new content but have taken care to make it work and look pretty seamless. You can get better insight into what events are attached to what elements via the Visual Events bookmarklet. Once you find the code that triggers the event, you can run the obfuscated javascript through JSBeautifier to examine it more closely.
Specifically, it looks like they're adding click handlers to all anchor tags, passing off the event if it was triggered with a middle click or modified with a keyboard key, otherwise passing it to a dynamic loader which handles state and other specific conditions like multiple clicks. The seamlessness of it comes from the way they deal with URLs making every page bookmarkable and the browser history so the back and forward buttons work as you would expect on a "normal" site.
Here's the task at hand. I need to implement a fully client-side tree that will work finely with permalinks and back/forward buttons for navigation.
E.g. I open a page with such tree control, expand some nodes, then press back and it collapses the last expanded node, then press forward and it expands the subject node. Finally I copy the url of the page and send it to my colleague - she clicks the url and the tree gets its nodes open to reveal the same tree structure that I see on my screen.
I'm looking for a JavaScript tree control that would fit the following list of requirements:
(Mandatory) Support for asynchronous node retrieval.
Possibility to hook into expand/collapse events to invoke custom logic that will serialize tree state into url anchor.
API for programmatic expanding/collapsing of given nodes, so that I don't have to emulate clicks when deserializing tree state upon pageload.
I've already had some experience with jsTree and jQuery treeview.
The problem with jsTree is that it uses <a> tags to render nodes, which messes up url anchors on click. After a couple of hours I've managed to migrate it to <span>'s, though my solution works only in Firefox. Not as good as I'd like.
Another thing happened when I tinkered with jQuery treeView. At first I was embarassed by its "not in active development" status, though upon a second glance it appeared to be a simple yet powerful widget. The async demo looked excellent so I tried to reproduce it at my PC and with my data. But then I faced a weird bug - when my JSON service returned lazy nodes (i.e. ones that had hasChildren set to true), the treeview immediately expanded those and rendered the "loading" gif, though without loading anything. I tried to debug this glitch, but I'm really not that smart to understand how all those callbacks and aspects interact with each other. At least not within the time window I had.