I'm developing a single page application that uses a lot of widgets (mainly grids and tabs) from the jqWidgets library that are all loaded upon page load. It's getting quite large and I've started to notice after using (I emphasize using because it doesn't start to lag after simply being open for any amount of time, but specifically, after opening and closing a bunch of tabs on my page, each tab containing multiple grids loaded thru Ajax that have multiple event listeners tied to each) the site for a couple minutes the UI becomes quite slow and sometimes non-responsive, when the page is refreshed everything works smooth again for a few minutes then back to laggy. I'm still testing on localhost. My initial reaction was that the DOM has too many elements (each grid creates hundreds of divs! And I have a lot of them) so event listeners which are tied to IDs have to search through too many elements and become slow. If this is the case it won't be too hard to fix, is my assumption likely to be the culprit or do I have worse things to fear?
UPDATE: here are captures of the memory time line and heap snapshot. On the memory timeline there was no interaction with the site, the two large increases are page refreshes, the middle saw tooth section is just letting my site idle.
Without seeing any code examples it doesn't sound too bad.
If you have a LOT of jQuery selectors try and make those specific as possible. Especially if you're selecting a lot of items a lot of the time.
For example, if you have a bunch of class "abc", try and specify before that where to look - e.g. are they only found within table cells? are they only found within paragraph tags? The more specific you make your selector the better as if you specify the selector like this:
$('.class')
Then it will search the entire DOM for anything that matches .class, however, if you specify it as follows: $('p .class') then it will only search all paragraph tags for the class.
Other performance killers are wiring up events and then never removing them. If you have any code that removes elements that have event handlers attached to them then best practice is to remove the event handlers when the element is removed. Otherwise you will start piling up orphaned events.
If you are doing a large single page application look to a library like backbone (http://backbonejs.org/) or angular (http://angularjs.org/) to see if this can help you - they alleviate a lot of these issues that people who use plain jQuery will run in to.
Finally, this post (http://coding.smashingmagazine.com/2012/11/05/writing-fast-memory-efficient-javascript/) is seriously good at outlining out you can write fast, efficient javascript and how to avoid the common performance pitfalls.
Hope this helps.
It does sound like you have a memory leak somewhere. Are you using recursion that's not properly controlled or do you have loops that could be ended early, but you fail to break out of them when you find something you're looking for before the loop naturally ends. Are you using something like this:
document.getElementById(POS.CurrentTableName + '-Menus').getElementsByTagName('td');
where the nodelist returned is huge and you only end up using a tiny bit of it. Those calls are expensive.
It could be your choice of architecture also. Hundreds of divs per grid doesn't sound manageable logically by a human brain. Do you address each div specifically by id or are they just an artifact of the lib you're using and are cluttering up the DOM? Have you checked the DOM itself as you're using it to see if you're adding elements in the hinterland by mistake and cluttering up the DOM with junk you don't use causing the DOM to grow continuously as you use the app. Are you adding the event handlers to the elements numerous times instead of just once?
For comparison, I too have a single page app (Google-Chrome App - Multi currency Restaurant Point of Sale) with anywhere from 1,500 to 20,000 event handlers registered making calls to a sqlite back end on a node.js server. I used mostly pure JS and all but 50 lines of the HTML is written in JS. I tie all the event handlers directly to the lowest level element responsible for the event. Some elements have multiple handlers (click, change, keydown, blur, etc).
The app operates at eye blink speed and stays that fast no matter how long its up. The DOM is fairly large and I regularly destroy and recreate huge portions of it (a restaurant table is cleared and recreated for the next sitting) including adding up to 1,500 event handlers per table. Hitting the CLEAR button and it refreshing the screen with the new table is almost imperceptible, admittedly on a high end processor. My development environment is Fedora 19 Linux.
Without being able to see your code, its a little difficult to say exactly.
If the UI takes a little bit before it starts getting laggy, then it sounds likely that you have a memory leak somewhere in your JavaScript. This happens quickly when using a lot of closures as well as nested function and variable references without cleaning them up when your done with them.
Also, event binding to many elements can be a huge drain on browser resources. If possible, try to use event delegation to lower the amount of elements listening to events. For example:
$('table').on('click','td', myEventHandler);
Be careful to make sure that event bindings only occur once as to avoid actions being unintentionally fired many times.
Good luck!
Related
Clicking my button is taking >2s, before my code is reached (the code in the actual button handler runs in ~10ms).
It's bound as follows:
$('#my-div').find('.my-button').on('click', function(){ ... })
No other mouse event handlers are bound to this element or its parents or children.
There are a lot of these buttons, about a thousand on the page. I'm guessing that has something to do with it, but I still don't understand why it's quite so slow. Is there anything I can do?
You have a thousand buttons on your page, when a few dozen should be more than enough for most users. This is an accessibility problem for most users.
Your page is around 1.6MBs. The bigger a page gets, the slower the browser can render and manipulate it. Most programmers think that a page is slow because of javascript, actually 99% of the time the reason the page is slow is because the browser has to render and manipulate the html in the page. This is especially true for older versions of Internet Explorer.
You should set only one event handler, instead of setting up a thousand event handlers (which will have to be individually compiled by the browser)...
like so:
$('#my-div').on('click', '.my-button', function(){ ... })
I have a fairly good sized javascript (with react/redux but no jquery) codebase for a webapp I'm building, and I've noticed that when I repeatedly open and close a certain panel within the UI, the number of listeners according to Chrome's performance timeline keeps increasing.
The graph looks like this:
I have allowed the chrome's performance monitor run for a good minute or two with the page sitting idle (just after opening/closing the panel a bunch), hoping that perhaps the listeners will get garbage collected, but they are not. I've switched to other tabs during this process, also hoping that the listeners will get garbage collected when the tab is backgrounded, but they unfortunately are not.
I therefore suspect that some listeners are getting registered that are never unregistered.
This leads me to two main questions:
Does my hypothesis that listeners are getting added and never
unbound seems sensible, or is there more I could be doing to confirm
this suspicion?
Assuming my suspicion is correct, how can I best go
about tracking down the code where the event listener(s) is/are
being added? I have already tried the following:
Looked at the code that is responsible for opening the panel in question, seeing where it adds any listeners, and commenting out those portions to see if there's any change in the performance graph. There is not a change.
Overridden the addEventListener prototype like so:
var f = EventTarget.prototype.addEventListener;
EventTarget.prototype.addEventListener = function(type, fn, capture) {
this.f = f;
this.f(type, fn, capture);
console.trace("Added event listener on" + type);
}
Even after doing this, then commenting out all code portions that cause this console.trace to be executed (see #1) such that the console.trace is no longer printed upon open/close of the panel, I notice the same increase in listeners in the performance graph. Something else is causing the listeners to increase. I understand that there are other ways that listeners can be added, but it's not clear to me how to intercept all of those possibilities or cause them to be logged in Chrome's debugger in such a way that I can tell which code is responsible for adding them.
Edit:
- At the suggestion of cowbert in the comments, I took a look at this page:
https://developers.google.com/web/tools/chrome-devtools/console/events
I then made the following function:
function printListenerCount() {
var eles = document.getElementsByTagName("*");
var numListeners = 0;
for (idx in eles) { let listeners = getEventListeners(eles[idx]);
for(eIdx in listeners)
{
numListeners += listeners[eIdx].length;
}
console.log("ele", eles[idx], "listeners", getEventListeners(eles[idx]));
}
console.log("numListeners", numListeners)
}
I execute this function after having opened/closed the panel a bunch of times, but unfortunately the "numListeners" figure doesn't change.
If the numListeners figure changed, I would be able to diff the results before/after having open/closed the panel to discover which element
has the extra event listener registered to it, but unfortunately numListeners does not change.
There is also a monitorEvents() API described on https://developers.google.com/web/tools/chrome-devtools/console/events, but the function
call requires that you specify a DOM element that you wish to monitor. In this situation, I'm not sure which DOM element has the extra
listeners, so I'm not sure how the monitorEvents() call will really help me. I could attach it to all DOM elements, similar to how I've
written the printListenerCount function above, but I presume I'd run into a similar problem that I ran into with printListenerCount() --
for whatever reason, it's not accounting for the listener(s) in question.
Other notes:
This is a somewhat complicated reactjs (preact, technically) based application. Like most reactjs based apps, components get mounted/unmounted (inserted into and removed from the DOM) on the fly. I'm finding that this makes tracking down "stray event handler registrations" like this a bit tricky. So what I'm really hoping for is some general debugging advice about how to track down "Stray event handlers" in large/complex projects such as this. As a C programmer, I would open gdb and set a breakpoint on everything that can possibly cause the "listeners" number in the performance graph to increase. I'm not sure if there's an analog of that in the javascript world, and even if it there, I'm just not sure how to do it. Any advice would be much appreciated!
Thanks for your comments, everyone. I ended up figuring this out.
From my OP:
Does my hypothesis that listeners are getting added and never unbound seems sensible, or is there more I could be doing to confirm this suspicion?
It turns out that the answer to this question is: The hypothesis is not sensible. The listeners simply haven't had a chance to get garbage collected yet. It can take some more time than you might think.
Here's how I figured it out:
I failed to realize that while recording a performance timeline, it's possible to force a garbage collection by clicking on the trash can icon in the Performance tab (same tab used to start the timeline recording). By clicking this icon after repeated closings/openings of the UI panel, the extra listeners completely went away. The graph now looks like this, with the dips being moments where I clicked the trash icon:
Apparently, backgrounding the tab and waiting a couple of minutes like I mentioned in the OP is simply not enough time for garbage collection to occur on its own; It takes some more time than that.
I wasn't aware of the ability to manually collect garbage with the trash can icon when I wrote the OP... I strongly recommend using it before going on any wild goose chases hunting down what might at first look like a performance problem.
I have a chrome extension that modifies the DOM based on keywords. The problem is, for websites like twitter that have an infinite scroll, I need a way for my function to keep firing as the user scrolls through the page.
Is .livequery() the only way to do this or is there a better way?
Right now all of the logic is plain JavaScript/Jquery, but I'm open to using a framework like Angular if that's the best way to do it.
I have several functions that interact -
1) a hide() function that adds a class to divs containing words I want hidden
2) a walk() function that walks the DOM and identifies divs to call hide() on
3) walkWithFilter() function that gets words to filter from localstorage and calls walk() function
The last function walkWithFilter() is called in a window.onload() event
It seems like the onScroll event would be a natural match for this. The trick would be that you'd need to keep track of what's already been processed to avoid reprocessing old content. If you're assuming that the user is always exposing new content below the existing content, that could be as simple as keeping a pointer to the last processed item and restarting the walkWithFilter method from there. That doesn't seem like an entirely safe assumption to me, though.
If you want to be more robust in that regard, you could try a virtual DOM approach: you maintain a copy of the DOM as you last saw it, compare it to the DOM as it currently exists, and take a diff. I know there are a bunch of premade libraries for this kind of thing, but I haven't used any and can't recommend a specific one (the link just goes to the first example that showed up in Google). It also doesn't appear to be overly burdensome to roll your own, if you're so inclined.
I have a situation where I am working on a large site and what I have been doing is using one main .js file to store all my bound js code that I want to use on elements such as onclick, onchange etc etc.... these are all held within the one onDomReady method.
Now I'm wondering is it such a good idea to have each page have to go over these and "search" for each element to see if it has to bind anything?
..or should I perhaps use more specificity to prevent this such as the main page ID like #page1, #page2 etc OR should I store these in the specifics pages header (I don't really want to do that as I prefer to keep it all in one place).
Just trying to optimize things and get rid of unnecessary overhead! :)
If I understand correctly, you have one js file with all your event handlers.
This file is included i many pages.
So for example, if there are 100 event handlers in the file, each page may be using only 10 of these.
If thats the case, then its not efficinet, because you have lots of
document.getElementBy... that are not fnding the elements, because they belong to a different page, or worse, finding elements with same selector on multile pages that should not be binded to handlers on a specific page.
also, you are adding script to a page that it does not need.
Best to give each page only what it needs, be it in external js or if very little script, in doucment head.
js that you share across pages should be code that you intend to re-use often.
EDIT:
In response to comment:
regarding reducing http requests, you mean the one file will be in cache, for other pages to use? fair enough, that counts as a benefit. Though there are tradeoffs, such as increased memory usage due to javascript that you dont need in page.
using more specific selector will reduce the risk of attaching event handler to wrong element in a page that you did not mean to target, but there is a safer option:
If you insit on sharing one event handler file across pages,
Group them by wrapping them a function, one for each page. call that function from the page.
This way, you dont have to execute a bunch of code that you dont need, and don't risk adding wrong event handlers to simmilar elements accross pages.
As many developers will be I'm producing web based application that are using AJAX to retrieve data and HTML.
I'm new to web development and javascript but have a couple of decades experience in programming in other languages.
I'm using mootools, which is a great framework, but have been battleing with the lack of destructors in javascript or even onDestroys/ unloads for the dom elements.
I've written a number of UI classes ( mostly to learn ) and alot of them use setInterval timers to periodically get data from the WebServer and update elements on the page (mostly images from cameras).
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I've been reading more on the mootools site and have realized a number of mistakes I've been making and have started to correct alot of the issues. The biggest of which was not using Element.store and Element.retrieve instead of linking my classes directly to the Elements.
I've already found that the contents of the div being reloaded need to be freed by calling destroy on all its child elements before calling the Request.HTML but that will not remove (clear) any timers that are running.
So I've done a JSFiddle here deinitialize classes to show what i've been trying, its appears to work fine but the following and what i want to know is,
Is it a good idea?
are there any other issues I might have missed?
can you see any problem with this type of implementation ?
or am I reinventing the wheel and missed
something?
Explanation
When the class is initialized it stores itself with the element.
It also appendes (makes if necessary) itself into an AssocClasses array also stored with the element.
I've created a ClearElement function that is called whenever the contents of an element are to be replace with and AJAX call or other method, which gets all elements within the div and if they have and AssocClasses array attached, calls the deinitialize on each of the Classes in the array, then it calls destroy on each of its direct children to free the elements/storage.
Any information, pointers etc would be most greatfully recieved.
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I would rethink your timer storage and use of evalScripts in your ajax calls.
Keep these outside of your AJAX requests. When doing peer code reviews rarely have I seen an instance where these were needed and could be done in a better way.
Maybe on the link that is clicked have it trigger a callback function on Complete or onSuccess
Without seeing your exact code it will be hard to advise further.