I have an html ul list of 10,000+ elements and want to add custom hover tooltip events and do some other processing to each one. To do this on document.ready takes 2-3 seconds and freezes the browser. How can I do this asynchronously so that the browser doesn't freeze?
I've been reading about setTimeout, jQuery queue and deferred, but maybe I'm too dense to understand it all. This guy had interesting stuff http://erickrdch.com/2012/05/asynchronous-loop-with-jquery-deferred.html
Here's my each() loop that adds the hover.
$('#biglist li').each(function(index) {
$(this).hover(function(e){
...do stuff...
});
});
Thanks for your help.
why do you have it encased in a loop? try just applying hover to #biglist with .on(), and then delegating to each li:
$('#biglist').on('hover','li', function(){
// do crap
});
The .each() loop is likely wreaking havoc by doing processing on each item. As a side note, if you need to have different functions performed for mouseenter vs mouseleave, u need to use the XML format (using the .on() method is slightly different than the traditional .hover() method):
$('#biglist').on({
mouseenter: function(){
// do mouseenter crap
},
mouseleave: function(){
// do mouseleave crap
}
},'li');
Either way this should greatly reduce CPU consumption from processing.
You should bind it to #bigList using .on(), instead of on 10,000+ elements:
$('#bigList').on('hover', 'li', function(e) {
// ...
});
Browser JavaScript does not support true concurrency (apart from cutting-edge HTML5 features, that is) so you can't have one process run in the background while the page continue to work normally.
The best you can do AFAIK is to cut your processing code into chunks as small as possible, then execute them with setTimeout or setInterval, using a low value for delay (but preferably non-zero, so the rest of the page won't freeze).
Setting aside the fact that you don't need a different hover handler for each element, as others pointed out, if you need to do heavy processing of your received data (and such processing can't be done by the server) one way would be using a queue:
var queue = [];
setInterval(function() {
var next = queue.shift();
if ( next ) next();
},50);
...
$.each(lotsOfData, function(index, value) {
queue.push(function() {
// Code for processing the value
});
});
Related
I run an e-commerce website and we have various third-party JavaScript that add click handlers to links and forms and then insert a delay to make sure the tracking goes through. The delay is inserted by burning CPU and running a for or while loop until a certain date is passed either 350 ms later (Marketo / Munchkin) or 500ms for Visual Website Optimizer. Combined this is almost 1 second!
Sometimes, maybe the delay is nice so tracking can be more reliable. However, we don't want this to happen when you click most links on our site because it adds up to 1 second of delay for the user. With that big a delay, there goes all the other performance optimizations we've done!
Unfortunately, we need a lot of the functionality of these scripts (like Visual Website Optimizer and Marketo) so we can't remove them.
Is it possible to remove the handlers they've added or prevent them from firing, given that I don't have references to them?
You can use unbind to remove events.
If you need know the event name or type, you can see with Chrome Dev Tools
Another method is capture the event and stop the propagation.
EDIT:
If the event if not launched with jQuery, you can use removeEventListener or set null the property event.
You cannot remove the event listener without the reference to the handler function, but what you can do do is to clone the element.
var elements = document.getElementsByTagName('a');
for(var i =0 ; i < elements.length; i++){
var cloned = elements[i].cloneNode(true);
elements[i].parentNode.replaceChild(cloned, elements[i])
}
Well an alternative workaround might be to change the setTimeout functionality to work without or with smaller delay
window.oldSetTimeout = window.setTimeout;
window.setTimeout = function(func, delay) {
//change the delay value here
return window.oldSetTimeout(function() {
func();
}, delay);
};
You can use pure javascript to remove the event listeners,
At the start of your program just call a function to remove the event listener, something like this
var domElement = document.getElementById('elementWithEventListener');
domElement.removeEventListener('click',listener)
And yes, you will have to find out what listeners are getting fired.
Check out Event listeners for more info.
I am using an infinite scroll plugin which uses ajax.
When the 'next page' is loaded via ajax, all other ajax related scripts that are on the next page do not work. I have been told that I have to use 'delegated events'(ie change $(id).click() to $(document).on) - problem is that means editing multiple plugins and changing dozens of function calls.
Is there any way I can avoid changing everything to $(document).on and do something cool with the infinite scroll?????
I'd much rather modify the infinite scroll plugin rather than modifying other ajax related plugins to make them fit.
Unfortunately you have very few options here, and switching to delegated events is by far the best of them.
The problem is that your old code was assigning behaviour to "particular elements" when what it should really have been doing is creating page-wide responses to "certain types of actions".
I see 3 possibilities, and only one of them is guaranteed to work.
Run any scripts that are needed on new pages each time a new page is loaded. The downside here being that unless you are careful about also "tearing down" between content loads you will have behaviours repeating or colliding with each other (eg: double popups, broken animations).
Encapsulate the dynamic areas in <iframe>s. Depending on your architecture this may or may not be possible, and certainly won't be easy to integrate with some kind of infinite scrolling plugin which already expects a certain page structure.
Bite the bullet and fix the crappy code.
Loading scripts inside your ajax loaded content is a bad way to start with anyway. What you need is event delegation to attach itself to any dynamically added elements.
$("body").on("click", ".yourclass", function() {
//This function will run for every element with `yourclass` class you load via ajax
});
If you must keep using .click() then you must have a function you can call on the new content to re-hook the events every time you add more content to the page.
e: though it is worth noting that a change from .click to .on can often be handled by a properly structured find/replace
Event delegation is the correct solution. The issue is that the HTML elements on the "next page" were not part of the DOM when the page loaded. Therefore, if you did something like:
$(function() {
$('#some-element-on-the-next-page').click(function() {
foo();
});
});
Your handler did not bind.
I wouldn't attach the events to $(document). I would attach them to the closest parent which is available when the DOM loads. For example, the body tag or the fixed width wrapper which is the first child of the body (assuming your layout uses this type of structure.)
Make sure that the element that you attach to is not emptied with .empty() or repopulated with .html() as that will break the binding. Attaching the delegated handlers lower down on the DOM tree will give you better performance since the events will not have to bubble all the way up to the document node to fire your methods.
You shouldn't need to rewrite all of your functions and plugins, just the bindings to the events that fire them.
I typically use the module pattern and de-couple my method definitions from the click handlers. All of my methods are defined in the outer closure. I'll have a "document ready" section where I bind user events like clicks.
For example:
var myModule = (function() {
var public = {};
public.foo = function() {
// do something cool here
};
// document ready
$(function () {
$('#site-container').on('click', '.js-foo', function() {
public.foo();
});
});
return public;
})();
If you need to change the bindings in the future you will only need to change the call inside the document ready section.
I'm using (and contributing to) a jQuery table sorting plugin that includes an event for extra processing before the table is sorted. Originally the browser didn't do a repaint before sorting, so I added a setTimeout call to the plugin code that should force a repaint. So the code is now like this:
$table.trigger("beforetablesort", {column: th_index, direction: sort_dir});
setTimeout(function() {
// do the hard work
}, 10);
My beforetablesort callback is like this:
table.bind('beforetablesort', function (event, data) {
$("table").css({opacity: 0.5});
});
The above all works fine. However, if I use addClass instead of inline styles, the changes from that class do not show until the table is fully sorted:
table.bind('beforetablesort', function (event, data) {
$("table").addClass('disabled');
});
If I increase the timeout to over 500ms, the opacity does change. It seems like it takes a tiny bit longer for a class change to be visible as opposed to an inline style change. But by the time the browser is ready to repaint it's already doing the table sorting.
Is there a way to force the repaint earlier? Or wait until the repaint for the table sorting code to run? Increasing the timeout arbitrarily doesn't seem like a good solution as it forces all tables to take at least half a second to sort. (Full code of the plugin is here on Github if it helps.)
Maybe you have already considered using $.deferred() http://api.jquery.com/jQuery.Deferred/.
The code in timeout should go in the callback handler when deferred object is resolved
The comment by Jan Dvorak above works.
Reading some computed property will wait for reflow. Hopefully it will ensure a repaint as well.
I just added the line $table.css("display"); and the browser repaints the table before the sorting starts.
So I'm using YUI to add some animations in my application triggered when a user clicks certain elements. However, I'm running into a common problem that is easily fixed with some poor coding, but I'm looking for a more elegant solution that's less error-prone.
Often, when the user clicks something, a DOM element is animated (using Y.Anim) and I subscribe to that animation's 'end' event to remove the element from the document after its animation has completed. Pretty standard stuff.
However, problems arise when the user decides to spam-click the element that triggers this event. If the element is going to be removed from the DOM when the animation ends, and the user triggers an event handler that fires off ANOTHER animation on the same element, this 2nd animation will eventually cause YUI to spit really nasty errors because the node it was animating on suddenly disappeared from the document. The quickest solution I've found for this is to just set some module/class-level boolean state like 'this.postAnimating' or something, and inside the event handler that triggers the animation, check if this is set to true, and if so, don't do anything. In the 'end' handler for the animation, set this state to false.
This solution is really, really not ideal for many reasons. Another possible solution is to detach the event handler for duration of the animation and re-attach it once the animation is complete. This is definitely a little better, but I still don't like having to do extra bookkeeping that I could easily forget to do if forgetting to do so leads to incomprehensible YUI errors.
What's an elegant and robust way to solve this problem without mucking up a multi-thousand-line Javascript file with bits and pieces of state?
Here's some example code describing the issue and my solution to it.
var popupShowing = false,
someElement = Y.one('...');
someElement.on("click", showPopUp)
var showPopup = function(e) {
if(!popupShowing) {
popupShowing = true;
var a = new Y.Anim({
node: someElement,
duration: 0.2,
...
});
a.on('end', function() {
someElement.remove(true);
popupShowing = false;
});
a.run();
}
}
So if the user clicks "someElement" many times, only one animation will fire. If I didn't use popupShowing as a guard, many animations on the same node would be fired if the user clicked quickly enough, but the subsequent ones would error out because someElement was removed when the first completed.
Have a look at the Transition API. It's more concise, and may very well do what you want out of the box.
someElement.transition({ opacity: 0, duration: 0.2 }, function () { this.remove(); });
// OR
someElement.on('click', function () { this.hide(true, { duration: 0.2 }); });
// OR
someElement.on('click', someElement.hide);
Personally, I haven't used Anim since Transition was added in 3.2.0. Transition uses CSS3 where supported (with hardware acceleration), and falls back to a JS timer for older browsers.
http://developer.yahoo.com/yui/3/examples/transition/transition-view.html
Edit: By popular demand, a YUI way:
myAnim.get('running')
tells you whether an animation is running. To use this you might have to restructure the way you call the event so the animation is in the right scope, for example:
YUI().use('anim', function (Y) {
var someElement = Y.one('...');
var a = new Y.Anim({
node: someElement,
duration: 0.2,
...
});
a.on('end', function() {
someElement.remove(true);
});
someElement.on('click', function() {
if (!a.get('running')) {
a.run();
}
});
});
jsFiddle Example
Previously I had said: I personally like the way jQuery handles this. In jQuery, each element has a queue for animation functions. During animations an "in progress" sentinel is pushed to the front for the duration of the animation so anything that doesn't want to step on an animation peeks at the front of the queue for "in progress" and decides what to do from there, e.g. do nothing, get in line, preempt the current animation, etc.
I don't know enough about YUI to tell you how to implement this, but I find it to be a very elegant solution.
Quick & dirty solution to this particular issue?
Attach your handlers using .once() instead
someElement.once("click", showPopUp)
Also suitable if you need the handler re-attached later, just call that line again when the animation is done. You could also store your state information on the node itself using setData/getData but that is just a panacea to the real problem of state tracking.
Also, +1 to Luke's suggestion to use Transition for DOM property animation, it's grand.
I've made a javascript menu with css and javascript. I've used some mootools (1.11 , the only version i can use).
The script runs on domready, it fetches a dom element, and adds functions (showmenu, hidemenu) on the mouseenter and mouseleave events. The dom element is three levels of nested ul/li`s.
Now I want to add a delay on the menu of 500 ms when one hovers over the menu for the first time, and again when the users leaves the menu (so that the user has half a second time to get back to the menu).
I dont know how to keep track of the events, and cancel them. My knowledge of javascript is not good enough to know where to start. Can anyone give me an example how i should create this? Not really looking for cut and paste code, more pointers in the working of javascript, which native functions i could use, and what is the best way to set something like this up.
Thanx in advance
p.s. Maybe i want to also have a delay (100 ms or so) when the user is already using the menu for the items to show up. Will this be a lot more complex?
perhaps this can give you an idea: http://www.jsfiddle.net/dimitar/stthk/ (extracted it from another menu class I am working on and modded for delay for you as an example)
basically several interesting bits:
options: {
showDelay: 500,
hideDelay: 500
},
defines your delays on mouseover and out.
and then the bind for mouseenter deferred via .delay():
mouseenter: function() {
$clear(_this.timer);
_this.timer = (function() {
this.retrieve("fold").setStyle("display", "block");
}).delay(_this.options.showDelay, this);
},
mouseleave: function() {
$clear(_this.timer);
_this.timer = (function() {
this.retrieve("fold").setStyle("display", "none");
}).delay(_this.options.hideDelay, this);
}
_this.timer is a shared var that handles the deferred function - it gets cleared up on either mouseout or mouseover. if no event that matters takes place within the allotted time period, it will change the display accordingly, else, it will cancel the function.
this is for mootools 1.2.5 btw (storage system + elment delegation) but the principle remains the same for the bits that matter.
The stylish/anal way of doing it would be to fade in/out the menu. You do that with Fx.Morph where you morph the opacity css property and set the complete property to actually remove the div - notice that it's different to make this work in IE5-7.
The more basic/sensible way is to use setTimeout().