I created a web page for viewing images. This page has some other code that gets included that I did not write. The page loads 40 small images upon load. Then the user will scroll down and additional pages of 40 images can be loaded via ajax. Once I get to 15-20 pages, I notice the page begins to slow significantly. I check app counters and it can go up to 100% cpu and memory can go over 3GB. Then I will inevitably get the modal that JQuery is taking too long to execute, asking me if I want to stop executing the script. Now I realize that a page with up to 800 images is a big load, but the issue with JQuery suggests to me that some code may also be iterating over this larger and larger group of dom objects. It almost appears to get exponentially slower as I pass 15 pages or so. Once I get to 20 pages it becomes almost unusable.
First of all, is it even possible to run a page efficiently, even with minimal JS, when you have this many images? Secondly, is there a recommended way to "trace" JS and see what kinds of functions are getting executed to help determine what is the most likely culprit? This is most important to me - is there a good way to do in Firebug?
Thanks :)
EDIT - I found my answer. I had some older code which was being used to replace images that failed to load with a generic image. This code was using Jquery's .each operator and thus was iterating over the entire page and each new ajax addition every time the page loaded. I am going to set a class for the images that need to be checked in CSS so that the ajax-loaded images are unaffected.
Firebug, and all the other debugging tools let you profile your functions. You can see how long they take to run and how many times they have been called.
http://getfirebug.com/javascript
See: Profile JavaScript performance
Another useful tool to look into is the profile() function
console.log('Starting Profile');
console.profile();
SuspectFunction();
console.profileEnd();
Though the console window in the debugger you can see the profile results.
The best tool I have used is https://developers.google.com/web-toolkit/speedtracer/ for Chrome
To answer your first question, 15 pages of images should not be a problem for a computer to handle. Google loads up to 46 pages of images without lagging at all. Although it does stop you from loading more after that.
The answer your second question, there are many ways to track JS code. Since you are doing a performance related debugged, I'd go with timestamped console log:
console.log(" message " + new Date());
I'd put one in the beginning and end of function you are interested in measuring the performance of, and read through the log to see how long it takes to execute each of those functions. You would compare the timestamp to see what excess code is executing and how long it took for the code to execute.
Finally, in Firebug, go to the console tab, and click on Pofile before you start you start scrolling down the page. Then scroll to your 15th page, and then click Profile again. It breaks down function called and amount of time it took.
I prefer to use the timer function in Firebug or Chrome, it can be called like this:
console.time('someFunction timer');
function someFunction(){ ... }
console.timeEnd('someFunction timer');
This isn't as robust as the profiler functions, but it should give you an idea of how long functions are taking.
Also if you are running at 100% CPU and 3GB of memory, you almost certainly have a memory leak. You may want to consider removing some the initial images, when more pages are loaded in. For example, after 5 pages being shown, you remove the first page when the user views the 6th page.
I was able to fix the problem by going over my code again. I was loading new images via ajax but I had an older line of code that was checking all images, ie $('img') to replace any images that failed to load with a generic image. This means that as I continually load new images, this selector has to iterate over the entire growing dom again and again. I altered that code and now the page is flying! Thanks everyone for the help.
Related
As some background, I am working on building a web slider activity that fits into/extends another piece of software that outputs HTML. The key point being I have very little to no control over how the software outputs it's HTML Elements, but I can add arbitrary JavaScript and HTML elements on top of what the software outputs.
What I am trying to do is delay the loading of some images that I want to add until after all the software's outputted elements are loaded. So far I have it working by waiting to load the images until after the window load event is fired. Like so:
//-----Simplified example-----
$(document).on('ready', funciton(){
//Some stuff to set things up
$(window).one('load', function(){
$('#SomeDiv').append('
<img class= "someClass"
id="arbitraryImage1"
src="folder/image1.jpg"
');
});
});
Though it is working right now I'm not sure what will happen if something goes wrong. The HTML outputted by the software has a bad habit of including things that it might not need as well as occasionally not loading images on slower connections. The majority of the time if it does fail the user will not notice, so it isn't an issue. However I would like to know how $(window).one('load', function(){})behaves when a problem arises, so that I can work around the error or at least make the activity fail graciously. Any assistance or links to documentation would be appreciated, the generic-ness of 'window' and 'load' brings up a lot of irrelevant information when googling.
To summarize:
What is the behaviour of $(window).one('load', function(){}) when the window fails to load a resource (image, css, etc.)?
Good question. The documentation says:
The load event fires at the end of the document loading process. At
this point, all of the objects in the document are in the DOM, and all
the images, scripts, links and sub-frames have finished loading.
https://developer.mozilla.org/en-US/docs/Web/API/GlobalEventHandlers/onload
But in practice, you'll find that the load event fires after all assets queued to load when the document was originally parsed (stylesheets, external scripts, img tags) have loaded, or failed loading. As long as no requests are still pending, the fact that these assets have loaded successfully or not has no bearing on the load event being fired.
The documentation is in need of a clarification, in my opinion, as it doesn't address your question.
If image fails to load, then there will be just no image in your image, if it will be unable to find SomeDiv, then nothing will be appended anywhere, generally if javascript function fails to some reasons - then nothing bad will happen, just broken functionality won't work, meaning load will still be done, just console will be telling about errors
bit of a heisenbug here...
Have a PHP/codeigniter app here. Pretty sure the controller/model etc. are sound and without bugs. Gotta be a client-side problem...
Very simple code like this in a page:
<div id="stuff">I'm empty now!</div>
<script language="javascript">
$(document).ready(function(){
var stuffID = <?=$id?>;
$.post('/event/viewStuff/'+stuffID,
function(response) {
$('#stuff').html(response);
}
);
});
</script>
After loading the above, the "stuff" div now has a grid of stuff, plus links to page through them 10 at a time, which look like this:
Next Page
But every time I click these links, the page freezes for 4-5 seconds, doing what I don't know - no network activity, no nothing chrome debugger. Then the "stuff" div reloads with the results. Also - the "Please wait" message is not shown. Weird thing is - when I yank the "document.ready()" function, reload the page, and just click on a bare / hard-coded paging link like the one above, things fires away fast as expected.
Thanks so much for taking time to read.
NEW INFO:
xdebug profile shows nothing unusual - about a 1 second functional call back to the controller/view to return content as expected.
Chrome profiler shows this - a stupid, unexplained idle:
Any additional insight on the "b.event.remove()" jquery function that's taking 6+ seconds? That seems to be the issue?
Fixed it - but don't understand the root cause:
As described above - there was a massive delay every time I clicked the "Next Page" link.
Javascript profiling proved that jquery 1.9.1 was hanging for ~6 seconds on b.event.remove(). No idea why. I thought to myself: "Hmm... maybe I've triggered some weirdly inefficient tear-down function. That #stuff div is rather large (lots of sub-forms etc)."
So I tried clearing it out first by adding this command to the link click code:
$('#stuff').html('');
No dice. Then I tried:
$('#stuff').empty();
Same deal - took even longer.
On a lark I then tried the non-jquery method:
document.getElementById('stuff').innerHTML = '';
It worked! The rest of the click-code executed immediately - no more delay!
Question is why??????
Thanks all!
try to use 127.0.0.1 instead of localhost
in mysql PDO is something like this thats costs performance... so just test it.
I submitted my HTML5 game for a HTML5 game platform for QA review and got the following feedback:
You should not perform time consuming tasks before window.onload but the actual is: DOM is blocked.
I am not sure what this review means. What I am doing now is:
$(window).ready(
....
);
So this is the entry point of all the code. So what is the difference between $(window).ready and $(document).ready() and window.onload. Do they follow a fixed order as to when they are triggered? And what does DOM is blocked mean?
dom ready vs window.load
$(document).ready(function(){}); is triggered until the HTML is fully parsed and rendered, but before all Assets (images, iframes and so on) are downloaded. $(window).load is executed right after all images are downloaded.
This means: $(document).ready comes always before $(window).load.
DOM is blocked
Is quite untechnical and not really the truth. You showcase, that you can manipulate the DOM at "DOM ready" and acutally this is the right event for doing so. What they mean is the following: As long as a script is executing, the browsers main thread is blocked. Which means, that the browser can not render/paint anything in this time and the browser becomes unresponsive.
The idea behind this advice is the following:
If you are executing a script on DOM ready, painting stops and will be delayed after script execution. The user might see some sort of your UI, because at this time the DOM is already rendered and in most cases also the CSS is, but not the images. But if you delay it after window.onload the browser can also render images and iframes, before you block the main thread, which means the user might see a fully rendered site/game sooner, although it isn't technically ready yet.
My 2 cents
Wether this is a good approach or not, really depends on a lot of circumstances. If your JS does a lot of UI things, it is bad, because the user won't see the final UI sooner, the user will see it later. If your JS is important for the page to work and the page has some bigger images, it is quite stupid to delay executing script. Consider a mobile user, he might already see the UI to start the game, but your click/tap event isn't yet bound, because there is a big image at the end of your page, which still needs to load?
If you have a performance problem fix it, don't delay it to antoher point, this will also block the main thread. But what you can do: If you have a lot of scripts, you can split those tasks into chunks and execute them at the point, when they are really needed and not initially. For example: You have a Menu and this menu has a hidden sub menu, which needs initially some special dom manipulation. Don't do this at dom ready or on window.load, do it right before it opens for the first time.
Showing then hiding animated indicator / spinner gifs are a good way to show a user that their action has worked and that something is happening while they wait for their action to complete - for example, if the action requires loading some data from a server(s) via AJAX.
My problem is, if the cause of the slowdown is a processor-intensive function, the gif freezes.
In most browsers, the GIF stops animating while the processor-hungry function executes. To a user, this looks like something has crashed or malfunctioned, when actually it's working.
JSBIN example
Note: the "This is slow" button will tie up the processor for a while - around 10 seconds for me, will vary depending on PC specs. You can change how much it does this with the "data-reps" attr in the HTML.
Expectation: On click, the animation runs. When the process is finished, the text changes (we'd normally hide the indicator too but the example is clearer if we leave it spinning).
Actual result: The animation starts running, then freezes until the process finishes. This gives the impression that something is broken (until it suddenly unexpectedly completes).
Is there any way to indicate that a process is running that doesn't freeze if JS is keeping the processor busy? If there's no way to have something animated, I'll resort to displaying then hiding a static text message saying Loading... or something similar, but something animated looks much more active.
If anyone is wondering why I'm using code that is processor-intensive rather than just avoiding the problem by optimising: It's a lot of necessarily complex rendering. The code is pretty efficient, but what it does is complex, so it's always going to be demanding on the processor. It only takes a few seconds, but that's long enough to frustrate a user, and there's plenty of research going back a long time to show that indicators are good for UX.
A second related problem with gif spinners for processor-heavy functions is that the spinner doesn't actually show until all the code in one synchronous set has run - meaning that it normally won't show the spinner until it's time to hide the spinner.
JSBIN example.
One easy fix I've found here (used in the other example above) is to wrap everything after showing the indicator in setTimeout( function(){ ... },50); with a very short interval, to make it asynchronous. This works (see first example above), but it's not very clean - I'm sure there's a better approach.
I'm sure there must be some standard approach to indicators for processor-intensive loading that I'm unaware of - or maybe it's normal to just use Loading... text with setTimeout? My searches have turned up nothing. I've read 6 or 7 questions about similar-sounding problems but they all turn out to be unrelated.
Edit Some great suggestions in the comments, here are a few more specifics of my exact issue:
The complex process involves processing big JSON data files (as in, JS data manipulation operations in memory after loading the files), and rendering SVG (through Raphael.js) visualisations including a complex, detailed zoomable world map, based on the results of the data processing from the JSON. So, some of it requires DOM manipulation, some doesn't.
I unfortunately do need to support IE8 BUT if necessary I can give IE8 / IE9 users a minimal fallback like Loading... text and give everyone else something modern.
Modern browsers now run CSS animations independently of the UI thread if the animation is implemented using a transform, rather than by changing properties. An article on this can be found at http://www.phpied.com/css-animations-off-the-ui-thread/.
For example, some of the CSS spinners at http://projects.lukehaas.me/css-loaders/ are implemented with transforms and will not freeze when the UI thread is busy (e.g., the last spinner on that page).
I've had similar problems in the past. Ultimately they've been fixed by optimizing or doing work in smaller chucks responding to user actions. In your case different zoom levels would trigger different rendering algorithms. You would only process what the user can see (plus maybe a buffer margin).
I believe the only simple workaround for you that would be cross-browser is to use setTimeout to give the ui thread a chance to run. Batch up your work into sets of operations and chain them together using several setTimeout calls. This will slow down the total processing time, but the user will at least be given feedback. Obviously this suggestion requires that your processing can be easily sectioned off. If that is the case you could also consider adding a progress bar for improved UX.
I have a page which appears to load fully, but I actually have to wait a further 6-10 seconds for things like buttons to become fully functional.
In IE you can still see the browser loading bar at full for this time after the page displays.
Does anyone know why this might be? I stripped out all the javascript and it still does it.
I get that in my pages sometimes when I don't compress my images. Large image files, or other large media, would be the first place I would check.
Something else I normally look for in speeding up the page is time it takes to download services (feed parsing etc), though since you took out all your javascript that shouldn't be a problem.
The problem with this was the way the page was rendering from the asp:repeater
Instead I created a datagrid and it seemed to eliminate the problem. No idea why as I'd expect it to be the other way around if anything.