I have a page (A) which is a heavy javascript page, when I leave this page to go to page B it takes a long time. When I go to page B from a different page it is really fast. So it has something to do with page A and probably its javascript.
When I run the network profiler from the developer tools in IE 9 it shows a gap of ~15 seconds between the response and the DomContentLoaded(event).
Page A is heavy with javascript because it runs the Xopus Editor, a rich text XML editor.
Does anybody have any ideas on what I could do to either analyse the gap as to what happens or what I could do to make Page A unload faster.
This is a long shot as there are about eleventy-hundred things wrong with it, but it might be somewhere to start. Add this script tag to your page as the very last one:
<script>
function unloadJS() {
var scripts = document.getElementsByTagName("SCRIPT");
for (var index = 0; index < scripts.length - 1; index++)
{
var file = scripts[index].getAttribute("src");
var start = +new Date();
scripts[index].parentNode.replaceChild(document.createElement('script'),
scripts[index]);
var elapsed = +new Date() - start;
alert(file + ": " + elapsed.toString());
}
return false;
}
</script>
This code attempts to force the unload of each of the JavaScript files that were loaded on the page, reporting the amount of time it takes to drop them in milliseconds. Fire this as is convenient, i.e., on unload or with a button:
<button onclick="return unloadJS()">Go!</button>
This might not work/tell you what you need to know because IE could refuse to do garbage collection when the script is disconnected. This could be because IE really doesn't unload them when you do this, or just because IE - well, what oddi said :)
In any event, this isn't a solution; it doesn't matter when the JS gets unloaded, garbage collection still takes the same amount of time. This is just an attempt at a first diagnosis, like you asked for. Hope it works/helps...
Yes IE sucks. But there are several tools to profile your page.
Fiddler or HttpWatch is a good tool for analysing your request timeline and see whether it takes long time to download all your heavy javascript code. It's often the main cause of slowing down a heavey js page. Since IE doesn't take parallel downloading js very well, it costs more time for hundreds of small javascript files.
For this case, try minifying your javascript. It is the most direct way to enhance your page load performance.
If it doesn't help much. You may need YSlow to analyse a detailed performance. Although it doesn't fits IE, fixing some issues under Chrome or FF can affect performance under IE.
Add some log in your console, narrow down the scope maybe you can find the execution performance problem.
Maybe you're using a Javascript library like PrototypeJS that hooks onto the page unload event and when the page unloads, it loops through an array removing all the event listeners for all the DOM elements on the page. If we know what libraries you are using, we could simulate a call to unload the page to force the library to execute it's unload function. Afterwards, start a timer to see how long it takes to load another page.
Related
I have a chrome extension that is causing user's laptops to slow to a crawl and realllllly use a lot of CPU (or at least causes their fan to go nuts).
I'm wondering how I can profile this, or see what's going on.
Some theories to help guide:
extension needs (unfortunately) to do some polling. I won't go into why this is the case, but trust me that it does.
What this ends up looking like, however, is:
setTimeout(function() {
// our inner scrolling loop
scrollingTimerId = setInterval(function() {
// did the user change the URL?
if (!isModalUrl(window.location.href)) {
clearInterval(scrollingTimerId);
}
// scroll!
doScroll();
}, SCROLL_DELAY);
// our loop to check if we're done
isDoneTimerId = setInterval(function() {
...
}, OTHER_DELAY);
}, TIMEOUT_DELAY);
Perhaps there is some failture to cancel setInterval or something that's causing the usage to increase over time?
extension also sends messages to ALL tabs on certain events. Could this be an issue with multiple Chrome windows open?
Trying to hunt down what performance issues it could be, and also where to look. Perhaps there is a good tool I don't know about in the Chrome dev tools inspector?
I've been doing some testing with http://www.webpagetest.org/ today to see which scripts are slowing down my page loads. Long story short, I've discovered that third-party scripts are causing a noticeable slowdown in loading. I'm loading them all at the bottom of the page, using async and defer ( see https://www.igvita.com/2014/05/20/script-injected-async-scripts-considered-harmful/ ).
I believe the main reason for the slowdown is not just in grabbing the files from the third-party, but in actually running the various scripts, especially side-by-side with mine.
I'd like to keep all the scripts, but I want them to be loaded behind the scenes, after all my scripts have loaded, and with no noticeable performance decrease in the browser. For example I don't want the browser to "stutter" or jump around if I start scrolling down while the third-party scripts are loading, or various other minor annoyances.
Has anyone tackled this before and come up with a good solution? So far I'm thinking the best option might be to load the third-party scripts using jQuery.getScript(), after all my scripts have finished (literally at the bottom of one of the .js includes). Still, that may load them all concurrently which could make the browser sluggish for a second or two.
Some more details on how I did the testing, for anyone interested:
grabbed the source code of a product page, threw it into a test PHP page so I could modify it at will
surrounded each script with an on/off flag such as
if ( isset( $_REQUEST["allowGoogleAnalytics"] ) ) {
ran a test with all scripts turned off
in new tabs, ran more tests, turning scripts on one at a time
by the time my own scripts were all turned on, the pages were taking about 1.9 seconds to load (first view) and less than a second on repeat view. This is fine with me.
after turning on the third-party scripts, the pages were taking at least 3.1 seconds to load (first load) sometimes as much as 3.9
The third party scripts in question are:
facebook "like" button
google +1 button
pinterest
google trusted stores
None of these are particularly bad on their own, but all at once they combine and take too long, and make the browser too sluggish.
You can queue scripts, if problem is in simultaneous load. Also this load should be started on document ready (i see you already using jQuery, so use it in example).
Example code (tested locally, works).
<script src="http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script>
var scripts2load = [
"http://apis.google.com/js/plusone.js",
"http://connect.facebook.net/en_US/all.js#xfbml=1"
];
function loadNext() {
var url = scripts2load.pop();
if (url) {
$.ajax({
url: url,
cache: true,
crossDomain: true,
dataType: "script",
success: loadNext
});
}
}
$(loadNext)
</script>
In the past I have had some success waiting until the page was fully loaded (which happens after DOM Ready). Any scripts that you load before window.load causes the browser to do more work on top of the parsing/rendering/resource loading it's already doing. Traditionally, we do everything on DOM ready - which can quickly give the browser a lot to deal with. Instead, split off any of your non-crucial functionality and let the browser deal with them after all the crucial stuff has been dealt with.
Try taking your non-crucial scripts (eg. like buttons, anything not crucial to the page) and wait to load them until window.load. Then apply a fade-in effect or something else to ease-in the display. If window.load is too long to wait (ie. you have a bunch of images on your page), then you can do something like this:
$(function() {
var timer = setTimeout(loadThirdPartyScripts, 1200),
$window = $(window);
$window.on('load.third_party', loadThirdPartyScripts);
function loadThirdPartyScripts() {
clearTimeout(timer);
$window.off('load.third_party');
/* load your scripts here */
}
});
This will load all of your non-crucial scripts after the window has loaded or after 1.2 seconds - whichever comes first (adjust the timeout as needed). If you have a lot of images - I suggest lazy loading ones below the fold - the whole point being to get window.load to fire as soon as possible.
Disclaimer: if you wait until window.load to load a dozen or two resources, you are still going to experience the same stutters that you are now.
I have a fairly extensive javascript that I can load in my Chrome (latest stable) and in IE11.
The load icon spins but the script eventually loads on my machine in both browsers.
I have 2 other people trying to load the page that contains the javascript in IE11 and they both cannot get the page to load. The loader icon spins forever and when they mouse over the refresh icon a flyout states "long running script"
How can I analyze my javascript to identify how and where the script is taking forever to load?
Chrome's Developer Tools (F12) can profile your code. This will give you a lot of information -- possibly a lot of noise -- but it will identify two things for sure 1) functions where a lot of time is spent, and 2) functions that are called often.
This is the first place I'd start: turn on the profiler and reload the page.
If that doesn't give you a good place to start, look into the Chrome Timeline and console.timeStamp( 'Some Note' ). After you have started recording a timing session, every time the code encounters "console.timeStamp", it will annotate the timeline allowing you to estimate elapsed time between one or more points in your execution. See here: https://developers.google.com/chrome-developer-tools/docs/console#measuring_how_long_something_takes
I'm making an html interface to upload images on a server with Drag & Drop and multiple selection files. I want to display the pictures before sending them to the server. So I first try to use FileReader but I have had some problems like in this post. so I change my way and I decided to use blob:url like ebidel recommends in the post, with window.URL.createObjectURL() and window.URL.revokeObjectURL() to release memory.
But now, I've got another problem, which is similar to this one. I want that a client could upload 200 images at time if he wants. But the browser crashed and the ram used was very high! So I thought that maybe too much images were displayed at the same time, and I set up a system with a waiting queue of files using an array, in order to treat only 10 files at time. But the problem still occurs.
On Google Chrome, if I check chrome://blob-internals/ the files (which are normally already released by window.URL.revokeObjectURL()) are released approximately after a 8 seconds delay. On Firefox I'm not sure but it seems like if the files were not released (I check on about:memory -> images for that)
Is my code which is bad, or is it a problem independent of me? Is there a solution to force the navigators to release immediately the memory? If it can help, this is the part of JavaScripton which the problems occurs: link expired because code was not included in question.
EDIT
This is a kind of own answer + an answer to bennlich (too long text for a comment)
I understood from the answer of user1835582 that I could indeed remove the Blob/File but while the browser needs images it keeps them somewhere in memory (which is logical). So it's the fact to display images (many & heavy) that gave me crashes & slow downs, not the revokeObjectURL method. Moreover, each browser manages the memory by its own way, leading to different behaviors. Here is how I came to this conclusion.
First, let's try that revokeObjectURL works well, with a simple example using the source code of https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications#Example.3A_Using_object_URLs_to_display_images.
Using Chrome you can verify that Blob are well revoked, by checking chrome://blob-internals/ or trying to open displayed images into a new tab that will be blank. Note : to fully release Blob references, add document.getElementById("fileElem").value = "". When I posted my question some years ago, it was about 8 seconds to release blob, now it's almost immediate (probably due to improvements in Chrome & / or to a better computer)
Then, time for a charge test. I did it with a hundred of jpg of ~2.5 Mo each. After that images have been displayed, I scrolled the page. Chrome crashed and Firefox was slow (not tested on others browsers). However, when I commented li.appendChild(img) all went well, even with a huge bunch of images. This shows that problems are not coming from revokeObjectURL method which in fact works properly, but from displaying lot of heavy images. You can also test to create a simple html page with hundreds of heavy images and scroll it => same result (crash / slow down).
Finally to look deeper about images memory management, it's interesting on Firefox to look into about:memory. For example I saw that when the window is active, Firefox uncompresses the images (images -> uncompressed-heap goes up), while raw (images -> raw) is always stable (relative to the the quantity of images loaded). There is a good discussion about memory management here : http://jeff.ecchi.ca/blog/2010/09/19/free-my-memory.
With window.URL.revokeObjectURL() you can only get [Blob] or [File] object. You can not force remove from memory.
Note.
Browsers are not finalized and they can leak from these facilities. If you implement the animation, you have to understand that at your own risk.
This isn't an answer, but I just want to say that, as far as I can tell, this is still an issue in the latest version of Chrome (35). I made a test page that exposes the problem:
http://ecobyte.com/tmp/chromecrash-1a.html
If you select a large number (say, 600) of high resolution photos on your computer and drop them into the box on that page, it will crash Chrome (tried on Windows 7 and Mac OS X 10.8.5).
If you look at the source you can see that sequence of ops is:
createObjectURL
load the img (don't add to DOM!)
revokeObjectURL to free the ref
Lose the img ref
Repeat all steps for next dropped image
Seems like only a single image should be in memory/referenced at any given moment, but eventually this crashes Chrome.
I'm trying to load up a large (several megs) document in a textarea.
Ignoring the network load time (which is actually minimal when I reload it as it's getting a 304), in Firebug it's telling me that it's taking nearly 20 seconds for the DOMContentLoaded and load events to get around to firing.
If I change the textarea to a div, it drops the time to 5 seconds, even though it has to actually render the entire contents!
There are no javascript libraries loaded - unloading them was the first thing I tried. I do have a number of CSS files loaded.
Any ideas about what makes it so slow or, even better, how to speed things up? Load the content a chunk at a time? Kind of ugly but at least it gives the user something to look at rather than a locked browser and potential "this script is taking too long" warnings.
This is Firefox 3.6.15 on Ubuntu.
David, in Firefox 3.6 and earlier textareas with very long content are pretty slow because the editor code reformats the DOM inside the textarea: it creates one textnode and one <br> per line. This is a lot more work than just rendering a single textnode child of a <div>.
You should try Firefox 4, which edits the textnode (or rather a clone of it) directly; I suspect it will be much faster on your page.
As far as speeding this up for users... your only real options with old Firefox versions are to not have so much text in a textarea, unfortunately.