How does $(window).one('load') work, and how does it fail? - javascript

As some background, I am working on building a web slider activity that fits into/extends another piece of software that outputs HTML. The key point being I have very little to no control over how the software outputs it's HTML Elements, but I can add arbitrary JavaScript and HTML elements on top of what the software outputs.
What I am trying to do is delay the loading of some images that I want to add until after all the software's outputted elements are loaded. So far I have it working by waiting to load the images until after the window load event is fired. Like so:
//-----Simplified example-----
$(document).on('ready', funciton(){
//Some stuff to set things up
$(window).one('load', function(){
$('#SomeDiv').append('
<img class= "someClass"
id="arbitraryImage1"
src="folder/image1.jpg"
');
});
});
Though it is working right now I'm not sure what will happen if something goes wrong. The HTML outputted by the software has a bad habit of including things that it might not need as well as occasionally not loading images on slower connections. The majority of the time if it does fail the user will not notice, so it isn't an issue. However I would like to know how $(window).one('load', function(){})behaves when a problem arises, so that I can work around the error or at least make the activity fail graciously. Any assistance or links to documentation would be appreciated, the generic-ness of 'window' and 'load' brings up a lot of irrelevant information when googling.
To summarize:
What is the behaviour of $(window).one('load', function(){}) when the window fails to load a resource (image, css, etc.)?

Good question. The documentation says:
The load event fires at the end of the document loading process. At
this point, all of the objects in the document are in the DOM, and all
the images, scripts, links and sub-frames have finished loading.
https://developer.mozilla.org/en-US/docs/Web/API/GlobalEventHandlers/onload
But in practice, you'll find that the load event fires after all assets queued to load when the document was originally parsed (stylesheets, external scripts, img tags) have loaded, or failed loading. As long as no requests are still pending, the fact that these assets have loaded successfully or not has no bearing on the load event being fired.
The documentation is in need of a clarification, in my opinion, as it doesn't address your question.

If image fails to load, then there will be just no image in your image, if it will be unable to find SomeDiv, then nothing will be appended anywhere, generally if javascript function fails to some reasons - then nothing bad will happen, just broken functionality won't work, meaning load will still be done, just console will be telling about errors

Related

Loading image is shown too late for users with slow connection

The issues is with existing ASP.NET MVC 4 project. The View itself is not that big but there are also several service calls and what happens is that people with slow internet connection reports that for some period of time when they request the page it stay unresponsive, so they don't know if the content is loading or something went wrong.
So in general, what I need is a way to show a loading image as the very first thing from this page (or at least fast enough) so even if it takes some time for the browser to download the full content, at least the user will know that something is going on.
However it seems that this is not as trivial as it sounds. I came up with two ideas, one was already proven to not work in all cases and the second is also something that many people doesn't recommend.
What I've tried was to use pure JavaScript in the <head> tag like so:
<html>
<head>
<script>
document.write("<div style='margin-left: 50px; margin-top : 50px;'>LOADING...</div>");
</script>
</head>
Of course the styling is just to get the idea. This seemed like it was working until recently when on a minor build of IE 11 the page broke which after some investigation was proven to be due to the usage of document.write() inside the <head> tag. Even though this seems to work on most browsers and versions the fact that it's a potential danger requires a change in the approach.
The second idea is pretty similar to this, again - writing directly in the <head> tag but this time instead of using document.write() just putting the html markup directly there. From what I've read this is also a recipe for disaster so I don't even want to try it, but those are the two ideas that I could came up with. Everything else just doesn't work for slow internet connections since the mainstream solutions relays on the fact that some part of the DOM is already loaded so you can use it to apply behaviour/style.
I hope I'm not the first one that got this problem so I would appreciate any suggestion on how to solve this issue. Since I am using ASP.NET MVC 4 there was an idea to use a dedicated controller with view which to get called first just for the sake of showing the loading image and then moving to the requested view. Even though this idea seems interesting I didn't have the time to think it over, it also sounds a pretty complicated, but anyways, all help will be appreciated.
When faced with the same issue, we did as you mentioned: Load a super-light-weight page first with a tiny loading icon & "Loading..." text, then called back to the server to pull down the heavy content in chunks using ajax.
If you content is really heavy, it's also worth mentioning that you need make sure you have gzip compression turned on at your web server layer also.
Don't block the rendering of the DOM. Load the page and call your scripts right before the closing body tag and attach to an event like DOMContentLoaded or window.load (if you're using jQuery, that would be $(document).ready). In other words, you should allow the page to load completely, even if it's empty, before running any of your JavaScript.
Then, don't worry about writing the "Loading" div dynamically. Just include it in the rendered HTML and hide it initially. The first thing your JavaScript will do, before issuing any AJAX calls, is to show this "Loading" div.

Does dynamically prefetching a resource using DOM injection work or make sense?

This is largely a theoretical question, for which I do have a practical purpose. I first am looking for some conceptual answers before diving into practice, as perhaps the idea itself does not make sense.
Imagine a slideshow that is entirely javascript-based. Users see a large image on their screen and click to move to the next large image. Upon clicking, the next image is loaded and inserted into the DOM, replacing the previous one.
I recently learned that prefetching directives can help in speeding up the loading of resources that are very likely to be used next. Note that I said resources, not pages. The slideshow is a single page.
In an image slideshow, it is very obvious that it is likely that the next image is needed, thus if image1 is on screen, I could dynamically add this to the DOM:
<link rel="prefetch" href="http://img.mysite.com/img2.jpg">
My questions regarding this idea:
Would it work at all? Do browsers accept this directive when it is dynamically inserted in the DOM at run-time? Would it trigger the prefetch?
Is there a possibility of timing conflicts, where if prefetching would indeed work, it did not finish in time before the use does the "load" without prefetching? Obviously this can happen, but will it have unwanted side effects?
Will specific events such as image onload still work correctly, or are they never triggered in the case of a successful prefetch (assuming it works at all)?
I did a lot of searching but I am unable to find answers on this specific situation of dynamically injected prefetch hints.
onload always gets triggered, even if the image is coming from cache. You do not have to worry about timing effects or race conditions, any such behavior would be a browser bug.
As mentioned in comments, rel=prefetch is not the only way of achieving pre-fetching. It works though even when dynamically inserted into the DOM. After all, you could fetch the image without the prefetch attribute and hide it.

Perform time consuming tasks before window.onload

I submitted my HTML5 game for a HTML5 game platform for QA review and got the following feedback:
You should not perform time consuming tasks before window.onload but the actual is: DOM is blocked.
I am not sure what this review means. What I am doing now is:
$(window).ready(
....
);
So this is the entry point of all the code. So what is the difference between $(window).ready and $(document).ready() and window.onload. Do they follow a fixed order as to when they are triggered? And what does DOM is blocked mean?
dom ready vs window.load
$(document).ready(function(){}); is triggered until the HTML is fully parsed and rendered, but before all Assets (images, iframes and so on) are downloaded. $(window).load is executed right after all images are downloaded.
This means: $(document).ready comes always before $(window).load.
DOM is blocked
Is quite untechnical and not really the truth. You showcase, that you can manipulate the DOM at "DOM ready" and acutally this is the right event for doing so. What they mean is the following: As long as a script is executing, the browsers main thread is blocked. Which means, that the browser can not render/paint anything in this time and the browser becomes unresponsive.
The idea behind this advice is the following:
If you are executing a script on DOM ready, painting stops and will be delayed after script execution. The user might see some sort of your UI, because at this time the DOM is already rendered and in most cases also the CSS is, but not the images. But if you delay it after window.onload the browser can also render images and iframes, before you block the main thread, which means the user might see a fully rendered site/game sooner, although it isn't technically ready yet.
My 2 cents
Wether this is a good approach or not, really depends on a lot of circumstances. If your JS does a lot of UI things, it is bad, because the user won't see the final UI sooner, the user will see it later. If your JS is important for the page to work and the page has some bigger images, it is quite stupid to delay executing script. Consider a mobile user, he might already see the UI to start the game, but your click/tap event isn't yet bound, because there is a big image at the end of your page, which still needs to load?
If you have a performance problem fix it, don't delay it to antoher point, this will also block the main thread. But what you can do: If you have a lot of scripts, you can split those tasks into chunks and execute them at the point, when they are really needed and not initially. For example: You have a Menu and this menu has a hidden sub menu, which needs initially some special dom manipulation. Don't do this at dom ready or on window.load, do it right before it opens for the first time.

Improving front-end performance using the onLoad event

I've been trying to improve our front-end webpage performance and trying go use webpagetest for some insights. Our current page load time is 8 secs.
Here is a waterfall view of our site - http://i.imgur.com/D4sPLfs.png
The blue line indicates the load event fired which decides the load time.
And here is a waterfall of another site which has a page load time of 4secs - http://i.imgur.com/NuO1Mao.png
The waterfalls of both the sites look similar apart from one glaring difference - The blue line (load event fired) of the second one is way earlier even though a lot of content is loaded after the onLoad firing event.
Am I right in thinking that if somehow I can make the onLoad firing earlier on my site, I will see improved perceived performance for the user? If yes, how do I go about it?
We're already using lazyload.js for lazy loading images
Most of our third part js files (Google Ads / Analytics) are being called near the bottom of the body element
Thanks a lot!
There is the alternative "domready" event, which is also what jQuery uses. It is triggered when the DOM tree was built and does not wait for external resources (mostly images) to load, what the onload event does.
With that loading time it's pretty likely you are using a JS framework that is supporting the domready event already, which you should consider using, then.
You will likely have to rewrite a good portion of your scripts, mostly if you are expecting image width/height to be correctly set.
If you have access to jQuery, you can use something like that:
$().ready(function() {
loadHandler();
});
If jQuery is no solution, you should propably take a look at this extraction of jQueries domready function, because there are a lot of quirks to look out for.

How to trace slow JS or JQuery code

I created a web page for viewing images. This page has some other code that gets included that I did not write. The page loads 40 small images upon load. Then the user will scroll down and additional pages of 40 images can be loaded via ajax. Once I get to 15-20 pages, I notice the page begins to slow significantly. I check app counters and it can go up to 100% cpu and memory can go over 3GB. Then I will inevitably get the modal that JQuery is taking too long to execute, asking me if I want to stop executing the script. Now I realize that a page with up to 800 images is a big load, but the issue with JQuery suggests to me that some code may also be iterating over this larger and larger group of dom objects. It almost appears to get exponentially slower as I pass 15 pages or so. Once I get to 20 pages it becomes almost unusable.
First of all, is it even possible to run a page efficiently, even with minimal JS, when you have this many images? Secondly, is there a recommended way to "trace" JS and see what kinds of functions are getting executed to help determine what is the most likely culprit? This is most important to me - is there a good way to do in Firebug?
Thanks :)
EDIT - I found my answer. I had some older code which was being used to replace images that failed to load with a generic image. This code was using Jquery's .each operator and thus was iterating over the entire page and each new ajax addition every time the page loaded. I am going to set a class for the images that need to be checked in CSS so that the ajax-loaded images are unaffected.
Firebug, and all the other debugging tools let you profile your functions. You can see how long they take to run and how many times they have been called.
http://getfirebug.com/javascript
See: Profile JavaScript performance
Another useful tool to look into is the profile() function
console.log('Starting Profile');
console.profile();
SuspectFunction();
console.profileEnd();
Though the console window in the debugger you can see the profile results.
The best tool I have used is https://developers.google.com/web-toolkit/speedtracer/ for Chrome
To answer your first question, 15 pages of images should not be a problem for a computer to handle. Google loads up to 46 pages of images without lagging at all. Although it does stop you from loading more after that.
The answer your second question, there are many ways to track JS code. Since you are doing a performance related debugged, I'd go with timestamped console log:
console.log(" message " + new Date());
I'd put one in the beginning and end of function you are interested in measuring the performance of, and read through the log to see how long it takes to execute each of those functions. You would compare the timestamp to see what excess code is executing and how long it took for the code to execute.
Finally, in Firebug, go to the console tab, and click on Pofile before you start you start scrolling down the page. Then scroll to your 15th page, and then click Profile again. It breaks down function called and amount of time it took.
I prefer to use the timer function in Firebug or Chrome, it can be called like this:
console.time('someFunction timer');
function someFunction(){ ... }
console.timeEnd('someFunction timer');
This isn't as robust as the profiler functions, but it should give you an idea of how long functions are taking.
Also if you are running at 100% CPU and 3GB of memory, you almost certainly have a memory leak. You may want to consider removing some the initial images, when more pages are loaded in. For example, after 5 pages being shown, you remove the first page when the user views the 6th page.
I was able to fix the problem by going over my code again. I was loading new images via ajax but I had an older line of code that was checking all images, ie $('img') to replace any images that failed to load with a generic image. This means that as I continually load new images, this selector has to iterate over the entire growing dom again and again. I altered that code and now the page is flying! Thanks everyone for the help.

Categories