How do I profile / reduce html page rendering time? - javascript

I'm working in chrome to improve page load times.
I'm trying to determine the cause of the delay between when the content is finished downloading and the onload event fires. You can see from the image that the content finishes downloading at about 160ms, but the load event doesn't fire until about 600ms.
My question, how can I identify and break down what is taking 450ms to happen? Is it possible to improve the load time here, or is this just an inevitable part of the rendering/painting process?
UPDATE #1
Solved the problem, the culprit was mainly jQuery; page is now loading at the 300ms section. I decided to defer the loading of jQuery (and every other site script until after the window.onload event fired. This closed the gap and now all of the page scripts load after the onload happens.
Here's the timeline view that shows the script loading:

The the Chrome Dev tools you have the Timeline tab, press record, refresh the page and stop the recording you'll get all you need.
Also be sure the check the check boxes you're interested in. Documentation here
I've tested some other pages and they have a similar gap to yours and i think that's the painting and rendering time.
If you really care that much about those 450 ms i suggest you read this article about the way chrome renders the DOM it's pretty good one.
My personal opinion though is that this sounds like premature optimization if you don't plan on rendering a few thousand or tens of thousands of elements you should just let it be or try to optimize some other parts, preferably the JavaScript.
I still suggest reading the article it's pretty good.

Related

How to be notified when all my page elements have been rendered with html5

Is there a way to catch any event that should be fired when all parts of a page has rendered?
I'm working in an iOS environment and there is a random delay on rendering step between iPad2, iPad3, and iPad Air 2. The thing is that this delay cause inconsistencies in my code and unfortunately visual bugs.
To be very accurate, I'm not talking about loading (I've already tried the onload method and it doesn't works) but rendering. When all the images and / or dom elements fully display on the screen.
These posts don't help me.
How to detect if a page has fully rendered using jQuery?
Running javascript after page is fully rendered

Preventing re-paint in Google Chrome via CSS/JS/etc

Is it possible to cause Google Chrome to prevent painting... as in, to keep the page exactly the same, no animations or content changes.
The reason I ask is because I have created an extension for people who find it difficult to read webpages when things are animating/flashing/changing/etc.
It currently works by taking a screenshot and layering it over the page (position absolute, with a high value z-index).
But because captureVisibleTab cannot capture the whole page (issue 45209), the screenshot needs to be re-created every time the user scrolls the page.
However the change in iOS 8 Safari to not pause printing while scrolling got me thinking there may be another way around this by trying to emulate the pre iOS 8 behaviour (something I preferred, as Reader View does not always work, or stop animated gifs).
You cannot stop the execution thread, its browser who decides it.
However to prevent CPU Cycles What chrome does is, Pauses the javascript execution thread when window is blurred. But since you are showing captured with higher z-index you window will still be active.
One possible way :
Disable the script for that url when the page is loaded.
You might miss the dynamic content but as you asked "no animations or content changes". Any dom or style manipulations by javscript causes repaint of the page. Disabling it might be one solution. However not pretty sure about how to stop css animations.
I have also seen extensions that can capture full webpage image or pdf. you can capture the full page and show them irrelevant of whatever changing in the background

FInding out when page is being viewed in EPUB FXL via Javascript

Is it possible to find out when a page of an EPUB Fixed Layout is being viewed with Javascript?
There is the DOMContentLoaded event, but the adjacent pages also fire this event when they are preloaded in iBooks, causing animations or sounds to start before the page being visible...
No, it's not.
This is a "feature" of iBooks. It preloads the pages, I suppose to make page turns faster later on. Unfortunately, there is no way to detect that a page is preloading as opposed to being actually viewed.
There's only one way to deal with this--force user interaction on each page (tapping something) to start the animations or sounds. You may even need to structure your JS so that the JS itself is not loaded until the user interaction occurs. Videos won't start playing without user interaction anyway.

What's the best way to make external images and JS files to not affect page load time?

On a lot of the pages I work with there are a lot of external(non-critical) external images and js files that get called that affect the load time. One of these is a tracking pixel for an ad company that sometimes can take a few seconds to load and you can see this hanging in the browser so it gives a poor user experience. Is there a way that I can load these and not have them count as the initial page load? I've seen similar things that launch a timer and once the timer fires they load but I'm worried that if the user leaves the page too quickly the tracking pixel wont have time to load.
Not really - the point of tracking using a gif is to track users regardless of whether they have javascript or not. Delaying the load of the gif would require javascript, so would defeat the point and potentially mess up your stats.
The best method is to put these 'unnecessary for page load' things at the end of the code, inside the closing body tag.
If you can load the tracking pixel further down on the webpage, preferably as close to the end BODY tag as possible, it will likely process all other content prior to that image first, making the page load appear to occur faster in the event the image isn't loading very fast.
This can be explained (if taken slightly out of context) by Yahoo YSlow's "Best Practices for Speeding up your Website" section on put scripts at the bottom.

How do you protect web pages against slow ad tracking web beacons?

I work for a large website. Our marketing department asks us to add ever more web ad tracking pixels to our pages. I have no problem with tracking the effectiveness of ad campaigns, but the servers serving those pixels can be unreliable. I'm sure most of you have seen web pages that refuse to finish loading because a pixel from yieldmanager.com won't finish downloading.
If the pixel never finishes downloading, onLoad never fires, and, in our case, the page won't function without that.
We have the additional problem of Gomez. As you may know they have bots all over the world that measure site speed, and it's important for us to look good in their measurements, despite flaws in their methodology. Their bots execute onLoad handlers. So even if I use a script that runs onLoad to add the pixels to the page after everything else finishes, we can still get crappy Gomez scores if the pixel takes 80 seconds to load.
My solution was to add the pixels to the page via an onMouseMove handler, so only humans will trigger them. Do you guys have any better ideas ?
jQuery and other JavaScript frameworks can help handle the problem by using a method such as the" document ready" function, which fire when the document is ready and don't need to wait for all the images.
I'll quote direct from the jQuery tutorial:
The first thing that most Javascript programmers end up doing is adding some code to their program, similar to this:
window.onload = function(){ alert("welcome"); }
Inside of which is the code that you want to run right when the page is loaded. Problematically, however, the Javascript code isn't run until all images are finished downloading (this includes banner ads). The reason for using window.onload in the first place is due to the fact that the HTML 'document' isn't finished loading yet, when you first try to run your code.
To circumvent both problems, jQuery has a simple statement that checks the document and waits until it's ready to be manipulated, known as the ready event:
$(document).ready(function(){
// Your code here
});
You could use this event to load those problematic images.
See this link for further info.
http://docs.jquery.com/Tutorials:How_jQuery_Works
I also work for a large website and here is what we did:
Moved the ad calls to the bottom of
the page, so the content would show
up first (because JS is
synchronous).
Loaded the ads using Ajax Calls (to
allow the page to be usable while
the ads are loading) into a hidden
element.
The position is moved to the correct
place in the DOM and "unhidden."
Depending upon the page, we either
wait for the ad to finish loading
before move the position, or we move
the position immediately.

Categories