How do you protect web pages against slow ad tracking web beacons? - javascript

I work for a large website. Our marketing department asks us to add ever more web ad tracking pixels to our pages. I have no problem with tracking the effectiveness of ad campaigns, but the servers serving those pixels can be unreliable. I'm sure most of you have seen web pages that refuse to finish loading because a pixel from yieldmanager.com won't finish downloading.
If the pixel never finishes downloading, onLoad never fires, and, in our case, the page won't function without that.
We have the additional problem of Gomez. As you may know they have bots all over the world that measure site speed, and it's important for us to look good in their measurements, despite flaws in their methodology. Their bots execute onLoad handlers. So even if I use a script that runs onLoad to add the pixels to the page after everything else finishes, we can still get crappy Gomez scores if the pixel takes 80 seconds to load.
My solution was to add the pixels to the page via an onMouseMove handler, so only humans will trigger them. Do you guys have any better ideas ?

jQuery and other JavaScript frameworks can help handle the problem by using a method such as the" document ready" function, which fire when the document is ready and don't need to wait for all the images.
I'll quote direct from the jQuery tutorial:
The first thing that most Javascript programmers end up doing is adding some code to their program, similar to this:
window.onload = function(){ alert("welcome"); }
Inside of which is the code that you want to run right when the page is loaded. Problematically, however, the Javascript code isn't run until all images are finished downloading (this includes banner ads). The reason for using window.onload in the first place is due to the fact that the HTML 'document' isn't finished loading yet, when you first try to run your code.
To circumvent both problems, jQuery has a simple statement that checks the document and waits until it's ready to be manipulated, known as the ready event:
$(document).ready(function(){
// Your code here
});
You could use this event to load those problematic images.
See this link for further info.
http://docs.jquery.com/Tutorials:How_jQuery_Works

I also work for a large website and here is what we did:
Moved the ad calls to the bottom of
the page, so the content would show
up first (because JS is
synchronous).
Loaded the ads using Ajax Calls (to
allow the page to be usable while
the ads are loading) into a hidden
element.
The position is moved to the correct
place in the DOM and "unhidden."
Depending upon the page, we either
wait for the ad to finish loading
before move the position, or we move
the position immediately.

Related

How do I profile / reduce html page rendering time?

I'm working in chrome to improve page load times.
I'm trying to determine the cause of the delay between when the content is finished downloading and the onload event fires. You can see from the image that the content finishes downloading at about 160ms, but the load event doesn't fire until about 600ms.
My question, how can I identify and break down what is taking 450ms to happen? Is it possible to improve the load time here, or is this just an inevitable part of the rendering/painting process?
UPDATE #1
Solved the problem, the culprit was mainly jQuery; page is now loading at the 300ms section. I decided to defer the loading of jQuery (and every other site script until after the window.onload event fired. This closed the gap and now all of the page scripts load after the onload happens.
Here's the timeline view that shows the script loading:
The the Chrome Dev tools you have the Timeline tab, press record, refresh the page and stop the recording you'll get all you need.
Also be sure the check the check boxes you're interested in. Documentation here
I've tested some other pages and they have a similar gap to yours and i think that's the painting and rendering time.
If you really care that much about those 450 ms i suggest you read this article about the way chrome renders the DOM it's pretty good one.
My personal opinion though is that this sounds like premature optimization if you don't plan on rendering a few thousand or tens of thousands of elements you should just let it be or try to optimize some other parts, preferably the JavaScript.
I still suggest reading the article it's pretty good.

Details about window.stop()

I just came across the window.stop()
The stop() method is exactly equivalent to clicking the stop button in the browser. Because of the order in which scripts are loaded, the stop() method cannot stop the document in which it is contained from loading, but it will stop the loading of large images, new windows, and other objects whose loading is deferred.
https://developer.mozilla.org/en-US/docs/Web/API/Window.stop
I am wondering:
What do 'windows' and 'other objects whose loading is deffered' refer to? Can somebody give me a practical example?
If I append an image to the body and call window.stop() when 30% has been loaded, and then append the same image at a later time; will the latter image only have 70% left load?
What do 'windows' and 'other objects whose loading is deffered' refer
to? Can somebody give me a practical example?
new windows could be iframes or regular frames. Scripts themselves can be marked as defer which loads them after the document and after other inline scripts so these could also be stopped with window.stop(). So practical examples of objects that might have their loading stopped are: iframes, embedded objects (like Adobe Flash), scripts marked defer and images. The idea here is that if you're going to call window.stop() from a script in the page, it can only apply to resources that have not yet completed loading. By definition, at least part of the document and your script have already completed loading or the script wouldn't be running so you could call window.stop() so it is primarily going to apply to resources that are loaded later in the load process such as the ones above.
If I append an image to the body and call window.stop() when 30% has
been loaded, and then append the same image at a later time; will the
latter image only have 70% left load?
The browser cache is very unlikely to cache 30% of an image and then resume loading only the remaining 70% the next time you ask for that resource. Most caches just refuse to cache an object if it is not completely loaded. In any case, this would be entirely browser dependent as this type of behavior of the cache is outside of the standards.
What do 'windows' and 'other objects whose loading is deffered' refer
to? Can somebody give me a practical example?
I use to do a practical to keep window calm when page reloading like this example.
window.stop();
(do some staff here..)
window.location.reload(true);
The window will automatically active by itself when page is reloaded.
So it stop only the browser, not the script.

Do <iframes> and images loaded via AJAX delay window.onload?

I'm trying to make the window.onload event fire sooner so that Google will think my page loads faster (this is a frustrating task since how long it takes to get to window.onload is basically irrelevant from the user perspective, but I digress)
However, I don't know what delays the onload event! Specifically:
If I load a Facebook likebox on my page in an <iframe>, does its loading delay the onload event? What about if the likebox iframe has to load a bunch of profile pics; does onload wait until they fully load?
Suppose that on document ready I do an async AJAX request for an HTML blob and inject it into the page. If this HTML blob contains a bunch of <img> tags, does the onload event wait for all of these to load?
In general, how does the browser know when to fire the onload event? What sorts of things block onload, and what sorts of things don't?
a) You can't control window.onload except by reducing the page "weight". Its up to the browser to decide when its going to declare that event.
b) Google doesn't have a clue about the window.onload event because its not parsing JavaScript.
1) You can completely eliminate the Facebook payload by using XFBML version of the like button and asynchronous loading of the Facebook JavaScript SDK (http://developers.facebook.com/docs/reference/javascript/FB.init/). Do note that it will work only if JavaScript is enabled.
2) Everything that is going to dramatically increase the weight of your web page should be loaded asynchrouniusly, preferably after the window.onload event has fired.
If you look at the waterfall, in firebug or chrome inspector, iframe and ajax calls does affect the onload event. I ran into similar problem with facebook considerably slowing down site. Yes, while looking at pageload time in webmaster tool, it shows the lag.
My solution was to dynamically append facebook iframe when the page is completely loaded. and for ajax calls, i only trigger them on load.
This brought my page load time from 7 seconds with embedded facebook iframe, to 2.6 seconds with dynamically appending it.

Rendering ads in an iFrame then moving them to the main window

I have a set of ads that are written out by document.write because that is the only thing that the adserver will do.
I have seen other sites reload ads on the page if the user sits there for a while (something I may want to do in the future). So I was playing with loading the ads in an iFrame, then moving them out into the main window afterwards. This seemed to work quite well, until it served up a google ad which is itself in an iframe within the iframe. Is it possible to pull them out properly / move google ads around the page at all?
This is what I have currently, and it works for everything but iframed ads within the iframe.
$().ready(function(){
$('#iframe').load(function(){
var middle_ad_contents = $('#iframe').contents().find('#middle_ad').html();
$('#ad_middle').html(middle_ad_contents);
});
});
[edit]
Upon further investigation... it looks like reloading google ads may be against the terms of service, perhaps I shouldn't do this?
[edit 2]
Reloading the whole page is not really an option (and kind of a dick move).
The point was to perhaps rotate the ads, but more to stop them from blocking the pageload because adserver X, which is being served through adserver Y which is being served through adserver Z is slow/not responding. The iframe seemed like the best solution because then I can delay the document.writes which are 2 or three levels deep until the end of the page without them wiping out the whole page as document.write after pageload === document.replaceTheWholeDOM. There is also the perhaps the option of monetizing ajaxy/other iframed (shudder) content with this method.
The best way to go about this is probably using a document.write replacement. There are several to choose from, but here's one: https://github.com/eligrey/async-document-write
This will replace the global document.write function with one that can be used even after the page has loaded.
Not only may it be against the terms of service, but it is also lowering the value of the ads to the advertiser and creating a clunky element in UI.
Think about it from a UI standpoint...you're on the site, concentrating on something, then everything flashes. Your attention goes from what you were concentrating to to figuring out what just happened. Never mind, just a banner flip. Next.....now, where was I?
For the advertiser, what if you notice the ad and are about to click on it and BOOM, it changes. Now what, can you go back? If not, you just lost revenue. Users spend seconds on many pages, so unless you've got an incredibly "sticky" website, how much exposure is the advertiser really going to get? Remember, Google rewards AdWord sites for clickthroughs, not based on volume shown, which can actually hurt your CTR.
If you're determined to make this happen, I think I would consider attacking it by having the ad server post directly as it's intended into the dom, then use a javascript-based timer to asynchronously ping the adserver and again tell it to redraw the desired div. I would avoid iFrames like the plague because they're just not friendly in this age of simple Dom manipulation.
Or, you could just take the MSNBC approach and reload the entire page every X minutes. It's a horrible UI pattern, but it would achieve your goal and likely bend (but not break) TOS.

What's the best way to make external images and JS files to not affect page load time?

On a lot of the pages I work with there are a lot of external(non-critical) external images and js files that get called that affect the load time. One of these is a tracking pixel for an ad company that sometimes can take a few seconds to load and you can see this hanging in the browser so it gives a poor user experience. Is there a way that I can load these and not have them count as the initial page load? I've seen similar things that launch a timer and once the timer fires they load but I'm worried that if the user leaves the page too quickly the tracking pixel wont have time to load.
Not really - the point of tracking using a gif is to track users regardless of whether they have javascript or not. Delaying the load of the gif would require javascript, so would defeat the point and potentially mess up your stats.
The best method is to put these 'unnecessary for page load' things at the end of the code, inside the closing body tag.
If you can load the tracking pixel further down on the webpage, preferably as close to the end BODY tag as possible, it will likely process all other content prior to that image first, making the page load appear to occur faster in the event the image isn't loading very fast.
This can be explained (if taken slightly out of context) by Yahoo YSlow's "Best Practices for Speeding up your Website" section on put scripts at the bottom.

Categories