Evaluating Performance Affects of User Analytics - javascript

I am planning to include a user analytics tool - most likely Clicky to my site.
What worries me is the performance effect it might have on the whole site. It is loaded just before the body tag ends:
<noscript><p><img alt="Clicky" width="1" height="1" src="//in.getclicky.com/100874070ns.gif" /></p></noscript>
<script type="text/javascript" src="clicky.js"></script>
</body>
</html>`
I already evaluated the JS loading time of the solution, which seems to load the script about 2 seconds. As I understand this wouldn't harm the actual page usage, but am I right?
1) Does the js start to log events only after it's loaded (let's say 2 seconds)?
2) Are there any other ways of estimating the script load then the network section's load time in chrome inspector (or similar tool)?
Thanks.

Yes - it can't log anything, unless it attaches an event handler,
and for that it needs to be loaded.
There many online tools, to profile web performance. In the
(far) past I've used WebPagetest, which helped me a lot, but I don't really no what is the best tool nowdays.
Suggestion - Try to use async or defer attributes on the <script> tag, and see if it makes a difference.

Related

How to redirect *after* dropping a facebook tracking pixel (or similar)

Trying to figure out how to redirect to a new page after dropping a Facebook retargeting pixel.
We already have everything else setup but want to add this functionality (the same as https://meteor.link/)
We currently use window.location.replace('https://thelinkgoeshere.com'); but I think the Facebook is async so we can't just add the pixel code above it.
Thanks for any help you can provide!
The current answers does not quite cut it!
It is true that the redirect will wait until the fbevents.js has loaded. It does however not wait for the fbq events to be sent, which is what you would want out of this solution.
Here are two examples on how Chrome handles your proposal on an extremely slow connection and on 3G:
In the first example Facebook does not even have the time to initiate the tracking request before redirecting. In the 2nd example it does mange to initiate it, but Chrome cancels the tracking, because of the pending redirect.
It would be easy if Facebook would supply a Callback like Google Analytics does, but as of the time of writing, it does not currently offer callback functionality.
There are plenty of solutions on the Web that simply use a fixed setTimeout to redirect after, say, a second. But this solution poses two kinds of problems:
It does not work when the connection is way slower then expected. In that case the redirect will still happen before the track. So you will loose data!
It forces users on fast connections to wait for your arbitrary timer to pass before they are redirected!
What you can do though to achieve the desired effect is a complete Hack, and I would love if Facebook would finally implement the Callback, so use this with care:
Note: I used two libraries for this (jQuery and imagesLoaded). You could probably do it without those, but I choose the easy route and do not mind the overhead that much.
<html>
<head>
// Header Stuff
<script>
// Regular Facebook Pixel Code (Does not need to be modified with async!)
</script>
<script src="https://code.jquery.com/jquery-3.3.1.min.js" integrity="sha256-FgpCb/KJQlLNfOu91ta32o/NMZxltwRo8QtmkMRdAu8=" crossorigin="anonymous"></script>
<script src="https://unpkg.com/imagesloaded#4/imagesloaded.pkgd.min.js"></script>
</head>
<body>
<!-- You should provide a Fallback Button here in case automatic redirect fails! -->
<img id="pixelLoad" src="https://www.facebook.com/tr/">
<script>
$(function()
{
$("#pixelLoad").imagesLoaded().done(function(inst)
{
setTimeout(function(){ window.location.href='someURL'; },100);
});
});
</script>
</body>
This loads the exact tracking image from Facebook that is used in their fbq method, thus allowing to compensate for slow connection to the Facebook server. Sadly you have to use the image and cannot use a simple AJAX request, as Facebook's Origin-Policy does not allow this.
Here is an example on how Chrome handles the redirect and the Facebook Pixel requests after the change:
It seems that this allows the Browser to correctly queue the redirect, so it happens after the pixel tracking events. I have not tried this with legacy browsers, so I will repeat myself: Exercise caution when using this Code!
EDIT:
Edge and Firefox show similar behavior and this solution does seem to work with them as well.
If you look closely at the Facebook pixel code, you'll notice that somewhere in that code, the dynamic script tag being inserted by the pixel has it's async property set to true like so:
t.async=!0;
!0 evaluates to true.
To change this behavior, find where this property is being set in the pixel, and change it to false:
t.async=false;
Normally this would not be recommended as turning off async will block further execution until the script is loaded and executed, but in your case that's exactly what you need.
Once you set async to false, it is guaranteed that the FB pixel JS code will load and run before the next script that follows the pixel:
<script>
// FB pixel here with async false
</script>
<script>
window.location.replace('https://thelinkgoeshere.com');
<script>
You can modify the facebook pixel script you put on the page
!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version='2.0';
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,'script',
'https://connect.facebook.net/en_US/fbevents.js');
fbq('init', '111......1');
fbq('track', 'PageView');
//add your redirect here
window.location.replace('https://thelinkgoeshere.com');
As mentioned before, the JS code won't work because the event is not necessarily tracked before the page redirects. The best you can do if you don't want to add a flakey timeout is to rely on the fallback pixel img tag.
Here's my code:
<html>
<body>
Please wait...
<br>
If you don't get redirected, click here.
<script>
var img = document.createElement('img');
var redirectFunction = function () {
window.location = "https://....";
};
img.onload = redirectFunction;
img.onerror = redirectFunction;
document.body.appendChild(img);
img.src = "https://www.facebook.com/tr?id=PIXEL_ID&ev=PageView&noscript=1";
</script>
</body>
</html>
Pro-tip: you might want to add <meta property="og... tags on this page so that the title and preview image show up when shared on Facebook.
Have you considered using Google Tag Manager?
It gives you better control of how your scripts load. Additionally, you can create script in Tag Manager to redirect once FB one has fired.
This is exactly what I followed to set up mine:
https://youtu.be/DZm1Wz3zgzU

Loading gif is not animating?

On a clients webpage, the page uses js/css to style a tree-style view to the desired look of the client. But on some of the pages with many objects it takes some time for the styles to load. So I suggested adding a loading screen.
Here is an example of the issue: http://zamozuan.com/content/16-search-auto-parts-by-vehicle-chevrolet
My client purchased a little animated icon they like that fits the theme of their site nicely.
I have a very simple addition to the script to add a loading icon, and then on $(document).ready, hiding the load element and showing the main element.
All works fine except the loading gif is not animating until AFTER the page has completely loaded.
It seems that the loop for the js is too intensive so the gif is not animating.
Before anyone chops my head off, I did view the similar questions on here, but those solutions do not work for me - the majority of issues are in IE and my issue is in chrome (not IE), and most of the other options are work-arounds based on clicking buttons to enable/disable, but in my case this is when the page initially loads so that is not viable.
I am wondering, is there any work-around to fix this? Is it possible to pre-load the gif in to an animated status and somehow prevent JS from interfering? Or is there a way to make the js loop not be so intensive that it completely freezes the browser?
As for the code, it is the exact as the link above, with only this added at the end:
$('#loading').hide();
$('.mainsearch').first().css('display', 'block');
The loading element just contains:
<img src="img/loading.gif" class="img-responsive">
<p class="centertext">Loading, please wait...</p>
But as mentioned, the gif does not load, just freezes.
For the javascript that is styling the tree, you can view it here:
http://zamozuan.com/js/tree.js
Any help would be greatly appreciated, as my client would not want to waste this loading icon they bought.
Thanks.
No, there is no work-around. It's not a problem with how the image is loaded, it simply won't be animated while the script is running.
Javascript is single threaded. As long as a script is running, there is no visual updates in the browser. This includes GIF animations, while the script is running animations won't move.
Without being able to work on the actual code and try it, I would suggest trying to break up your looping into separate callbacks. Maybe using setTimeout or requestAnimationFrame for every few recursions to give the browser a chance to update/paint.

Handling third-party javascripts quickly, so pages don't appear slow

I've been doing some testing with http://www.webpagetest.org/ today to see which scripts are slowing down my page loads. Long story short, I've discovered that third-party scripts are causing a noticeable slowdown in loading. I'm loading them all at the bottom of the page, using async and defer ( see https://www.igvita.com/2014/05/20/script-injected-async-scripts-considered-harmful/ ).
I believe the main reason for the slowdown is not just in grabbing the files from the third-party, but in actually running the various scripts, especially side-by-side with mine.
I'd like to keep all the scripts, but I want them to be loaded behind the scenes, after all my scripts have loaded, and with no noticeable performance decrease in the browser. For example I don't want the browser to "stutter" or jump around if I start scrolling down while the third-party scripts are loading, or various other minor annoyances.
Has anyone tackled this before and come up with a good solution? So far I'm thinking the best option might be to load the third-party scripts using jQuery.getScript(), after all my scripts have finished (literally at the bottom of one of the .js includes). Still, that may load them all concurrently which could make the browser sluggish for a second or two.
Some more details on how I did the testing, for anyone interested:
grabbed the source code of a product page, threw it into a test PHP page so I could modify it at will
surrounded each script with an on/off flag such as
if ( isset( $_REQUEST["allowGoogleAnalytics"] ) ) {
ran a test with all scripts turned off
in new tabs, ran more tests, turning scripts on one at a time
by the time my own scripts were all turned on, the pages were taking about 1.9 seconds to load (first view) and less than a second on repeat view. This is fine with me.
after turning on the third-party scripts, the pages were taking at least 3.1 seconds to load (first load) sometimes as much as 3.9
The third party scripts in question are:
facebook "like" button
google +1 button
pinterest
google trusted stores
None of these are particularly bad on their own, but all at once they combine and take too long, and make the browser too sluggish.
You can queue scripts, if problem is in simultaneous load. Also this load should be started on document ready (i see you already using jQuery, so use it in example).
Example code (tested locally, works).
<script src="http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script>
var scripts2load = [
"http://apis.google.com/js/plusone.js",
"http://connect.facebook.net/en_US/all.js#xfbml=1"
];
function loadNext() {
var url = scripts2load.pop();
if (url) {
$.ajax({
url: url,
cache: true,
crossDomain: true,
dataType: "script",
success: loadNext
});
}
}
$(loadNext)
</script>
In the past I have had some success waiting until the page was fully loaded (which happens after DOM Ready). Any scripts that you load before window.load causes the browser to do more work on top of the parsing/rendering/resource loading it's already doing. Traditionally, we do everything on DOM ready - which can quickly give the browser a lot to deal with. Instead, split off any of your non-crucial functionality and let the browser deal with them after all the crucial stuff has been dealt with.
Try taking your non-crucial scripts (eg. like buttons, anything not crucial to the page) and wait to load them until window.load. Then apply a fade-in effect or something else to ease-in the display. If window.load is too long to wait (ie. you have a bunch of images on your page), then you can do something like this:
$(function() {
var timer = setTimeout(loadThirdPartyScripts, 1200),
$window = $(window);
$window.on('load.third_party', loadThirdPartyScripts);
function loadThirdPartyScripts() {
clearTimeout(timer);
$window.off('load.third_party');
/* load your scripts here */
}
});
This will load all of your non-crucial scripts after the window has loaded or after 1.2 seconds - whichever comes first (adjust the timeout as needed). If you have a lot of images - I suggest lazy loading ones below the fold - the whole point being to get window.load to fire as soon as possible.
Disclaimer: if you wait until window.load to load a dozen or two resources, you are still going to experience the same stutters that you are now.

How to increase page loading speed with AddThis widget?

<script type="text/javascript"
src="http://s7.addthis.com/js/250/addthis_widget.js"></script>
I am using this code for facebook, twitter etc, but there is a script in this which makes the page loading speed extremely slow. Can you please help with the solution for this, the entire code is below
<!-- AddThis Button BEGIN -->
<div class="addthis_toolbox addthis_default_style ">
<a class="addthis_button_preferred_1"></a>
<a class="addthis_button_preferred_2"></a>
<a class="addthis_button_preferred_3"></a>
<a class="addthis_button_preferred_4"></a>
<a class="addthis_button_compact"></a>
<a class="addthis_counter addthis_bubble_style"></a>
</div>
<script type="text/javascript">
var addthis_config = {
"data_track_addressbar": true
};
</script>
<script type="text/javascript"
src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ra-4dfeea6f5bf22ac6">
</script>
<!-- AddThis Button END -->
Besides moving everything to the bottom of the page as Mudshark already said, you can also use the async addthis version:
http://support.addthis.com/customer/portal/articles/381221-optimizing-addthis-performance#.USyDXiVuPYo
<script type="text/javascript" src="http://s7.addthis.com/js/250/addthis_widget.js#async=1"></script>
function initAddThis(){
addthis.init()
}
// After the DOM has loaded...
initAddThis();
One of the solutions would be to use deferred JavaScript loading pattern for AddThis library.
There are several nice JavaScript libraries helping you out with that problem. I personally use mostly Modernizr.load (or yepnope.js by itself)
You can read more on that issue and improvement in Improve Your Page Performance With Lazy Loading article.
As a side note, I was able to improve page load by about 35% average in my past projects by using deferred JavaScript loading patter. I hope that will help.
One obvious thing to do would be to move the javascript to the bottom of your page, right before </body> so that everything else can load before it.
put async="async" attribute to your script tag
<script type="text/javascript"
src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ra-4dfeea6f5bf22ac6" async="async">
</script>
There are few things to note:
you really don't need to load addthis immediately, you can load it relatively late during page rendering process,
addthis .js file is huge, currently around 118kb, minimized and gzipped (sic!),
due to its size it will always take relatively a lot of time for browser to compile and process it, especially on mobile devices.
Using async attribute in the script tag might help, however browsers consider mostly network resources when they see the attribute. Browsers don't take into account what impact the script might have on CPU usage, page rendering tree etc. (in browsers defence there's no way for them to determine it). For example scripts that take a long time to execute might block rendering of first frame or other crucial early paints. Even if we ignore network resources (connection slot, bandwidth etc.) required to fetch the addthis .js file it still may turn out that the script has severe impact on page loading process.
Note that while the async attribute hints browser that the resource can be loaded asynchronously it says nothing about the script execution when it is finally retrieved. JS in browsers is mostly single threaded and once browser start to process the .js file it can't back out of it and it has to let it finish running.
On my computer, evaluating the script in Chrome takes ~130-140ms and it blocks ParseHTML event for that long. On less powerful mobile devices it may easily jump to 500ms.
Because addthis is so big it would be best give browsers a little help and defer .js file fetch until other, more important components of page are displayed. You should use dedicated .js deferring library for this task to make sure that it is processed after DOMContentLoaded event and after other important resources are processed. I personally use Lab.js for this as it's small and does its job well.
Note also that there exists defer attribute that you can add to script tag, however specification clearly states that the script with defer tag present has to be processed before DOMContentLoaded event - so no wins here.

Heavy Javascript page gap of 15 seconds between response and page load

I have a page (A) which is a heavy javascript page, when I leave this page to go to page B it takes a long time. When I go to page B from a different page it is really fast. So it has something to do with page A and probably its javascript.
When I run the network profiler from the developer tools in IE 9 it shows a gap of ~15 seconds between the response and the DomContentLoaded(event).
Page A is heavy with javascript because it runs the Xopus Editor, a rich text XML editor.
Does anybody have any ideas on what I could do to either analyse the gap as to what happens or what I could do to make Page A unload faster.
This is a long shot as there are about eleventy-hundred things wrong with it, but it might be somewhere to start. Add this script tag to your page as the very last one:
<script>
function unloadJS() {
var scripts = document.getElementsByTagName("SCRIPT");
for (var index = 0; index < scripts.length - 1; index++)
{
var file = scripts[index].getAttribute("src");
var start = +new Date();
scripts[index].parentNode.replaceChild(document.createElement('script'),
scripts[index]);
var elapsed = +new Date() - start;
alert(file + ": " + elapsed.toString());
}
return false;
}
</script>
This code attempts to force the unload of each of the JavaScript files that were loaded on the page, reporting the amount of time it takes to drop them in milliseconds. Fire this as is convenient, i.e., on unload or with a button:
<button onclick="return unloadJS()">Go!</button>
This might not work/tell you what you need to know because IE could refuse to do garbage collection when the script is disconnected. This could be because IE really doesn't unload them when you do this, or just because IE - well, what oddi said :)
In any event, this isn't a solution; it doesn't matter when the JS gets unloaded, garbage collection still takes the same amount of time. This is just an attempt at a first diagnosis, like you asked for. Hope it works/helps...
Yes IE sucks. But there are several tools to profile your page.
Fiddler or HttpWatch is a good tool for analysing your request timeline and see whether it takes long time to download all your heavy javascript code. It's often the main cause of slowing down a heavey js page. Since IE doesn't take parallel downloading js very well, it costs more time for hundreds of small javascript files.
For this case, try minifying your javascript. It is the most direct way to enhance your page load performance.
If it doesn't help much. You may need YSlow to analyse a detailed performance. Although it doesn't fits IE, fixing some issues under Chrome or FF can affect performance under IE.
Add some log in your console, narrow down the scope maybe you can find the execution performance problem.
Maybe you're using a Javascript library like PrototypeJS that hooks onto the page unload event and when the page unloads, it loops through an array removing all the event listeners for all the DOM elements on the page. If we know what libraries you are using, we could simulate a call to unload the page to force the library to execute it's unload function. Afterwards, start a timer to see how long it takes to load another page.

Categories