Lazy-Loading/Delay X Amount of JavaScript - javascript

TLDR Version
Pretty simple question, what is the best method to lazy-load or delay low priority JavaScript? Considering that this could be anything, from Google Analytics, Optimizely, Test and Target, to Custom Code. All loaded with Adobe DTM.
Longer Version
We have analysed our traffic and found slow loading times cause less sales, no surprise there. Unfortunately, a lot of the JavaScript is bloated by a separate department and it's only getting worse with loading time coming in above 10 seconds.
The evil side of me wants to place a page load event and setTimeout on the Adobe DTM script to delay the code until much later. But this will have a side effect of page load events in the vendor's code not executing properly.
Best solution I have so far today, is to add a page load event on the page to load Adobe DTM by appending a <script> element to the <body> tag. Essentially, we need the application's code to load first and the marketing code should load in the background without impacting the website.
Can anyone provide a better solution?

Check your page organization - make sure low priority scripts are loaded at the end of the body, not in the head.
Add a defer attribute to the script tags of low priority scripts.
Be sure you have compression enabled for JavaScript files (web server configuration).
Leverage browser caching by setting far-future expiration dates for files that don't change often, append a version or timestamp to the JavaScript files to ensure updates are properly handled.
If possible, minify the JavaScript.
If possible, reduce the number of JavaScript files by combining them into a single file.
Strive to ensure only the JavaScript required for each page is requested.

Please note as per the documentation: https://marketing.adobe.com/resources/help/en_US/dtm/deployment.html That if the DTM embed code is not implemented in the prescribed manner then technically it is no longer a supported implementation.
Rudi Shumpert
Sr. Architect & Evangelist
Adobe Marketing Cloud Core Services
DTM Blog Posts: http://blogs.adobe.com/digitalmarketing/tag/dynamic-tag-management/
Full DTM Demo..no slides…just live demo: https://outv.omniture.com/play/?v=B5ODcybDozpBfRAARKiVrQ7V9lMtrD1C
DTM Help & Documentation: https://marketing.adobe.com/resources/help/en_US/dtm/
Marketing Cloud ID Service: https://marketing.adobe.com/resources/help/en_US/mcvid/

Related

Load external JS without impact TTI

I have multiple external js from the same provider that I insert into the website to embed Instagram (it created galleries for each insert). Unfortunately, this JS negatively impacts Google page speed significantly. I am talking about a reduction in 20 - 23 points on mobile and desktop.
Unfortunately, this product doesn't have any other option to embed. It is a normal tag. I have tried async and defer, but no use as google is clever and considers all the scripts on the page to be loaded to calculate Time to Interactive, which is delayed. Obviously the SEO is taking a hit as the website performance is getting a hit.
The question, is there a way I can load all these scripts without influencing the DCL (DomContentLoaded) and TTI? Any help is appreciated. It is sufficient that these scripts are loaded when the DOM element is in view.
Since you can't split them up to several chunks, I would suggest you load these files on dynamically on demand (when they're needed).
This answer shows how to load a script dynamically.

J2EE web application Performance Tuning

i am trying to improve the performance of my web application.
It is a java based web application & deployed on an amazon cloud server with Jboss & apache.
There is one page in the application that is taking 13-14 seconds to open. The functionality is so much that there are about 100+ http requests that get executed on page loading time. The Js & css files are taking too much time to load.
So i moved all of the Javascript code from my JSP page to new JS files. Minified the JS & css files. Still there is not much difference in the page load time.
There is dojo data on this page as well which takes time to load.
Is there any other appproach i should try to tune this page?
Can something be done at the Jboss or Apache level?
Apply caching for images etc. More here
Use CDN for any external libraries you use (like jquery). More here
Use a library for your js scripts like RequireJS to optimize your css and js files. You can concatenate (merge multiple js files to one) your code. This will reduce the number of ajax calls (that's what the browser does when it sees a js or css dependency) (As #Ken Franqueiro mentions in the comment section, Dojo already has a mechanism for this). More here
Optimize your images. Use appropriate dimensions. Do not use full blown dimensions if you just intend to use it for a 10x10 container. Use sprites if possible. More here
Show a message/loader to show the user some progress. This will minimize the user restlessness More here
If you load data that takes too long, then show the page and load the data afterwards. This will too give some sense of progress to the user.
If the response is very big you can compress your response data. Be careful though that the browsers your application supports, can handle the compressed information by default, or add a custom mechanism to decompress the data. More here
Use some profiling tools like Chrome Development Tools or FireBug for Mozilla.
Take a snapshot of your network traffic and check where the bottleneck is.
In Chrome you press F12 and then select the Network tab.

Javascript/jQuery optimization for large projects (social media)

I am developing a social media website of the scale of LinkedIn, and I am struggling with few things related to JS. As project is going on, the requirement of JavaScript/jQuery is increasing. I am guessing by the end of the project, jQuery required on each page will be of average file size 1.5MB which is a bad idea. Functions like image upload, comments, like, updates is almost on every page.
My Question here is How to optimize jQuery? Any tricks or concept that I can use on this project which will lower file size as well as do these functions (there are much more than just above ones)?
Here I have already implemented:
1. Compress/Minify jQuery by PHP plugin
2. Use jQuery.getScript() to load file only when it required
3. Divide one large JS file into few separate files to finish loading faster
4. HTML optimization (Google Pagespeed)
I've heard of people using YUI Compressor, which 'has a better compression ratio' than most other tools. Not sure what you're using for compression at the moment, but this could provide a slight improvement. http://yui.github.io/yuicompressor/
Use a CDN (Content Delivery Network) to deliver jQuery. The browser will most probably cache the library and each page that loads, won't actually be needing to download the library again. This, if i'm not wrong, is one of the main reasons CDN's are used.
Otherwise, you should rethink your dependency/development strategy since you might not need jquery after all on each page. At least not whole of it.

Is there an advantage to dynamically loading/unloading javascript and css stylesheets?

Background:
I'm putting together a site that will use ajax as a primary method of changing the content. That way the main frame and images will not have to be constantly reloaded with every page change. The main frame has its own site.css stylesheet.
Question 1:
Is it worthwhile to put all the stylesheet information into a single stylesheet? I think that this would make the website less modular. Everytime a new page or content is added/removed the css would have to be updated (given the content requires different style information).
Question 1.1:
Same question but with javascript.
Question 2:
If it is worthwhile (as I think it is) to have multiple stylesheets, is it beneficial to unload a stylesheet when its not in use. For example, I load the profile.php page so I dynamically load the profile.css. The user then changes to the settings.php page, I unload the profile.css and load the settings.css. Is this constant loading/unloading going to tank performance or even save on website size?
Question 2.1
Same question as above but applied to javascript functions.
Once your javascript or css file is downloaded to the machine, it is cached by their browser. So, you don't incur the additional cost of another HTTP request. Lazy loading the scripts and stylesheets could make sense, but there is no sense in unloading these web assets once they have already been sent to the client.
It is good to use some sort of mechanism to compile your scripts and stylesheets to minimize the initial http requests to one per asset type. Having only one stylesheet and one javascript file would be an architectural nightmare in many cases, but that doesn't mean that you can't still present it to the browser that way. I use .NET, so I'm not sure how you handle this in PHP, but I'm sure the functionality is out there.
Answer 1:
This is all about balance. You should keep different CSS and JS files, but combine them all into one before deploying (one CSS file and one JS file). Basically there's no need to develop with one big file because there are tools that can compile them for you.
The tricky part (depending on the complexity of your site) is figuring out which CSS and JS code gets used frequently enough to be called on every page. If you have code that only gets used on one page, you should probably dynamically load it...unless your whole site only has 5 pages. Like I said, balance.
Answer 1.1 and 1.2:
No, not worthwhile. As soon as you load the CSS and JS files, they get cached on the user's machine. So unloading them is pointless.
This smells somewhat of premature optimization to me. Let's look at SO's loadtimes:
all.css - 46.8 Kb - load time 87 ms
Just to drive the point home some more, here is the load times for a bunch of other items on this particular SO page (on my four-year-old laptop over a fiber optic connection):
And here are the relative sizes of (some of) those components:
I think the takehome is that if you build the rest of your site well, you shouldn't have to worry about optimizing those last few milliseconds until they really count.
Each time you need another css or js file, your browser requests it. The more requests you make, the more time it takes your page to load.
Question 1
Yes, because as said before, once downloaded the css gets cached. That's why in general when including style info it's better to not define it inline. You can handle this by using a simple PHP script that declares the content type as css and then includes your individual css files as one.
header("content-type: text/css");
$file = file_get_contents('your.css');
$file .= file_get_contents(...)
Then just refer to it as normal in a link tag.
<link rel="stylesheet" href="yourCSS.php" >
Question 1.1
Same answer as before just use this instead:
header("content-type: application/javascript");
Question 2
No. Once loaded it will always be accessible in the memory cache. Googles GWT framework actually compiles all of your external content and transfers it as a large monolithic bundle. This is because you can only do 2 concurrent HTTP_Requests per domain. Thats also why Yahoo's speed guide recommends that you place your css at the top of the page and your javascript at the bottom of the body tag.
Question 2.1
No. But do try to keep your external files small. Always minify production code.
Answer 1
While there may be some benefit to load your CSS rules in a as-needed basis, I would advise against it. Simply because this is against having a unified layout for your site or web application. Your design should be standardized throughout all of your pages, and whatever "modules" you are loading via Ajax requests, they should use the same layout as the primary page. JQuery UI is a good example; whether you use a widget or not, it's style is downloaded nevertheless.
However, if you need to apply specific styles to the bits of HTML that you retrieve from your requests, then you simply can have these rules inside a <style> tag appended to the <head> section, or add the CSS file to your <head>, or even have your rules set to the style attribute of your HTML elements.
Answer 1.1
If you should load Javascript code in a as-needed basis, this code should not be reloaded twice, unless you really need to reload the code for some obscure reasons...
But in any case, neither CSS nor Javascript should be "unloaded", there is no point to that.
Answer 2 and 2.1
If you had asked this question a few years back, the answer would have probably been that "loading a single stylesheet and/or Javascript is better", but this is not really true anymore. With the internet connection speed and computer performance, the browsers have become very efficient and the trouble is not worth the performance gain (most of the time). Moreover, if these resources are static (the URL don't change), they are cached and reused, however many there can be. Users usually don't mind to wait if waiting is expected (see section 2.10), or if it's the first time.

Speed up web site loading

I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php

Categories