Concurrent loading of .js files via https - javascript

We're looking at splitting our .js to be served from two domains with the intent that that would enable concurrent loading.
Question: Can we a) use subdomains for that purpose and b) will that concurrent loading also hold true over https?
For instance, we'd like to request two files as such:
https://www.example.com/firstfile.js
https://subdomain.example.com/secondfile.js
Doable? Alternatives?

As far as I am aware, it won't work. Scripts are set up to block parallel downloads. The reason for that is that parallel loading of scripts can cause race conditions in your javascript. Minify or on demand loading are your best options.

I think you have to consider the latency of the network (a kind of lost time that adds up for every call to make the round trip). The latency is what kills HTTP calls responsiveness.
Personally I follow the trend to reduce the number of http calls.
I merge all my files in one (+ minimise + gzip)

The problem caused by scripts is that they block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. While a script is downloading, however, the browser won't start any other downloads, even on different hostnames. (source)
Sounds problematic.

An alternative presented in the book "faster web sites" or "even faster web sites" (which i recommend for you to read) suggests loading the javascript files diagrammatically using a javascript function/method that will append child nodes to the element.
You might want to do some research on the topic but it is a good practice which you might want to consider.
regards,

a) Yes. Use document.domain to avoid Same Origin Policy issues.
b) I don't know, but I can't think of any reason why it shouldn't.

Related

is combining all js files into a single file necessarily faster?

I always hear in production, you want to combine multiple .js files into 1 to make it load faster.
But since browser actually makes multiple request concurrently, there's a chance that multiple files can be loaded faster than a single file, which has to be downloaded from beginning to end.
Is this reasoning correct?
It's a complex area.
The browser making multiple concurrent connections to the same server (which are usually quite limited in number) doesn't make the connection between the client and server faster. The pipes between them are only so big, and the server only has so much delivery capacity. So there's little if any reason to believe 4 parallel downloads, each of 10k, from the same server are likely to be faster than 1 download of 40k from that server. Add to that the fact that browsers limit the number of concurrent connections to the same server, and the expense of setting up those individual connections (which is non-trivial), and you're still better off with one large file for your own scripts.
For now. This is an area being actively developed by Google and others.
If you can load scripts from multiple servers (for instance, perhaps load common libraries from any of the several CDNs that make them accessible, and your own single combined script from your own server [or CDN]), it can make sense to separate those. It doesn't make the client's connection faster, but if the client's connection isn't the limiting factor, you can get a benefit. And of course, for a site that doesn't justify having its own CDN, loading common libraries from the free CDNs and just your own scripts from your own server lets you get the advantage of edge-casting and such on the scripts you load from the free CDNs.
For Large JS files:
Not Good idea,If you have small JS files then its good idea to merage otherwise
suppose if JS files is more than 500kbs then single file will make in
MBS and take huge loading HTTP request time.
For small JS files:
Good idea ,for small it has good idea but its better to use only 3rd party tool
which will also compress your final single file so that HTTP request
time will take less time. I would suggest using PHP Minify(but you can find other which suit you), which lets
you create a single HTTP request for a group of JS or CSS files.
Minify also handles GZipping, Compression, and HTTP Headers for client
side caching.
demo status of PHP minify
Before
After
It depends on if your server is HTTP/2 or HTTP/1.1.
HTTP/2
HTTP/2 (H2) allows a server to quickly respond to multiple requests, allowing the client to streamline all the requests without waiting for the first one to return and parse. This helps to mitigate the need for concatenation, but doesn't entirely remove it. See this post for an in-depth answer to when you should or shouldn't concatenate.
Another thing to keep in mind is that if your server gzips your assets, it can actually be better to concatenate some of them together since gzipping can perform better on larger files with lots of repeating text. By separating all your files out, you could actually hurt your overall performance. Finding the most optimal solution will require some trial and error (a lot of this is still new and so best practices are still being discovered).
HTTP/1.1
With HTTP/1.1, as the other answers have pointed out, for the majority of cases combining all your files into one is better. This reduces the number of HTTP requests, which can be slow with HTTP/1.1. There are ways to mitigate this by requesting assets from different subdomains to allow multiple concurrent requests.
I recommend reading High Performance Browser Networking for a complete understanding on strategies for HTTP/1.1.

Front-end performance and number of css/js files

I am trying to find an answer as to how to improve front-end performance for web applications. My question is say I have multiple css/js files being referenced.
Now the browser would make http call for each of the css/js file. But my questions are;
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Is the behaviour (parallel or one after the other) browser-specific ?
Is the use of async attribute for script tag standard or accpeted way for asynchronous download?
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser-speicific?
Does using AMD frameworks like RequireJS solve any of the performance issues OR is it to be used only in a single-page app development ?
Apart from that references to any other general front-end performance improvement tips would be great?
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Is the behaviour (parallel or one after the other) browser-specific ?
Browsers download the content of a website in parallel using multiple connections. The number of those connections depends on the browser and its user settings. If memory serves, the average number of connections is 4.
Is the use of async attribute for script tag standard or accepted way for asynchronous download?
The async attribute is used to denote that the script is to be executed asynchronously, it has no effect on the precedence of download
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser specific?
There is no limit, although obviously the more you have, the longer it will take for the page to download due to the connection limit.
Does using AMD frameworks like RequireJS solve any of the performance issues OR is it to be used only in a single-page app development ?
Those frameworks can be used on any website, with any structure. Their benefit comes from delaying the download of JS until it is actually required by the page. This means that other UI elements, such as images and video can be downloaded first which makes the page load appear quicker for the end user.
If you linking your css/js file in index.html the the request will be parallel, not serial.
I'm not sure about this, but I guess for all browsers it's parallel. Unless you link one css in index.html and then import the other css using the #import in css file.
For asynchronous download, you need to use require.js or any such kind of package manager. The async attribute is used for execution only not for request.
There is no limitation for http requests in a page.
Using require.js is a good option.Actually with require.js you can use r.js which will help you to create a build, which will reduce your multiple css & js files to a single file.
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Answer:
when user visit any website first time the browser keep cache of your the contents like js css and parallel request depend on browser to browser. by default different browser has different parallel request limit. and it is same for js and css even for your Ajax request as well.
Is the behaviour (parallel or one after the other) browser-specific ?
Answer:
yes its browser specific.
Is the use of async attribute for script tag standard or accepted way for asynchronous download?
Answer
there is no standard way to use it not uses depend on the requirement and use of async attribute is not related to asynchronous downloading. the downloading of content depend on browser setting or default setting.
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser specific?
Answer:
there is no limit of http calls to a server however the browser will send it in own way as per default setting or user settings.

Slow Loading - How to properly load page to prevent blocking of images (queuing)

Page is loading slower than expected. I checked the timeline with firebug, and I see a lot of image blocking:
http://i.imgur.com/tenTNVH.png
I guess I am doing something wrong. (I know I have double jquery here, will eliminate this mistake), but globally is there any way to load images parallel with js?
The reason why this is happening not because images are blocked by js, but because browser has limited number of parallel connections to the same server (some noted about 6-7)
If you look to your timeline closely, you will see there is that limit - no more than 7 files downloaded at the same time, and next is started at the time one of the current files is finished downloading.
In the past there was nasty tricks to avoid that limitation, like placing your images on subdomains and have them loaded in parallel just like from another server, but there is a better ways to improve loading performance. Most effective in matter of effort/result are:
use js concatenation/minimization toolchain. having all the JS in one-two files leaves your connection pool available for other downloads. In your case - you have 3 versions of jQuery and 2 of jqueryUI. Do you really need all of them? having two files instead of 5 will reduce blocking significantly, especially taking into account the fact that files are unminified and big.
use CDN's for third-party libraries. There is public free ones like google cdn. <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js"></script>. This also has advantage of most probably it is already in browser's cache
concat your images into sprites. Good if you have many small images, which is not content but UI-related. There is a lot of techniques to achieve it
Enable SPDY on your server if it is available. This can improve download speed not by removing blocking but by removing connection overhead.
Blocking generally occurs when there is more than X parallel requests to the same host (in chrome its 16, it varies per browser).
To rectify this, you have several options:
For images- split up you media content to a different hosts (subdomains from which you can serve the same content)
For js and css- try to minify and concatenate files on the server beforehand, so that they require less requests to retrieve.
For icons etc, try combining them into sprites if possible.
Theres a nice article about it here: http://gtmetrix.com/parallelize-downloads-across-hostnames.html

Why doesn't Facebook combine its CSS/JS files?

I am curious as to why the Facebook developers have chosen to not combine their scripts and stylesheets into single files. Instead they are loaded on demand via their CDN.
Facebook is obviously a very complex application and I can understand how such modularity might make Facebook easier to maintain, but wouldn't the usual optimisation advice still apply (especially given its high level of usage)?
Or, does the fact that they are using a CDN avoid the usual performance impact of having lots of small scripts / styles?
In a word BigPipe. They divide the page up into 'pagelets' each is processed separately on their servers and sent to the browser in parallel. Essentially almost everything (CSS, JS, images, content) is lazy loaded, thus it comes down in a bunch of small files.
They might be running into the case where the savings of being able to serve different combinations of JS files to the browser at different times (for different pages or different application configurations for different users) represents a larger savings than the reduced HTTP request overhead of combining all of the files into one.
If a browser is only ever executing a small percent of the total JS code base at any given time, then this would make sense. Because they have so many different users and different parts of different applications running in different configurations for those users, it is arguable that this is the case.
Second, those files only need to be downloaded once, then the browser won't ask for them again until they have changed or the cache has expired, so only the first visit really benefits from the all-in-one style. And yes having and advanced CDN with many edge locations around the world definitely helps.
Maybe they think it's more likely that you visit Facebook more often than you clear your browser cache.

javascript and css loadings

I was wondering, If I have, let's say 6 javascripts includes on a page and 4-5 css includes as well on the same page, does it actually makes it optimal for the page to load if I do create one file or perhaps two and append them all together instead of having bunch of them?
Yes. It will get better performance with fewer files.
There are a few reasons for this and I'm sure others will chime in as I won't list them all.
There is overhead in the requests in addition to the size of the file, such as the request its self, the headers, cookies (if sent) and so on. Even in many caching scenarios the browser will send a request to see if the file has been modified or not. Of course proper headers/server configuration can help with this.
Browsers by default have a limited number of simultaneous connections that it will open at a time to a given domain. I believe IE has 2 and firefox does 4 (I could mistaken on the numbers). Anyway the point is, if you have 10 images, 5 js files and 2 css files, thats 17 items that needs to be downloaded and only a few will be done at the same time, the rest are just queued.
I know these are vague and simplistic explanations, but I hope it gets you on the right track.
One of your goals is to reduce http requests, so yes. The tool called yslow can grade your application to help you see what you can do to get a better user experience.
http://developer.yahoo.com/yslow/
Even if browse doing several requests it's trying to open least amount of TCP connections (see Keep-Alive HTTP header option docs). Speed of web pages loading also can be improved by settings up compression (DEFLATE or GZIP) mode on the server side.
Each include is a separate HTTP request the user's browser has to make, and with an HTTP request comes overhead (on both the server and the connection). Combining multiple CSS and JavaScript files will make things easier on you and your users.
This can be done with images as well, via a technique called CSS sprites.
Yes. You are making fewer HTTP requests that way.
The best possible solution would be to add all code to one page so it can be fetched in one GET request by the browser. If you are linking to multiple files, the browser has to request for these external pages everytime the page is loaded.
This may not cause a problem if pieplineing is enabled in the browser and the site is not generating much traffic.
Google have streamlined their code to being all in one. I can't even imagine how many requests that has saved and lightened the load on their servers with that amount of traffic.
There's no longer any reason to feel torn between wanting to partition js & css files for the sake of organisation on the one hand and to have few files for the sake of efficiency on the other. There are tools that allow you to achieve both.
For instance, you can incorporate Blender into your build process to aggregate (and compresses) CSS and JavaScript assets. Java developers should take a look at JAWR, which is state of the art.
I'm not really very versed in the factors which effect server load, however I think the best thing to do would be to find a balance between having one big chunk and having your scripts organized into meaningful separate files. I don't think that having five or so different files should influence performance too much.
A more influential factor to look at would the compression of the scripts, there are various online utilities which get rid of white space and use more efficient variable names, I think these will result in much more dramatic improvements than putting the files together.
As others have said, yes, the fewer files you can include, the better.
I highly recommend Blender for minifying and consolidating multiple CSS/JS files. If you're like me and often end up with 10-15 stylesheets (reset, screen, print, home, about, products, search, etc...) this tool is great help.

Categories