I was wondering, If I have, let's say 6 javascripts includes on a page and 4-5 css includes as well on the same page, does it actually makes it optimal for the page to load if I do create one file or perhaps two and append them all together instead of having bunch of them?
Yes. It will get better performance with fewer files.
There are a few reasons for this and I'm sure others will chime in as I won't list them all.
There is overhead in the requests in addition to the size of the file, such as the request its self, the headers, cookies (if sent) and so on. Even in many caching scenarios the browser will send a request to see if the file has been modified or not. Of course proper headers/server configuration can help with this.
Browsers by default have a limited number of simultaneous connections that it will open at a time to a given domain. I believe IE has 2 and firefox does 4 (I could mistaken on the numbers). Anyway the point is, if you have 10 images, 5 js files and 2 css files, thats 17 items that needs to be downloaded and only a few will be done at the same time, the rest are just queued.
I know these are vague and simplistic explanations, but I hope it gets you on the right track.
One of your goals is to reduce http requests, so yes. The tool called yslow can grade your application to help you see what you can do to get a better user experience.
http://developer.yahoo.com/yslow/
Even if browse doing several requests it's trying to open least amount of TCP connections (see Keep-Alive HTTP header option docs). Speed of web pages loading also can be improved by settings up compression (DEFLATE or GZIP) mode on the server side.
Each include is a separate HTTP request the user's browser has to make, and with an HTTP request comes overhead (on both the server and the connection). Combining multiple CSS and JavaScript files will make things easier on you and your users.
This can be done with images as well, via a technique called CSS sprites.
Yes. You are making fewer HTTP requests that way.
The best possible solution would be to add all code to one page so it can be fetched in one GET request by the browser. If you are linking to multiple files, the browser has to request for these external pages everytime the page is loaded.
This may not cause a problem if pieplineing is enabled in the browser and the site is not generating much traffic.
Google have streamlined their code to being all in one. I can't even imagine how many requests that has saved and lightened the load on their servers with that amount of traffic.
There's no longer any reason to feel torn between wanting to partition js & css files for the sake of organisation on the one hand and to have few files for the sake of efficiency on the other. There are tools that allow you to achieve both.
For instance, you can incorporate Blender into your build process to aggregate (and compresses) CSS and JavaScript assets. Java developers should take a look at JAWR, which is state of the art.
I'm not really very versed in the factors which effect server load, however I think the best thing to do would be to find a balance between having one big chunk and having your scripts organized into meaningful separate files. I don't think that having five or so different files should influence performance too much.
A more influential factor to look at would the compression of the scripts, there are various online utilities which get rid of white space and use more efficient variable names, I think these will result in much more dramatic improvements than putting the files together.
As others have said, yes, the fewer files you can include, the better.
I highly recommend Blender for minifying and consolidating multiple CSS/JS files. If you're like me and often end up with 10-15 stylesheets (reset, screen, print, home, about, products, search, etc...) this tool is great help.
Related
I always hear in production, you want to combine multiple .js files into 1 to make it load faster.
But since browser actually makes multiple request concurrently, there's a chance that multiple files can be loaded faster than a single file, which has to be downloaded from beginning to end.
Is this reasoning correct?
It's a complex area.
The browser making multiple concurrent connections to the same server (which are usually quite limited in number) doesn't make the connection between the client and server faster. The pipes between them are only so big, and the server only has so much delivery capacity. So there's little if any reason to believe 4 parallel downloads, each of 10k, from the same server are likely to be faster than 1 download of 40k from that server. Add to that the fact that browsers limit the number of concurrent connections to the same server, and the expense of setting up those individual connections (which is non-trivial), and you're still better off with one large file for your own scripts.
For now. This is an area being actively developed by Google and others.
If you can load scripts from multiple servers (for instance, perhaps load common libraries from any of the several CDNs that make them accessible, and your own single combined script from your own server [or CDN]), it can make sense to separate those. It doesn't make the client's connection faster, but if the client's connection isn't the limiting factor, you can get a benefit. And of course, for a site that doesn't justify having its own CDN, loading common libraries from the free CDNs and just your own scripts from your own server lets you get the advantage of edge-casting and such on the scripts you load from the free CDNs.
For Large JS files:
Not Good idea,If you have small JS files then its good idea to merage otherwise
suppose if JS files is more than 500kbs then single file will make in
MBS and take huge loading HTTP request time.
For small JS files:
Good idea ,for small it has good idea but its better to use only 3rd party tool
which will also compress your final single file so that HTTP request
time will take less time. I would suggest using PHP Minify(but you can find other which suit you), which lets
you create a single HTTP request for a group of JS or CSS files.
Minify also handles GZipping, Compression, and HTTP Headers for client
side caching.
demo status of PHP minify
Before
After
It depends on if your server is HTTP/2 or HTTP/1.1.
HTTP/2
HTTP/2 (H2) allows a server to quickly respond to multiple requests, allowing the client to streamline all the requests without waiting for the first one to return and parse. This helps to mitigate the need for concatenation, but doesn't entirely remove it. See this post for an in-depth answer to when you should or shouldn't concatenate.
Another thing to keep in mind is that if your server gzips your assets, it can actually be better to concatenate some of them together since gzipping can perform better on larger files with lots of repeating text. By separating all your files out, you could actually hurt your overall performance. Finding the most optimal solution will require some trial and error (a lot of this is still new and so best practices are still being discovered).
HTTP/1.1
With HTTP/1.1, as the other answers have pointed out, for the majority of cases combining all your files into one is better. This reduces the number of HTTP requests, which can be slow with HTTP/1.1. There are ways to mitigate this by requesting assets from different subdomains to allow multiple concurrent requests.
I recommend reading High Performance Browser Networking for a complete understanding on strategies for HTTP/1.1.
I am curious as to why the Facebook developers have chosen to not combine their scripts and stylesheets into single files. Instead they are loaded on demand via their CDN.
Facebook is obviously a very complex application and I can understand how such modularity might make Facebook easier to maintain, but wouldn't the usual optimisation advice still apply (especially given its high level of usage)?
Or, does the fact that they are using a CDN avoid the usual performance impact of having lots of small scripts / styles?
In a word BigPipe. They divide the page up into 'pagelets' each is processed separately on their servers and sent to the browser in parallel. Essentially almost everything (CSS, JS, images, content) is lazy loaded, thus it comes down in a bunch of small files.
They might be running into the case where the savings of being able to serve different combinations of JS files to the browser at different times (for different pages or different application configurations for different users) represents a larger savings than the reduced HTTP request overhead of combining all of the files into one.
If a browser is only ever executing a small percent of the total JS code base at any given time, then this would make sense. Because they have so many different users and different parts of different applications running in different configurations for those users, it is arguable that this is the case.
Second, those files only need to be downloaded once, then the browser won't ask for them again until they have changed or the cache has expired, so only the first visit really benefits from the all-in-one style. And yes having and advanced CDN with many edge locations around the world definitely helps.
Maybe they think it's more likely that you visit Facebook more often than you clear your browser cache.
Sites like Facebook use "lazy" loading of js.
When you would have to take in consideration that I have one server, with big traffic.
I'm interested - which one is better?
When I do more HTTP requests at once - slower loading of page (due to limit (2 requests at once))
When I do one HTTP request with all codes - traffic (GB) is going high, and apaches are resting little bit more. But, we'll have slower loading of page.
What's faster in result ?
Less requests! Its the reason why we combine JS files, CSS files, use image sprites, etc. You see the problem of web is not that of speed or processing by server or the browser. The biggest bottleneck is latency! You should look up for Steve Souders talks.
It really depends on the situation, device, audience, and internet connection.
Mobile devices for example need as little HTTP requests as possible as they are on slower connections and every round trip takes longer. You should go as far as inline (base-64) images inside of the CSS files.
Generally, I compress main platform and js libs + css into one file each which are cached on a CDN. JavaScript or CSS functionality that are only on one page I'll either inline or include in it's own file. JS functionality that isn't important right away I'll move to the bottom of the page. For all files, I set a far HTTP expires header so it's in the browser cache forever (or until I update them or it gets bumped out when the cache fills).
Additionally, to get around download limits you can have CNAMES like images.yourcdn.com and scripts.yourcdn.com so that the user can download more files in parallel. Even if you don't use a CDN you should host your static media on a separate hostname (can point to the same box) so that the user isn't sending cookies when it doesn't need to. This sounds like overfill but cookies can easily add an extra 4-8kb to every request.
In a developer environment, you should be working with all uncompressed and individual files, no need to move every plugin to one script for example - that's hard to maintain when there are updates. You should have a script to merge files before testing and deployment. This sounds like a lot of work but its something you do for one project and can reuse for all future projects.
TL;DR: It depends, but generally a mixture of both is appropriate. 'Cept for mobile, less HTTP is better.
The problem is a bit more nuanced then that.
If you put your script tags anywhere but at the bottom of the page, you are going to slow down page rendering, since the browser isn't able to much when it hits a script tag, other then download it and execute it. So if the script tag is in the header, that will happen before anything else, which leads to users sitting there stairing at a white screen until everything downloads.
The "right" way is to put everything at the bottom. That way, the page renders as assets are downloaded, and the last step is to apply behavior.
But what happens if you have a ton of javascript? (in facebooks example, about a meg) What you get is the page renders, and is completely unusable until the js comes down.
At that point, you need to look at what you have and start splitting it between vital and non vital js. That way you can take a multi-stage approach, bringing in the stuff that is nessicary for the page to function at a bare minimum level quickly, and then loading the less essential stuff afterwards, or even on demand.
Generally, you will know when you get there, at that point you need to look at more advanced techniques like script loaders. Before that, the answer is always "less http requests".
Ignoring download times, what's the performance impact of making the browser interpret several separate small files as opposed to one big one. In particular, could it make a significant difference to page rendering speed in ie6 and 7?
Browsers typically limit themselves to a certain number of simultaneous requests. This number is dependent on how "server friendly" they are.
How many concurrent AJAX (XmlHttpRequest) requests are allowed in popular browsers?
So, depending on the number of artifacts the browser has to load, it may have to wait for others to complete first. Artifacts include everything the browser has to go back to the server for: images, javascript, css, flash, etc. Even the favicon if you have one.
That aside, rendering speed is normally going to boil down to how the pages are structured. ie. how many calculations you depend on the browser to make (% width vs fixed width).
It has to make more round-trip HTTP requests. It may or may not have significant consequences.
Where,
Apart from download times , if you too have many javascript and css files
Each request is as an extra http call from client to server.
If the page load is one of the main criteria you should definetly think about it
read this doc also
http://developer.yahoo.com/performance/rules.html
I work for a gov't organization with a large scale enterprise intranet and when we had around 25+ JS files and 10+ CSS files loading on our intranet portal we did notice a dramatic lag in page load time in IE6 and 7. Newer browsers have faster routines for loading and executing JavaScript. I used YUI Compressor to minify everything including CSS.
If you include minification in along with combining files, then dead code often gets removed (depending on the minifier) and some code can be optimized (see YUI Compressor: What are micro optimizations? and Which javascript minification library produces better results?).
I've asked this question a bunch of times when I first started out with web development.
If you have under 10 javascripts and 10 css files (css not so important in my opinion), then I don't think there is much use minifying and compressing. However, if you are dealing with a bunch of javascript files (greater than 10), then YES, it's gonna make a difference.
What you may experience is, even after compressing and minifying and combining your scripts, you may still experience slow-ness. That's when HTML caching plays a huge role in website optimizations, at least that's what I experienced in my web application. Try looking into Memcached and use it to cache your html files. This technique speeds up your web application a WHOLE LOT!!!
I am assuming your question is related to web optimization and high performance websites.
Just my 2 cents.
I was wondering what would be best. I have different JS functions, for instance I have the accordion plugin, a script for the contact page. But I only use each script on one page e.g. 'the faq page'uses the accordion JS but not the contact JS obviously.
This along with many other examples (my js dir is 460kb big in total, seperated in different files)
So what's best, put all the scripts in one file and load it in my header template, or seperate them into about 10 different files and load them when I need them?
Regards
You want to place them all in one file. It cuts down on the number of trips to the server and reduces overhead.
Placing them at the end of the document is generally recommended as that way the rest of the page downloads beforehand.
Here's a link describing the best practices by Yahoo on where to include scripts and about minimizing trips to the server.
http://developer.yahoo.com/performance/rules.html
The "best" isn't usually a one-size-fits-all.
Merging the files together means fewer connections and (assuming your cache settings are correct) it will allow your first page view to take a hit and then all other pages would benefit.
Splitting them out gives you more granularity in terms of caching but it comes at a cost of many connections (each connection has an overhead associated with it). Remember that many browers only make 2 connections to any given hostname. This is an old restriction imposed by the HTTP spec.
Usually "best" is finding the way to chunk them into large enough groups that for any one page you aren't downloading too much extra but you are able to share a lot between pages. Favor fewer groups over worrying about downloading too much.
Whichever way you go, make sure you compact your scripts (remove whitespace, comments, etc.), serve them up GZipped or Deflated, and set your expire headers appropriately so that a user isn't downloading the same scripts over and over.
I would group it into a couple or 3 files based on what is used everywhere or only somewhere.
Also, with that much code, you should look at minifying the code to reduce the download time. I've used the YUI Compressor before, does a good job and is easy to integrate into a build file.
Combine them into a single file - it will mean fewer HTTP requests.
However, it is very important that you are setting expiry headers on your CSS and JS files. You should always have these headers set, but it's especially bad if you're forcing the user to re-download the contents of 10 files each page load.
If you really only use each function on a single page, you won't gain much by combining them into a single file. It'll take longer to load whatever page a visitor hits first, but subsequent pages will load faster.
If most scripts are only used on a few pages, then it might make sense to figure out which pages visitors are likely to hit first (main page, plus whatever's bookmark-worthy) and produce combined js files for those pages, so they load as quickly as possible. Then just load the less-used scripts on whatever page they're used.