I was wondering what would be best. I have different JS functions, for instance I have the accordion plugin, a script for the contact page. But I only use each script on one page e.g. 'the faq page'uses the accordion JS but not the contact JS obviously.
This along with many other examples (my js dir is 460kb big in total, seperated in different files)
So what's best, put all the scripts in one file and load it in my header template, or seperate them into about 10 different files and load them when I need them?
Regards
You want to place them all in one file. It cuts down on the number of trips to the server and reduces overhead.
Placing them at the end of the document is generally recommended as that way the rest of the page downloads beforehand.
Here's a link describing the best practices by Yahoo on where to include scripts and about minimizing trips to the server.
http://developer.yahoo.com/performance/rules.html
The "best" isn't usually a one-size-fits-all.
Merging the files together means fewer connections and (assuming your cache settings are correct) it will allow your first page view to take a hit and then all other pages would benefit.
Splitting them out gives you more granularity in terms of caching but it comes at a cost of many connections (each connection has an overhead associated with it). Remember that many browers only make 2 connections to any given hostname. This is an old restriction imposed by the HTTP spec.
Usually "best" is finding the way to chunk them into large enough groups that for any one page you aren't downloading too much extra but you are able to share a lot between pages. Favor fewer groups over worrying about downloading too much.
Whichever way you go, make sure you compact your scripts (remove whitespace, comments, etc.), serve them up GZipped or Deflated, and set your expire headers appropriately so that a user isn't downloading the same scripts over and over.
I would group it into a couple or 3 files based on what is used everywhere or only somewhere.
Also, with that much code, you should look at minifying the code to reduce the download time. I've used the YUI Compressor before, does a good job and is easy to integrate into a build file.
Combine them into a single file - it will mean fewer HTTP requests.
However, it is very important that you are setting expiry headers on your CSS and JS files. You should always have these headers set, but it's especially bad if you're forcing the user to re-download the contents of 10 files each page load.
If you really only use each function on a single page, you won't gain much by combining them into a single file. It'll take longer to load whatever page a visitor hits first, but subsequent pages will load faster.
If most scripts are only used on a few pages, then it might make sense to figure out which pages visitors are likely to hit first (main page, plus whatever's bookmark-worthy) and produce combined js files for those pages, so they load as quickly as possible. Then just load the less-used scripts on whatever page they're used.
Related
What if I had a compilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script> and <style> tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
Impossible to say in general, because it is very situational.
If you're pulling resources from many different servers, these requests can slow your page loading down (especially with some bad DNS on the visiting side).
Requesting many different files may also slow down page load even if they're from the same origin/server.
Keep in mind not everyone has gigabit internet (or even on megabit level). So putting everything directly into your HTML file (inlining or using data URIs) will definitely reduce network overhead at first (less requests, less headers, etc.).
In addition (and making the previous point even worse) this will also break many other features often used to reduce page loading times. For example, resources can't be cached - neither locally nor on some proxy - and are always transferred. This might be costly for both the visitor as well as the hosting party.
So often the best way to approach this is going the middle ground, if loading times are an issue to you:
If you're using third party scripts, e.g. jQuery, grab these from a public hosted CDN that's used by other pages as well. If you're lucky, your visitor's browser will have a cached copy and won't do the request.
Your own scripts should be condensed and potentially minified into a single script (tools such as browersify, webpack, etc.). This must not include often changing parts, as these would force you to transfer even more data more often.
If you've got any scripts or resources that are really only part of your current visitor's experience (like logged in status, colors picked in user preferences, etc.), it's okay to put these directly into the parent HTML file, if that file is customized anyway and delivering them as separate files wouldn't work or would cause more overhead. A perfect example for this would be CSRF tokens. Don't do this if you're able to deliver some static HTML file that's filled/updated by Javascript though.
Yes, it will improve page load time but still this method is not often used due to these reasons:
Debugging will be difficult for that.
If we want to update later, it also won't be so easy.
Separate css and .js files remove these issues
And yeah, for faster page load, you can use a BUILD SYSTEM like GRUNT, GULP, BRUNCH etc. for better performance.
In an environment with at least 500ms latency over 2G mobile connections (~0.1mbps), what's the fastest and most efficient way of sending a about 10kb of css and js, in around 5-10 files on the server, to the client?
I can think of three options:
combining all js to one file and all css to one file
linking all css and js files one by one
inline everything
I know google uses inline, but that's probably just to save server sockets. They are even saving ram by running in stateless mode - they trust the clients to remember the sessions for them. Server power isn't an issue at all.
On the other hand, facebook seem to autogenerate their css (their names are base64 encoded), but into over 10 different files sent to the user, and they don't even seem to optimize it that heavily; only some whitespace removal.
I'm already passing all the files through a function that compresses everything, so any one of these are feasible. I don't want to choose the first alternative because it's easier.
The first two takes advantage of caching (the second one a bit less than the first one) but the second only requires three requests to the server, and the third only requires one get request from the server (ignoring the few images we might have on some of the pages).
Does Android / iOS cache js and css across restarts of the browser? If not, then inline sounds better.
The only goal is to minimize the average load time of the user. Each user will be spending about 100 page loads on the site per day, seeing about 40 css and js files per day. The css and js is basically static content. It's set to cache 30 days, and we change the url if the file changes using /path/to/file.ext?md5-hash-of-file. Also, everything is gzipped wherever possible.
EDIT:
I think i should clarify the two options I found for number two. Is it a good idea to use a single file for css and js across the whole site? It would only use two requests and remove any double (or septuple) caching because a single function is in two or more different combined js files, but a download of up to 1MB doesn't sound that good.
Today it's basically one combined css per view, so every time you view the same page again the content is cached. However, some js and css is used on more than one page.
It really depends on the usage. For a page with only one time visitors , I would recommend inlining everything. This makes for a faster initial load (a single request vs multiple requests) and easier to work with. This is the case for landing pages, help pages, wizards and similar one-use pages.
However, if you are expecting recurring visitors, I'd recommend using an external file. While the first load will be slower, you make it up with near-zero load time afterwards for these assets. This is the case for most websites.
inline css and javascript will make your page so heavy.its a very good
practice to merge your all style sheets and all javascript files into
one and include them into your page.this will make your page very fast
as compared to inline styles.
The problem with #2, linking to each file, is that the biggest factor in load time for small elements is round trip time, not file size. It takes several round trips to set up the connection to get each file. This also means that you should combine your css and js files. In your high-latency environment, round trips will be especially painful. Here's google's advice on round trips
As others have pointed out, #3, inlining, means that files cannot be cached. It can slow down load times because of the increased size of the html. However you avoid the roundtrip penalty.
In your environment, I would also recommend looking at the HTML5 application cache to optimize caching for css and js files. You would need to convert your app to use AJAX calls instead of loading html pages, but doing that also reduces the needed data transfer.
I am working on a project using HTML and JavaScript. I am using a number of JavaScript files in my project. I want to know if there are any side-effects caused by the number of JavaScript files in a web project? (In terms of efficiency or speed).
I want to know that Is there any effect of no of JavaScript files in a web project ? In terms of efficiency or speed.
If you mean, is there a speed cost associated with having all of your JavaScript embedded in inline script tags within your HTML, then maybe, maybe not; it depends on how much JavaScript there is.
It's a trade-off:
If you put all of your JavaScript in the HTML files, that means you have to duplicate it in each HTML file, making each one larger. If it's a lot of script, that adds up to each HTML file being heavier, and it means when you change the content (rather than the code), you force the user to re-download all of that code.
If you put your JavaScript in a separate file, you have an additional HTTP request when your page is loaded, but only if the JavaScript file isn't already in cache — and all of your HTML files are that much smaller.
So in most cases, if you have any significant amount of code, you're better off using a separate JavaScript file and setting the caching headers correctly so that the browser doesn't need to send the separate HTTP request each time. For instance, configure your server to tell the browser that JavaScript file is good for a long time, so it doesn't even do an If-Modified-Since HTTP request. Then if you need to update your JavaScript, change the filename and refer to the new file in your HTML.
That's just one approach to caching control, which is a significant and non-trival topic. But you get the idea. Making it separate gives you options like that, with a fairly low initial cost (the one extra HTTP request to load it into cache). But it can be overkill for a quite small amount of code.
In terms of execution, I believe there should be no considerable difference depending on the number of JavaScript-files you have. However, a large number of JavaScript-files require a large number of requests when your page is loaded, which definitely may impact the time it takes to load your page.
There are at least two ways to handle this and which approach to use depends on the situation at hand.
Bundle all files into one:
Each request to the server adds to the time it takes to load your page. Becuse of this it is often considered good practice to bundle and minify your JavaScript-files into one single file, containing all your scripts, thus requiring only a single request to load all the JavaScript.
In some cases however, if you have very much JavaScript for instance, that single file can become quite large, and even if you just load a single file, loading that file may take some time, thus slowing down your page load. In that case, it might be worth considering option two.
Load modules as you need them:
In other cases, for example if you have a lot of JavaScript for your site, but only a few modules of all your code is used on each page, it might give you better performance to use conditional loading. In that case, you keep your code in separate modules and have each module in a separate file. You can then load only the modules you need on each page.
This will depend entirely on the javascript files that you will include. Naturally, when including a javascript file, it will have to be loaded by the browser. How taxing this is depends on the size of the file. Generally speaking it is advantageous to minimize the number of separate requests of script files by combining all code in a single file
A larger number of JavaScript files can cause your page to load slower, but if everything caches correctly their shouldn't be any problem. A browser that requests a lot of files from cache is a bit slower before rendering, but we are talking about 0.001 seconds.
In Crowders post if you are correctly caching then you have to change your filename and refer to it, but if you implement a nice caching mechanism in your webproject it's easier to refer your JavaScript with a querystring like:
scripts/foo.js?version=1
We use the software versionnumber, so every new release we change the querystring and all clients get the new script files.
Everytime you release it's best to bundle en minify all of your JavaScript, but it isn't that useful for small projects.
I am curious as to why the Facebook developers have chosen to not combine their scripts and stylesheets into single files. Instead they are loaded on demand via their CDN.
Facebook is obviously a very complex application and I can understand how such modularity might make Facebook easier to maintain, but wouldn't the usual optimisation advice still apply (especially given its high level of usage)?
Or, does the fact that they are using a CDN avoid the usual performance impact of having lots of small scripts / styles?
In a word BigPipe. They divide the page up into 'pagelets' each is processed separately on their servers and sent to the browser in parallel. Essentially almost everything (CSS, JS, images, content) is lazy loaded, thus it comes down in a bunch of small files.
They might be running into the case where the savings of being able to serve different combinations of JS files to the browser at different times (for different pages or different application configurations for different users) represents a larger savings than the reduced HTTP request overhead of combining all of the files into one.
If a browser is only ever executing a small percent of the total JS code base at any given time, then this would make sense. Because they have so many different users and different parts of different applications running in different configurations for those users, it is arguable that this is the case.
Second, those files only need to be downloaded once, then the browser won't ask for them again until they have changed or the cache has expired, so only the first visit really benefits from the all-in-one style. And yes having and advanced CDN with many edge locations around the world definitely helps.
Maybe they think it's more likely that you visit Facebook more often than you clear your browser cache.
I was wondering, If I have, let's say 6 javascripts includes on a page and 4-5 css includes as well on the same page, does it actually makes it optimal for the page to load if I do create one file or perhaps two and append them all together instead of having bunch of them?
Yes. It will get better performance with fewer files.
There are a few reasons for this and I'm sure others will chime in as I won't list them all.
There is overhead in the requests in addition to the size of the file, such as the request its self, the headers, cookies (if sent) and so on. Even in many caching scenarios the browser will send a request to see if the file has been modified or not. Of course proper headers/server configuration can help with this.
Browsers by default have a limited number of simultaneous connections that it will open at a time to a given domain. I believe IE has 2 and firefox does 4 (I could mistaken on the numbers). Anyway the point is, if you have 10 images, 5 js files and 2 css files, thats 17 items that needs to be downloaded and only a few will be done at the same time, the rest are just queued.
I know these are vague and simplistic explanations, but I hope it gets you on the right track.
One of your goals is to reduce http requests, so yes. The tool called yslow can grade your application to help you see what you can do to get a better user experience.
http://developer.yahoo.com/yslow/
Even if browse doing several requests it's trying to open least amount of TCP connections (see Keep-Alive HTTP header option docs). Speed of web pages loading also can be improved by settings up compression (DEFLATE or GZIP) mode on the server side.
Each include is a separate HTTP request the user's browser has to make, and with an HTTP request comes overhead (on both the server and the connection). Combining multiple CSS and JavaScript files will make things easier on you and your users.
This can be done with images as well, via a technique called CSS sprites.
Yes. You are making fewer HTTP requests that way.
The best possible solution would be to add all code to one page so it can be fetched in one GET request by the browser. If you are linking to multiple files, the browser has to request for these external pages everytime the page is loaded.
This may not cause a problem if pieplineing is enabled in the browser and the site is not generating much traffic.
Google have streamlined their code to being all in one. I can't even imagine how many requests that has saved and lightened the load on their servers with that amount of traffic.
There's no longer any reason to feel torn between wanting to partition js & css files for the sake of organisation on the one hand and to have few files for the sake of efficiency on the other. There are tools that allow you to achieve both.
For instance, you can incorporate Blender into your build process to aggregate (and compresses) CSS and JavaScript assets. Java developers should take a look at JAWR, which is state of the art.
I'm not really very versed in the factors which effect server load, however I think the best thing to do would be to find a balance between having one big chunk and having your scripts organized into meaningful separate files. I don't think that having five or so different files should influence performance too much.
A more influential factor to look at would the compression of the scripts, there are various online utilities which get rid of white space and use more efficient variable names, I think these will result in much more dramatic improvements than putting the files together.
As others have said, yes, the fewer files you can include, the better.
I highly recommend Blender for minifying and consolidating multiple CSS/JS files. If you're like me and often end up with 10-15 stylesheets (reset, screen, print, home, about, products, search, etc...) this tool is great help.