What's better? More HTTP requests = less data transfered or Less HTTP requests = more data transferred? - javascript

Sites like Facebook use "lazy" loading of js.
When you would have to take in consideration that I have one server, with big traffic.
I'm interested - which one is better?
When I do more HTTP requests at once - slower loading of page (due to limit (2 requests at once))
When I do one HTTP request with all codes - traffic (GB) is going high, and apaches are resting little bit more. But, we'll have slower loading of page.
What's faster in result ?

Less requests! Its the reason why we combine JS files, CSS files, use image sprites, etc. You see the problem of web is not that of speed or processing by server or the browser. The biggest bottleneck is latency! You should look up for Steve Souders talks.

It really depends on the situation, device, audience, and internet connection.
Mobile devices for example need as little HTTP requests as possible as they are on slower connections and every round trip takes longer. You should go as far as inline (base-64) images inside of the CSS files.
Generally, I compress main platform and js libs + css into one file each which are cached on a CDN. JavaScript or CSS functionality that are only on one page I'll either inline or include in it's own file. JS functionality that isn't important right away I'll move to the bottom of the page. For all files, I set a far HTTP expires header so it's in the browser cache forever (or until I update them or it gets bumped out when the cache fills).
Additionally, to get around download limits you can have CNAMES like images.yourcdn.com and scripts.yourcdn.com so that the user can download more files in parallel. Even if you don't use a CDN you should host your static media on a separate hostname (can point to the same box) so that the user isn't sending cookies when it doesn't need to. This sounds like overfill but cookies can easily add an extra 4-8kb to every request.
In a developer environment, you should be working with all uncompressed and individual files, no need to move every plugin to one script for example - that's hard to maintain when there are updates. You should have a script to merge files before testing and deployment. This sounds like a lot of work but its something you do for one project and can reuse for all future projects.
TL;DR: It depends, but generally a mixture of both is appropriate. 'Cept for mobile, less HTTP is better.

The problem is a bit more nuanced then that.
If you put your script tags anywhere but at the bottom of the page, you are going to slow down page rendering, since the browser isn't able to much when it hits a script tag, other then download it and execute it. So if the script tag is in the header, that will happen before anything else, which leads to users sitting there stairing at a white screen until everything downloads.
The "right" way is to put everything at the bottom. That way, the page renders as assets are downloaded, and the last step is to apply behavior.
But what happens if you have a ton of javascript? (in facebooks example, about a meg) What you get is the page renders, and is completely unusable until the js comes down.
At that point, you need to look at what you have and start splitting it between vital and non vital js. That way you can take a multi-stage approach, bringing in the stuff that is nessicary for the page to function at a bare minimum level quickly, and then loading the less essential stuff afterwards, or even on demand.
Generally, you will know when you get there, at that point you need to look at more advanced techniques like script loaders. Before that, the answer is always "less http requests".

Related

Structuring huge application assets

We are about to completely rebuild a clients website, it currently has over 1000 pages.
There will be a cull, however my idea is to dynamically load assets based on what's on the page but I wanted to get feedback.
Let's say I have 100 global components (carousel,buttons,videos,Nah etc) currently over time we've just put all javascript for all components into a bundle.js file, same with css, however if a page only uses 3 of those 100 components it seems redundant to include everything.
So I guess my question is if it wrong to dynamically request only the components used, at runtime rather than loading all assets every time?
The big downside I can see is that almost every page will request new files, so caching will be harder, also more HTTP request would have to be made.
But if someone has a better idea please let me know
Firstly, I suggest an evidence-based approach. Don't do anything without data to back up the decision.
My thoughts on an overall approach. I'm thinking about React as I write this, but nothing is React-specific.
Server-render your content. It will then display to your users without needing your JavaScript bundle.
Get a good CDN and/or something like varnish and cache each route/page response. You'll get fast response times no matter how big the site.
Now, when the user visits a page they'll get it quickly and then you asynchronously download your JavaScript file that will breath life into the page.
Because the user is already reading your page, you can take your time loading the JS - up to a second or two. If you think most of users will have decent internet (e.g. they're all in South Korea) then I'd go as big as a 2mb JS bundle before bothering to do chunking. Opinions will vary, it's up to you. If your users have bad internet (e.g. they're all in North Korea) then every kb counts and you should aim to be making the smallest chunks needed for each page. Both for speed and to respect the users' download quota.

Is it faster to load if all webpage resources are compiled into a single HTML file?

What if I had a compilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script> and <style> tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
Impossible to say in general, because it is very situational.
If you're pulling resources from many different servers, these requests can slow your page loading down (especially with some bad DNS on the visiting side).
Requesting many different files may also slow down page load even if they're from the same origin/server.
Keep in mind not everyone has gigabit internet (or even on megabit level). So putting everything directly into your HTML file (inlining or using data URIs) will definitely reduce network overhead at first (less requests, less headers, etc.).
In addition (and making the previous point even worse) this will also break many other features often used to reduce page loading times. For example, resources can't be cached - neither locally nor on some proxy - and are always transferred. This might be costly for both the visitor as well as the hosting party.
So often the best way to approach this is going the middle ground, if loading times are an issue to you:
If you're using third party scripts, e.g. jQuery, grab these from a public hosted CDN that's used by other pages as well. If you're lucky, your visitor's browser will have a cached copy and won't do the request.
Your own scripts should be condensed and potentially minified into a single script (tools such as browersify, webpack, etc.). This must not include often changing parts, as these would force you to transfer even more data more often.
If you've got any scripts or resources that are really only part of your current visitor's experience (like logged in status, colors picked in user preferences, etc.), it's okay to put these directly into the parent HTML file, if that file is customized anyway and delivering them as separate files wouldn't work or would cause more overhead. A perfect example for this would be CSRF tokens. Don't do this if you're able to deliver some static HTML file that's filled/updated by Javascript though.
Yes, it will improve page load time but still this method is not often used due to these reasons:
Debugging will be difficult for that.
If we want to update later, it also won't be so easy.
Separate css and .js files remove these issues
And yeah, for faster page load, you can use a BUILD SYSTEM like GRUNT, GULP, BRUNCH etc. for better performance.

Inline vs included js and css?

In an environment with at least 500ms latency over 2G mobile connections (~0.1mbps), what's the fastest and most efficient way of sending a about 10kb of css and js, in around 5-10 files on the server, to the client?
I can think of three options:
combining all js to one file and all css to one file
linking all css and js files one by one
inline everything
I know google uses inline, but that's probably just to save server sockets. They are even saving ram by running in stateless mode - they trust the clients to remember the sessions for them. Server power isn't an issue at all.
On the other hand, facebook seem to autogenerate their css (their names are base64 encoded), but into over 10 different files sent to the user, and they don't even seem to optimize it that heavily; only some whitespace removal.
I'm already passing all the files through a function that compresses everything, so any one of these are feasible. I don't want to choose the first alternative because it's easier.
The first two takes advantage of caching (the second one a bit less than the first one) but the second only requires three requests to the server, and the third only requires one get request from the server (ignoring the few images we might have on some of the pages).
Does Android / iOS cache js and css across restarts of the browser? If not, then inline sounds better.
The only goal is to minimize the average load time of the user. Each user will be spending about 100 page loads on the site per day, seeing about 40 css and js files per day. The css and js is basically static content. It's set to cache 30 days, and we change the url if the file changes using /path/to/file.ext?md5-hash-of-file. Also, everything is gzipped wherever possible.
EDIT:
I think i should clarify the two options I found for number two. Is it a good idea to use a single file for css and js across the whole site? It would only use two requests and remove any double (or septuple) caching because a single function is in two or more different combined js files, but a download of up to 1MB doesn't sound that good.
Today it's basically one combined css per view, so every time you view the same page again the content is cached. However, some js and css is used on more than one page.
It really depends on the usage. For a page with only one time visitors , I would recommend inlining everything. This makes for a faster initial load (a single request vs multiple requests) and easier to work with. This is the case for landing pages, help pages, wizards and similar one-use pages.
However, if you are expecting recurring visitors, I'd recommend using an external file. While the first load will be slower, you make it up with near-zero load time afterwards for these assets. This is the case for most websites.
inline css and javascript will make your page so heavy.its a very good
practice to merge your all style sheets and all javascript files into
one and include them into your page.this will make your page very fast
as compared to inline styles.
The problem with #2, linking to each file, is that the biggest factor in load time for small elements is round trip time, not file size. It takes several round trips to set up the connection to get each file. This also means that you should combine your css and js files. In your high-latency environment, round trips will be especially painful. Here's google's advice on round trips
As others have pointed out, #3, inlining, means that files cannot be cached. It can slow down load times because of the increased size of the html. However you avoid the roundtrip penalty.
In your environment, I would also recommend looking at the HTML5 application cache to optimize caching for css and js files. You would need to convert your app to use AJAX calls instead of loading html pages, but doing that also reduces the needed data transfer.

How do I make my server do all the loading and javascript and then server the page all ready

I got a webpage that calls oracle and then does some processing and then a lot of javascript.
The problem is that all of this make it slow for the user. I have to use internet explorer 6 so the javascript takes very long to load, around 15 seconds.
How can i make my server do all of this every minute for example and save the page so if a user requests it it would server them that page that is all ready calculated etc
im using tomcat server my webpage is mainly javascript and html
edit:
By the way I can not rewrite my webpage, it would have to remain as it is
I'm looking for something that would give the user a snapshot of the webpage that the server loaded
YSlow recommendations would tell you that you should put all your CSS in the head of your page and all JavaScript at the bottom, just before the closing body tag. This will allow the page to fully load the DOM and render it.
You should also minify and compress your JavaScript to reduce download size.
To do that, you'd need to have your server build up the DOM, run the JavaScript in an environment that looks (enough) like web browser, and then serialize the result as HTML.
There have been various attempts to do that, Jaxer is one of them (it was originally a product from Aptana, now an Apache project). Another related answer here on SO pointed to the jsdom project, which is a DOM implementation in JavaScript (video here).
Re
By the way I can not rewrite my webpage, it would have to remain as it is
That's very unlikely to be successful. There is bound to be some modification involved. At the very least, you're going to have to tell your server-side framework what parts it should process and what parts should be left to the client (e.g., user-interaction code).
Edit:
You might also look for "website thumbnail" services like shrinktheweb.com and similar. Their "pro" account allows full-size thumbnails (what I don't know is whether it's an image or HTML). But I'm not specifically suggesting them, just a line you might pursue. If you can find a project that does thumbnails, you may be able to adapt it to do what you want.
But again, take a look at Jaxer, you may find that it does what you need or very similar (and it's open-source, so you can modify it or extract the bits you want).
"How can i make my server do all of this every minute for example"
If you are asking how you can make your database server 'pre-run' a query, then look into materialized views.
If the Oracle query is responsible for (for example) 10 seconds of the delay there may be other things you can do to speed it up, but we'd need a lot more information on what the query does

javascript and css loadings

I was wondering, If I have, let's say 6 javascripts includes on a page and 4-5 css includes as well on the same page, does it actually makes it optimal for the page to load if I do create one file or perhaps two and append them all together instead of having bunch of them?
Yes. It will get better performance with fewer files.
There are a few reasons for this and I'm sure others will chime in as I won't list them all.
There is overhead in the requests in addition to the size of the file, such as the request its self, the headers, cookies (if sent) and so on. Even in many caching scenarios the browser will send a request to see if the file has been modified or not. Of course proper headers/server configuration can help with this.
Browsers by default have a limited number of simultaneous connections that it will open at a time to a given domain. I believe IE has 2 and firefox does 4 (I could mistaken on the numbers). Anyway the point is, if you have 10 images, 5 js files and 2 css files, thats 17 items that needs to be downloaded and only a few will be done at the same time, the rest are just queued.
I know these are vague and simplistic explanations, but I hope it gets you on the right track.
One of your goals is to reduce http requests, so yes. The tool called yslow can grade your application to help you see what you can do to get a better user experience.
http://developer.yahoo.com/yslow/
Even if browse doing several requests it's trying to open least amount of TCP connections (see Keep-Alive HTTP header option docs). Speed of web pages loading also can be improved by settings up compression (DEFLATE or GZIP) mode on the server side.
Each include is a separate HTTP request the user's browser has to make, and with an HTTP request comes overhead (on both the server and the connection). Combining multiple CSS and JavaScript files will make things easier on you and your users.
This can be done with images as well, via a technique called CSS sprites.
Yes. You are making fewer HTTP requests that way.
The best possible solution would be to add all code to one page so it can be fetched in one GET request by the browser. If you are linking to multiple files, the browser has to request for these external pages everytime the page is loaded.
This may not cause a problem if pieplineing is enabled in the browser and the site is not generating much traffic.
Google have streamlined their code to being all in one. I can't even imagine how many requests that has saved and lightened the load on their servers with that amount of traffic.
There's no longer any reason to feel torn between wanting to partition js & css files for the sake of organisation on the one hand and to have few files for the sake of efficiency on the other. There are tools that allow you to achieve both.
For instance, you can incorporate Blender into your build process to aggregate (and compresses) CSS and JavaScript assets. Java developers should take a look at JAWR, which is state of the art.
I'm not really very versed in the factors which effect server load, however I think the best thing to do would be to find a balance between having one big chunk and having your scripts organized into meaningful separate files. I don't think that having five or so different files should influence performance too much.
A more influential factor to look at would the compression of the scripts, there are various online utilities which get rid of white space and use more efficient variable names, I think these will result in much more dramatic improvements than putting the files together.
As others have said, yes, the fewer files you can include, the better.
I highly recommend Blender for minifying and consolidating multiple CSS/JS files. If you're like me and often end up with 10-15 stylesheets (reset, screen, print, home, about, products, search, etc...) this tool is great help.

Categories