How to minimize traffic and load time for http request for external scripts? - javascript

So I was pretty interested in the idea of head.js, however the idea doesn't play nice with the way I have developed my backend.
Basically what I'm trying to do is decide how I should serve my scripts(js and css) giving my services the most performance and the least traffic. I feel like I can replicate a more integrated idea of head.js using more of my backend. I'm using node.js, without any frameworks around it, to serve everything.
MoreOver, I have several javascript and css files. Basically the idea is theres one for the entire site for the header, footer and reused methods and styles. Then there's one for each page.
The original idea was to make the server send a get request to "/js?src=" and just request one file like "index", then index will be all the javascript that is needed to load on that page will be sent as a single response as a big conjoined script. The problem with this is I'm using Closure Compiler and this just seems like I would run into a lot of issues with this except for the CSS.
The second idea is to create a loop in the template to create separate requests for each script. Utilizing more of the idea of head.js, however moving it to the backend.
The third idea, maybe I'm overthinking at this point. Create a Closure Compiled version of the scripts for each page which will include the javascript for not only the page but the header and footer, therefore the scripts never conflict. This creates redundant data in each of the files, and posses the issue of not pipelining my assets.
The basic idea of my service is a website that will serve social media content, images, and music in realtime. So initial page loading isn't extremely important, however, I want the server to be able to handle a large number of requests quickly. So I'm more focused on the big picture of serving many users than I am on the individual users experience. Whats my best approach?

Related

Any way to create an application with the local web page as an interface?

A few days ago I decided to make my own "interface" to make it easier to organize (and work with) some of my personal files. You know when a lot of related data, pictures and links are right in front of you and you can change them in a couple of clicks, this is very convenient.
I started by studying HTML, CSS and JS, because I thought that the changes made to the local page would be saved somewhere on my PC so I can just run Index.html and do whatever I want. But they didn't. Refreshing the page erased all changes.
Using browser localstorage does not suit me, because if I change the browser, the data will be lost. I wanted it to just open with Index.html and work fine even if I change my browser or move the site folder to another computer.
Then I decided to learn more about server-side languages (such as PHP or Node.js) because they are directly related to databases, so I was hoping to save changes through them. But these languages required me to really open the server, with ip and port tracking. And I just wanted to open a local page through one file, without any ports or connections via the console. So this method scared me off quickly.
So is there an easy way to make a local page like this? Maybe I have not studied well one of the above methods and it has this opportunity?
Or the best I can hope for is a simple application that will use that local page as an interface to interact with data? I accidentally heard about this possibility a long time ago. Then I will ask you to give at least a hint as to which language to choose for this.
I don't understand well everything that lies outside of vanilla HTML, CSS and JS, so a complete study of a complex language like Java or Python will be too difficult for me, and the goal is not worth such a lot of effort.
I hope I understand correcly what you are trying to do.
If your goal is to make an application to manage your files, I think the simplest solution will be, as you said, to look into NodeJS and the File system api which will let you interact with your files through javascript code.
Your program will have to be in two part that will have to interact:
the "front" html page
the "back" nodejs script
The downside is that you'll have to go deeper into your study of the language to learn how to create the interactions you want between your html file and your NodeJS application.
However, there is no need to open your server to the web to make it work. The NodeJS application can be set to listen to requests from only the computer that runs it (localhost).
I obviously can't get too much into details without knowing precisely what you want to do but you'll probably have to learn to make a local server with node (search "nodejs http" or "nodejs express"), then make requests to it via the html page's scripts (search "ajax request").
What you need to look into are (web based) content management systems. like strapi or "grand old dame" WordPress.

Structuring huge application assets

We are about to completely rebuild a clients website, it currently has over 1000 pages.
There will be a cull, however my idea is to dynamically load assets based on what's on the page but I wanted to get feedback.
Let's say I have 100 global components (carousel,buttons,videos,Nah etc) currently over time we've just put all javascript for all components into a bundle.js file, same with css, however if a page only uses 3 of those 100 components it seems redundant to include everything.
So I guess my question is if it wrong to dynamically request only the components used, at runtime rather than loading all assets every time?
The big downside I can see is that almost every page will request new files, so caching will be harder, also more HTTP request would have to be made.
But if someone has a better idea please let me know
Firstly, I suggest an evidence-based approach. Don't do anything without data to back up the decision.
My thoughts on an overall approach. I'm thinking about React as I write this, but nothing is React-specific.
Server-render your content. It will then display to your users without needing your JavaScript bundle.
Get a good CDN and/or something like varnish and cache each route/page response. You'll get fast response times no matter how big the site.
Now, when the user visits a page they'll get it quickly and then you asynchronously download your JavaScript file that will breath life into the page.
Because the user is already reading your page, you can take your time loading the JS - up to a second or two. If you think most of users will have decent internet (e.g. they're all in South Korea) then I'd go as big as a 2mb JS bundle before bothering to do chunking. Opinions will vary, it's up to you. If your users have bad internet (e.g. they're all in North Korea) then every kb counts and you should aim to be making the smallest chunks needed for each page. Both for speed and to respect the users' download quota.

Is it faster to load if all webpage resources are compiled into a single HTML file?

What if I had a compilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script> and <style> tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
Impossible to say in general, because it is very situational.
If you're pulling resources from many different servers, these requests can slow your page loading down (especially with some bad DNS on the visiting side).
Requesting many different files may also slow down page load even if they're from the same origin/server.
Keep in mind not everyone has gigabit internet (or even on megabit level). So putting everything directly into your HTML file (inlining or using data URIs) will definitely reduce network overhead at first (less requests, less headers, etc.).
In addition (and making the previous point even worse) this will also break many other features often used to reduce page loading times. For example, resources can't be cached - neither locally nor on some proxy - and are always transferred. This might be costly for both the visitor as well as the hosting party.
So often the best way to approach this is going the middle ground, if loading times are an issue to you:
If you're using third party scripts, e.g. jQuery, grab these from a public hosted CDN that's used by other pages as well. If you're lucky, your visitor's browser will have a cached copy and won't do the request.
Your own scripts should be condensed and potentially minified into a single script (tools such as browersify, webpack, etc.). This must not include often changing parts, as these would force you to transfer even more data more often.
If you've got any scripts or resources that are really only part of your current visitor's experience (like logged in status, colors picked in user preferences, etc.), it's okay to put these directly into the parent HTML file, if that file is customized anyway and delivering them as separate files wouldn't work or would cause more overhead. A perfect example for this would be CSRF tokens. Don't do this if you're able to deliver some static HTML file that's filled/updated by Javascript though.
Yes, it will improve page load time but still this method is not often used due to these reasons:
Debugging will be difficult for that.
If we want to update later, it also won't be so easy.
Separate css and .js files remove these issues
And yeah, for faster page load, you can use a BUILD SYSTEM like GRUNT, GULP, BRUNCH etc. for better performance.

What's better? More HTTP requests = less data transfered or Less HTTP requests = more data transferred?

Sites like Facebook use "lazy" loading of js.
When you would have to take in consideration that I have one server, with big traffic.
I'm interested - which one is better?
When I do more HTTP requests at once - slower loading of page (due to limit (2 requests at once))
When I do one HTTP request with all codes - traffic (GB) is going high, and apaches are resting little bit more. But, we'll have slower loading of page.
What's faster in result ?
Less requests! Its the reason why we combine JS files, CSS files, use image sprites, etc. You see the problem of web is not that of speed or processing by server or the browser. The biggest bottleneck is latency! You should look up for Steve Souders talks.
It really depends on the situation, device, audience, and internet connection.
Mobile devices for example need as little HTTP requests as possible as they are on slower connections and every round trip takes longer. You should go as far as inline (base-64) images inside of the CSS files.
Generally, I compress main platform and js libs + css into one file each which are cached on a CDN. JavaScript or CSS functionality that are only on one page I'll either inline or include in it's own file. JS functionality that isn't important right away I'll move to the bottom of the page. For all files, I set a far HTTP expires header so it's in the browser cache forever (or until I update them or it gets bumped out when the cache fills).
Additionally, to get around download limits you can have CNAMES like images.yourcdn.com and scripts.yourcdn.com so that the user can download more files in parallel. Even if you don't use a CDN you should host your static media on a separate hostname (can point to the same box) so that the user isn't sending cookies when it doesn't need to. This sounds like overfill but cookies can easily add an extra 4-8kb to every request.
In a developer environment, you should be working with all uncompressed and individual files, no need to move every plugin to one script for example - that's hard to maintain when there are updates. You should have a script to merge files before testing and deployment. This sounds like a lot of work but its something you do for one project and can reuse for all future projects.
TL;DR: It depends, but generally a mixture of both is appropriate. 'Cept for mobile, less HTTP is better.
The problem is a bit more nuanced then that.
If you put your script tags anywhere but at the bottom of the page, you are going to slow down page rendering, since the browser isn't able to much when it hits a script tag, other then download it and execute it. So if the script tag is in the header, that will happen before anything else, which leads to users sitting there stairing at a white screen until everything downloads.
The "right" way is to put everything at the bottom. That way, the page renders as assets are downloaded, and the last step is to apply behavior.
But what happens if you have a ton of javascript? (in facebooks example, about a meg) What you get is the page renders, and is completely unusable until the js comes down.
At that point, you need to look at what you have and start splitting it between vital and non vital js. That way you can take a multi-stage approach, bringing in the stuff that is nessicary for the page to function at a bare minimum level quickly, and then loading the less essential stuff afterwards, or even on demand.
Generally, you will know when you get there, at that point you need to look at more advanced techniques like script loaders. Before that, the answer is always "less http requests".

Dynamic Loading: Pass back url to script or pass back script itself?

I've been wondering if there is a right way to do this: When I'm dynamically loading a script using AJAX, I have the option of passing back a url to the script on the server and then running a: <script src = response.url ></script> or just passing back the script itself.
I went with the approach of passing back the contents script itself because I figured I only make one roundtrip instead of two. Also because I want to pass back some css or other assets as well. I noticed however that facebook passes back urls to CDN resources, so I'm wondering if my approach has some consequences that I'm overlooking.
Thanks!
Matt
When you're facebook, you're looking at some rather unique traffic patterns. Sending back 20KB of script vs sending 30 characters from dynamic servers can translate into a lot more load on those servers. Additionally, they might not be able to serve large-ish content all that fast.
In contrast, the CDN servers are glorified static proxies, designed for speed and for scale. So from facebook's point of view, the additional round-trip makes sense, as it can still improve the overall page speed, and it certainly improves their server traffic patterns.
Now back to you. This approach won't make sense if you're going to load the script from the same servers as the rest of your site. If you do have access to a CDN as well, then you have to do the math using various assumptions about your users (latency, location), facts about your site (size of scripts, timing of script loads), and compare the effect of having your main servers serve those scripts, versus the extra round-trip and your CDN servers handing out those scripts.
One additional thought about roundtrips: If I was facebook, I'd probably be handing out those CDN URLs early on, before the page actually needs to load the scripts. Ideally, I'd piggyback on another request to sneak that little bit of extra data in. That'd make the extra round-trip issue mostly moot.
Hm, well I'm fairly sure there are some cross-domain security issues with AJAX, meaning if you were trying to dynamically load a script's content from an external CDN, you'd need to work around such an issue..

Categories