Why link css and JavaScript files by online links? [duplicate] - javascript

This question already has answers here:
Why use a CDN (Content Delivery Network‎)? [duplicate]
(3 answers)
Closed 2 years ago.
I am a beginner building my own website and I am currently linking my css and js pages by like so:
<link rel="stylesheet" href="style.css">
<script src="jscode.js"></script>
where style.css and and jscode.js is in the same folder as the html file. However, published websites seem to exclusively link their css and js pages by online links like so:
<link rel="stylesheet" type="text/css" href="https://cdn.sstatic.net/Shared/stacks.css?v=0ee8a05683e7">
<script async="" src="https://cdn.sstatic.net/Js/full.en.js?v=ada34f05dd4d"></script>
Why would they do this instead of having the css and js files hosted in the same folder as the html file? Should I also upload my css and js files online and link them?

This is done for perfomance reasons.
As you may have noted, they link a CDN hosted stylesheet: https://cdn.sstatic.net/Shared/stacks.css?v=0ee8a05683e7
Files that rarely change benefit from being hosted on CDNs, improving website performance. This is especially true for common libraries, e.g. Bootstrap or jQuery or Vue.
But if you craft your resources yourself, it is totally fine to put you css and js files alongside your web pages.

These external libraries increase the application bundle size when we build our application and so it also increases the application loading time. When you load CSS and JavaScript library from a CDN your website will load them fast.

Using CDN links typically offers faster delivery of the contents/resources to the user, and for websites that have high enough traffic, it can reduce the workload on the hosting server, because of the way CDNs work, you might want to read up on CDN here. But it also depends on the creator of the code, because in most cases, the difference is usually not observable by the user.

You will notice the 'cdn' subdomain in the src and type attributes: (href="https://cdn.sstatic.net/Shared/stacks.css?v=0ee8a05683e7").
These refer to 'content delivery network'. These are servers that are distributed around the globe and host the CSS and JS files in question. When a user visits the site the CDN will serve cached files from a server that is close to them which results in a quicker site load.
However, using a CDN isn't required.

Related

Site Performance (via Lighthouse): Less HTTP Requests or Less Render Blocking?

So I've been checking out Chrome's new Audits panel with Lighthouse, and slowly been improving my site's metrics. It currently sits at 95 (performance), 97 (accessibility) and 75 (best practices).
I started going through the recommendations, and clicked the "learn more" links on several sections. As a result, I'm now reading through the Google Developers website. However the two articles that are specifically of importance for my question, are Render-Blocking Resources and Render-Blocking CSS.
From the Render-Blocking Resources article, here is the most important excerpt (the CSS article as a whole essentially carries on with this in more detail):
For critical scripts, consider inlining them in your HTML. For non-critical scripts, consider marking them with the async or defer attributes. See Adding Interactivity with JavaScript to learn more.
For stylesheets, consider splitting up your styles into different files, organized by media query, and then adding a media attribute to each stylesheet link. When loading a page, the browser only blocks the first paint to retrieve the stylesheets that match the user's device. See Render-Blocking CSS to learn more.
For non-critical HTML imports, mark them with the async attribute. As a general rule, async should be used with HTML imports as much as possible.
Now, in order for my site to achieve its current scores, here is what is currently in place. (I should note that I do plan on taking advantage of them all, if possible. This is simply as the site currently stands.)
Current Implementations
The site is 100% designed to be mobile-first, and is compatible with device resolutions ranging from 240 pixels up to 3840 pixels in width. I am using adaptive design, with a few breakpoints.
I'm using HTTPS.
I'm using icon sprites.
I'm using non-minified CSS and JavaScript, split into several files, because the site theme is still under development. The CSS files contain media queries.
<a> elements are using target="_blank" and rel="noopener" is absent.
<script> elements are not using async or defer attributes.
I'm not using HTTP/2, or service workers.
I have enabled GZIP compression via cPanel ("Optimize Website"), targeting all content. (I am prepared to limit this to just HTML, PHP, CSS, JavaScript, plain text and XML, if the average load times are faster [I plan on running a benchmark to test both scenarios by loading the site 10 times for each — I haven't had a chance to test this yet].)
The Problems
The main CSS file is fairly large (currently 75 kiB). On-top of that, there is the jQuery file that is already minified (95 kiB), plus several smaller CSS files which are specific to certain areas of the site. Users are also expected to view multiple pages within a single visit due to the nature of my site (a microprocessor information repository).
In order to comply with the outlined guidelines above, I'm contemplating splitting up the CSS files to not only be dependent on the section of the site as they currently are, but also by media query, linking to them with <link> elements using the media attribute, versus queries within the CSS files themselves.
For JavaScript, I'm thinking about grouping all async scripts together in one file, and all defer scripts together in another. I'm assuming there are no problems with grouping the jQuery API code with other self-written functions?
Okay. There are my plans, but in taking a step back for a moment and thinking about how this would be implemented, a couple of questions were brought to my attention, and I was hoping you guys could help me decide. This kind of thought process is very new to me (also partly because I've never had a site of this size before), so any input would be great.
The Questions
Since all CSS files are downloaded, irrespective of whether they are used, I'm not sure whether one large main CSS file is the best way to go, or whether to have more HTTP requests and split it up by media type/query and section of the site. Regardless of the method, CSS will be minified.
<link rel="stylesheet" href="reset-min.css"> <!-- ~ 1.5 kiB; all media types -->
<link rel="stylesheet" href="layout-min.css"> <!-- ~ 50 kiB; internal #media -->
<link rel="stylesheet" href="color-min.css"> <!-- ~ 25 kiB; internal #media -->
<!--
3 HTTP requests; 76.5 kiB
Main CSS file split into "layout" and "color" for easy editing in the future.
-->
vs.
<link rel="stylesheet" href="reset-min.css"> <!-- ~ 1.5 kiB; all media types -->
<link rel="stylesheet" href="global-screen.css" media="screen"> <!-- ~ 7 kiB; site-wide elements -->
<link rel="stylesheet" href="color-screen.css" media="screen"> <!-- ~ 10 kiB -->
<link rel="stylesheet" href="database-screen.css" media="screen"> <!-- 20 kiB; not on all pages -->
<link rel="stylesheet" href="global-print.css" media="print"> <!-- ~ 6 kiB; site-wide elements -->
<link rel="stylesheet" href="color-print.css" media="print"> <!-- ~ 7 kiB -->
<link rel="stylesheet" href="database-print.css" media="print"> <!-- ~ 7 kiB; not on all pages -->
<!--
7 HTTP requests; 58.5 kiB
CSS files split into "global/database" and "color" for easy editing in the future.
-->
The first question also applies to JavaScript files. I have the main JS file for my functions, plus the jQuery API file. However, since the functions will be split up by async or defer attributes, this will result in more HTTP requests, but of smaller size. JS will also be minified.
<script src="jquery-min.css"> <!-- ~ 95 kiB; local -->
<script src="scripts-min.css"> <!-- ~ 21.3 kiB -->
<!--
2 HTTP requests; 116.3 kiB
-->
vs.
<script src="scripts-async-min.css" async> <!-- ~ 108.1 kiB; includes jQuery API -->
<script src="scripts-defer-min.css" defer> <!-- ~ 4.3 kiB -->
<!--
2 HTTP requests; 112.4 kiB
-->
My last question is really concerning the main culprits of reduced webpage loading — the jQuery API file, and web fonts. Since they take up HTTP requests anyway, would you recommend outsourcing directly from the jQuery CDN and Google, versus hosting the files locally? I bring this up as I'm assuming the CDNs would be faster to serve. However, I would be taking advantage of service workers where possible to reduce the need to download files.
Thank You!
Thank you for reading through this long post. I understand that I have probably gone into too much detail, but I didn't want to leave anything out that may be important.
Here are my views on the questions you raised:
1) Browser can open multiple parallel connections to a single server while loading the files. However, the catch lies in the limits on the number of these connections. Loading a single file of 76.5 KiB will usually be slower than loading 3 files of 25 KiBs. This won't hold true always since the Server handshake time 'might' be greater than the time taken to actually fetch the resource, especially in your case were the file size is 70Kibs and average Internet Speeds are ever-increasing. I guess the best solution will be highly dependent on the geographical areas you are targeting.
Heres a discussion on parallel connections limit : Max parallel http connections in a browser?
2) Same predicate applies to the js files as well. However, jQuery is almost global these days. There's a very very high chance that user has visited a website that required before coming to your website. Hence, using a popular CDN would mean that THE jQuery FILE WILL NOT BE DOWNLOADED AT ALL AND LOADED FROM THE BROWSER CACHE. There goes one of your biggest culprit. Even if the file does not exists in the browser cache, using a CDN should be the preferred practice, as CDNs by definition, are Geographically Distributed Systems and a good chance exists that the CDN can provide a lower latency if the visitor is not geographically in the same region as your server.
The javascript code that is not immediately required to be executed should be loaded async or deffered. What you do of these two further depends upon the browsers you are targeting. Older browser won't see any advantage with deferred loading.
Refer: Script Tag - async & defer
3) The same rational applies to fonts as well.If you are using a popular font, chances are it already exists in the cache.
Using service workers to manage jquery and fonts seems like an overkill to me. You might want to use them for your other assets though.
Also, I would like to add that you should try make the webpages load progressively. Put only the required code inside the head section. JS code which is executed upon user interaction should be loaded after the DOM content, i.e., just above the closing body tag.
I would also suggest to defer loading the css which is being used to style the content rather than fixing the layout. This includes all the colours, fonts and other fancy stuff.
If your purpose is to get a good score on the Chrome Audit, things I mentioned might not stand correct. But for serving actual users, this is what I would personally do.
Question 1:
You can keep separate css files with you for easy editing. Use vswebessentials to bundle and minify for the production version. When you are hosting it on server, put up a far future expires date or use cache manifest. Then, you would need to rename the bundled file when you make the changes. Site will work lot faster.
A 75kb css is not large in today's date when downloaded over single connection, if you give a far future expiry date and your client makes no other request for that for any other date for the domain.
Hopefully, you are not going to change css daily. You can break it down too as per media query specification.
Question 2
Loading jquery from cdn is a good idea but you need to make sure that it is available before your js is going to execute. Do not use service workers for simpler stuff. If you want to check, audit www.google.co.in, your audit numbers may be better than google. So, do not complicate development with audit numbers. Concentrate on UX. That's what google is doing.
Question 3
You can and should use CDN for jquery and fonts. Use cache manifest and far future expiry, if you can instead of following the audit tab for numbers and service workers.
I wanted a high rating for my website for my progressive web app (Angular, AWS serverless) for free. the ratings I got are in the image below. This is what I would advise against your questions...
1) Put the CSS file required for above the fold content in the HTML page itself, for the rest of the css files, let them load asynchronously. be careful about unused css, I spent time cleaning and removing unused classes.
2) Do you really use a lot of jquery? for best results, you might want to write vanilla javascript instead of using jquery in the above-the-fold content especially. in my website https://beta.akberiqbal.com i wrote plain javascript for the above-the-fold items (navigation etc). and below the fold, I had my three Angular5 JS files. the bundle.css file was small, so I made it part of the HTML also.
3) if you have a service worker app, a local fetch should be faster than a CDN.

How do I automatically install web assets like bootstrap, jquery and font-awesome without using CDN?

I want to know if installing jquery/bootstrap/font-awesome can be done automatically, instead of installing it via npm and then manually dragging the code to my css/js/fonts folder?
Is there no program that can update and automatically drag them to the correct folder?
I know people are saying that you can just manually drag the javascript file to the correct location, but bootstrap for example consists of more than a single javascript file. It includes font and css files.
If I were to include them in this manner:
\web
-\css
--\app
---\main.css
--\font-awesome
---\font-awesome.min.css
-\fonts
etc.
Then it wouldn't work, because font-awesome expects it's fonts to be one folder aside.
JQuery, Bootstrap and Fontawesome are not softwares or applications that you install in a webpage. They are just CSS and Javascript files. So these are like any other javascript or CSS file you may have written from scratch for your webpage. Except that they are well maintained, highly optimized and made for a particular application. (Like Bootstrap primary purpose is to provide a framework for making webpages responsive.)
To include them to a webpage all you have to do is tell the HTML file to use those files. And this is done by linking them to the HTML using the <script> tag and its src* attribute. (*W3schools link. Hehe).
Now in src attribute you may provide a URL to a location on the web containing the file or you may provide a relative local path to a location in your server or local machine containing the file. Yes, you can manually drag the files into your css/js folder and just include the files using that path. No Im not aware of any softwares to automate the process. But you need only place the file in one location for an entire webpage and its sub pages to access it. So its not a very intensive process.
As for why CDN's host such files for public access, an insight is given here : How cloudfare provides free service. And security, well, they are pretty darn secure, it is literally their job to provide secure access to the files they host. And why people use CDN in the first place is because this (in short performance).
Update:
As for how to include files in your HTML, it goes like this (Bootstrap example) :
<link rel="stylesheet" href="static/bootstrap/css/bootstrap.min.css">
<script type="text/javascript" src="static/bootstrap/js/bootstrap.min.js"></script>
You need to provide the path to the required CSS and JS files. In the case of Bootstrap these two are the only ones you need to include to get full functionality of the library.
I think it is not a good idea to use local files instead of CDNs until you are not working offline.
Here you can read about CDNs vs Local Files:
https://halfelf.org/2015/cdn-vs-local/
Multiple files on CDN vs. one file locally
https://www.sitepoint.com/7-reasons-to-use-a-cdn/
Although there is one another link that is just opposite:
7 Reasons NOT to use a Content Delivery Network
Nevertheless if you want to use the files locally you can follow the instructions below:
Move at the cdn link in your project
Copy the link from src or href and open it in your browser.
Save the file locally and give the reference of the file in your project.

Should I be concerned if my website is linked to an external Style Sheet [duplicate]

This question already has answers here:
Why should I use Google's CDN for jQuery?
(7 answers)
Closed 7 years ago.
At some point in my website I needed a Timer so I looked for a free jQuery Countdown Timer and found this one : Example
After integrating the model to my page inside my IDE (VS2010) i payed attention that some CSS and JS files are not stored locally in my project folder, but they are still Linked to an External sources, and that had me thinking : Am i suppose to find a way to download these file locally than use them, or should i use them the way they are ? and should i be concerned if they may change or desperate at some point in the future ? what are the best practices in case ?
Here is an example of the HTML code :
....
....
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<link rel="stylesheet" href="//netdna.bootstrapcdn.com/bootstrap/3.0.0/css/bootstrap.min.css">
<script type="text/javascript" src="inc/TimeCircles.js"></script>
....
....
Those are CDNs ( http://it.wikipedia.org/wiki/Content_Delivery_Network ) meaning that they're hosted by someone for all of us to use, so you're pretty much guaranteed it will stay there. The main advantage of using CDNs is that the user will probably have visited another site that uses the same resource and this means that said resource is already cached in the user's computer, leading to a faster loading time for your site.
You should never rely on external sources for critical files unless you're using a dependable CDN. In this case you're using the most common CDN sources for Bootstrap and jQuery, so you're all set.
I assume that you've downloaded the timer files and are hosting those locally. Your reference to them confused me, so I've updated this answer.

Dynamic/Automatic Javascript and CSS Combining and Minification

I know I have seen an article this somewhere (specifically related to Azure too!) but I forgot to bookmark it (doh!) and after hours searching can't find it anywhere :(
I have a MVC Application running in Azure with multiple layout pages and roughly 20+ javascript files (each quite lengthy hence why they are seperate!).
Each layout page includes a couple of script and css files, the rest are added using #head { } (razor syntax for adding sections to the layout page outside of the body.
I cannot remember if the article was exactly what I was after but what I would like to do is combine AND minify the nessessary javascript and css files at runtime dependant on the layout and page.
For example if i had a layout file with:
<script src="script1.js"></script>
<script src="script2.js"></script>
<link href="css1.css" />
and a page with
<script src="pagespecificscript.js"></script>
<script src="usercontrolspecificscript.js"></script>
<link href="page.css" />
I would want a 2 minified files to be sent to the user's browser such as
<script src="201101010800abc-min.js"></script>
<link href="201101010800abc-min.css" />
Thanks in advance!
Check out RequestReduce. Its a project I have been working on that minifies/merges css and optimizes and sprites images. It does this on the fly with no code changes and very little confg necessary. Next week, I will be releasing javascript merge/minify. I have been blogging quite a bit about this lately (http://www.mattwrock.com/post/2011/09/10/Adopt-RequestReduce-and-see-immediate-Yslow-and-Google-Page-Speed-score-improvements-not-to-mention-a-faster-site!.aspx) so it is possible that this is the article that you had run into. The easiest way to grab RequestReduce is via nuget: Install-Package RequestReduce.
I'm a dev lead on a couple Microsoft web sites where we have been deploying this with good success, so it is enterprise tested and quite scalable. However it is not a Microsoft product but rather a personal OSS project I have been contributing to. Its also free.

What is the smallest ExtJS package?

Does anyone know the bare minimum files required for Ext JS 2.2? I know the ExtJS site has a feature to "build" a small version of ExtJS (ext.js) as a replacement for ext-all.js but that's for minimizing the size of ExtJS on the client. I'm interested in minimizing what's on the server. Currently the SDK comes with the following subdirectories:
ext-2.2/
adapter
air
build
docs
examples
resources
source
I think its pretty safe to remove examples, docs, and air. However, are there other things we can remove to make this smaller or is there a resource (besides the large javascript source code corpus) that documents the minimum required files?
This link explains the include order
What is the proper include order for my JavaScript files?
This is the minimum include set
<link rel="stylesheet" type="text/css" href="../extjs/resources/css/ext-all.css">
<script type="text/javascript" src="../extjs/adapter/ext/ext-base.js"></script>
<script type="text/javascript" src="../extjs/ext-all.js"></script>
The ext-all.css depends on files in ../extjs/resources/css so you should include that entire directory structure also.
So you'd need the following files at a minimum
extjs/resources/**/*
extjs/adapter/ext/ext-base.js
extjs/ext-all.js
If you're not using Ext JS for any of the UI components then you don't need any of the stylesheets and supporting images, but in that case you'd have to question why you're using Ext JS since that's it's strong point.
Ext Corehttp://extjs.com/products/extcore/

Categories