Is there any difference between linking a stylesheet or script to our website from EXTERNAL servers and uploading the stylesheet to our server and link to it? What are the benefits of uploading to our server?
Usually the external ones are served by a CDN server, that's means that your users probably have them cached ( for example jquery or other popular library )
If you store something on your server it will mean your server will have to process more data, that is send your scripts or CSS together with every request. That will add some extra work for your server which may mean extra costs or slower responses.
The advantage of using links to external servers is less load on your server and also most of the popular frameworks like jQuery for example, they have your file available all over the world, so if user opens your website based in europe for example and he is based in USA he will not have to wait for file to be delivered across the world, but instead will most likely get that file from a location closer to him. Less traffic, less load on server and faster response times.
Yes, There is lots of difference:
If you will include stylesheet or script from external source then it will take time to load the page.
If you will include stylesheet or script from external source then it may causes not to load all due to firewall(I have faced this situation).
In future if there will change in the files then it will also reflect your site.
The security issue, anyone can add the code to track your data.
So always recommend to download and add these file from your on server instead of including stylesheet or script from external source.
Related
Between the following setups, which one would be performing the fastest page load for a front end user. I am only interested in the speed performance for frontend users and not the maintenance requirement for backend developers.
A website that only uses static .html files, no JavaScript, no PHP, no server side programming language to render the html. Basically the origins of the internet, where each click on an internal link loads a static .html file. Each page is a pre-created physical .html file on the server.
A website with a physical pre-created .html file, however the main content (article) on each page is fetched via Javascript from a noSQL server (Google Cloud Firestore or Fauna DB). Each click on an internal link only replaces the main content of the page via database call. The rest of the website (menu, logo, sidebar, footer) is all static and never needs to reload.
A website with a physical pre-created .html file, but the main content (article) on each page itself is fetched via JavaScript from a local JSON file, no database, just a regular .json file in the same directory as the .html file on the same server. Each click on an internal link only replaces the main content of the page using JavaScript (probably vanilla JavaScript using fetch, unless react is somehow faster, which I doubt). The rest of the website (menu, logo, sidebar, footer) is all static and never needs to reload.
Of course server performance and user location does always play a role in speed tests, but for argument sake let’s assume it’s the same user visiting the same web server. Additionally in regards to noSQL, let's say it’s a fast and reliable performing 3rd party server such as Google Cloud Firestore.
Which one of these setups would be the fastest? Has anyone tested this? I heard some people argue that basic static .html files are always fastest, while others argue that a static html file where the content is loaded via JavaScript is faster when navigating internal links once the initial page load is done. Both arguments make sense.
Any major pros or cons for one of the mentioned setups, or past benchmarks?
The speed of the webpage has two big components:
A. How fast the server responds/the size of the response
B. How fast the browser can render whatever it fetched
So, static files without JS will be the fastest, there is no delay on the server side, and the browser is very efficient in rendering static assets
The third option is still fast, but slightly slower than the first one as there is some work for the browser exists (transforming the JSON to HTML via JS)
The second option will be the slowest, as it is the only option where the server is not responding instantly with a file, but needs to connect to a DB, fetch the results, transform them, and only then send back.
All of it is relevant only in case we are talking about exactly the same content, but in different forms.
The question is slightly flawed, but to answer
Static content is fastest, the browser will render the content and cache it.
Getting content from a database adds overhead to the call and retrieval, the main page will be downloaded once and cached on the client side, the calls for content can not be cached as the browser needs to make the call to see what the content is. The upside is that the call will only return the content that needs to be displayed and DB searches are pretty quick from the big cloud service providers
This option is probably slower than 2, because the whole JSON file will need to be downloaded for the JavaScript to pick out the content for one article from all the content.
I would suggest option 2 is best from a maintainability vs speed point of view as it will only send the required data across the network and the rest is cached.
If you like option 3, have a look at using the browser cache https://web.dev/cache-api-quick-guide/ to cache your JSON file, this way the user will only need to download an updated version when you change the content
I understand the best way would be not to have the external JS at all, but alas, it's not possible.
Situation
The owner of a site wants (no ifs/ands/orButs) to get paid by a company that offers gambling ads. This company states that in order for them to offer said ads the owner of the site must add a JS code to the site. Said JS code is a few lines, but essentially it creates a tag <script> and loads a minified external JS file located in the publicity company's server. They do different kinds of ads (pop-ups, etc) and some other things that require the code.
There's no discussing not going through with this, I wanted to know if there were any kind of layers of security I might be able to add in order to protect site viewers. I know they are still in danger, but there's not much else I can do.
Things to do
Copy the external JS file and serve it from site owner's server (or is that a horrible idea? The thing is, at least this way they can't be changing it to their heart's content, since it's in the site owner's server).
Not loading the JS file in any page that has Login forms.
Only load the JS file where the publicity will be shown.
Not load the JS file is user if signed in
Modify JS file so that it has its own scope (function(){})() .
Anything else I could possibly do? Or am I simply fooling myself in thinking I can offer some feeble protection?
There are a few ways that may allow you to secure your page with external scripts.
First create a content security policy. This basically tells the browser where it can load different types of content from so if the third party starts loading content from new sources without telling you first they will be blocked.
Secondly the script-src tag. This allows you to specify a hash of the script tag and if it changes the browser won't run it.
There is a much better write up on these and more on Troy Hunt blog specifically this page https://www.troyhunt.com/locking-down-your-website-scripts-with-csp-hashes-nonces-and-report-uri/
Things to do:
Use a CDN that supports versions (almost every modern CDNs supports that) so you don't need to host these JS files yourself, and you don't need to worry about the fact that the file might change.
Only run your JS on login pages
For ads, use iframe elements, so the JS code for ads can't access external information
Use Subresource Integrity (SRI) on script tags
Example with jQuery
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js" integrity="sha256-FgpCb/KJQlLNfOu91ta32o/NMZxltwRo8QtmkMRdAu8=" crossorigin="anonymous"></script>
As Karl Graham mentioned, use Content Security Policy (CSP) in an HTTP Header, so content can't leak.
Make sure to use SSL Certificates (HTTPS), and to encrypt content when you do AJAX/Fetch requests so even if an external script listens to FetchEvents, it won't be able to read the data.
I'm almost certain that if you follow these rules, your external script won't be able to get your form content.
I have had this question for awhile and am surprised that I have yet to come across a good/complete answer to it.
The question is essentially this:
When it comes to loading js files, in what situations should you load them from the web if available versus serving them up yourself? What case typically allows for the lowest latency?
E.g.
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script>
vs.
<script src="js/jquery-1-11-3.min.js"></script>
Complete answer: Both.
Loading it off of the web will benefit you in a couple of ways:
1) There is a limit to the number of maximum open HTTP requests a browser can have. However, this limit is per domain. So reaching out to google's servers will not prevent you from loading your CSS/images.
2) It's very likely that the user will already have that file cached, so they will get an HTTP 304 not changed response, and not have to download the file.
Now, with that said, sometimes the server will be down, or network issues will otherwise stop you from loading that file. When that happens, you need a workaround, which we can do like so:
<script>
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='/js/jquery-2.0.0.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
Put this tag AFTER loading JQuery from the CDN, if the load failed, jQuery will be undefined and proceed to load it from the local source. If the load from the CDN worked, then this block will be skipped over.
Including from own server pros:
Technically google servers can be down, and your website won't load correctly.
people, that dont trust google and have it blocked, script blocked etc.
These ppl too wouldnt want to include the file from google directly.
connections to google can have higher latency. if you have your audience in your own country and a good provider, your connec can be faster than googles.
contra:
higher webserver traffic
more connections
higher CPU impact
You have to decide for yourself, which one is better for you. For smaller sites I'd go for local stored file.
I'm using a method of creating a .js file on server #1 which contains document.writes to write html code, then a simple js include inside html code on server #2 to load that html code (there are multiple server #2's). This is basically replacing an iframe method with the advantage being that each server #2 owner controls their own css.
The method works perfectly as is. My question has to deal with caching. Each time the page is loaded on server #2 I want the .js reloaded, as it will change frequently on server #1. This appears to be the case on each browser I tested, but can I rely on this as being the default case, or is it dependent on browser settings? Despite all I've read on caching I can't figure out what triggers the load for a case like this.
You can control browser caching using HTTP headers on the server side. Like cache-control and cache-expiration. More here - http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
In a case like this, the caching is triggered by the cache policy of the js file. Not the html file.
The browser doesn't cache the rendered page (well, it does for back buttons but that's not what we're talking about). The browser caches the source file. Therefore even if the html page is configured to be cached for a long time the javascript injected content will only be cached as long as its been configured to.
To configure caching policy you need to set specific headers on the server side. Sometimes you can do this in a CGI script. Sometimes you can do this in the server configuration files.
Google "http caching" and read up on how to configure a page to be cached or not cached (also google "json disable caching" or "ajax disable caching" because this issue crops up a lot with ajax).
I have a website that is grown somewhat large and is built on a super-restrictive platform (SBI). There you have to follow their file structure and put everything in an appropriate folder and then upload each and every file through their interface manually. I have cool HTML5 template and some Javascript with a lot of little files and images so it was just way easier to upload all this stuff to my OTHER DOMAIN hosted by Hostgator using Filezilla and then just refer css and js files from my SBI site to their location at my Hostgator's domain.
Are there any potential issues with this method?
The reason I am asking is because yesterday I came across Google's article on serving resourcing from a consistent URL: https://developers.google.com/speed/docs/best-practices/payload#duplicate_resources However, I might be misunderstanding what it means. When I put my actual URL to test at Google's page speed insights here https://developers.google.com/speed/pagespeed/insights it advises me to serve resources from a consistent URL, but in details it doesn't complain about my CSS and JS files, it complains about Facebook only, like this:
Suggestions for this page:
The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and24.3KiB.
http:// static.ak.facebook.com/.../xd_arbiter.php?...
https:// s-static.ak.facebook.com/.../xd_arbiter.php?...
I appreciate you reading this. Thanks in advance!
Serving static content from a different domain is common practice, I don't see any issues there - it's as safe and reliable as the server you are using to serve it.
The facebook warning could mean you are loading the same FB API script twice, or it just may be some black magic done by the FB devs.
You should not have any problems with hosting your files on a different site. Your users may experience a slightly slower page load because their machine has to do more DNS lookups, on the other hand most web browsers only download a maximum of 2 files form a host simultaneously, so doubling your hosts can double your simultaneous downloads. That warning about Facebook is because the same script is being downloaded twice from two different places which is not ideal, but I'm not familiar with the Facebook api so I'm not sure if that can be helped.