I had an idea on how to vastly speed up my website and ease the load on my server for cached objects, but I'm wondering if this is an extremely stupid idea before I spend all day thinking about it! Hear me out -
The localStorage object has a minimum of 2.5 MB (in Chrome, more here) which is good enough for most small sites. Which is why I am wondering - for small, static websites, would it be a good idea to basically save the entire site in a huge localStorage file and basically load the site from the visitor's own computer each time? To better explain my idea, here's some pseudo-code:
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<!--other head information-->
</head>
<body>
<script>
if (localStorage.getItem('cached_site')!=null) {
document.getElementsByTagName("body")[0].remove();
document.documentElement.innerHTML = localStorage.getItem('cached_site');
//Somehow stop loading the rest of the site (ajax?)
}
else {
//set localStorage var to the source code of the site.
//**This will be fleshed out to data/base64 all image resources as well**
localStorage.setItem('cached_site',document.documentElement.innerHTML);
//load the site as normal
}
</script>
</body>
</html>
This will save the contents of the body in a localStorage file called cached_site. There are some issues with the logic (like the "stop loading the site" part) but you get the idea.
Is this worth it? Potentially, requests could be completely cut from static content because it's all being loaded onto the user's computer. In order to update the content, maybe another value, perhaps a version number could be saved and checked in the else block.
My webhost limits the amount of requests I can get per day, which is why I thought of this. It's not a huge deal, but an interesting project nonetheless.
This idea will have issues with dynamic web sites... how will you treat dynamically generated web pages?
Another idea would be to save each page into a static html file on the server, and then serve the static page, which will be regenerated when needed.
You should also cache the static parts of your website, i.e. images, scripts, CSS, and store these in the cache of your visitors' browsers.
In Apache, you could use mod_expires, and then set up an .htaccess file like this:
### turn on the Expires engine
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)(\.gz)?$">
ExpiresDefault "access plus 1 month"
FileETag None
</FilesMatch>
This will basically cache all of the static parts of your website in your visitors' browser cache, so they will stop refetching the same files over and over.
You should also try and combine your CSS/Javascript files, and ideally end up with 1-2 javascript files, and 1 CSS file, thus limiting the number of requests per client.
I would also suggest using Yahoo's YSlow and Google's PageSpeed Tools to profile your site. They both contain best practices on how to speed up your page here and here.
Here are some suggestions from Yahoo + Google, at a glance:
Minimize HTTP Requests (use combined files for JS/CSS, CSS sprites, image maps and inline images, where possible)
Add an Expires or a Cache-Control Header (like what has been explained above)
Properly configure ETags, or remove them
Gzip Components (e.g. with mod_deflate in Apache)
Make JavaScript and CSS External
Minify JavaScript and CSS (i.e. remove whitespace and newlines) - in PHP, you can use the minify script to do this
Optimize images
Related
Background:
I have a small RaspberyPi-like server on Armbian (20.11.6) (precisely - on Odroid XU4).
I use lighttpd to serve pages (including Home Assistant and some statistics and graphs with chartjs).
(the example file here is Chart.bundle.min.js.gz)
Issue:
There seems to be a growing amount of javascript files, which become larger than the htmls and the data itself (some numbers for power/gas consumption etc.). I am used to use mod_compress, mod_deflate etc on servers (to compress files on the fly), but this would kill the Odroid (or unnecessarily load CPU and the pitiful SD card for caching).
Idea:
Now, the idea is, just to compress the javascript (and other static (like css)) files and serve it as static gzip file, which any modern browser can/should handle.
Solution 0:
I just compressed the files, and hoped that the browser will understand it...
(Clearly the link was packed in the script html tag, so if the browser would get that gz is a gzip... it should maybe work).
It did not ;)
Solution 1:
I enabled mod_compress (a suggested on multiple pages) and and tried to serve static js.gz file.
https://www.drupal.org/project/javascript_aggregator/issues/601540
https://www.cyberciti.biz/tips/lighttpd-mod_compress-gzip-compression-tutorial.html
Without success (browser takes it as binary gzip, and not as application/javascript type).
(some pages suggested enabling mod_deflate, but it does not seem to exist)
Solution 2:
(mod_compress kept on) I did the above, and started fiddling with the Content-Type, Content-Encoding in the HTML (in the script html tag). This did not work at all, as the Content-Type can be somehow influenced in HTML, but it seems that the Content-Encoding can not.
https://www.geeksforgeeks.org/http-headers-content-type/
(I do not install php (which could do it) to save memory, sd card lifetime etc.).
Solution 3:
I added "Content-Encoding" => "gzip" line to the 10-simple-vhost.conf default configuration file in the setenv.add-response-header. This looked as a dirty crazy move, but I wanted to check if the browser accepts my js.gz file... It did not.
And furthermore nothing loaded at all.
Question:
What would be an easy way to do it ? (without php).
Maybe something like htaccess in Apache ?
EDIT 1:
It seems that nginx can do it out-of-the-box:
Serve static gzip files using node.js
http://nginx.org/en/docs/http/ngx_http_gzip_static_module.html
I am also digging into the headers story in lighttpd:
https://community.splunk.com/t5/Security/How-to-Disable-http-response-gzip-encoding/m-p/64396
EDIT 2:
Yes... after some thinking, I got to the idea that it seems that this file could be cached for a long time anyway, so maybe I should not care so much :)
Your solution (below) to set the response header is a workable one for your situation.
However, I would recommend using lighttpd mod_deflate with deflate.cache-dir (lighttpd 1.4.56 and later)
When configured properly, lighttpd will serve gzipped Content-Encoding to clients which support the compression, and lighttpd will serve plain content to clients which do not support the compression. lighttpd will compress each file as it is served and will save the compressed file in deflate.cache-dir so that lighttpd does not have to re-compress the file the next time the file is requested. lighttpd will detect if the original file has changed and will re-compress it into the cache the next time the file is requested.
It seems that I was writing the question so long, that I was near to the solution.
I created an module file 12-static_gzip.conf, with following content:
$HTTP["url"] =~ ".gz" {
setenv.add-response-header = (
"Content-Encoding" => "gzip"
)
}
I have not found any similar trick for lighttpd, so I applied here a similar solution which I would use for Apache. Expected behavior was, that it will just respond the Content-Encoding header for the gz files, without using php or any additional modules... and it works !!!
The mod_compress module or any other of this kind is disabled and no other changes are needed.
Clearly, the http negotiation is more complex, so I am not sure if this will work for all browsers, but it surely work very nicely for Chrome.
I am also planning to create some ESP32 web servers, where drive and memory are even more critical, so I will try to apply similar solution.
Nevertheless, the questions still hold...
is there a better/cleaner solution ?
Are there some caveats to be expected ? Browser compatibility etc. ?
I am trying to improve the performance of my website. In the Chrome DevTools, I see that the request for bg2.jpg is being delayed in starting its download.
I figure this is happening because I am using JavaScript to generate the URL and set it as a background image in CSS and the Chrome browser is deprioritizing the script tag containing this code.
let bgImgName = "bg" + Math.floor(Math.random() * 5);
...
document.documentElement.style.setProperty("--bgUrl", `url(img/${bgImgName}.jpg)`);
My thought is to preload the image using, <link rel="preload" href="img/bg2.jpg" as="image"> in the HTML. My problem is that I have to hardcode the URL for this to work (because my server only runs apache and does not have a true server-side language). My server (host with GoDaddy on a Linux shared host) does give me access to the .htaccess file and there might be a way to use Server Side Includes to inject the random number, but I have not found a way to do this.
Is there a way to do it this way, or a different way to solve this problem?
Update: Looks like I cannot use Server Side Includes. I forgot that I gzip the HTML files before I upload them to the server so that there is a performance boost of serving the compressed static files right from the disk.
Is there a way I can add a random number in the .htaccess file that is passed to the browser?
Since you have 5 random images, an easy way is to use CSS and make them background of any element with 0 size. This will force the browser to load the image before reaching the JS script. Of course, the CSS need to be loaded before your JS and as soon as possible.
So your code can be something like this:
html {
background-image:
url("img/bg1.jpg"),
url("img/bg2.jpg"),
url("img/bg3.jpg"),
url("img/bg4.jpg"),
url("img/bg5.jpg");
background-size:0 0;
}
I have a view layout (main.cshtml) where I am calling an external javascript file. I have renderings (another cshtml files) which are included as placeholders to this layout(main.cshtml). example: two pages:
1) http://localhost/home/ has two renderings for Body placeholder
2)http://localhost/about/ has two renderings for Body placeholder
both home and about pages uses same main.cshtml, I don't want to load externalJS.js every time I navigate from home to about or vice versa. i.e;the externalJS.js should load once for entire application. Can I achieve it?
<!DOCTYPE html>
<html>
<head>
<title>Main</title>
</head>
<body>
<div data-role="page" class="pageWrapper">
<header data-role="header" class="header">
#Html.Sitecore().Placeholder("Header")
</header>
<div class="wrapper" data-role="main">
#Html.Sitecore().Placeholder("Body")
</div>
<div data-role="footer" role="contentinfo" class="ui-footer ui-bar-inherit">
#Html.Sitecore().Placeholder("Footer")
</div>
</div>
<script src="../../js/externalJS.js"></script>
</body>
</html>
If you are worried about the bandwidth usage and load on your servers from users downloading exernalJS.js each time they visit one of your pages, you're worries might already be solved by web browser caching. Basically the web browser saves a copy of html, css, js, image files, etc locally and reloads those if it needs to, rather than jumping back out to the network.
On the other hand, if the initial processing of externalJS.js is what you want to avoid, then something like Ajax (Asynchronous Javascript and Xml) is what you want. The idea behind Ajax is that you write Javascript to handle fetching new content from the network. So rather than the user clicking on an anchor and having their browser download an entirely new page, you would set up something for them to click on and Javascript would then send a request to the server, which would return some xml (or html, or json) and Javascript would then insert the new data into the existing page without the browser reloading anything or changing pages. Note that you may want to use Javascript to add the change to the browser's history since that won't happen by default. See here for an Ajax tutorial.
The technique to implement with Sitecore you may consider is called bundling as it does the job exactly as you want, and even more. It is a feature of ASP.NET which Sitecore is built on.
Most of the current major browsers limit the number of simultaneous connections per each hostname to six. That means that while six requests are being processed, additional requests for assets on a host will be queued by the browser. Bundles can unite several styles / scripts into one "file" to address this issue.
Pay attention to v parameter with a long hash stamp - that is version stamp so that it remains the same for the same combination of scripts / style in you bundle. As soon as it remains the same - it is cached by a browser and is normally requested just once for the first time, regardless of which page is called by.
<link href="/Styles/css?v=Za3K5TEVXIEtbh2FEp3kzkdlpWT7iVdGKSiEj6SVOaI1" rel="stylesheet"/>
There is also a technique called minification that comes along with bundles - you may not only "clue" several scripts to one combining file with a specific version stamp, but also minimize (compress) that file to take less bandwidth to transfer - quite handy on a high-traffic websites.
Here are useful links that will explain how to implement bundling and minification with Sitecore:
http://sitecorefootsteps.blogspot.co.uk/2014/01/implementing-bundling-and-minification.html
https://himynameistim.wordpress.com/2014/12/05/bundling-and-minification-with-sitecore/
http://jockstothecore.com/bundling-with-sitecore-mvc/
http://techblog.lbigroup.be/2013/08/21/adding-bundling-and-minification-to-a-sitecore-project/
I'm currently trying to preload images for a webpage I'm creating as those images are quite big.
Currently I know (thanks to another post here) how to handle the images themselves via preloading them (via javascript pre loading and then displaying them in a canvas).
BUT whenever I switch the page the preloaded images need to be preloaded again, thus they are not cached.
So my question is: Is there any possibility to cache these images?
(or is it even best to put them into a session variable?)
The images themselves are quite big and can take up 1.5MB each (in total there are 20 images alone in the part that is currently already in existence, which takes about 4 seconds to preload).
As infos if necessary:
I'm using an apache server and php as primary language with javascript as support.
Edit:
As I forgot to mention it: The webserver I will finally store the site on is an external one (hosting provider) so I won't be able to edit the webserversettings themselves there
If the images don't change, try something like this in .htaccess:
#Set caching on image files for 11 months
<filesMatch "\.(ico|gif|jpg|png)$">
ExpiresActive On
ExpiresDefault "access plus 11 month"
Header append Cache-Control "public"
</filesMatch>
If you think this is not the right approach, like the images may change, just eager-load the images right when the page hits (warning, definitely a hack):
(function(){
var hiddenCache = document.createElement("div");
hiddenCache.style.display = "none";
document.body.appendChild(hiddenCache);
// or for loop if ECMA 3
myEagerLoadedImageUrls.forEach(function(urlStr){
var hiddenImg = document.createElement("img");
hiddenImg.src = urlStr;
hiddenCache.appendChild(hiddenImg)
});
})()
The browser already caches the images in its memory and/or disk cache as long as the headers coming from the server aren't telling it to avoid caching. The browser cache endures across page loads. SO, if your images have been loaded once on the first page, they should be in the browser cache already for the second page and thus when requested on the second page, they should load locally and not have to be fetched over the internet.
If you're looking for client-side code that can be used to preload images, there are many examples:
How do you cache an image in Javascript
Image preloader javascript that supports events
Is there a way to load images to user's cache asynchronously?
FYI, it is possible in newer browsers to use a combination of Local Storage and data URIs to implement your own image caching, but I'd be surprised if there was any real world situation where that was required and if you have a lot of images, you may run into storage limits on Local Storage quicker than limits on the size of the browser cache.
Can anyone help? I have been designing a site using Javascript but the rest of the html content is static ie. images etc
When i load my page in Firefox i have to clear the cache..
I remember a long time ago there was something you could add to the html to force a reload.
My question is, is this a good thing? I presume it caches for a reason i.e to cahce images etc.. But this causes my pages not to refresh
And how to do it?
Really appreciate any feedback
If you want only the js to be loaded afresh everytime, and leave everything else to load from cache, you can add a version number to the js include line like so:
<script src="scripts.js?v=5643" type="text/javascript"></script>
Change the version number (?v=num) part each time you change the js file. This forces the browser to get the js file from the server.
Note: Your actual file name will be the same - scripts.js
For disabling cache for all files, if you're using apache, put this in your httpd.conf
<Directory "/home/website/cgi-bin/">
Header Set Cache-Control "max-age=0, no-store"
</Directory>
You can also put a meta tag on your html like so:
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="cache-control" content="no-cache" />
More info on this here
For web pages you can how the page is cached in the HTTP Header. You should look at Expires if you have a particular date for the cache to expire or Cache-Control for dynamic expiration based on when the page was requested. Here's a pretty good tutorial that covers how cache works and covers set up on the major web servers.
Try pressing Control + F5 when you load your page in FireFox-- this should clear your browser's cache of the page and reload cleanly for you.