We are currently using Webpack with the HtmlWebpackPlugin to generate our javascript builds for our webpage.
new HtmlPlugin({
template: 'www/index-template.html', //source path - relative to project root
filename: 'index.html', //output path - relative to outpath above
hash: true,
cache: true //only emit new bundle if changed
}),
This causes a hash to be added to the query string of the bundled javascript file.
<script type="text/javascript" src="/build/vendor.min.js?4aacccd01b71c61e598c"></script><script type="text/javascript" src="/build/client.min.js?4aacccd01b71c61e598c"></script>
When using any standard desktop or mobile browser, new builds are cache busted properly and the new version of the site is loaded without any effort from the user. However, we also have a chrome web app implementation where we call:
chrome.exe --app=http://localhost:65000 --disable-extensions
In this application, for some reason the hash on the end of the javascript build doesn't bust the cache. We have to manually right click somewhere on the page, then click reload (or press F5). For some reason the cache isn't busted in the web application.
I was thinking that possibly it is caching the index.html file maybe? That may cause the app to never receive the updated hash on the build. I'm not sure how to solve that issue though if that is the case.
I have also noticed that if our localhost server is down, the page still loads as if the server were running. This indicates to me some kind of offline cache. I checked the manifest.json parameters and can't find anything to force a reload.
I have also tried these chrome command line switches which did not help either: --disk-cache-size=0, --aggressive-cache-discard, --disable-offline-auto-reload.
Another caveat is that we need to retain the localStorage data and their cookies. In a standard browser window, or any browser for that matter it works just fine, but not when it is inside a Chrome web app.
Are you talking "Progressive Web App" with service workers? If so then the html file can (and should) be cached on first download. You need to have some sort of aggressive update process on the client to ensure new files are loaded properly.
Perhaps having an api call that checks some sort of dirty flag on the server could work, and if it comes back true, it should reload the template files. Or something more complex where it gets an array of dirty files from the server so it knows which ones to reload instead of loading everything. Just some ideas.
As your page works without the server running at localhost, I suspect that your app is offline first. This is done exactly through service workers(as pointed out by #Chad H) which are officially supported by Chrome and are experimental in other browsers. So, expect different behavior in other browsers. To bust the cache,
In Production
For a permanent solution, you to find and modify the service worker (SW) code. Deletion of old caches happens only in activate event of SW.
You can also read more about Service worker and ask a question with the updated SW code. Also, check out this resolved issue that faced a problem similar to yours.
For dev setup
You can use the Disable Cache option under Network tab in Chrome DevTools (works only when DevTools is open) or use a more robust chrome extension called Cache Killer.
Related
Last year we put online a website using Gatsby that made use of a service worker (SW) for caching files. A month ago, we put online a complete rework of this website, using a completely different tech stack (ie: not using Gatsby and SW anymore). The new version runs on Webflow now.
My problem is that the previous service worker is still installed on our customers browsers. Any previous visitor is seeing the old version of the website because the old SW cached everything.
I have no way of pushing another JS script to unregister the SW because all the JS files are cached by this same SW! This old version of the website seems to run completely offline now and I don't see a way of removing this old SW and the associated cache (without asking for my customers to manually remove the SW themselves of course).
Does anyone have any idea :D
EDIT with recent observations:
The solution is certainly to provide a new sw.js file.
The easy way would have been to have Webflow (where the new website is hosted) serve a new sw.js file. Unfortunately, it does not seem to be possible (see here).
But I could 301 Redirect to a different location of the sw.js file. Unfortunately (again), browsers don't accept redirect when loading a new service worker.
So I am still stuck there.
I use a number of CDN links in my webapplication for javascript and CSS, e.g.:
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.3/jquery.min.js"
integrity="sha384-I6F5OKECLVtK/BL+8iSLDEHowSAfUo76ZL9+kGAgTRdiByINKJaqTPH/QVNS1VDb"
crossorigin="anonymous"></script>
Usually everything works fine, but sometimes I get this message in Firebug console:
None of the "sha256" hashes in the integrity attribute match the content of the subresource.
If that happens, my javascript doesn't load and my application is broken. A simple refresh will resolve that. Rather than getting rid of the CDN links and hosting the files myself, I would like to fix this. Is this a common problem?
One possible explanation for this is if your system time is sufficiently off. I was running Debian in a VirtualBox instance. I hibernated the host machine a few times without touching the VM again. That's when I noticed certain Web pages weren't loading properly in Firefox within the VM. Once I got here it occurred to me to check the system time. Sure enough it was off by nearly 2 hours. ntp was not installed so I installed that package: sudo aptitude install ntp. I verified that the date/time was updated with date, and then tested Firefox again. The problematic Web pages (including this one) worked.
Make sure your network connection is working or if you have your browser set up to use Network Proxy that it too is working.
I was seeing this message as I was loading html locally (e.g., File -> Open File) on a browser where the integrity check would fail because I did not have the network proxy (via ssh tunnel) working at the time. As soon as I resolved my network connectivity, the page would load and these messages would go away (with the assumption, of course, that the integrity attribute values are correct.)
I'm learning angular and have cloned the repository here. I've installed the dependcies through npm and have the web server running. I can load the page up at localhost:4000.
If I make a change to the index.html (a simple text change), I can see the results when refreshing my browser. But, if I make a change to an html page that's loaded as an angular directive, the changes don't appear in my browser (Chrome, FIrefox). I tried F5, Ctrl+F5, Shift+F5, etc. Even restarting the web server doesn't do anything.
Is there something I need to set up in the angular code so that refreshes work properly?
https://github.com/codeschool/WatchUsBuild-ReadingListAppWithAngularJS
Note, this is Angular 1x proj.
Should I blame caching
It's cached in your browser. Simply have your dev tools open and under networking tab mark disable cache.
Note:- this will work only if dev tools are open not otherwise.
I can recommend live-server which detect the changes and update make an reload in browser.
One more thing Angular it self use template cache by default so that can also cause the problem and in that case you need to rebuild your app on changes.
read about template cache
Yes that happens with angular because the browsers usually cache the webpages and when you make changes in html and then refresh, the browser loads the cached pages instead. It doesn't happen every time but most of the time. So try clearing the cache of the browser and then load the page. It should work correctly.
Angular 2 and ember has some mechanism called watches that look for changes you make in the files, and whenever it detects a change, it re compiles all files and load a fresh copy for you. But in Angular 1 I don't think there is such a mechanism and I faced this problem my self a lot. And this is the solution I have come across so far. Hope someone else has a better solution.
This is not a programming question per se. I am using a free web host called getfreehosting. I am using their online file manager to transfer files. From time to time, the changes I make on source code do NOT reflect immediately after I upload them. I.e. when I run my application on Chrome, then go to view page source, I realize the JavaScript running is still the old version! In most cases this doesn't happen but when it does it is extremely frustrating. I've tried clearing the browser's cache. I even tried editing the file directly on their servers. Sometimes it solves the problem but other times it doesn't.
Is this a common issue encountered when transferring files to a web host? Or perhaps this is one of the downsides of using a free web host?
Thanks.
You can try clearing your browser's cache, or the ol' CTRL+F5 refresh trick. Otherwise, the hosting provider may be using a caching layer to help ease resource usage.
It is the responsibility of the server to indicate to the browser what the cacheable lifetime of the script files are when they are served to the browser (1 hr, 1 day, 1 month, etc...). This is a server side setting.
Caching is very important for both server-side efficiency and client-side performance so you don't want to defeat it completely.
You can either shorten the server-side setting for the cache lifetime or you can use a version number in your script files (like jQuery does) so that when you revise your script files, you give them a new filename like "myscript-v12.js" and update the corresponding HTML files to refer to the new filename. Then, as soon as the browser gets the new HTML file, it is guarenteed to get the new JS file because the new filename could never have been in the browser cache.
If this is just an issue for you personally while developing and revising your site, then just clear your browser cache after you upload new files and then when your browser loads that page, it won't have any version in the cache and will be forced to get the new version from the server.
There is a CACHE system in modern browsers.
Try clear cache before you browse your web site.
I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.