Service worker cached all files, no way to update - javascript

Last year we put online a website using Gatsby that made use of a service worker (SW) for caching files. A month ago, we put online a complete rework of this website, using a completely different tech stack (ie: not using Gatsby and SW anymore). The new version runs on Webflow now.
My problem is that the previous service worker is still installed on our customers browsers. Any previous visitor is seeing the old version of the website because the old SW cached everything.
I have no way of pushing another JS script to unregister the SW because all the JS files are cached by this same SW! This old version of the website seems to run completely offline now and I don't see a way of removing this old SW and the associated cache (without asking for my customers to manually remove the SW themselves of course).
Does anyone have any idea :D
EDIT with recent observations:
The solution is certainly to provide a new sw.js file.
The easy way would have been to have Webflow (where the new website is hosted) serve a new sw.js file. Unfortunately, it does not seem to be possible (see here).
But I could 301 Redirect to a different location of the sw.js file. Unfortunately (again), browsers don't accept redirect when loading a new service worker.
So I am still stuck there.

Related

What should I do to update service workers and cached PWA files

So in my current system whenever I make a javascript bundle I add a hash to the file name for production builds. I have no CDN caching in my index.html, so whenever a new build comes out a new index.html is created pointing to the new hashed javascript file names. This works well in invalidating the cache of the bundles whenever a new deployment happens (at the cost of my very small index.html not being cached)
I was looking into making my app work as a PWA and I was wondering how to invalidate the cache for existing users when I deploy new builds of my app. Here are the problems:
The serviceworker.js needs to have a fixed file name (no hash in filename), meaning my CDN will cache that file for a long period and I don't want to manually invalidate the CDN cache on every deploy. I guess I could remove the cache on the serviceworker.js like I do for index.html since that file is downloaded only once anyway.
My worker gets registered in the user's browser, I am unsure if I need to add some extra code to check if a new version of the worker is available. How does the browser decides when to try to update a registered serviceworker? Can I unregistered my service worker to install a new version in my PWA? Won't that break the PWA?
My index.html is cached by the service worker, meaning people using the PWA will never get new versions of my main JS bundles (because the cached index.html will be pointing to the old bundles by the hash in the filenames). I guess I could just remove serviceworker cache on index.html but then the app will not work offline?
If want to use my app offline, I need my service worker to be able to cache my hashed JS files. I guess I need to do some magic on the filenames in self.addEventListener('fetch', ...) to use the cached version if available and periodically check for new versions when internet is available, getting the hashed JS bundle file names by creating an uncached static JSON file or something that has the latest filenames. Seems like a hackish solution though.
I can't seem to find good guides on how to handle these problems, feels like a lot of work for something the browser should be able to do for me based on some options (like retry every X amount of time). Is there some magic HTTP header I am not aware of?
I have the same problem and I solved this today. Your index.html should be cached by the service worker for offline working. When a user opens your PWA, all files will be loaded from the cache and after complete loading, the service worker will be retrieved from the network and if there is a new service worker, all new files will be downloaded and the new service worker will be installed and become waiting to be active. you need to refresh your PWA in order to activate the new service worker and actually use the updated app.
I used This article and make my new service worker (instead of the default service worker in create-react-app). Detection of new service worker installation is also explained in this article so you can inform the user that there is a new update available and refresh the PWA.
I hope this can help you.

Cache busting in a Offline First Web Application

We are currently using Webpack with the HtmlWebpackPlugin to generate our javascript builds for our webpage.
new HtmlPlugin({
template: 'www/index-template.html', //source path - relative to project root
filename: 'index.html', //output path - relative to outpath above
hash: true,
cache: true //only emit new bundle if changed
}),
This causes a hash to be added to the query string of the bundled javascript file.
<script type="text/javascript" src="/build/vendor.min.js?4aacccd01b71c61e598c"></script><script type="text/javascript" src="/build/client.min.js?4aacccd01b71c61e598c"></script>
When using any standard desktop or mobile browser, new builds are cache busted properly and the new version of the site is loaded without any effort from the user. However, we also have a chrome web app implementation where we call:
chrome.exe --app=http://localhost:65000 --disable-extensions
In this application, for some reason the hash on the end of the javascript build doesn't bust the cache. We have to manually right click somewhere on the page, then click reload (or press F5). For some reason the cache isn't busted in the web application.
I was thinking that possibly it is caching the index.html file maybe? That may cause the app to never receive the updated hash on the build. I'm not sure how to solve that issue though if that is the case.
I have also noticed that if our localhost server is down, the page still loads as if the server were running. This indicates to me some kind of offline cache. I checked the manifest.json parameters and can't find anything to force a reload.
I have also tried these chrome command line switches which did not help either: --disk-cache-size=0, --aggressive-cache-discard, --disable-offline-auto-reload.
Another caveat is that we need to retain the localStorage data and their cookies. In a standard browser window, or any browser for that matter it works just fine, but not when it is inside a Chrome web app.
Are you talking "Progressive Web App" with service workers? If so then the html file can (and should) be cached on first download. You need to have some sort of aggressive update process on the client to ensure new files are loaded properly.
Perhaps having an api call that checks some sort of dirty flag on the server could work, and if it comes back true, it should reload the template files. Or something more complex where it gets an array of dirty files from the server so it knows which ones to reload instead of loading everything. Just some ideas.
As your page works without the server running at localhost, I suspect that your app is offline first. This is done exactly through service workers(as pointed out by #Chad H) which are officially supported by Chrome and are experimental in other browsers. So, expect different behavior in other browsers. To bust the cache,
In Production
For a permanent solution, you to find and modify the service worker (SW) code. Deletion of old caches happens only in activate event of SW.
You can also read more about Service worker and ask a question with the updated SW code. Also, check out this resolved issue that faced a problem similar to yours.
For dev setup
You can use the Disable Cache option under Network tab in Chrome DevTools (works only when DevTools is open) or use a more robust chrome extension called Cache Killer.

source code changes do NOT reflect immediately upon uploading them on my web host

This is not a programming question per se. I am using a free web host called getfreehosting. I am using their online file manager to transfer files. From time to time, the changes I make on source code do NOT reflect immediately after I upload them. I.e. when I run my application on Chrome, then go to view page source, I realize the JavaScript running is still the old version! In most cases this doesn't happen but when it does it is extremely frustrating. I've tried clearing the browser's cache. I even tried editing the file directly on their servers. Sometimes it solves the problem but other times it doesn't.
Is this a common issue encountered when transferring files to a web host? Or perhaps this is one of the downsides of using a free web host?
Thanks.
You can try clearing your browser's cache, or the ol' CTRL+F5 refresh trick. Otherwise, the hosting provider may be using a caching layer to help ease resource usage.
It is the responsibility of the server to indicate to the browser what the cacheable lifetime of the script files are when they are served to the browser (1 hr, 1 day, 1 month, etc...). This is a server side setting.
Caching is very important for both server-side efficiency and client-side performance so you don't want to defeat it completely.
You can either shorten the server-side setting for the cache lifetime or you can use a version number in your script files (like jQuery does) so that when you revise your script files, you give them a new filename like "myscript-v12.js" and update the corresponding HTML files to refer to the new filename. Then, as soon as the browser gets the new HTML file, it is guarenteed to get the new JS file because the new filename could never have been in the browser cache.
If this is just an issue for you personally while developing and revising your site, then just clear your browser cache after you upload new files and then when your browser loads that page, it won't have any version in the cache and will be forced to get the new version from the server.
There is a CACHE system in modern browsers.
Try clear cache before you browse your web site.

Versioning Javascript Files to Prevent Unnecessary Cache Clearing

I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.

Can I host Facebook's all.js locally?

I've been noticing that sometimes my Facebook app runs slow, and when checked it was because the all.js file was not loaded from the Facebook server, so I copied the file ontp my server and tested it.
Everything seems to work fine, and actually it runs faster. My question is - do you know if there are bugs or errors in doing this?
The problem here is that now you're shifting a dependency, and by extension the maintenance of that dependency to your local application. If it's hosted on Facebook's servers, they can update it to fix bugs or add features.
If it's taking a long time to load, you should bring it up on their support forums
Your page has to load the all.js file in any case.
Facebook servers should be faster than the server which host your
website. So, theoretically loading the js file from facebook should
be faster.
A better approach would be to cache the file for some time. This will make the page loads after the initial one much, much faster.
As people have mentioned, the all.js file is updated constantly with bug-fixes etc. So, it is always better to get the newest version of the file instead of manually updating it on your server after some time.
You can have some problems when the facebook update API. You will need to regularly and frequently (every 5 min?) update the file.

Categories