How to handle deploys with Webpack code splitting? - javascript

Here's an unexpected issue I've run into with Webpack code splitting in the wild: Imagine this scenario:
The user loads a React app with Webpack code splitting and a few bundle chunks are loaded
A deploy happens and the contents of any future chunks that the user might receive from the server are updated (note: the previous chunks get deleted on the server during a deploy)
The user clicks on a link and loads a new route which triggers more bundle chunks to load. Except these new chunks are incompatible with the ones the user's browser has already loaded and the app breaks because of a runtime error
How can this scenario be prevented?
One possible solution would be to maintain multiple versioned sets of chunks but I'm wondering if there's a simpler solution being used by large-scale apps.
If preload-webpack-plugin is used, all chunks can be prefetched but they will only stay cached for a short time (5 minutes in Chrome).

As Max Stoiber writes on spectrum.chat:
ServiceWorkers come in really handy when doing code splitting!
We use the excellent offline-plugin by #nekr to cache all the current bundles locally, so no matter if the server updates the files or not the ServiceWorker will always serve the files from the local cache. Every hour it will check the server for updates and, if an update is available, download all the fresh bundles from the remote server and cache them locally. The next time the user restarts the app the new version of the app is used! 💯
https://github.com/NekR/offline-plugin
This solution means your app downloads all the chunks up front, which defeats the purpose of code splitting in terms of bandwidth, but at least you still retain the benefit of only parsing the chunks you need to load the app, which for me is significant on slow devices. Also, browser refreshes/caching now involves the Service Worker lifecycle (see "Waiting" at https://developers.google.com/web/fundamentals/primers/service-workers/lifecycle).

This problem is extremely well stated.
I will add though that "Deletion" might not be the right name for whats happening, depending on the setup.
My initial response to this problem was that this was a caching problem. That old chunk files were being picked up instead of the new one. Its close to what was happening at least in my case I had the following:
index.js
const Page1 = lazy(() => import('./page/Page1'));
const Page2 = lazy(() => import('./page/Page2'));
const main = () => {
{
'/page1': Page1,
'/page2': Page2,
}[window.location.href](); /* Some Render Router Implementation */
};
V1 Deployed at (https://my-domain/distribution_folder/*)
User would load V1 index.js
V2 Deployed at (https://my-domain/distribution_folder/*)
User (who hadn't refreshed) would dynamically load a chunked route using their cached V1 index.js file.
Request would be sent to (https://my-domain/distribution_folder/{page_name}.{chunk_hash}.js)
A chunk error would occur because that unique chunk would no longer be there.
Its interesting because the provider that was being used was migrating traffic to the new version. So I thought that would be the end of it but what I wasn't realizing was that any user could still be using a previously deployed version - How would they know? They're already using the application. The browser already downloaded the application (index.js).
The solution really depends on where you're dynamically importing these chunks. In the case above since they're page routes we can do a hard refresh when the user requests a different page when we can't find a chunk. This assumes however that your Cache-Control headers are setup correctly though. For example:
index.js -> Cache-Control: no-store
page/{page_name}.{chunk_hash}.js -> Cache-Control: public,max-age=31536000,immutable
We can make these chunks immutable because sometimes they don't change between releases and if they don't change why not use the cached version. However, index.js cannot be stored in the cache because this is the "router" that dynamically loads the content and this will always change.
Pros
No more chunk load errors
We don't need to load everything on first page load
Less complexity by not having a service worker
Cons
This approach forces a refresh for users
Related Questions
Webpack Code Splitting 'Loading chunk failed' error wrong file path
Angular 2 Error: Loading chunk failed many times
Webpack - Loading chunk 0 failed
ChunkLoadError: Loading chunk XY failed. - Randomly getting fatal on PRODUCTION

if the chunk filenames are hashed, wouldn't old route link to old hashed chunk (which presumably would still be available) and load everything fine?

https://webpack.js.org/guides/caching/#output-filenames
A simple way to ensure the browser picks up changed files is by using output.filename substitutions. The [hash] substitution can be used to include a build-specific hash in the filename, however it's even better to use the [chunkhash] substitution which includes a chunk-specific hash in the filename.

Related

How to refresh in SPA (react, vue)?

If the source code or function is changed in the SPA project, you must deploy it to the server again.
If the service is deployed, it will continue to load the cached js value unless refreshed. How do I fix this?
Say all of your SPA code are two files: vendor.js and app.js. To miss the cache when updated, typically what is done is compute hashes of the contents and put it in the file name: vendor.<truncated md5 hash>.js and app.<truncated md5 hash>.js. Each time you build the project (assuming you changed at least one line) it gets a new hash, therefore a new filename, and miss the cache.

Reload from server for all files

I know it's possible to force reload from server using location.reload(true). However, let's say I used that to refresh index.html. If index.html loads a bunch of javascript files, those are still coming from the cache for me. Is there any way to ignore the cache for the duration of a request?
My use case is that I'm doing AB testing on my app, and want to provide a way for users to go back to the old version if something isn't working. But some of the URLs are the same, even though the files between versions are different. It would be nice to be able to handle this in JS rather than having to change every URL on the new version.
There is actually at least 535 different ways to reload a page via javascript, FYI ;).
Have you tried to put document on front? document.location.reload(true);
Try also this other option:
window.location.href = window.location.href;
or
history.go(0);
Sure, both are soft reload, but seems to work in certain situation.
If nothing works, you have to append random data to the url (like timestamp) to force the download from server, bypassing the cache.
If you want to bypass browser taking js files from cache, you need to fetch from server not just files like script.js but rather script.12345.js When you update your file on server, you change file's hash number to let's say script.54321.js And browser understands that the file is different, it must download it again. You can actually use Webpack for this purpose to automate things. In output instead of {filename: bundle.js} you write {filename: bundle.[hash].js}

When Chrome clears disk cache?

I have one website which serves listener.js on the main page. I want to update this javascript file with some extra codes. But browsers (especially chrome) has memory and disk cache. Also HTTP cache of course. I tried something about that state. I tried just F5, the file loaded from memory cache. Then I killed chrome and opened the website again, javascript file loaded from the disk cache. So I have 2 questions;
When chrome clears disk cache?
How can I say to my visitors don't use any cache and get the new javascript file from my server?
Update:
Can I do this with no-cache Http header?
Removing temporarily cached file known as cache busting. It is useful because browser doesn't have to download these files again.
If it is causing issues, developers can force browsers to download new files. This is performed by re-naming file but there is a better way
src="js/listener.js" => src="js/listener.js?v=2"
Update:
Or hash like this => ?v=c298c7f8233d which is better than ?v=2 (comment by Tech Guy)
(Credits: 30-seconds)
Chrome doesn't auto clear disk cache unless this option is checked
Privacy settings > Content settings > Keep local data only until you quit browser
In which case, it deletes cache on closing the browser.
You usually prevent a client from saving your files in cache by hashing your filenames in
each build, which is the most common cache-busting technique. That means in every release, you will have a new file name and the old cached file won't matter. For instance
Most build tools like Webpack have cache-busting features that you can turn on.
You don't want to stop the user from caching at all, because caching is immensely useful and prevents repeated downloads. You just want to prevent downloads when you build a new release.
This solution worked for me.
let randomNum = Math.round(Math.random() * 10000);
src = "js/listener.js?" + randomNum;
Every time a random number will be generated and it'll be treated as a new request and won't be cached.

Forcing browser to download a file rather than caching

I have a javascript file which internally calls a function to load an xml file.
$(document).ready(function()
{
urlVal ="web/help.xml";
}
The javaxcript is versioned so that the browser always loads it instead of
caching it
"./js/help_min.js?ver=${verNumber}"
I am facing an issue where browser downloads the latest js file but has cached help.xml included in js file.
is there a way that the browser will always load latest "hepl.xml" rather than caching it.
The proper apporach would be to fix the backend to send headers telling the browser not to cache the data (see i.e. How to control web page caching, across all browsers?). But if you cannot do that, make the request unique each time, i.e.
"./js/help_min.js?ver=${verNumber}&random=${something_random}"
where something_random value of random can be i.e. current time stamp (with millis). That way your request will not match the cache entry enforcing fetch on each request.
PS: you seem to also have design flaw, as by logic using the same ${verNumber} should return the same data, hence caching would be more than welcome to reduce the traffic and speed up loading time.

swPrecache and CDN as proxy?

I am using swPrecache, for loading my static assets of my PWA, to support offline mode. It working great. My setup is something like:
https://www.myexampledomain.com/myapp/ loads static index.html and which in turn loads uses swPrecache to load static assets like JS, Images, CSS,etc. Mind you these all are loaded from the same domain e.g www.myexampledomain.com/myapp/js/file1.js
But my swprecache list has decent number of files and takes some time to download on slower internet connection. FYI, I am already delaying the service worker registration to something like "load" event.
So here is what I am trying now. I need someone to validate if this is possible:
https://www.myexampledomain.com/myapp/ loads the static html files as before.
Have swPrecache intercept the static requests that go to app domain (e.g https://www.myexampledomain.com/myapp/js/file1.js) and instead fetch these to a CDN endpoint? (e.g https://some.cloudfront.com/myapp/js/file1.js).
Once downloaded the swPrecache continues to work as usual.
So essentially I am hoping to have swPrecache proxy the static asset requests, to a CDN to make it faster to download during the initial load.
Any comments/pointers on this will help.
You can use the stripPrefixMulti option in sw-precache to change the URLs that are written to your service worker file. It's fairly brute-force, though, so it helps if there's a common prefix that is shared by all the assets that will be served from the CDN.
For example, if everything that will be served off of the CDN is stored in a local assets/ directory, and their paths on the CDN will start with https://my-cdn.com/assets/, you can use
{
stripPrefixMulti: {'assets/': 'https://my-cdn.com/assets/'},
// ... other sw-precache options...
}
You'll want to make sure that whenever a local file changes as part of your build process the copy of the file on the CDN also changes immediately, or you'll run the risk of the versioning info generated for the local files being out of sync with what's on the CDN.
yes, you can use
{
stripPrefixMulti: {'assets/': 'https://my-cdn.com/assets/'},
// ... other sw-precache options...
}
but I'm sure you will face new CORS problem. I'm working on it too

Categories