If the source code or function is changed in the SPA project, you must deploy it to the server again.
If the service is deployed, it will continue to load the cached js value unless refreshed. How do I fix this?
Say all of your SPA code are two files: vendor.js and app.js. To miss the cache when updated, typically what is done is compute hashes of the contents and put it in the file name: vendor.<truncated md5 hash>.js and app.<truncated md5 hash>.js. Each time you build the project (assuming you changed at least one line) it gets a new hash, therefore a new filename, and miss the cache.
Related
In one of the applications that I am working on for my company, I came across a weird behaviour or maybe it's just my misunderstanding and I hope I can get some clarification.
The application is served by Apache and the root is : /company/client. For every page that I visit, for example https://11.11.11.11/index.phtml, it will actually point to the file in the server /company/client/index.phtml and so on. In one of the modules of the application, it contains a move_uploaded_file php function, and the target directory is /images/example/, when the page is run, the app is trying to go to the absolute server root /images/example/ instead of /company/client/images/example/.
Also the a new windowed opened up by window.open has an img tag having src='/images/exmaple/', this points to the server root instead of /company/client/images/example/, is this expected?
Am I missing anything, or is it something to do with Apache configuration?
Additional info:
The application is served as a virtual host in conf file, with DocumentRoot "/company/client/".
The page that is executing window.open and php function is used as an Iframe inside /company/client/index.phtml
sorry for my mistake.
Thanks for the help from the everyone especially Chris G, the problem was that the code is using a GET variable incorrectly, as a result the image name isn't passed. And I got confused because someone made a mistake in the code by moving image relative to the root folder which is incorrect. That made me think that PHP is also treating path like the client side which is a mistake. I'm guessing I can conclude that the web server document root only applies to everything client side, like the url, JS, HTML?
I know it's possible to force reload from server using location.reload(true). However, let's say I used that to refresh index.html. If index.html loads a bunch of javascript files, those are still coming from the cache for me. Is there any way to ignore the cache for the duration of a request?
My use case is that I'm doing AB testing on my app, and want to provide a way for users to go back to the old version if something isn't working. But some of the URLs are the same, even though the files between versions are different. It would be nice to be able to handle this in JS rather than having to change every URL on the new version.
There is actually at least 535 different ways to reload a page via javascript, FYI ;).
Have you tried to put document on front? document.location.reload(true);
Try also this other option:
window.location.href = window.location.href;
or
history.go(0);
Sure, both are soft reload, but seems to work in certain situation.
If nothing works, you have to append random data to the url (like timestamp) to force the download from server, bypassing the cache.
If you want to bypass browser taking js files from cache, you need to fetch from server not just files like script.js but rather script.12345.js When you update your file on server, you change file's hash number to let's say script.54321.js And browser understands that the file is different, it must download it again. You can actually use Webpack for this purpose to automate things. In output instead of {filename: bundle.js} you write {filename: bundle.[hash].js}
Here's an unexpected issue I've run into with Webpack code splitting in the wild: Imagine this scenario:
The user loads a React app with Webpack code splitting and a few bundle chunks are loaded
A deploy happens and the contents of any future chunks that the user might receive from the server are updated (note: the previous chunks get deleted on the server during a deploy)
The user clicks on a link and loads a new route which triggers more bundle chunks to load. Except these new chunks are incompatible with the ones the user's browser has already loaded and the app breaks because of a runtime error
How can this scenario be prevented?
One possible solution would be to maintain multiple versioned sets of chunks but I'm wondering if there's a simpler solution being used by large-scale apps.
If preload-webpack-plugin is used, all chunks can be prefetched but they will only stay cached for a short time (5 minutes in Chrome).
As Max Stoiber writes on spectrum.chat:
ServiceWorkers come in really handy when doing code splitting!
We use the excellent offline-plugin by #nekr to cache all the current bundles locally, so no matter if the server updates the files or not the ServiceWorker will always serve the files from the local cache. Every hour it will check the server for updates and, if an update is available, download all the fresh bundles from the remote server and cache them locally. The next time the user restarts the app the new version of the app is used! 💯
https://github.com/NekR/offline-plugin
This solution means your app downloads all the chunks up front, which defeats the purpose of code splitting in terms of bandwidth, but at least you still retain the benefit of only parsing the chunks you need to load the app, which for me is significant on slow devices. Also, browser refreshes/caching now involves the Service Worker lifecycle (see "Waiting" at https://developers.google.com/web/fundamentals/primers/service-workers/lifecycle).
This problem is extremely well stated.
I will add though that "Deletion" might not be the right name for whats happening, depending on the setup.
My initial response to this problem was that this was a caching problem. That old chunk files were being picked up instead of the new one. Its close to what was happening at least in my case I had the following:
index.js
const Page1 = lazy(() => import('./page/Page1'));
const Page2 = lazy(() => import('./page/Page2'));
const main = () => {
{
'/page1': Page1,
'/page2': Page2,
}[window.location.href](); /* Some Render Router Implementation */
};
V1 Deployed at (https://my-domain/distribution_folder/*)
User would load V1 index.js
V2 Deployed at (https://my-domain/distribution_folder/*)
User (who hadn't refreshed) would dynamically load a chunked route using their cached V1 index.js file.
Request would be sent to (https://my-domain/distribution_folder/{page_name}.{chunk_hash}.js)
A chunk error would occur because that unique chunk would no longer be there.
Its interesting because the provider that was being used was migrating traffic to the new version. So I thought that would be the end of it but what I wasn't realizing was that any user could still be using a previously deployed version - How would they know? They're already using the application. The browser already downloaded the application (index.js).
The solution really depends on where you're dynamically importing these chunks. In the case above since they're page routes we can do a hard refresh when the user requests a different page when we can't find a chunk. This assumes however that your Cache-Control headers are setup correctly though. For example:
index.js -> Cache-Control: no-store
page/{page_name}.{chunk_hash}.js -> Cache-Control: public,max-age=31536000,immutable
We can make these chunks immutable because sometimes they don't change between releases and if they don't change why not use the cached version. However, index.js cannot be stored in the cache because this is the "router" that dynamically loads the content and this will always change.
Pros
No more chunk load errors
We don't need to load everything on first page load
Less complexity by not having a service worker
Cons
This approach forces a refresh for users
Related Questions
Webpack Code Splitting 'Loading chunk failed' error wrong file path
Angular 2 Error: Loading chunk failed many times
Webpack - Loading chunk 0 failed
ChunkLoadError: Loading chunk XY failed. - Randomly getting fatal on PRODUCTION
if the chunk filenames are hashed, wouldn't old route link to old hashed chunk (which presumably would still be available) and load everything fine?
https://webpack.js.org/guides/caching/#output-filenames
A simple way to ensure the browser picks up changed files is by using output.filename substitutions. The [hash] substitution can be used to include a build-specific hash in the filename, however it's even better to use the [chunkhash] substitution which includes a chunk-specific hash in the filename.
I am trying to achieve the below in ASP.NET MVC3 web application which uses razor.
1) In my Index.cshtml file, I have the below reference.
<script src="/MySite/Scripts/Main.js"></script>
2) I load my home page for the first time and a http request is made to fetch this file which returns 200.
3) Then, I made some changes to the Main.js and saved it.
4) Now I just reload the home page (please note that I am not refreshing the page) by going to the address bar and typing the home page url and pressing enter. At this point, I want the browser to fetch the updated Main.js file by making a http request again.
How can I achieve this? I don't want to use System.Web.Optimization bundling way. I knew that we can achieve this by changing the URL (appending version or some random number) everytime the file changes.
But the challenge here is the URL is hardcoded in my Index.cshtml file. Everytime when there is a change in Main.js file, how can I change that hardcoded URL in the Index.cshtml file?
Thanks,
Sathya.
What I was trying to achieve is to invalidate browser cache as soon as my application javascript file (which already got cached in the browser) gets modified at the physical location. I understood that this is simply not achievable as no browsers are providing that support currently. To get around this below are the only two ways:
1)Use MVC bundling
2)Everytime the file is modified, modify the URL by just appending the version or any random number to the URL through querystring. This method is explained in the following URL - force browsers to get latest js and css files in asp.net application
But the disadvantage with the 2nd method is, if there are any external applications referring to your application's javascript file, the browser cache will still not be invalidated without refreshing the external application in browser.
Just add a timestamp as a querystring parameter:
var timestamp = System.DateTime.Now.ToString("yyyyMMddHHmmssfff");
<script src="/MySite/Scripts/Main.js?TimeStamp=#timestamp"></script>
Note: Only update TimeStamp parameter value, when the file is updated/modified.
It's not possible without either using bundling (which internally handles version) or manually appending version. You can create a single file bundle as well if you want.
I am using swPrecache, for loading my static assets of my PWA, to support offline mode. It working great. My setup is something like:
https://www.myexampledomain.com/myapp/ loads static index.html and which in turn loads uses swPrecache to load static assets like JS, Images, CSS,etc. Mind you these all are loaded from the same domain e.g www.myexampledomain.com/myapp/js/file1.js
But my swprecache list has decent number of files and takes some time to download on slower internet connection. FYI, I am already delaying the service worker registration to something like "load" event.
So here is what I am trying now. I need someone to validate if this is possible:
https://www.myexampledomain.com/myapp/ loads the static html files as before.
Have swPrecache intercept the static requests that go to app domain (e.g https://www.myexampledomain.com/myapp/js/file1.js) and instead fetch these to a CDN endpoint? (e.g https://some.cloudfront.com/myapp/js/file1.js).
Once downloaded the swPrecache continues to work as usual.
So essentially I am hoping to have swPrecache proxy the static asset requests, to a CDN to make it faster to download during the initial load.
Any comments/pointers on this will help.
You can use the stripPrefixMulti option in sw-precache to change the URLs that are written to your service worker file. It's fairly brute-force, though, so it helps if there's a common prefix that is shared by all the assets that will be served from the CDN.
For example, if everything that will be served off of the CDN is stored in a local assets/ directory, and their paths on the CDN will start with https://my-cdn.com/assets/, you can use
{
stripPrefixMulti: {'assets/': 'https://my-cdn.com/assets/'},
// ... other sw-precache options...
}
You'll want to make sure that whenever a local file changes as part of your build process the copy of the file on the CDN also changes immediately, or you'll run the risk of the versioning info generated for the local files being out of sync with what's on the CDN.
yes, you can use
{
stripPrefixMulti: {'assets/': 'https://my-cdn.com/assets/'},
// ... other sw-precache options...
}
but I'm sure you will face new CORS problem. I'm working on it too