I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.
Related
I've implemented this script on my Squarespace website using the wexley template to make images in a gallery act as links (Wexley does not support clickthrough URLs natively).
It works fine, but if I add any thumbnails to the gallery it will not work until the browser cache is cleared.
I am wondering if there is a way to fix this? Perhaps through:
1) setting an expiry on the cache? I am not in developer mode so this would have to go into a header injection
2) Versioning? I tried hosting the javascript as a file elsewhere on my site. This worked (it pulled the script from another location) but still get the same issue, even when I upload a new script file and point to that after updating the page!
You can force the client to download the field again. To accomplish this you need to make the clients browser to think it doesnt have the script in cache. You can do this changing the file name.
Imagine you have this folder structure:
index.html
index.js
If in your index.html you reference the script like src="index.js" you may force clients to download just apendding a query string to the import: src="index.js?0"
Now clients browsers will check if this file is in cache, and since it isnt, they will fetch from the server.
Checking the resource loading on my page I realized that the script was not being cached so it was something else getting cached that was interfering.
Because I am not in dev mode, I implemented a fix that relies on appending the URL with the date of the update, and then setting up 301 redirects.
The URL and redirects (2 total) would have to be updated when any content is added.
If anyone sees issues with this (relating to SEO or some unknown), I would appreciate your feedback.
On cloudflare I want to disable caching and see my website changes immediately that I've pushed live.
Things I've tried:
I've put development mode on.
Create a bypass on caching in page rules.
Purged an individual webpage.
Purged the website.
Set cache to clear every 2 hours.
None of the above worked.
Tech I'm using:
Angular2
SystemJS
Typescript which becomes javascript on build.
Firebase for hosting and database.
Cloudflare for SSL etc.
The only way people see my website changes, it if they hard refresh.
The main problem is I've got a javascript file called app.js and its has all my javascript in for my Angular app. And it doesnt seem like its trying to get the resource in the browser.
I've changed the app.js to app.js?1490959855777
And still doesnt fetch the file again.
I basically want to see my JS file without a user having to hard refresh.
Based on the discussion above, it looks like the caching is happening on the browser - since a hard refresh will get the new file contents.
I think what happened is CF told the browser to hold onto that file for a very log time. And the browser is listening to that request.
Because you can't ask your users to do a hard refresh, you'll need to rename the static files that are being cached so aggressively.
I am working on a web page development using netbeans IDE and use Firefox for debugging/testing. Whenever i do changes to Javascript, these changes are not getting reflected on the web page,the source code reveals the obsolete code.
Everytime i make changes, i ensure to restart my nginx server before opening browser, PHP seems to work fine this way, but Javascript is not in sync with my changes to the code.
Pls provide me a solution to encounter this problem.
The problem is that your browser is caching your files, you can clean browser caching or set the browser to stop caching files.
Another way to avoid browser caching is append something (timestamp or id) with a '?' at the end of your HTML file reference.
<script src='script.js?0001'><script>
Any time you want the browser request your file again, just change this value.
For avoiding the caching of files, its better to handle it programatticlly by adding proper headers like Cache-Control and max-age. However, these headers are different for different browser like IE ,firefox etc.
Best way is to trick browser by adding the randow query parameter so that browser will belive this is different request.
<script src='myScript.js?dummyParam=12001>
Here,12001 should be generated different after every change by using timestamp or someother random value.
This is not a programming question per se. I am using a free web host called getfreehosting. I am using their online file manager to transfer files. From time to time, the changes I make on source code do NOT reflect immediately after I upload them. I.e. when I run my application on Chrome, then go to view page source, I realize the JavaScript running is still the old version! In most cases this doesn't happen but when it does it is extremely frustrating. I've tried clearing the browser's cache. I even tried editing the file directly on their servers. Sometimes it solves the problem but other times it doesn't.
Is this a common issue encountered when transferring files to a web host? Or perhaps this is one of the downsides of using a free web host?
Thanks.
You can try clearing your browser's cache, or the ol' CTRL+F5 refresh trick. Otherwise, the hosting provider may be using a caching layer to help ease resource usage.
It is the responsibility of the server to indicate to the browser what the cacheable lifetime of the script files are when they are served to the browser (1 hr, 1 day, 1 month, etc...). This is a server side setting.
Caching is very important for both server-side efficiency and client-side performance so you don't want to defeat it completely.
You can either shorten the server-side setting for the cache lifetime or you can use a version number in your script files (like jQuery does) so that when you revise your script files, you give them a new filename like "myscript-v12.js" and update the corresponding HTML files to refer to the new filename. Then, as soon as the browser gets the new HTML file, it is guarenteed to get the new JS file because the new filename could never have been in the browser cache.
If this is just an issue for you personally while developing and revising your site, then just clear your browser cache after you upload new files and then when your browser loads that page, it won't have any version in the cache and will be forced to get the new version from the server.
There is a CACHE system in modern browsers.
Try clear cache before you browse your web site.
I've been noticing that sometimes my Facebook app runs slow, and when checked it was because the all.js file was not loaded from the Facebook server, so I copied the file ontp my server and tested it.
Everything seems to work fine, and actually it runs faster. My question is - do you know if there are bugs or errors in doing this?
The problem here is that now you're shifting a dependency, and by extension the maintenance of that dependency to your local application. If it's hosted on Facebook's servers, they can update it to fix bugs or add features.
If it's taking a long time to load, you should bring it up on their support forums
Your page has to load the all.js file in any case.
Facebook servers should be faster than the server which host your
website. So, theoretically loading the js file from facebook should
be faster.
A better approach would be to cache the file for some time. This will make the page loads after the initial one much, much faster.
As people have mentioned, the all.js file is updated constantly with bug-fixes etc. So, it is always better to get the newest version of the file instead of manually updating it on your server after some time.
You can have some problems when the facebook update API. You will need to regularly and frequently (every 5 min?) update the file.