forcing RequireJS text! to reload - javascript

Using the text! plugin, is there a way of forcing RequireJS to reload a file rather than returning the cached data?

RequireJS will only cache the file per request. A page reload will fetch it again.
If you see something different it is because:
Either you have caching on your server.
or your browser caches the request. You can of course disable this on your browser.
If you want browsers to fetch a clean file every time, you should have a no-cache header for these resources on your server.

I think that you could add the new html5 cache feature by providing a cache manifest: http://www.html5rocks.com/en/tutorials/appcache/beginner/
then you could use the requirejs "domReady" to get the proper load event:
http://requirejs.org/docs/api.html#pageload
and then listen to the proper event (code taken from the first link):
window.applicationCache.addEventListener('updateready', function(e) {
if (window.applicationCache.status == window.applicationCache.UPDATEREADY) {
// Browser downloaded a new app cache.
if (confirm('A new version of this site is available. Load it?')) {
window.location.reload();
}
} else {
// Manifest didn't changed. Nothing new to server.
}}, false);
at this point whenever you update urlArgs you will get the new js files and with the manifest cache file you will get the new html files

Related

How to circumvent browser caching? [duplicate]

Is there a way I can put some code on my page so when someone visits a site, it clears the browser cache, so they can view the changes?
Languages used: ASP.NET, VB.NET, and of course HTML, CSS, and jQuery.
If this is about .css and .js changes, then one way is "cache busting" by appending something like "_versionNo" to the file name for each release. For example:
script_1.0.css // This is the URL for release 1.0
script_1.1.css // This is the URL for release 1.1
script_1.2.css // etc.
or after the file name:
script.css?v=1.0 // This is the URL for release 1.0
script.css?v=1.1 // This is the URL for release 1.1
script.css?v=1.2 // etc.
You can check this link to see how it could work.
Look into the cache-control and the expires META Tag.
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="EXPIRES" CONTENT="Mon, 22 Jul 2002 11:12:01 GMT">
Another common practices is to append constantly-changing strings to the end of the requested files. For instance:
<script type="text/javascript" src="main.js?v=12392823"></script>
Update 2012
This is an old question but I think it needs a more up to date answer because now there is a way to have more control of website caching.
In Offline Web Applications (which is really any HTML5 website) applicationCache.swapCache() can be used to update the cached version of your website without the need for manually reloading the page.
This is a code example from the Beginner's Guide to Using the Application Cache on HTML5 Rocks explaining how to update users to the newest version of your site:
// Check if a new cache is available on page load.
window.addEventListener('load', function(e) {
window.applicationCache.addEventListener('updateready', function(e) {
if (window.applicationCache.status == window.applicationCache.UPDATEREADY) {
// Browser downloaded a new app cache.
// Swap it in and reload the page to get the new hotness.
window.applicationCache.swapCache();
if (confirm('A new version of this site is available. Load it?')) {
window.location.reload();
}
} else {
// Manifest didn't changed. Nothing new to server.
}
}, false);
}, false);
See also Using the application cache on Mozilla Developer Network for more info.
Update 2016
Things change quickly on the Web.
This question was asked in 2009 and in 2012 I posted an update about a new way to handle the problem described in the question. Another 4 years passed and now it seems that it is already deprecated. Thanks to cgaldiolo for pointing it out in the comments.
Currently, as of July 2016, the HTML Standard, Section 7.9, Offline Web applications includes a deprecation warning:
This feature is in the process of being removed from the Web platform.
(This is a long process that takes many years.) Using any of the
offline Web application features at this time is highly discouraged.
Use service workers instead.
So does Using the application cache on Mozilla Developer Network that I referenced in 2012:
Deprecated This feature has been removed from the Web standards.
Though some browsers may still support it, it is in the process of
being dropped. Do not use it in old or new projects. Pages or Web apps
using it may break at any time.
See also Bug 1204581 - Add a deprecation notice for AppCache if service worker fetch interception is enabled.
Not as such. One method is to send the appropriate headers when delivering content to force the browser to reload:
Making sure a web page is not cached, across all browsers.
If your search for "cache header" or something similar here on SO, you'll find ASP.NET specific examples.
Another, less clean but sometimes only way if you can't control the headers on server side, is adding a random GET parameter to the resource that is being called:
myimage.gif?random=1923849839
I had similiar problem and this is how I solved it:
In index.html file I've added manifest:
<html manifest="cache.manifest">
In <head> section included script updating the cache:
<script type="text/javascript" src="update_cache.js"></script>
In <body> section I've inserted onload function:
<body onload="checkForUpdate()">
In cache.manifest I've put all files I want to cache. It is important now that it works in my case (Apache) just by updating each time the "version" comment. It is also an option to name files with "?ver=001" or something at the end of name but it's not needed. Changing just # version 1.01 triggers cache update event.
CACHE MANIFEST
# version 1.01
style.css
imgs/logo.png
#all other files
It's important to include 1., 2. and 3. points only in index.html. Otherwise
GET http://foo.bar/resource.ext net::ERR_FAILED
occurs because every "child" file tries to cache the page while the page is already cached.
In update_cache.js file I've put this code:
function checkForUpdate()
{
if (window.applicationCache != undefined && window.applicationCache != null)
{
window.applicationCache.addEventListener('updateready', updateApplication);
}
}
function updateApplication(event)
{
if (window.applicationCache.status != 4) return;
window.applicationCache.removeEventListener('updateready', updateApplication);
window.applicationCache.swapCache();
window.location.reload();
}
Now you just change files and in manifest you have to update version comment. Now visiting index.html page will update the cache.
The parts of solution aren't mine but I've found them through internet and put together so that it works.
For static resources right caching would be to use query parameters with value of each deployment or file version. This will have effect of clearing cache after each deployment.
/Content/css/Site.css?version={FileVersionNumber}
Here is ASP.NET MVC example.
<link href="#Url.Content("~/Content/Css/Reset.css")?version=#this.GetType().Assembly.GetName().Version" rel="stylesheet" type="text/css" />
Don't forget to update assembly version.
I had a case where I would take photos of clients online and would need to update the div if a photo is changed. Browser was still showing the old photo. So I used the hack of calling a random GET variable, which would be unique every time. Here it is if it could help anybody
<img src="/photos/userid_73.jpg?random=<?php echo rand() ?>" ...
EDIT
As pointed out by others, following is much more efficient solution since it will reload images only when they are changed, identifying this change by the file size:
<img src="/photos/userid_73.jpg?modified=<? filemtime("/photos/userid_73.jpg")?>"
A lot of answers are missing the point - most developers are well aware that turning off the cache is inefficient. However, there are many common circumstances where efficiency is unimportant and default cache behavior is badly broken.
These include nested, iterative script testing (the big one!) and broken third party software workarounds. None of the solutions given here are adequate to address such common scenarios. Most web browsers are far too aggressive caching and provide no sensible means to avoid these problems.
Updating the URL to the following works for me:
/custom.js?id=1
By adding a unique number after ?id= and incrementing it for new changes, users do not have to press CTRL + F5 to refresh the cache. Alternatively, you can append hash or string version of the current time or Epoch after ?id=
Something like ?id=1520606295
<meta http-equiv="pragma" content="no-cache" />
Also see https://stackoverflow.com/questions/126772/how-to-force-a-web-browser-not-to-cache-images
Here is the MDSN page on setting caching in ASP.NET.
Response.Cache.SetExpires(DateTime.Now.AddSeconds(60))
Response.Cache.SetCacheability(HttpCacheability.Public)
Response.Cache.SetValidUntilExpires(False)
Response.Cache.VaryByParams("Category") = True
If Response.Cache.VaryByParams("Category") Then
'...
End If
Not sure if that might really help you but that's how caching should work on any browser. When the browser request a file, it should always send a request to the server unless there is a "offline" mode. The server will read some parameters like date modified or etags.
The server will return a 304 error response for NOT MODIFIED and the browser will have to use its cache. If the etag doesn't validate on server side or the modified date is below the current modified date, the server should return the new content with the new modified date or etags or both.
If there is no caching data sent to the browser, I guess the behavior is undetermined, the browser may or may not cache file that don't tell how they are cached. If you set caching parameters in the response it will cache your files correctly and the server then may choose to return a 304 error, or the new content.
This is how it should be done. Using random params or version number in urls is more like a hack than anything.
http://www.checkupdown.com/status/E304.html
http://en.wikipedia.org/wiki/HTTP_ETag
http://www.xpertdeveloper.com/2011/03/last-modified-header-vs-expire-header-vs-etag/
After reading I saw that there is also a expire date. If you have problem, it might be that you have a expire date set up. In other words, when the browser will cache your file, since it has a expiry date, it shouldn't have to request it again before that date. In other words, it will never ask the file to the server and will never receive a 304 not modified. It will simply use the cache until the expiry date is reached or cache is cleared.
So that is my guess, you have some sort of expiry date and you should use last-modified etags or a mix of it all and make sure that there is no expire date.
If people tends to refresh a lot and the file doesn't get changed a lot, then it might be wise to set a big expiry date.
My 2 cents!
I implemented this simple solution that works for me (not yet on production environment):
function verificarNovaVersio() {
var sVersio = localStorage['gcf_versio'+ location.pathname] || 'v00.0.0000';
$.ajax({
url: "./versio.txt"
, dataType: 'text'
, cache: false
, contentType: false
, processData: false
, type: 'post'
}).done(function(sVersioFitxer) {
console.log('Versió App: '+ sVersioFitxer +', Versió Caché: '+ sVersio);
if (sVersio < (sVersioFitxer || 'v00.0.0000')) {
localStorage['gcf_versio'+ location.pathname] = sVersioFitxer;
location.reload(true);
}
});
}
I've a little file located where the html are:
"versio.txt":
v00.5.0014
This function is called in all of my pages, so when loading it checks if the localStorage's version value is lower than the current version and does a
location.reload(true);
...to force reload from server instead from cache.
(obviously, instead of localStorage you can use cookies or other persistent client storage)
I opted for this solution for its simplicity, because only mantaining a single file "versio.txt" will force the full site to reload.
The queryString method is hard to implement and is also cached (if you change from v1.1 to a previous version will load from cache, then it means that the cache is not flushed, keeping all previous versions at cache).
I'm a little newbie and I'd apreciate your professional check & review to ensure my method is a good approach.
Hope it helps.
In addition to setting Cache-control: no-cache, you should also set the Expires header to -1 if you would like the local copy to be refreshed each time (some versions of IE seem to require this).
See HTTP Cache - check with the server, always sending If-Modified-Since
There is one trick that can be used.The trick is to append a parameter/string to the file name in the script tag and change it when you file changes.
<script src="myfile.js?version=1.0.0"></script>
The browser interprets the whole string as the file path even though what comes after the "?" are parameters. So wat happens now is that next time when you update your file just change the number in the script tag on your website (Example <script src="myfile.js?version=1.0.1"></script>) and each users browser will see the file has changed and grab a new copy.
Force browsers to clear cache or reload correct data? I have tried most of the solutions described in stackoverflow, some work, but after a little while, it does cache eventually and display the previous loaded script or file. Is there another way that would clear the cache (css, js, etc) and actually work on all browsers?
I found so far that specific resources can be reloaded individually if you change the date and time on your files on the server. "Clearing cache" is not as easy as it should be. Instead of clearing cache on my browsers, I realized that "touching" the server files cached will actually change the date and time of the source file cached on the server (Tested on Edge, Chrome and Firefox) and most browsers will automatically download the most current fresh copy of whats on your server (code, graphics any multimedia too). I suggest you just copy the most current scripts on the server and "do the touch thing" solution before your program runs, so it will change the date of all your problem files to a most current date and time, then it downloads a fresh copy to your browser:
<?php
touch('/www/sample/file1.css');
touch('/www/sample/file2.js');
?>
then ... the rest of your program...
It took me some time to resolve this issue (as many browsers act differently to different commands, but they all check time of files and compare to your downloaded copy in your browser, if different date and time, will do the refresh), If you can't go the supposed right way, there is always another usable and better solution to it. Best Regards and happy camping. By the way touch(); or alternatives work in many programming languages inclusive in javascript bash sh php and you can include or call them in html.
For webpack users:-
I added time with chunkhash in my webpack config. This solved my problem of invalidating cache on each deployment. Also we need to take care that index.html/ asset.manifest is not cached both in your CDN or browser. Config of chunk name in webpack config will look like this:-
fileName: [chunkhash]-${Date.now()}.js
or If you are using contenthash then
fileName: [contenthash]-${Date.now()}.js
This is the simple solution I used to solve in one of my applications using PHP.
All JS and CSS files are placed in a folder with version name. Example : "1.0.01"
root\1.0.01\JS
root\1.0.01\CSS
Created a Helper and Defined the version Number there
<?php
function system_version()
{
return '1.0.07';
}
And Linked JS and SCC Files like below
<script src="<?= base_url(); ?>/<?= system_version();?>/js/generators.js" type="text/javascript"></script>
<link rel="stylesheet" type="text/css" href="<?= base_url(); ?>/<?= system_version(); ?>/css/view-checklist.css" />
Whenever I make changes to any JS or CSS file, I change the System Verson in Helper and rename the folder and deploy it.
I had the same problem, all i did was change the file names which are linked to my index.html file and then went into the index.html file and updated their names, not the best practice but if it works it works. The browser sees them as new files so they get redownloaded on to the users device.
example:
I want to update a css file, its named styles.css, change it to styless.css
Go into index.html and update , and change it to
in case interested I've found my solution to get browsers refreshing .css and .js in the context of .NET MVC (.net fw 4.8) and the use of bundles.
I wanted to make browsers refresh cached files only after a new assembly is deployed.
Buinding on Paulius Zaliaduonis response, my solution is as follows:
store your application base url in the web config app settings (the HttpContext is not yet available at runtime during the RegisterBundle...), then make this parameter changing according to the configuration (debug, staging, release...) by the xml transform
In BundleConfig RegisterBundles get the assembly version by the means of reflection, and...
...change the default tag format of both styles and scripts so that the bundling system generates link and script tags appending a query string parameter on them.
Here is the code
public static void RegisterBundles(BundleCollection bundles)
{
string baseUrl = system.Configuration.ConfigurationManager.AppSettings["by.app.base.url"].ToString();
string assemblyVersion = Assembly.GetExecutingAssembly().GetName().Version.ToString();
Styles.DefaultTagFormat = $"<link href='{baseUrl}{{0}}?v={assemblyVersion}' rel='stylesheet'/>";
Scripts.DefaultTagFormat = $"<script src='{baseUrl}{{0}}?v={assemblyVersion}'></script>";
}
You'll get tags like
<script src="https://example.org/myscriptfilepath/script.js?v={myassemblyversion}"></script>
you just need to remember to to build a new version before deploying.
Ciao
Do you want to clear the cache, or just make sure your current (changed?) page is not cached?
If the latter, it should be as simple as
<META HTTP-EQUIV="Pragma" CONTENT="no-cache">

Rewrite URL offline when using a service worker

I am making a web application with offline capabilities using a service worker generated by a Nodejs plugin called sw-precache. Everything works fine and I do have access to the html files or images offline.
But then, since you have no server-side language possible, is there a way to rewrite url client-side like an .htaccess file would do? Like showing a "404 Page not found" page when no file matches the url? I know that redirections are possible using Javascript or meta tags, but rewriting the url?
By default, sw-precache will only respond to fetch events when the URL being requested is a URL for a resource that it has cached. If someone navigations to a URL for a non-existent web page, then sw-precache won't respond to the fetch event.
That does mean that you have a chance to run your own code in an additional fetch event handler that could implement custom behavior, like returning a 404.html page when a user navigates to a non-existent page while offline. You need to jump through a couple of hoops, but here's how to do it:
// In custom-offline-import.js:
self.addEventListener('fetch', event => {
if (event.request.mode === 'navigate') {
event.respondWith(
fetch(event.request)
.catch(() => caches.match('404.html', {ignoreSearch: true}))
// {ignoreSearch: true} is needed, since sw-precache appends a search
// parameter with versioning information.
);
}
});
// In your sw-precache config:
{
// Make sure 404.html is picked up in one of the glob patterns:
staticFileGlobs: ['404.html'],
// See https://github.com/GoogleChrome/sw-precache#importscripts-arraystring
importScripts: ['custom-offline-import.js'],
}
This shouldn't interfere with anything that sw-precache is doing, as it will just be used as fallback.

Detecting applicationCache viability of remote resource

I am trying to determine if cache (as obtained via applicationCache and HTML5 cache-manifest) is available located on a different domain (local file system vs WWW).
The cache-checking resource (a gateway mechanism, if you will) is located on the local filesystem and is loaded via a webview. This is a requirement that I cannot work around.
Until recently, this gateway local file would check to see if the device is online and redirect to the remote resource using window.location and if the device was not online, it would display a graphic (also locally packaged) that essentially said "You must connect to the internet to use this feature."
However, I just recently implemented offline support on that remote resource. It works. If we have the gateway file just redirect to the remote resource whilst offline, it'll load the remote resource from cache.
There's a possibility that a user may try to use the device whilst offline when they have not yet accessed it via online, so logic needs to be placed in the gateway code to test if the cache exists or not. I am running into cross domain issues, which I expected, but I am not sure how to go about fixing it.
Here is the code I have tried:
if (window.navigator.onLine === false) {
// See if we're able to reach the content (if it's cached, we'll get HTML, else it'll fail)
cacheCheck = $.ajax(contentLoc, {async:false});
// the request failed so we have no cache at all. let's just show the offline graphic.
if (cacheCheck.status === 0) { // no cache available :(
$("#launchpage").addClass("offline");
} else { // we have a cache :)
redirect();
}
} else {
redirect();
}
When I was writing it, I was under the naive hope that the $.ajax() would fetch the cached version (if it existed) and I could just test the returned object to see if the status code returned wasn't an error status code.
However, this does not work as object is returning "Error: NETWORK_ERR: XMLHttpRequest Exception 101"
Is there any other method that I can use to determine whether or not it is safe to redirect? It's a requirement that I display a local image if a redirect would fail (due to no cache)
I have figured out a workaround. If I inject an iframe pointing to the remote resource and check to see if that iframe loads HTML (I specifically check if the tag contains a manifest attribute) it assumes that cache exists and it can perform a redirect.
If I don't get what I'm expecting, the error graphic displays. Though in my implementation the graphic is always displayed when offline since I have to wait for the iframe to load asynchronously.
Here is example code to make it work:
if (window.navigator.onLine === false) {
// We're offline, so show the offline graphic, in case the future test fails (i.e., no cache available)
$("#launchpage").addClass("offline");
// Create an iframe pointing to the canvas. If it loads, we have a cache
$('body').append('<iframe id="offlineTest" src="'+contentLoc+'" style="display:none;" />');
$('#offlineTest').bind('load', function() {
// See if the result HTML tag has a manifest tag.
manifest = $("#offlineTest").contents().find('html').attr('manifest');
if (manifest !== undefined) { // cache is available.
redirect();
}
});
} else {
redirect();
}

Offline cache - html5

Ok I have a problem with my cache, and just can't figure it out D:
Every time I try to reload the page (To get cache), I get this error: Application Cache Error event: Failed to commit new cache to storage?
Someone who know why this doesn't work?
Links (In case you want to see yourself):
Index.php
Cache file
.htaccess: AddType text/cache-manifest cache
Some files in you cache-manifest don't exist (404).
According to the spec. if not all files can be retrieved the new cache will not be used/committed.
It seems you are using Chrome. Your application cache might be broken. Try clearing it but entering the following to address bar:
chrome://appcache-internals/

html5 appcache adding/removing specific files

Say I have a simple appcache manifest that looks like:
CACHE:
# v1
# images
images/one.jpg
images/two.jpg
images/three.jpg
I then use some server side method to update the manifest to:
CACHE:
# v1
# images
images/one.jpg
images/two.jpg
images/three.jpg
images/four.jpg
And then call a function client side to update the appcache:
function updateCache(){
var appCache = window.applicationCache;
appCache.update();
if (appCache.status == window.applicationCache.UPDATEREADY) {
appCache.swapCache();
}
}
I would like to 'add' my new image to the existing cache without downloading everything again (which is what's currently happening). Is this possible or am I missing something fundamental?
It will download everything again, that's the way it's designed to work. However if you have set far future expiry headers on your images then the chances are the browser will fetch them from the local browser cache rather than requesting them off the server again.
Note that if you do this during development this can lead to some strange behaviour, but you should definitely do it for production sites.

Categories