I've just noticed that Safari on iOS keeps stale $http.get results in cache, that target my server (REST call).
However, Safari claims a status 200 (not 304), even if result is stale... troubling
I confirm that the issue comes from Safari since it's easy to check the real result through a rest call to the server.
What I do to force Safari to refresh its cache is adding a random parameter:
$http.get('myUrl?rnd=' + new Date().getTime())
Is there a better practice? Probably changing the response headers on the server directly?
My server returns this response header:
HTTP/1.1 200 OK
Server: Cowboy
Date: Tue, 11 Nov 2014 23:52:59 GMT
Connection: keep-alive
Content-Type: application/json; charset=utf-8
Content-Encoding: gzip
Vary: Accept-Encoding
Content-Length: 495
Via: 1.1 vegur
Your response doesn't have any cache control headers. According to this answer browsers are free to do whatever they want if there are no cache control headers. In your case Safari on iOS has decided to cache the content even though that isn't what you want.
You could keep using your workaround, or you could add cache control headers in the response to tell Safari not to cache your response.
Note that RFC's might say that responses should not be cached if there are no cache control headers. (I haven't checked). But browsers often have non-standard behavior that you have to work around.
As an aside - early on in my computer networking job I thought that it was OK to not support browsers and webservers that didn't follow the RFCs. I was wrong.
Related
I have a WebSocketServer running on a server box, with a website attempting to connect to it and send back and forth information.
I have noticed that on WiFi it works perfectly on all the browsers I have tested, however over Mobile Data Firefox. I intercepted and edited headers and managed to reproduce the problem. Firefox is sending a combined header Connection: keep-alive, Upgrade in the request. Chrome in comparison is just sending Connection: Upgrade. My theory is that when the request passes through the mobile data provider's proxy, as well as adding their own identifying headers, it re-parses all of the other headers, and does not understand a combined header. This is confirmed by the fact that at the server end, the request is received (from Firefox) but the Connection header is truncated to Connection: keep-alive. If I manually remove the keep-alive from the Connection header using the interception program, the problem is solved.
I don't need the keep-alive part of the request (in fact if anything I would prefer it not to be enabled) so I'm asking if there is a way to stop Firefox sending it without using about:config etc (e.g. in JS or HTML), as I would like for this to work for the general end-user.
Many thanks,
Richard
I had a similar problem, henceforth resolved.
In my case, the problem was that my hosting provider had a proxy which was not dealing correctly with the Connection and/or Upgrade headers. Indeed, these headers are hop-by-hop and as such:
Hop-by-hop headers
are meaningful only for a single transport-level connection and must not be retransmitted by proxies or cached. Such headers are: Connection, Keep-Alive, Proxy-Authenticate, Proxy-Authorization, TE, Trailer, Transfer-Encoding and Upgrade. Note that only hop-by-hop headers may be set using the Connection general header.
Souce: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers
Shortly, these headers are not retransmitted but somehow interpreted before being passed to your server. When these headers are sent by Firefox, this phase of interpretation becomes critical since the value associated to the Connection header is more "complicated" than that sent by other browsers, i.e.
Firefox sends Connection: keep-alive, Upgrade
Chrome/Edge/... sends Connection: Upgrade
Solution : I simply told my hosting provider that only Connection: keep alive arrives to my server when one sends Upgrade: <my_protocole> AND Connection: keep alive, Upgrade (and he had the possibility to correct the issu within 72 hours).
I'm using Google hosted jQuery in my webapp (//ajax.googleapis.com/ajax/libs/jquery/1.8.3/jquery.min.js) As part of bug diagnostics I have a window.onerror handler which catches any errors I'm not catching locally and lets the server know about them.
So far so good, but... sometimes i get errors like these:
"Script error.","Error loading script","Unexpected token <"
My assumption is that the Google CDN is blocked in these cases (for whatever reason). I do have a local fallback for jQuery, that I'm fairly sure is working well, but I would like to find out what's being returned so that I can test my assumptions and maybe get some of these users on a white list for Google CDN (if it's company firewall blocking it).
But so far I haven't been able to figure out how to retrieve the returned content. Can't retrieve innerText of a SCRIPT tag if it's a file, can't do an ajax request because of cross-domain policy, etc.
Does anyone have any ideas about how this would be possible?
It simply isn't possible to get the content of any file referenced by a <script> tag. This is with good reason: doing so would allow you to circumvent XHR's Same Origin Policy.
Consider:
<script src="https://www.example.com/private/api/getAuthToken" id="s"></script>
If you could access the text of the respnse, you'd be able to do this:
var stolenAuthToken = $('#s').text();
That's obviously bad. Therefore, you're never allowed to read the content of something brought in by <script> tags.
Your particular situation is complicated by a relatively recently introduced change where errors in cross-origin scripts do not report any useful information to your page's onerror handler. (Essentially, this was done to patch an information disclosure security hole that allows a malicious site to infer whether you're logged in to some well-known sites, among other things.)
This means that you get no useful information about errors from CDN-hosted script, so another change was made to allow the use of CORS for a CDN (or other non-same-origin) server to opt in to allowing full error details to pass to an onerror handler.
We (Facebook) need a mechanism for disabling the window.onerror muting behavior implemented in #363897. Our static script resources are served on a CDN under a different domain from the main site. Because these domains differ we're falling afoul of the x-domain logic that prevents us from gathering useful information about browser errors.
This "feature" has been widely enough adopted in in the wild (in Firefox and Webkit browsers) that the majority of uncaught exceptions we see in production now have no actionable information in them.
The crossorigin attribute (originally intended for <img>) allows you to specify that a resource should be loaded with CORS rules. It has been implemented by Mozilla, WebKit, and Chrome.
<script src="http://example.com/xdomainrequest" crossorigin="anonymous"></script>
Unfortunately for you, in my testing, I found that the Google CDN does not send CORS headers.
GET http://ajax.googleapis.com/ajax/libs/jquery/1.8.3/jquery.min.js HTTP/1.1
Host: ajax.googleapis.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:17.0) Gecko/20100101 Firefox/17.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
Referer: http://fiddle.jshell.net/josh3736/jm2JU/show/
Origin: http://fiddle.jshell.net
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.1 200 OK
Vary: Accept-Encoding
Content-Type: text/javascript; charset=UTF-8
Last-Modified: Tue, 13 Nov 2012 19:53:02 GMT
Date: Wed, 02 Jan 2013 22:54:25 GMT
Expires: Thu, 02 Jan 2014 22:54:25 GMT
X-Content-Type-Options: nosniff
Server: sffe
Content-Length: 93637
X-XSS-Protection: 1; mode=block
Cache-Control: public, max-age=31536000
Age: 169036
...
Note the presence of the Origin header in the request (indicating a CORS request), and the absence of an Access-Control-Allow-Origin header in the response. Thus, even if you put the crossorigin attribute, the CORS check will fail, and your scripts will receive scrubbed error details.
There is a three-year-old issue to enable CORS on the Google CDN server. I wouldn't hold my breath.
tldr: If you want meaningful error messages, you must host all JavaScript yourself, on the same origin.
I have web page which refers large number of JS, Images files. when teh page is loaded second time, 304 requests goes to the server. I would like to get http 200 (cache) status for cached object rather than 304.
I am using asp.net 4, iis 7.
Setting the Expiry header is not working, it still sends 304 requests. I want http status 200 (cache).
Please let me know if there is any technique for this.
You've said that setting the expiry doesn't help, but it does if you set the right headers.
Here's a good example: Google's library hosting. Here are the (relevant) headers Google sends back if you ask for a fully-qualified version of a library (e.g., jQuery 1.7.2, not jQuery 1., or jQuery 1.7.):
Date:Thu, 07 Jun 2012 14:43:04 GMT
Cache-Control:public, max-age=31536000
Expires:Fri, 07 Jun 2013 14:43:04 GMT
(Date isn't really relevant, I just include it so you know when the headers above were generated.) There, as you can see, Cache-Control has been set explicitly with a max-age of a year, and the Expires header also specifies an expiration date a year from now. Subsequent access (within a year) to the same fully-qualified library doesn't even result in an If-Modified-Since request to the server, because the browser knows the cached copy is fresh.
This description of caching in HTTP 1.1 in RFC 2616 should help.
I have a Flash application that sends a getURL request for an image file every 60 seconds.
This works fine in all browsers except IE9 with Internet Option set to automatically check for newer versions of stored pages.
I setup Charles proxy (http://xk72.com) to watch the requests being sent by my flash app and confirmed that the request is being surpressed by IE9 when the setting is set to Auto, but works fine if I change the setting to check everytime I visit the webpage. This, however, is not an option! I need this to work in all browsers regardless of how the options are set.
Even if I do a page refresh (F5), the ASP page does not reload. The only way to get it to reload is close the browser and restart it.
I have tried adding content headers to disable caching but it does not appear to work.
For Example, here is my request headers:
HTTP/1.1 200 OK
Date Sun, 02 Oct 2011 23:58:31 GMT
Server Microsoft-IIS/6.0
X-Powered-By ASP.NET
Expires Tue, 09 Nov 2010 14:59:39 GMT
Cache-control no-cache
max-age 0
Content-Length 9691
Content-Type text/html
Set-Cookie ASPSESSIONIDACQBSACA=ECJPCLHADMFBDLCBHLJFPBPH; path=/
Cache-control private
I have read the Microsoft blog (http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx) which states that if I add the content headers, the browser should respect my request, but it obviously does not.
I don't think this is a Flash issue since the html page that holds the Flash object will not even reload.
You can append a random number to the end of the url.
I get some json data with ajax when the page DOM is loaded using jQuery like this:
$(document).ready(function(){
getData();
});
...where getData() is a simple jQuery ajax call, something like this:
function getData(){
$.ajax({cache: true, dataType: 'json', url: '/foo/bar'});
}
The Expires header for this request is set to some time in the future, so the next time I load the page, the ajax call should use the cached data. Firefox 3 does not.
But, if I instead ask for the data like this:
$(document).ready(function(){
setTimeout("getData()", 1);
});
Firefox does respect the Expires header, and uses the cache. Any ideas why this would be?
This page mentions that browsers may treat ajax calls that occur when a page loads differently from ajax calls that occur in response to a user UI event.
Edit: I forgot to include the http headers in my original post. I think the headers are fine, because the caching works as long as the request isn't made in an ajax call when the page loads. If I visit the url that the ajax call uses in my browser URL bar, caching works, and as I explain above, caching works if I add a little delay to the ajax call.
Request headers
Host 10.0.45.64:5004
User-Agent Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.9) Gecko/20100824 Firefox/3.6.9
Accept application/json, text/javascript, /
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip,deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive 115
Connection keep-alive
X-Requested-With XMLHttpRequest
Cookie
Response headers
I set the Expires header to 1 week in the future so that users only need to refresh once a week.
Date Wed, 04 May 2011 15:32:04 GMT
Last-Modified Wed, 04 May 2011 15:32:03 GMT
Expires Wed, 11 May 2011 15:32:03 GMT
Content-Type text/javascript
Cache-Control Public
Connection close
Define an error handler in the $.ajax() call and inspect the response headers (using jqXHR.getAllResponseHeaders() where jqXHR is the jQuery Ajax object, status code, and responseText.length. You may find that the request is successful, but jQuery treats them as unsuccessful. I've recently had a similar issue with cached files and $.ajax(), and it turns out that sometimes when files are loaded when the browser is offline, or from a local file, return a status code of 0. Because the status doesn't fall in the range of success codes (200-300), jQuery considers the request to have failed. See this for what I did to fix this issue. Basically, in your error handler, you can check the responseText.length. If it is non-empty, consider the request successful and parse the JSON using JSON.parse(). BUT!!! you have to make sure on your server-side that invalid requests are empty.