I am using XHR to get some HTML. The HTML response contains an img-element, and this image is for some reason cached.
The server serves the image along with the following headers:
cache-control: no-cache, must-revalidate
content-security-policy: default-src 'self'; ...
content-type: image/png
date: Mon, 08 Oct 2018 03:41:00 GMT
expires: Sat, 26 Jul 1997 05:00:00 GMT
server: nginx (Ubuntu)
status: 200
strict-transport-security: max-age=30879000; includeSubDomains; preload
x-content-type-options: nosniff
x-frame-options: SAMEORIGIN
x-xss-protection: 1; mode=block
And XHR is used like this:
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function () {
if (xmlHttp.readyState === 4 && xmlHttp.status === 200) {
addToPage(xmlHttp.responseText);
}
};
xmlHttp.open("GET", url, true);
xmlHttp.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xmlHttp.send();
Example XHR response:
<h1>Title</h1>
<p>Stuff</p>
<img src="/captcha-generator">
It works as expected when not using XHR. For some reason the image gets cached when it's linked to from within the XHR response.
Why is the image cached, and how do I force the browser to fetch the new image?
I can of course append a unique parameters to break the cache,(eg. /captcha-generator?r={random-string} but I would like to avoid that.
In case of Development mode it is happing .So Incase of google crome go to right top
corner click on costomize and control google crome .then open with New inconginto window
for development mode
Related
Chrome and Microsoft IE are aborting a POST response but it's working fine in Firefox. When I run a test through Fiddler I notice the POST headers in Chrome/IE do not contain Cache-Control: max-age=0 and Origin: ... but are otherwise the same.
In Chrome when I press the submit button the POST is sent to the server, processed, and the client aborts the response. After a few reposts, the client finally accepts the response; however the server has already processed the request so it results in duplicate info. This never happens on Firefox, it just always accepts the first response.
It seems to only happen if the request is large (ie: contains a lot more fields for the server to process) leading me to think this has something to do with the time it takes for the server to process the request (in Firefox the request shows as taking about 9 seconds).
Is there something in Firefox that would cause it to wait longer for this response? Or vice-versa, something in IE/Chrome that could be making it end prematurely?
It may not be relevant but this is a Perl Mason site. The response headers in the page which has the form being submitted:
HTTP/1.1 200 OK
Date: Tue, 07 Aug 2018 19:08:57 GMT
Server: Apache
Set-Cookie: ...; path=/
Set-Cookie: TUSKMasonCookie=...; path=/
Expires: Mon, 1 Jan 1990 05:00:00 GMT
Pragma: no-cache
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0, max-age=0
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
It turns out it was Javascript on the page responsible for the reposting. A setTimeout() which recursively called its own function was continuing to be set even after the form data had been posted within the method.
In Chrome/IE/Edge the form submission would be POSTed and the function would continue to set another timeout calling itself. The subsequent call would again POST and abort the connection waiting on the original.
Firefox however would not repost although it too would continue to set and recall the function.
The fix was to add a flag to track when the POST was submitted and when set, to stop the timeout cycle:
function countdown(secs, starttime) {
var d = new Date();
if (!starttime) {
starttime = d.getTime() / 1000;
}
var nowtime = d.getTime() / 1000;
var timeleft = (starttime + secs) - nowtime;
timeleft = Math.max(0, timeleft);
var isposted = false; // <-- Added as fix
if (timeleft == 0) {
isposted = true; // <-- Added as fix
alert("Time is up. Click OK.");
var frm = document.getElementById("frm");
frm.submit();
}
if (!isposted) { // <-- Added as fix
timeout = setTimeout(["countdown(", secs, ", ", starttime, ")"].join(""), 1000);
}
}
I'm trying to get the response header of a website and get the cookies that i can find inside. There is no problem, it's working when i start the request through a terminal with "node file.js".
But i'm launching the POST request through a local HTML page, so i used browersify for be able to launch a request from a client-side (or at least this is what i understood).
Then i'm using a CORS proxy to get around “No Access-Control-Allow-Origin header” problems. When i start the request from my local HTML page it's working and i can read the body but i'm not able to find the reponse header of the website and get the cookie that i need.
Here how my request looks like:
function connectionInfoConcert(account, psw, next){
console.log("-------------CREATE HEADER-------------\n")
var options = {
method: 'POST',
url: 'https://tranquil-thicket-71867.herokuapp.com/https://www.infoconcert.com/mon-infoconcert/connexion.html',
headers: {
'Accept':'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Language':'fr-FR,fr;q=0.9,en-US;q=0.8,en;q=0.7',
'Cache-Control':'max-age=0',
'Connection':'keep-alive',
'Content-Length':'184',
'Content-Type':'application/x-www-form-urlencoded',
'origin': 'https://www.infoconcert.com',
'referer': 'https://www.infoconcert.com/mon-infoconcert/connexion.html'
},
form: {
origin:'',
username: account,
password: psw
}
};
console.log("-------------END HEADER-------------\n-----------START REQUEST-----------------------")
request(options, function (error, response, body) {
if (!error) {
console.log(response)
console.log(body)
console.log(response.headers['set-cookie'][0]); //obviously 'set-cookie' doesn't exist
}
else {
console.log(error)
console.log(response)
console.log("-------------ERROR------------")
return console.log("Something went wrong")
}
});}
https://www.infoconcert.com/mon-infoconcert/connexion.html is the website where i try to get the cookie
https://tranquil-thicket-71867.herokuapp.com is the CORS proxy that i'm using.
If some of you know how to be able to get the correct header it would be really nice !
Edit 1 :
The problem is when i do my request to "https://www.infoconcert.com/mon-infoconcert/connexion.html" i can find the cookie that i'm looking for in response.headers["set-cookie"].
But because i'm doing the request from a html page it's not working and i'm doing the request to "https://tranquil-thicket-71867.herokuapp.com/https://www.infoconcert.com/mon-infoconcert/connexion.html" but i can't find my cookie in the response.
Here the header of the response i got from "https://tranquil-thicket-71867.herokuapp.com/https://www.infoconcert.com/mon-infoconcert/connexion.html"
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: server,vary,cache-control,content-type,content-encoding,p3p,date,expires,pragma,connection,x-cache-info,x-final-url,access-control-allow-origin
Cache-Control: private, no-cache, no-store, proxy-revalidate, no-transform
Connection: keep-alive
Content-Encoding: gzip
Content-Type: text/html; charset=UTF-8
Date: Thu, 03 May 2018 17:23:49 GMT
Expires: Thu, 19 Nov 1981 08:52:00 GMT
P3p: policyref="/w3c/p3p.xml", CP="NOI DSP COR NID CUR ADM DEV OUR BUS"
Pragma: no-cache
Server: Apache
Transfer-Encoding: chunked
Vary: Accept-Encoding,User-Agent
Via: 1.1 vegur
X-Cache-Info: not cacheable; response specified "Cache-Control: private"
X-Cors-Redirect-1: 302 https://www.infoconcert.com/mon-infoconcert/index.html
X-Cors-Redirect-2: 302 https://www.infoconcert.com/mon-infoconcert/connexion.html
X-Final-Url: https://www.infoconcert.com/mon-infoconcert/connexion.html
X-Request-Url: https://www.infoconcert.com/mon-infoconcert/connexion.html
So i think it's my proxy who doesn't forward the cookie that he gets.
If you need more details just ask.
Thanks you very much for your help
I have Android 4.0.4 and a PhoneGap 3.6.3 app that makes a synchronous XmlHttpRequest:
var xhr = new XMLHttpRequest();
xhr.open('GET', 'http://myserver/myapp/api/ProgramOptions', false);
xhr.setRequestHeader("Accept","application/json");
xhr.send(null);
window.alert(xhr.responseType);
window.alert(xhr.readyState);
window.alert(xhr.statusText);
window.alert(xhr.status);
window.alert(xhr.responseText);
window.alert(xhr.getAllResponseHeaders());
When loading the web app in the browser, I get
""
4
"OK"
200
"{"success":true,data:[/*snip*/]}"
"Pragma:no-cache
Date:Tue, 17 Feb 2015 09:40:45 GMT
Content-Encoding:gzip
WWW-Authenticate:Negotiate oYG2MIGz/*snip*/
Server:Microsoft-IIS/7.5
X-AspNet-Version:4.0.30319
X-Powered-By:ASP.NET
Persistent-Auth:false
Vary:Accept-Encoding
Content-Type:application/json; charset=utf-8
Access-Control-Allow-Origin:*
Cache-Control:no-cache
Content-Length:10586
Expires:-1"
When loading the phonegap app, I get
""
4
""
0
""
""
Does anyone know why that is? I already searched for status 0, but there's helluva lot of reasons, and none I found - except the cross-origin requests - comes with an empty responseText. But then, the config.xml contains <access origin="*" />, and, as you can see, the server also sends a Access-Control-Allow-Origin header, so there should be no issue with CORS!?
I am told that MsXML2 follows redirects. However I get a "HTTP 0" error from the script when accessing a URL that has moved.
The reason I need to make it work is because this is a Windows (Sidebar) Gadget used by 300 000 users. And I am moving the website, and want all calls for old versions to still go through.
This is the code simplified:
function MyHttpCall() {
var httpReq = new ActiveXObject("Msxml2.XMLHTTP.6.0");
httpReq.onreadystatechange = function() {
if (httpReq.readyState < 4) return;
if (httpReq.status != 200) alert("HTTP " + httpReq.status);
alert ("Houston we have contact");
}
httpReq.open("GET", myURL, true);
httpReq.setRequestHeader("Cache-Control", "no-store, no-cache, must-revalidate");
httpReq.setRequestHeader("Cache-Control", "post-check=0, pre-check=0");
httpReq.setRequestHeader("Pragma", "no-cache");
httpReq.setRequestHeader("If-Modified-Since", "Tue, 01 Jan 2008 00:00:00 GMT");
httpReq.send();
}
I assume this has to do with httpReq.status != 200, but I thought the readystatechange is continously firing events once state changes. Fire one for HTTP 301, and another one for HTTP 200.
Accoarding to a Microsoft article, cross-domain redirects are not allowed in MsXML. That could most possibly be the case.
this is driving me nutters.
jQuery 1.4.2, windows XP sp3
Here is my test.
Load firefox 3.5+
http://plungjan.name/test/testcors.html
works
Save the file to harddisk and run from there
From my office the external works and the internal does not
What is also interesting is that I cannot run both in one go.
Background:
I do a GET to an internal web service that uses CORS.
Please do NOT post any answers about FF not handling cross domain request when it does since v3.5 as detailed here and here
It works in IE8 and FF3.6.6 from one server to the other and now almost from file system (file:///) to service.
Only from file system and only when FF 3.6.6 needs to negotiate (the user is already logged in, authorised and sends the credentials!) do I not get the data after negotiation. jQuery xhr returns status 0 and no data/responseText or whatever
Seems to me, jQuery reacts and saves the xhr from the 401 rather than from the 200 OK later
Here is the result I get at the end of the communication when I alert the XHR object:
Status:success
Data:[]
XHR:
some native functions,
readyState:4
status:0
responseXML:null
responseText:
withCredentials:true
if I make a call to the same server but without needing credentials, the data is returned just fine cross domain
So the communication is as follows:
GET /restapplicationusingcors/authenticationneeded-internal/someid
Accept: application/json
Accept-Language: en
.
.
Origin: null
Cookie: LtpaToken=...
the return is
HTTP/1.1 401 Unauthorized
Server: Apache
Pragma: No-cache
Cache-Control: no-cache
Expires: Thu, 01 Jan 1970 01:00:00 CET
WWW-Authenticate: Negotiate
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html
Then FF sends
GET /restapplicationusingcors/authenticationneeded-internal/someid HTTP/1.1
Host: myhost.myintranet.bla
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6
Accept: application/json
Accept-Language: en
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Origin: null
Cookie: LtpaToken=....
Authorization: Negotiate ....
and is rewarded with the file I need, but cannot get at in FF:
HTTP/1.1 200 OK
Date: Tue, 20 Jul 2010 12:08:39 GMT
Pragma: No-cache
Cache-Control: no-cache, max-age=600, s-maxage=3600
Expires: Thu, 01 Jan 1970 01:00:00 CET
X-Powered-By: ...
Content-Disposition: inline;filename=nnnnnn.json
Content-Language: en
Access-Control-Allow-Origin: ...
Keep-Alive: timeout=6, max=70
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/json;charset=UTF-8
THE DATA SENT FROM THE SERVER IS NOT IN THE XHR OBJECT
Here is my code
function getJSON(url,func,lang) {
accept = 'application/json';
lang=lang?lang:"*";
// gruesome hack to handle that APPENDS the mime header to */* !!!
// NOW HANDLED by first setting Accept to "" !!!
// if ($.browser.msie && url.indexOf('serveAsMime')==-1) {
// url+= '?serveAsMime='+accept;
// }
if (currentRequest != null) currentRequest.abort();
var requestObjectJSON = {
url : url,
// dataType: "json",
method : 'get',
beforeSend: function(xhr){
xhr.setRequestHeader('Accept', ""); // IE hack
xhr.setRequestHeader('Accept', accept);
xhr.setRequestHeader('Accept-Language', lang);
if (url.indexOf('-internal') !=-1) {
try {
xhr.withCredentials = true;
alert('set credentials')
}
catch(e) {
alert('cannot set xhr with credentials')
}
}
},
success: function(data,status,xhr) {
var responseText = xhr.responseText;
var responseJSON = xhr.responseJSON;
var t = "";
try{
for (var o in xhr) t += '\n'+o+':'+xhr[o];
}
catch(e) {
if (e.message.indexOf('.channel')==-1)alert(e.message);
}
alert('Status:'+status+'\nData:['+data+']\nXHR:'+t);
func(responseText);
},
}
currentRequest = $.ajax(requestObjectJSON);
}
This is a stab in the dark since I don't fully understand your problem, but I think you might be having a problem with file: URLs, which are not treated as having any origin. I'm not sure it's even possible to authorize CORS from a file URL.
So you need to set an ajax prefilter in your model/collection in order to use CORS. Otherwise it doesn't send the cookie.
$.ajaxPrefilter( function( options, originalOptions, jqXHR ) {
options.xhrFields = {
withCredentials: true
};
});
I put this in my Model/Collection initialize function.
These are the conditions to be met to make CORS working with secured services:
Service response should contain header Access-Control-Allow-Credentials: true (see Requests with credentials and Cannot use wildcard in Access-Control-Allow-Origin when credentials flag is true).
Service response header Access-Control-Allow-Origin should not be *. The idea is to return the value passed by client in header Origin (see examples in this post).
According to specification, OPTIONS method should return HTTP code 200, thus it cannot be secured (see The CORS).
For methods PUT/POST that need to pass certain request headers to service (like Content-Type or Accept), these headers need to be listed in Access-Control-Allow-Headers (see jQuery AJAX fails to work when headers are specified)
JavaScript should set this XMLHttpRequest property: xhr.withCredentials = true; (as answered by Kirby)
Altogether configuration for Apache:
# Static content:
SetEnvIf Request_URI ".*" no-jk
# RESTful service:
SetEnvIf Request_URI "^/backend/" !no-jk
SetEnvIf Request_Method "OPTIONS" no-jk
# Fallback value:
SetEnv http_origin "*"
SetEnvIf Origin "^https?://(localhost|.*\.myconpany\.org)(:[0-9]+)?$" http_origin=$0
Header set Access-Control-Allow-Credentials "true"
Header set Access-Control-Allow-Origin "%{http_origin}e"
Header set Access-Control-Allow-Methods "GET,POST,PUT,DELETE"
Header set Access-Control-Allow-Headers "Content-Type, Accept"
JkMount /* loadbalancer
CORS with file://
If you have problems by allowing origins from the file:// protocol, according to The Web Origin Concept it should be done the same way as any other origins. I could not find information about the browser support, but I think every browser which is supporting CORS does support this one either.
The Web Origin Concept tells us the following about the file URI scheme:
4. If uri-scheme is "file", the implementation MAY return an
implementation-defined value.
NOTE: Historically, user agents have granted content from the
file scheme a tremendous amount of privilege. However,
granting all local files such wide privileges can lead to
privilege escalation attacks. Some user agents have had
success granting local files directory-based privileges, but
this approach has not been widely adopted. Other user agents
use globally unique identifiers for each file URI, which is
the most secure option.
According to wikipedia the domain by the file URI scheme is localhost. It is omittable by the address bar, but I don't think it is omittable in the allow origin headers. So if your browser implementation allows origin with a file URI scheme, then you should add file://localhost to your allowed origins, and everything should work properly after that.
This was how it should work, now meet reality:
I tested with current firefox 29.0.1, and it did not work. However the file:// protocol is transformed into null origin by this implementation. So by firefox the null works. I tried with a wider domain list, but I did not manage to allow multiple domains. It seems like firefox does not support a list with multiple domains currently.
I tested with chrome 35.0.1916, it works the same way as firefox did.
I tested with msie 11.0.9600. By request from the file protocol it always shows an allow blocked content button, even by not allowing the null origin. By other domains it works the same way as the previous browsers.
HTTP basic auth:
The credentials part I tried out with PHP and HTTP basic auth.
http://test.loc
Displays :-) when logged in and :-( when unauthorized.
<?php
function authorized()
{
if (empty($_SERVER['PHP_AUTH_USER']) || empty($_SERVER['PHP_AUTH_PW']))
return false;
return ($_SERVER['PHP_AUTH_USER'] == 'username' && $_SERVER['PHP_AUTH_PW'] == 'password');
}
function unauthorized()
{
header('HTTP/1.1 401 Unauthorized');
header('WWW-Authenticate: Basic realm="Restricted Area"');
echo ':-(';
}
if (!isset($_GET['logout']) && authorized()) {
echo ':-)';
} else
unauthorized();
So this code changes the location by login and logout.
Cross domain CORS with HTTP basic auth
http://todo.loc
Gets the content of http://test.loc with cross domain XHR and displays it.
cross domain ajax<br />
<script>
var xhr = new XMLHttpRequest();
xhr.open('GET', "http://test.loc", true);
xhr.withCredentials = true;
xhr.onreadystatechange = function (){
if (xhr.readyState==4) {
document.body.innerHTML += xhr.responseText;
}
};
xhr.send();
</script>
Requires headers by http://test.loc:
Access-Control-Allow-Origin: http://todo.loc
Access-Control-Allow-Credentials: true
Cross scheme CORS with HTTP basic auth
file:///path/x.html
Gets the content of http://test.loc with cross scheme XHR and displays it.
cross scheme ajax<br />
<script>
var xhr = new XMLHttpRequest();
xhr.open('GET', "http://test.loc", true);
xhr.withCredentials = true;
xhr.onreadystatechange = function (){
if (xhr.readyState==4) {
document.body.innerHTML += xhr.responseText;
}
};
xhr.send();
</script>
Requires headers by http://test.loc:
Access-Control-Allow-Origin: null
Access-Control-Allow-Credentials: true
Conclusion:
I tested cross-sheme CORS with credentials called from file:// and it works pretty well in firefox, chrome and msie.