I have this code:
window.onload = function() {
document.cookie = 'foo=bar; expires=Sun, 01 Jan 2012 00:00:00 +0100; path=/';
var xhr = new XMLHttpRequest();
xhr.open("GET", "/showcookie.php",true);
xhr.setRequestHeader("Cookie", "foo=quux");
xhr.setRequestHeader("Foo", "Bar");
xhr.setRequestHeader("Foo", "Baz");
xhr.withCredentials = true;
var pre = document.getElementById('output');
xhr.onreadystatechange = function() {
if (4 == xhr.readyState) {
pre.innerHTML += xhr.responseText + "\n";
}
};
xhr.send(null);
};
and this /showcookie.php
<?php
print_r($_COOKIE);
?>
and it always show
Array
(
[Host] => localhost
[User-Agent] =>
[Accept] =>
[Accept-Language] => pl,en-us;q=0.7,en;q=0.3
[Accept-Encoding] => gzip,deflate
[Accept-Charset] => ISO-8859-2,utf-8;q=0.7,*;q=0.7
[Keep-Alive] => 115
[Connection] => keep-alive
[foo] => Baz
[Referer] =>
[Cookie] => foo=bar
)
Array
(
[foo] => bar
)
I'm using Firefox 3.6.13, Opera 11.00 and Chromium 9.0 on Ubuntu.
Is anybody have the same problem or maybe it's impossible to modify Cookie header.
The Cookie header is one of several which cannot be modified in an XMLHttpRequest. From the specification:
Terminate [execution of the setRequestHeader method] if header is a
case-insensitive match for one of the
following headers:
Accept-Charset
Accept-Encoding
Connection
Content-Length
Cookie
Cookie2
Content-Transfer-Encoding
Date
Expect
Host
Keep-Alive
Referer
TE
Trailer
Transfer-Encoding
Upgrade
User-Agent
Via
… or if the start of header is a
case-insensitive match for Proxy- or
Sec- (including when header is just
Proxy- or Sec-).
The above headers are controlled by
the user agent to let it control those
aspects of transport. This guarantees
data integrity to some extent. Header
names starting with Sec- are not
allowed to be set to allow new headers
to be minted that are guaranteed not
to come from XMLHttpRequest.
I think this might be a hard constraint on the XHR functionality.
Setting the clientside document.cookie caused the Cookie header to be sent in requests as expected. If you want to pass a cookie value in an an ajax request this might be the way to go.
A workaround is to send a custom header to the php script with the cookie string you want to set:
// in the js...
xhr.open("GET", "showcookie.php",true);
//xhr.setRequestHeader("Cookie", "foo=quux");
xhr.setRequestHeader("X-Set-Cookie", "foo2=quux");
xhr.withCredentials = true;
Then in your showcookie.php you can grab the custom header value and fire a set-cookie response header:
$cookie = $_SERVER['HTTP_X_SET_COOKIE'];
// NOTE: really should sanitise the cookie input.
header('Set-Cookie: ' . $cookie);
print_r($_COOKIE);
Note that you wont see a cookie header until the response is parsed by the browser. Also please make sure you sanitise the contents of the X_SET_COOKIE header - this is a proof of concept only:)
Related
I hope this is not a duplicate...
I am trying to POST user email & password to a php file and it seems that the php file isn't getting those values.
The js code:
function ReceiveLoginData() {
let text = this.responseText;
console.log(text);
let json_data = JSON.parse(
text.substring(1, text.length - 1).replaceAll("\\u0022", "\"")
);
// there is a lot more code... but its irrelevant.
}
function SubmitLogin() {
var email_addr = document.getElementsByClassName("login-email")[0].value;
var passwd = document.getElementsByClassName("login-passwd")[0].value;
var req = new XMLHttpRequest();
req.onload = ReceiveLoginData;
// req.onreadystatechange = ReceiveLoginData; // does not work...
req.open("POST", "/users/auth/login.php"); // ...,true); or ...,false); fail too...
req.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
let data_to_send = "uemail=" +
window.encodeURIComponent(email_addr) +
"&upasswd=" +
window.encodeURIComponent(passwd);
// data_to_send = "uemail="+email_addr ... works neither
req.send(data_to_send);
}
PHP (actually its location is localhost:4000/users/auth/login.php)
<?php
$uemail = $_POST["uemail"];
$upasswd = $_POST["upasswd"];
$login_err = true;
// set it to false otherwise
function SendData(string $str)
{
echo json_encode($str, JSON_HEX_QUOT | JSON_HEX_APOS);
}
function main_fn()
{
$uemail = strtolower($uemail);
if (strlen($uemail) == 0) {
SendData("[\"noemail\"]");
}
// and much more but again irrelevant...
}
main_fn();
?>
I learnt that using window.encodeURIComponent(...) is safer from here: https://stackoverflow.com/a/17382629/18243229
but neither of the ways work.
Whatever I got to know after literal 5 hours of debugging and getting fed up(I blame my noviceness):
The PHP form is being executed. ReceiveLoginData function prints ["noemail"] whenever the submit button is pressed
The Network debugging tab in chrome's dev tools shows that connection is established with php file.
Some information which might just be useful:
Response Headers (source):
HTTP/1.1 200 OK
Host: localhost:4000
Date: Sun, 18 Sep 2022 16:59:49 GMT
Connection: close
X-Powered-By: PHP/8.1.10
Content-type: text/html; charset=UTF-8
Request Headers (source):
POST /users/auth/login.php HTTP/1.1
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Content-Length: 31
Content-type: application/x-www-form-urlencoded
Host: localhost:4000
Origin: http://localhost:4000
Referer: http://localhost:4000/users/auth/auth.html?
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/105.0.0.0 Safari/537.36
sec-ch-ua: "Google Chrome";v="105", "Not)A;Brand";v="8", "Chromium";v="105"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Linux"
Payload: (source | URL encoded)
uemail=email%40gmail.com&upasswd=1234
uemail: email%40gmail.com
upasswd: 1234
Response:
"[\u0022noemail\u0022]"
What else I did...
I didn't waste those 5 hours on this project...
I tried to remake a smaller project with the same mechanism and the same js code calling a PHP file and voila, the php file got the values posted to it...
Everything "seems" correct according to my knowledge but why does PHP not get the $_POST values?
Also, I'm currently focusing on Google Chrome and am on Linux (ig that makes no difference...)
From the code you have posted i can spot one problem.
the $uemail = $_POST["uemail"]; is in the global scope and the code inside the main_fn function is trying to use that variable but that variable is not available in that scope because it is only available in the global scope. So it seems to me you need to pass them as arguments to get them into the functions scope.
Changeing the function definition
from: function main_fn()
to: function main_fn($uemail, $upasswd)
and calling it
with: main_fn($uemail, $upasswd);
instead of: main_fn();
should do the trick
Hope this helps :-)
I want to get server information via fetch but if on the php page is a session started nothing returns.
Simplified example of my code:
JS:
fetch(url)
PHP:
session_start();
echo json_encode( array('a' => 1, 'b' => 2, 'c' => 3, 'd' => 4, 'e' => 5) );
As soon as I remove session part, the json is returned.
When I use XMLHttpRequest everything is working as expected:
var xhttp = new XMLHttpRequest();
xhttp.open("POST", url, true);
xhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
Maybe I have to change the fetch parameters, but I already tried different but nothing changed.
It seems you don't see response in browser console.
Try
fetch(url).then(function(res){
return res.json();
});
and you will see.
I found this is because of header:
Cache-Control: no-store, no-cache, must-revalidate
Calling session_cache_limiter('') before session_start() will prevent auto sending any caching headers. Calling with other parameters except 'nocache' will send allowing cache headers.
Also control available through php ini directive session.cache_limiter
Implementing a simple HTTP server in Qt, with the purpose of streaming real time data to an XMLHttpRequest object (AJAX/JavaScript).
The problem is that the design pattern requires partial transmission of data via the socket connection, changing the readyState in the XHR from '1' (Request) to '2' (Headers received), and then to '3' (Data received) - keeping the request pending. This is also known as "long-polling", or "Comet" and should be possible in most browsers.
However, it stays in request state until the connection is closed, and then readyState '2' and '4' are received. This is normal for HTTP GET, but not desired for this application.
JavaScript:
var request = new XMLHttpRequest();
request.onreadystatechange = function() {
console.log('readyState: ' + this.readyState + ' ' + this.status)
}
request.open("get", "localhost:8080/", true);
request.send();
Qt:
connect(socket, &QTcpSocket::readyRead, [=]()
{
QByteArray data = m_socket->read(1000);
socket->write("HTTP/1.1 200 OK\r\n");
socket->write("Content-Type: text/octet-stream\r\n");
socket->write("Access-Control-Allow-Origin: *\r\n");
socket->write("Cache-Control: no-cache, no-store, max-age=0, must-revalidate\r\n");
socket->flush();
}
So the big question is: How can I make the network system underneath the QtTcpSocket flush pending data after writing the headers (and later, the data), without the need to disconnect first?
A side note: I originally implemented this using WebSockets, but the browser I have to use does not support this.
EDIT:
The HTTP header formatting must have an extra set of "\r\n". Now it works:
connect(socket, &QTcpSocket::readyRead, [=]()
{
QByteArray data = m_socket->read(1000);
socket->write("HTTP/1.1 200 OK\r\n");
socket->write("Content-Type: text/octet-stream\r\n");
socket->write("Access-Control-Allow-Origin: *\r\n");
socket->write("Cache-Control: no-cache, no-store, max-age=0, must-revalidate\r\n");
socket->write("\r\n");
socket->flush();
}
Got it working now after a full day of trying different HTTP header configurations. It seems like user 'peppe' was on to something, and the only thing I needed was to add "\r\n" after the headers! (See edit).
It seems that I am unable to change most request headers from JavaScript when making an AJAX call using XMLHttpRequest. Note that when request.setRequestHeader has to be called after request.open() in Gecko browsers (see http://ajaxpatterns.org/Talk:XMLHttpRequest_Call). When I set the Referer, it doesn't get set (I looked at the request headers sent using Firebug and Tamper Data). When I set User-Agent, it messed up the AJAX call completely. Setting Accept and Content-Type does work, however. Are we prevented from setting Referer and User-Agent in Firefox 3?
var request = new XMLHttpRequest();
var path="http://www.yahoo.com";
request.onreadystatechange=state_change;
request.open("GET", path, true);
request.setRequestHeader("Referer", "http://www.google.com");
//request.setRequestHeader("User-Agent", "Mozilla/5.0");
request.setRequestHeader("Accept","text/plain");
request.setRequestHeader("Content-Type","text/plain");
request.send(null);
function state_change()
{
if (request.readyState==4)
{// 4 = "loaded"
if (request.status==200)
{// 200 = OK
// ...our code here...
alert('ok');
}
else
{
alert("Problem retrieving XML data");
}
}
}
W3C Spec on setrequestheader.
The brief points:
If the request header had
already been set, then the new value
MUST be concatenated to the existing
value using a U+002C COMMA followed by
a U+0020 SPACE for separation.
UAs MAY give the User-Agent header an initial value, but MUST allow authors to append values to it.
However - After searching through the framework XHR in jQuery they don't allow you to change the User-Agent or Referer headers. The closest thing:
// Set header so the called script knows that it's an XMLHttpRequest
xhr.setRequestHeader("X-Requested-With", "XMLHttpRequest");
I'm leaning towards the opinion that what you want to do is being denied by a security policy in FF - if you want to pass some custom Referer type header you could always do:
xhr.setRequestHeader('X-Alt-Referer', 'http://www.google.com');
#gnarf answer is right . wanted to add more information .
Mozilla Bug Reference : https://bugzilla.mozilla.org/show_bug.cgi?id=627942
Terminate these steps if header is a case-insensitive match for one of the following headers:
Accept-Charset
Accept-Encoding
Access-Control-Request-Headers
Access-Control-Request-Method
Connection
Content-Length
Cookie
Cookie2
Date
DNT
Expect
Host
Keep-Alive
Origin
Referer
TE
Trailer
Transfer-Encoding
Upgrade
User-Agent
Via
Source : https://dvcs.w3.org/hg/xhr/raw-file/tip/Overview.html#dom-xmlhttprequest-setrequestheader
For people looking this up now:
It seems that now setting the User-Agent header is allowed since Firefox 43. See https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_header_name for the current list of forbidden headers.
this is driving me nutters.
jQuery 1.4.2, windows XP sp3
Here is my test.
Load firefox 3.5+
http://plungjan.name/test/testcors.html
works
Save the file to harddisk and run from there
From my office the external works and the internal does not
What is also interesting is that I cannot run both in one go.
Background:
I do a GET to an internal web service that uses CORS.
Please do NOT post any answers about FF not handling cross domain request when it does since v3.5 as detailed here and here
It works in IE8 and FF3.6.6 from one server to the other and now almost from file system (file:///) to service.
Only from file system and only when FF 3.6.6 needs to negotiate (the user is already logged in, authorised and sends the credentials!) do I not get the data after negotiation. jQuery xhr returns status 0 and no data/responseText or whatever
Seems to me, jQuery reacts and saves the xhr from the 401 rather than from the 200 OK later
Here is the result I get at the end of the communication when I alert the XHR object:
Status:success
Data:[]
XHR:
some native functions,
readyState:4
status:0
responseXML:null
responseText:
withCredentials:true
if I make a call to the same server but without needing credentials, the data is returned just fine cross domain
So the communication is as follows:
GET /restapplicationusingcors/authenticationneeded-internal/someid
Accept: application/json
Accept-Language: en
.
.
Origin: null
Cookie: LtpaToken=...
the return is
HTTP/1.1 401 Unauthorized
Server: Apache
Pragma: No-cache
Cache-Control: no-cache
Expires: Thu, 01 Jan 1970 01:00:00 CET
WWW-Authenticate: Negotiate
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html
Then FF sends
GET /restapplicationusingcors/authenticationneeded-internal/someid HTTP/1.1
Host: myhost.myintranet.bla
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6
Accept: application/json
Accept-Language: en
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Origin: null
Cookie: LtpaToken=....
Authorization: Negotiate ....
and is rewarded with the file I need, but cannot get at in FF:
HTTP/1.1 200 OK
Date: Tue, 20 Jul 2010 12:08:39 GMT
Pragma: No-cache
Cache-Control: no-cache, max-age=600, s-maxage=3600
Expires: Thu, 01 Jan 1970 01:00:00 CET
X-Powered-By: ...
Content-Disposition: inline;filename=nnnnnn.json
Content-Language: en
Access-Control-Allow-Origin: ...
Keep-Alive: timeout=6, max=70
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/json;charset=UTF-8
THE DATA SENT FROM THE SERVER IS NOT IN THE XHR OBJECT
Here is my code
function getJSON(url,func,lang) {
accept = 'application/json';
lang=lang?lang:"*";
// gruesome hack to handle that APPENDS the mime header to */* !!!
// NOW HANDLED by first setting Accept to "" !!!
// if ($.browser.msie && url.indexOf('serveAsMime')==-1) {
// url+= '?serveAsMime='+accept;
// }
if (currentRequest != null) currentRequest.abort();
var requestObjectJSON = {
url : url,
// dataType: "json",
method : 'get',
beforeSend: function(xhr){
xhr.setRequestHeader('Accept', ""); // IE hack
xhr.setRequestHeader('Accept', accept);
xhr.setRequestHeader('Accept-Language', lang);
if (url.indexOf('-internal') !=-1) {
try {
xhr.withCredentials = true;
alert('set credentials')
}
catch(e) {
alert('cannot set xhr with credentials')
}
}
},
success: function(data,status,xhr) {
var responseText = xhr.responseText;
var responseJSON = xhr.responseJSON;
var t = "";
try{
for (var o in xhr) t += '\n'+o+':'+xhr[o];
}
catch(e) {
if (e.message.indexOf('.channel')==-1)alert(e.message);
}
alert('Status:'+status+'\nData:['+data+']\nXHR:'+t);
func(responseText);
},
}
currentRequest = $.ajax(requestObjectJSON);
}
This is a stab in the dark since I don't fully understand your problem, but I think you might be having a problem with file: URLs, which are not treated as having any origin. I'm not sure it's even possible to authorize CORS from a file URL.
So you need to set an ajax prefilter in your model/collection in order to use CORS. Otherwise it doesn't send the cookie.
$.ajaxPrefilter( function( options, originalOptions, jqXHR ) {
options.xhrFields = {
withCredentials: true
};
});
I put this in my Model/Collection initialize function.
These are the conditions to be met to make CORS working with secured services:
Service response should contain header Access-Control-Allow-Credentials: true (see Requests with credentials and Cannot use wildcard in Access-Control-Allow-Origin when credentials flag is true).
Service response header Access-Control-Allow-Origin should not be *. The idea is to return the value passed by client in header Origin (see examples in this post).
According to specification, OPTIONS method should return HTTP code 200, thus it cannot be secured (see The CORS).
For methods PUT/POST that need to pass certain request headers to service (like Content-Type or Accept), these headers need to be listed in Access-Control-Allow-Headers (see jQuery AJAX fails to work when headers are specified)
JavaScript should set this XMLHttpRequest property: xhr.withCredentials = true; (as answered by Kirby)
Altogether configuration for Apache:
# Static content:
SetEnvIf Request_URI ".*" no-jk
# RESTful service:
SetEnvIf Request_URI "^/backend/" !no-jk
SetEnvIf Request_Method "OPTIONS" no-jk
# Fallback value:
SetEnv http_origin "*"
SetEnvIf Origin "^https?://(localhost|.*\.myconpany\.org)(:[0-9]+)?$" http_origin=$0
Header set Access-Control-Allow-Credentials "true"
Header set Access-Control-Allow-Origin "%{http_origin}e"
Header set Access-Control-Allow-Methods "GET,POST,PUT,DELETE"
Header set Access-Control-Allow-Headers "Content-Type, Accept"
JkMount /* loadbalancer
CORS with file://
If you have problems by allowing origins from the file:// protocol, according to The Web Origin Concept it should be done the same way as any other origins. I could not find information about the browser support, but I think every browser which is supporting CORS does support this one either.
The Web Origin Concept tells us the following about the file URI scheme:
4. If uri-scheme is "file", the implementation MAY return an
implementation-defined value.
NOTE: Historically, user agents have granted content from the
file scheme a tremendous amount of privilege. However,
granting all local files such wide privileges can lead to
privilege escalation attacks. Some user agents have had
success granting local files directory-based privileges, but
this approach has not been widely adopted. Other user agents
use globally unique identifiers for each file URI, which is
the most secure option.
According to wikipedia the domain by the file URI scheme is localhost. It is omittable by the address bar, but I don't think it is omittable in the allow origin headers. So if your browser implementation allows origin with a file URI scheme, then you should add file://localhost to your allowed origins, and everything should work properly after that.
This was how it should work, now meet reality:
I tested with current firefox 29.0.1, and it did not work. However the file:// protocol is transformed into null origin by this implementation. So by firefox the null works. I tried with a wider domain list, but I did not manage to allow multiple domains. It seems like firefox does not support a list with multiple domains currently.
I tested with chrome 35.0.1916, it works the same way as firefox did.
I tested with msie 11.0.9600. By request from the file protocol it always shows an allow blocked content button, even by not allowing the null origin. By other domains it works the same way as the previous browsers.
HTTP basic auth:
The credentials part I tried out with PHP and HTTP basic auth.
http://test.loc
Displays :-) when logged in and :-( when unauthorized.
<?php
function authorized()
{
if (empty($_SERVER['PHP_AUTH_USER']) || empty($_SERVER['PHP_AUTH_PW']))
return false;
return ($_SERVER['PHP_AUTH_USER'] == 'username' && $_SERVER['PHP_AUTH_PW'] == 'password');
}
function unauthorized()
{
header('HTTP/1.1 401 Unauthorized');
header('WWW-Authenticate: Basic realm="Restricted Area"');
echo ':-(';
}
if (!isset($_GET['logout']) && authorized()) {
echo ':-)';
} else
unauthorized();
So this code changes the location by login and logout.
Cross domain CORS with HTTP basic auth
http://todo.loc
Gets the content of http://test.loc with cross domain XHR and displays it.
cross domain ajax<br />
<script>
var xhr = new XMLHttpRequest();
xhr.open('GET', "http://test.loc", true);
xhr.withCredentials = true;
xhr.onreadystatechange = function (){
if (xhr.readyState==4) {
document.body.innerHTML += xhr.responseText;
}
};
xhr.send();
</script>
Requires headers by http://test.loc:
Access-Control-Allow-Origin: http://todo.loc
Access-Control-Allow-Credentials: true
Cross scheme CORS with HTTP basic auth
file:///path/x.html
Gets the content of http://test.loc with cross scheme XHR and displays it.
cross scheme ajax<br />
<script>
var xhr = new XMLHttpRequest();
xhr.open('GET', "http://test.loc", true);
xhr.withCredentials = true;
xhr.onreadystatechange = function (){
if (xhr.readyState==4) {
document.body.innerHTML += xhr.responseText;
}
};
xhr.send();
</script>
Requires headers by http://test.loc:
Access-Control-Allow-Origin: null
Access-Control-Allow-Credentials: true
Conclusion:
I tested cross-sheme CORS with credentials called from file:// and it works pretty well in firefox, chrome and msie.