Safari is Rejecting Cookies When Calling An API On Another Server - javascript

I have a React application for my business that has multiple clients. My architecture is set-up like this:
Server A - Main Application Server, only serves React app (url = https://example.net)
Server B - Business Client #1, serves their slightly customized React app (url = https://example2.net)
Server C - Business Client #2, serves their slightly customized React app (url = https://example3.net)
Microservice Server - Running a Java microservice (url = https://api.example.net - notice it's a subdomain of Server A)
Server A, B, and C are all talking to the Microservice Server for API calls. The API calls are using the native browser fetch() call with these options
var options = {
method: "{method}",
mode: "cors",
credentials: "include",
headers: { "Content-Type": "application/json" }
}
The server response has this header:
"Access-Control-Allow-Credentials", "true"
and uses the microservice's built-in
enableCorsForAllOrigins()
Now, this all works fine on Firefox, Chrome, IE, and Edge. However, it doesn't work on Safari (MacOS or iPhone) for Server B or Server C. It works fine on Safari on Server A.
The API calls are in this sequence...
POST https://api.example.net/login
GET https://api.example.net/users
On Safari, the first call is successful, but the second call fails with a 401 error. The 401 error is thrown from the microservice trying to get the Session attribute for the "currentUserId", which is null. In other words, Safari is not sending the cookie to the server. I've figured out that Safari doesn't like that the API calls are going to a different URL base than the client's server. (In other words, Server B making API calls to Server A).
To further verify this, if a user starts on Server B, then navigates to Server A and makes an API call (a bad login for example), and then goes back to Server B, then it works for them on Safari. It seems like Safari just needs to exchange a cookie on the Server A URL to get it working on every other server.
Now, my issue...how do I fix this? Users on Mac Safari and iPhone Safari on Server B and Server C are not able to use the application because their session cookies are not sent to the server. Has anyone seen this before? Does anyone have solutions? My first attempt was to create an iframe on Server B that loaded Server A's page, but that didn't work.
Many thanks in advance!

So after reading the links provided by #str above, and reading the links from those links, I came to the conclusion that trying to get it working using cookies and sessions wouldn't work with Safari. So I scrapped the entire setup and went with JWT instead.
So, my solution now is:
POST https://api.example.net/login -> Response includes server-generated JWT token with userId in it.
JavaScript code now saves this JWT into SessionStorage (not a cookie, or you'll run into the same issue as when you used Sessions).
window.sessionStorage.setItem("jwtToken", data.jwtToken);
Set up the API calls to always pass the JWT token as a header.
"Authorization": "Bearer " + window.sessionStorage.getItem("jwtToken")
Then, using a server-side library, I can extract the JWT token and get the userId from it.
It works on Safari and is the preferred solution nowadays for web authorization, so I'd say go with this solution. The downsides is that storing the JWT in sessionStorage opens the user to XSS attacks.

Related

Twitch API's request data always empty on specific route and only on browser

Summary
I am building a website(react app) integrated with twitch helix api.
I use the Implicit grant flow to auth my website
When request this https://api.twitch.tv/helix/videos route with game_id query param, it will always return an empty array
a picture from edge browser's network and console I printed
And the other routes are works fine.
like this route
https://api.twitch.tv/helix/games/top
or same route with id query param
https://api.twitch.tv/helix/videos
BUT I use the api tester like Thunder Client for VS Code.
The https://api.twitch.tv/helix/videos works fine
Also used the C# NET6 ConsoleApp to test on my windows 11
Found that this https://api.twitch.tv/helix/videos route only works on server side, the client side could not works.
Only this https://api.twitch.tv/helix/videos route with game_id query param return an empty array while others (same api with different route or same route with other query param) works fine, it seems this is not a CORS error
Tried Solution
Had tried this CORS - Wrong ‘Access-Control-Allow-Origin’ header on twitch developer forums to add a request header 'accept': 'application/vnd.twitchtv.v5+json', or renew my twitch developer console's clientId.
But still didn't work
My website on vercel
Does anyone know this issue or I am missing something?
Try set language request header to a empty string:
headers.set('accept-language', '')
It worked for me.

Microsoft Outlook Add-in can't extract/pass data to a localhost server via Ajax

I'm trying to build an Outlook add-in for encrypting/decrypting emails. The Add-in itself uses a web app, which in turn uses Javascript.
I basically want to pass the unencrypted/encrypted email to a local server running python using GET/POST requests, perform some encryption/decryption on it in python, and then pass it back to Outlook Add-in.
The problem is that the Add-in just won't pass any data to the local server nor get something in return. I have tried to work with both Django and Flask with the same results. I also configured the local servers to use locally-generated certificates and keys with no improvement. I even tried routing the HTTP connection using ngrok, but none came of it.
Server response to GET request on HTTP
Server response to GET request on HTTPS using Werkzeug
The code given below never returns "Here5" on item-status. Clearly the success snippet isn't being run.
$.ajax({
url: "https://127.0.0.1:8000/getPost/hello/",
type: 'GET',
dataType: 'json', // added data type
success: function (res) {
$('#item-status').text("Here5");
$('#item-message').text(res);
}
});
I've also added the required URLs in the App Domain part of the manifest file:
<AppDomain>https://127.0.0.1:8000/getPost/hello/</AppDomain>
The Add-in performs fine when running GET requests from other sites on the internet, like:
https://api.github.com/users
However, I just can't seem to run it on my local server. How do I solve this?
Edit: After some debugging, I've learned that XMLHttpRequest status is 0 and responseText is empty. I think it might have something to do with CORS. Am I correct in that assumption? If yes, then how do I disable Cross Origin for localhost in this particular case?
Thank You

How do you send a cookie to a cross origin request using react and maybe electron?

I need to make a request to this other website and I need to send a cookie alongside it. I have no control over this other website so I cannot modify its cors policy or allow credentials in any meaningful way.
So, I decided to convert my website to a react app that will run contained inside an electron web browser. I thought that as the electron has full control over the computer, I would be able to send a normal request without having to deal with cors, but it seems, that cors and all sorts of other protections are still enabled.
To clarify my current setup, I have:
A react website localhost:3000
A electron app showing the react website.
There is no other communication happening between the two.
All requests are made solely by the react website.
The react website needs to communicate with foreignwebsite.com and send a cookie to it.
Normally this in any server-side type of application, including electron itself, would be of no problem, but react cannot make direct requests.
One way I could do this was to potentially create a middle server that communicates with the foreigner API and friendly gives me the data I need for my front-end app. But, I don't want to host a server, and if I create a server on the client machine I think the app would be too heavy.
I could use the electron process as a server and make node.js requests with it, but to communicate between react and electron seems weirdly complicated, it feels like it would be a mess of events.
I found an answer online that tries to kind of deactivate the cors of the electron browser, it partially did work, or at least it seems that it worked, it shows no error but the cookie simply isn't transferred, maybe it just tricks the browser but it doesn't work behind the scenes, idk...
const filter = {
urls: ['http://foreignwebsite.com/*']
}
mainWindow.webContents.session.webRequest.onBeforeSendHeaders(
filter,
(details, callback) => {
details.requestHeaders.Origin = `http://foreignwebsite.com/`
details.requestHeaders.Referer = `http://foreignwebsite.com/`
callback({ requestHeaders: details.requestHeaders })
}
)
mainWindow.webContents.session.webRequest.onHeadersReceived(
filter,
(details, callback) => {
details.responseHeaders['access-control-allow-origin'] = ['http://localhost:3000']
details.responseHeaders['access-control-allow-credentials'] = 'true'
callback({ responseHeaders: details.responseHeaders })
}
)
So above it sets the AccessControlAllowOrigin and the AllowCredentials of the response accordingly and it changes the origin and referer of the requests.
And when I go make requests on the react website I do it like this:
document.cookie = 'OmegaCookie=' + cookieVal + '; path=/; SameSite=None; Secure';
let response = await fetch('http://foreignwebsite.com', {credentials: 'include'});
But the cookie is not sent. Do you know what could be going wrong or how to handle it better in this situation?

Cross domain Ajax JSON POST support against RESTful WCF service using transportCredentialOnly security

I've posted before on this subject, but after a year of getting on with other things, I've managed to get into a pickle once again. I'll try and give a brief overview of the scenario and the current attempts to make things work:
IIS web server hosting HTML, JS etc. on host: iis.mycompany.com (referred to as foo)
WCF RESTful web services hosted via a Windows Service on host: wcf.mycompany.com (referred to as bar)
The Javascript served from foo works by making RESTful ajax calls (GET or POST depending on the action) to the WCF services on bar, obviously these are cross domain calls as they aren't on the same host.
The Javascript uses the jQuery (1.7.2) framework to manipulate the DOM and perform ajax calls to bar, the expected content type for POSTS is JSON, and the response from GETS is expected to be JSON too (application/json).
Bar has it's WCF services configured using TransportCredentialOnly as the security mode and the transport client credentail type is NTLM, so only authed users to contact the services.
CORS Support has been added to bar's WCF services using an extension to WCF:
http://blogs.msdn.com/b/carlosfigueira/archive/2012/05/15/implementing-cors-support-in-wcf.aspx
We have added additional headers and modfied some that the post already contained based on numerous internet articles:
property.Headers.Add("Access-Control-Allow-Headers", "Accept, Content-Type");
property.Headers.Add("Access-Control-Allow-Methods", "POST, GET, OPTIONS");
property.Headers.Add("Access-Control-Max-Age", "172800");
property.Headers.Add("Access-Control-Allow-Origin", "http://iis.mycompany.com");
property.Headers.Add("Access-Control-Allow-Credentials", "true");
property.Headers.Add("Content-type", "application/json");
Sites giving information on enabling CORS suggest that the Access-Control-Allow-Origin response header should be set to "*" however, this is not possible in our case as we make jQuery ajax calls using the following setup:
$.ajaxSetup({
cache: "false",
crossDomain: true,
xhrFields: {
withCredentials: true
}
});
As it turns out you cannot use "*" for the accepted origin when you are using "withCredentials" in the ajax call:
https://developer.mozilla.org/en/http_access_control
"Important note: when responding to a credentialed request, server
must specify a domain, and cannot use wild carding."
Currently in our development lab, this doesn't matter as we can hard code the requests to the IIS (foo) server URL.
The main problem now appears to be attempting POST requests (GET is working using the above configuration). When the browser attempts the POST process, it first sends an OPTIONS header to the server requesting allowed OPTIONS for the subsequent post. This is where we would like to see the headers we've configured in the CORS Support WCF extension being passed back, however we aren't getting that far; before the response comes back as "401 Unauthorized", I believe this is to do with the transport security binding configuration requesting NTLM, but I'm not sure.
Also, I'm not very experienced with this, but I haven't seen much information about POST using application/json content type as opposed to text/plain when performing cross domain requests.
I know that people will probably suggest JSONP as the one true solution, I'm not against different approaches, indeed I encourage anyone to suggest best practices as it would help others reading this question later. However, please attempt to answer the question before suggestion alternatives to it.
Many thanks in advance for anyone who contributes.
peteski
:)
UPDATE:
It appears that Chrome (20.x.x) doesn't suffer the problem of not negotiating NTLM to retrieve the OPTIONS header response from the server, but Firefox (13.0.1) does.
We've also noticed that someone has already posted a bug up on the Firefox forum, which we've added information to:
http://bugzilla.mozilla.org/show_bug.cgi?id=751552
Please vote for this bug to be fixed on the bugzilla site!
Using the following code, we can watch the network trace to see Firefox failing and Chrome working fine:
var url = "http://myWebServiceServer/InstantMessagingService/chat/message/send";
var data = '{ "remoteUserUri" : "sip:foo.bar#mydomain.com", "message" : "This is my message" }';
var request = new XMLHttpRequest();
request.open("POST", url, true);
request.withCredentials = true;
request.setRequestHeader("Content-Type", "application/json");
request.send(data);
console.log(request);
On a separate note, IE8 doesn't support the XMLHttpRequest for cross domain calls, favouring it's own magical XDomainRequest object, so we've got some work to do in changing the client side code to handle IE8 vs the world cases. (Thanks IE8).
/me crosses fingers that Mozilla fix the Firefox bug.
UPDATE 2:
After some digging it appears that IE8's XDomainRequest cannot be used to make cross domain requests where NTLM must be negotiated, this basically means that the security on our WCF binding can't be used thanks to limitations in a web browser.
http://blogs.msdn.com/b/ieinternals/archive/2010/05/13/xdomainrequest-restrictions-limitations-and-workarounds.aspx
"No authentication or cookies will be sent with the request"
So, I guess we've taken this as far as it is going to go for now.. It looks like we're going to have to create our own custom token authentication and pass it across to the WCF service in a cookie, or in IE8's case, POST it with the JSON. The WCF service will then have to handle decrypting the data and using that instead of the ServiceSecurityContext.Current.WindowsIdentity we previously had access to with NTLM auth.
I know you said you would rather have the problem itself addressed, but you may consider using a "reverse proxy."
I don't know what technologies you are using, but we use Apache web server and have a Java RESTful API running on a different server that required authentication. For a while, we messed with JSONP and CORS, but were not satisfied.
In the end, we setup an Apache Reverse Proxy and it worked miracles. The web browser believes it is communicating with its own domain and acts appropriately. The RESTful API doesn't know it is being used via a proxy. Therefore, everything just works. And Apache does all the magic.
Hopefully, all web servers have a feature like Apache's reverse proxy.
Here is some documentation on the feature: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
All we had to do is ensure the mod_proxy module was installed, then add the following lines to our Apache config file:
ProxyPass /restapi http://restfulserver.com/restapi
ProxyPassReverse /restapi http://restfulserver.com/restapi
Then restart the web server and voila!

Keeping an open connection in Internet Explorer (7,8,9)

I am in a project where I need to connect to a client's server and retrieve data for some graphs.
The client wants this to be something in real time (they have a script that is listening for connections and feeds data constantly).
My big problem is that I don't think this can be done in IE as easy as it would be done in Firefox or Chrome. Firefox and Chrome both support web sockets so I think I could make something that would do exactly what the client wants.
In IE the only way I could do this is by using a setTimeout with an ajax call.
Is there any other alternative for IE??
For now we're using jquery .ajax() and setting a callback for the onReadystageChanged event that handles the server response when the readyState == 3. But this doesn't work as expected because the connection closes after the first response.
Also, I am using IIS with Application Request Routing module as a proxy and a URL rewriting rule because we're using cross domain services. But this module uses a buffer with at least 1KB that corrupts the server response. The services were created by another company that won't implement CORS specification in their server.
How can I work around this and receive the response without buffering?
Thank you
Instead of WebSockets you can use long polling and a server side promise.
On the server you can suspend the http request so it works async.
Example jQuery:
poll();
function poll(){
$.ajax({
url : "http://my-own-domain.com/data/",
success : function(data) {
//do what you want with data
poll();
},
error : function(data) {
poll();
}
});
}
Server side example:
WSRequest req = WS.url("http://the-other-domain.org/data/");
Promise<HttpResponse> respAsync = req.getAsync();
HttpResponse resp = await(respAsync);
renderJSON(resp.getString());
For server side take a look at Play! Framework Async to get an idea how it can be done.

Categories