Our intranet website has to communicate with a client .NET app. We're using a HttpListener (on http://localhost:[port]) on the client app and an iframe that refers to this url in the page. Its working like a charm when the page is HTTP.
Problem:
When the site is HTTPS a 'Mixed content' Javascript error is displayed in newer browsers and the request doesnt arrive at the client.
I believe this error would also occur when using an Ajax request instead of an iframe.
I also tried to bind a self-signed certificate to the listener and listening on https://localhost:[port] (which works for IE), but since the Firefox has its own certificate store its really tough to install it there automatically (IE uses Windows certificate store which is easy to install there).
So, does anyone know any possibility to make a request to http://localhost:[port] when the site itself is HTTPS that works for both FF and IE?
Thanks!
Change the iframe to:
<script>
var request = new XMLHTTPRequest();
request.open("GET", "http://localhost:[port]/?action=doStuff");
request.send();
</script>
You will also need to make some minor modifications to your app.
It needs to implement an OPTIONS method and it needs to return a cross-origin-resource-policy. This sounds a lot harder than it is, it just needs to return a reply with the Access-Control-Allow-Origin header set to *.
The response of the GET request must also have this header.
If you know all the domains that try to communicate with your app on localhost you can change the * to a whitelist or even just a single value.
Related
When loading an image from a secure server via JS with the following code:
var preloadImage = new Image();
preloadImage.src = 'http://some/resource.png';
The request gets automatically upgraded to https. Presumably it's a well intentioned feature to stop mixed content. However the server I'm pointing to can only do http. I've been browsing the methods and properties available on Image to no avail. I mean ideally yes, the image would be https, but it's just a temp server spun up on AWS so if we can avoid it for now, that'd be much easier.
Does anyone have a workaround to stop JS automatically upgrading the request?
It is not trigged from the JS code it's up to the server to check if client can support https and force it.
My suggestion would be to have an intermediate server/proxy that will do the request to the remote server using https and communicate with the origin server using http
It turns out a recent update to most browsers has stopped mixed content all together. Before it used to throw a warning, now it's just not possible. We got around it with some nginx reverse proxy magic where we passed the IP in the URL like https://normal-domain/preview/IPADDESSS and it's working now.
I need to open a local html file in the browser. The javascript works fine but ajax stops working and XMLHttpRequest gives a cross origin error. Is there a way to run ajax from local directory. It is necessary for me that it is run from local file only.
Thanks
For anyone who wants to know, I had to run a server in the app to serve the js files. Seems like it's not possible to do it without running a server.
If anyone knows of another way, do tell.
The simplest way to allow this in Firefox is to navigate to about:config, look for the privacy.file_unique_originsetting and toggle it off.
Essentially, Firefox used to treat local files from the same directory as being from the same source, thus CORS was happily satisfied. That behavior changed after CVE-2019-11730 was discovered.
It worked fine for me on 84.0.1 on Arch. Just be sure to turn it off when not locally debugging.
Source
If you are using VS Code, the Live Server extension might help you. It resolved a cross-origin issue I was having when editing a webpage.
https://marketplace.visualstudio.com/items?itemName=ritwickdey.LiveServer
If you are using chrome, try this extension
CORS allows web applications on one domain to make cross domain AJAX requests to another domain. It's dead simple to enable, only requiring a single response header to be sent by the server.
What this extension does is add to response header rule - Access-Control-Allow-Origin: *
You can do that manually also by sending a response header.
For simple CORS requests, the server only needs to add the following header to its response: Access-Control-Allow-Origin: *
Read this for more info.
If you are able to change the server code, you can try adding the string "null" to allowed origins. In Chrome, it sends the origin as "null" if it's running from a local file.
Here's an example in ASP.NET Core for setting up the "null" origin:
services.AddCors(options =>
{
options.AddPolicy("InsecurePolicy",
builder => builder
.WithOrigins("null")
.AllowAnyMethod()
.AllowAnyHeader()
.AllowCredentials());
});
Note that this is not recommended, but might be good enough if you just need it during development.
I'm writing a Chrome packaged app for diagnosing web services. I want to be able to send a GET request to a URL and look at the headers and data in the response.
My problem is if a users visits a site that has the HSTS header set before using my app, my app will then be unable send GET requests to the http:// URLs for that domain because Chrome will automatically convert the http:// URLs to https:// ones before the request is sent out.
Is there anything at all I can do to prevent this? I've looked into the webrequest API and webview tag but I'm finding nothing that lets me ignore HSTS.
Is it possible to use https://developer.chrome.com/apps/sockets_tcp for this (I would need to be able to support http, https and gzipped data)?
Is there anything at all I can do to prevent this?
Probably not. If you already tested <webview> and it shares the HSTS list with the browser, then the network layer will transparently rewrite this for you.
Is it possible to use chrome.sockets.tcp for this?
Technically, yes, HSTS shouldn't matter for that. Practically, you would need to implement something like wget+SSL+gzip from ground up (in JS, NaCl or a Native Host - but in the latter case you don't really need built-in sockets).
I am building a web application using ASP.NET Web API and SignalR. I have a pretty good understanding of HTTP but this problem is beyond me.
Currently, I am setting a cookie on all AJAX requests to this API. Subsequent AJAX requests to the API send the cookie without issue.
However, I would like this cookie to also be used in the SignalR requests that establish a connection, but according to Chrome, the cookie is not being sent in these requests. I have the cookie set as HTTPOnly. Currently, everything is running locally; all requests go to localhost:port. The domain on the cookie is being set to localhost as well.
My SignalR connections look like this:
var connection = $.connection("/updates");
/* set handlers here */
connection.start(function () {
console.log("connection started!");
});
It seems as if Chrome thinks this is a CORS request and is withholding the cookies. The requests are on the same domain however, so this does not make much sense.
Turns out I don't understand cookies well enough. Browsers seem to have trouble handling TLDs like localhost as the cookie domain. Therefore, I left the domain undefined, so that the cookie will default to the domain of the request.
However, the path parameter needed to be set to / in order to make sure the cookie is sent in all requests.
Once I made these changes, everything worked as expected, and my cookies were plainly visible in SignalR.
I have a page loading up in MobileSafari which communicated with another server via CORS.
In desktop browsers (tested Chrome and Safari), I am able to log in, get a session cookie, and have that session cookie be sent back for subsequent requests so that I may be authenticated with all API calls.
However, when I login via Mobile Safari, the cookie does not get sent back on subsequent requests.
I'm using Charles Proxy to spy on what's going on, and it tells me:
POST https://myremoteserver.com/sessions.json passes up my login info
It succeeds and response is received with a valid Set-Cookie header.
GET https://myremoteserver.com/checkout.json is requested, without a Cookie request header.
Server responds as if I am not logged in.
I'm using this snippet with Zepto.js to ensure that the withCredentials: true is properly setup on the XHR object. (pardon the coffeescript)
# Add withCredentials:true to the xhr object to send the remote server our cookies.
xhrFactory = $.ajaxSettings.xhr
$.ajaxSettings.xhr = ->
xhr = xhrFactory.apply(this, arguments)
xhr.withCredentials = yes
xhr
And that snippet works great in desktop browsers, and before I added it I was not able to preserve the session cookies in those desktop browsers.
Is there some quirk in MobileSafari that prevents this from working like desktop browsers? Why does it not work in the same way?
Edit!
here is my CORS headers setup in my rails 2.3 app, fairly standard stuff I believe
def add_cors_headers
if valid_cors_domain
headers['Access-Control-Allow-Origin'] = request.headers['HTTP_ORIGIN']
headers['Access-Control-Expose-Headers'] = 'ETag'
headers['Access-Control-Allow-Methods'] = 'GET, POST, PATCH, PUT, DELETE, OPTIONS, HEAD'
headers['Access-Control-Allow-Headers'] = '*,x-requested-with,Content-Type,If-Modified-Since,If-None-Match'
headers['Access-Control-Allow-Credentials'] = 'true'
headers['Access-Control-Max-Age'] = '86400'
end
end
Also today desktop Safari on Mountain Lion started not to send the cookie, behaving just like MobileSafari. I'm not entirely sure if my assessment yesterday was inaccurate, or perhaps Apple is just trolling me...
Also could this be affected by using https:// at the remote url?
I don't know if this solution will work or is acceptable to you but I had the same problem with mobile Safari and a JSONP app. It seemed that Safari was not set to accept third party cookies. I went to Settings > Safari > Accept Cookies and set 'Always' and the problem evaporated. Good luck.
Can I set cookies in a response from a jsonp request?
I believe you are experiencing what I have been seeing in my app. My issue, was caused because iOS Safari, comes with a default option "Prevent Cross-Site Tracking" enabled by default that is causing the browser to block ALL third party cookies, even cookies that are issued by your back-end server from a different domain and CORS is configured correctly.
The only solution to this problem I found was to use a proxy in production like I did in dev. I accomplished this in Azure with Azure Functions and making all request go through a proxy. At that point iOS Safari did not block my cookies everything was set as expected.
I wrote about it in my blog https://medium.com/#omikolaj1/complete-guide-to-deploying-angular-and-asp-net-33a0976d0ec1
You didn't mention whether the remote server is under a different domain or just a different subdomain. I assume is under a different domain.
As #schellsan pointed out you can't set/write cookies to a different domain even if the CORS policy allows it due the 3rd party cookies restriction on safari. It's the latest safari restriction. I guess Firefox is about to do the same.
Workarounds I'm currently evaluating:
Use a redirect on the remote server so that when the client is redirected (the remote URL is in the browser bar) you can set the cookie
Use a custom header
I was running into the same problem.
My setup was:
AngularJS (Ionic) App on Server A with domain a.com
NodeJS with Passport JS as Backend on Server B with domain b.com
The login with the cookie went well on every browser, except Mobile Safari on iOS. Also the change of the mobile cookie (Do not track) settings in iOS did not had any impact on the issue.
Solution was to set a CNAME DNS Record
backend.a.com CNAME b.com
Open an address that sets the cookie via an iFrame - this will set the cookie.