In Firefox, how do I do the equivalent of --disable-web-security in Chrome. This has been posted a lot, but never a true answer. Most are links to add-ons (some of which don't work in the latest Firefox or don't work at all) and "you just need to enable support on the server".
This is temporary to test. I know the security implications.
I can't turn on CORS on the server and I especially would never be able to allow localhost or similar.
A flag, or setting, or something would be a lot better than a plugin. I also tried: http://www-jo.se/f.pfleger/forcecors, but something must be wrong since my requests come back as completely empty, but same requests in Chrome come back fine.
Again, this is only for testing before pushing to prod which, then, would be on an allowable domain.
Almost everywhere you look, people refer to the about:config and the security.fileuri.strict_origin_policy. Sometimes also the network.http.refere.XOriginPolicy.
For me, none of these seem to have any effect.
This comment implies there is no built-in way in Firefox to do this (as of 2/8/14).
From this answer I've known a CORS Everywhere Firefox extension and it works for me. It creates MITM proxy intercepting headers to disable CORS.
You can find the extension at addons.mozilla.org or here.
Check out my addon that works with the latest Firefox version, with beautiful UI and support JS regex: https://addons.mozilla.org/en-US/firefox/addon/cross-domain-cors
Update: I just add Chrome extension for this https://chrome.google.com/webstore/detail/cross-domain-cors/mjhpgnbimicffchbodmgfnemoghjakai
The Chrome setting you refer to is to disable the same origin policy.
This was covered in this thread also:
Disable firefox same origin policy
about:config -> security.fileuri.strict_origin_policy -> false
I have not been able to find a Firefox option equivalent of --disable-web-security or an addon that does that for me. I really needed it for some testing scenarios where modifying the web server was not possible.
What did help was to use Fiddler to auto-modify web responses so that they have the correct headers and CORS is no longer an issue.
The steps are:
Open fiddler.
If on https go to menu Tools -> Options -> Https and tick the Capture & Decrypt https options
Go to menu Rules -> Customize rules. Modify the OnBeforeResponseFunction so that it looks like the following, then save:
static function OnBeforeResponse(oSession: Session) {
//....
oSession.oResponse.headers.Remove("Access-Control-Allow-Origin");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
//...
}
This will make every web response to have the Access-Control-Allow-Origin: * header.
This still won't work as the OPTIONS preflight will pass through and cause the request to block before our above rule gets the chance to modify the headers.
So to fix this, in the fiddler main window, on the right hand side there's an AutoResponder tab.
Add a new rule and response:
METHOD:OPTIONS https://yoursite.com/ with auto response: *CORSPreflightAllow
and tick the boxes: "Enable Rules" and "Unmatched requests passthrough".
See picture below for reference:
Best Firefox Addon to disable CORS as of September 2016: https://github.com/fredericlb/Force-CORS/releases
You can even configure it by Referrers (Website).
As of June 2022, Mozilla Firefox does allow you to natively change the CORS configuration. No extra addons are required. As per Mozilla docs you can change the CORS setting by changing the value of the key content.cors.disable
To do so first go to your browser and type about:config in your address bar as shown in the
Click on accept risk and continue, since you are on this stack overflow page we assume you are aware of the risks you are undertaking.
You will see a page with your user variables. On this page just search for key content.cors.disable as
You do not have to type in true or false values, just hit the toggle button at the far right of you in the screen and it will change values.
While the question mentions Chrome and Firefox, there are other software without cross domain security. I mention it for people who ignore that such software exists.
For example, PhantomJS is an engine for browser automation, it supports cross domain security deactivation.
phantomjs.exe --web-security=no script.js
See this other comment of mine: Userscript to bypass same-origin policy for accessing nested iframes
For anyone finding this question while using Nightwatch.js (1.3.4), there's an acceptInsecureCerts: true setting in the config file:
firefox: {
desiredCapabilities: {
browserName: 'firefox',
alwaysMatch: {
// Enable this if you encounter unexpected SSL certificate errors in Firefox
acceptInsecureCerts: true,
'moz:firefoxOptions': {
args: [
// '-headless',
// '-verbose'
],
}
}
}
},
Related
I am working on a project which is written in javascript. I can see that for requesting, XMLHttpRequest object has been created.
It is working fine for "http" requests but fails for "https". Since I am debugging it in the Development environment, I just want to know how to ignore self-signed certificate in XmlHttpRequest objects?
While searching I have found this,
httpreq = new ActiveXObject("Msxml2.ServerXMLHTTP.3.0");
httpreq.setOption(2, 13056);
But the answer is not working for morden browsers like Microsoft edge, chorme...
I have also found this, and it clearly says the setOption() can be used for ignoring ssl certificates.
One difference I can see in my code is that I an creating the httpreq using:
httpreq = new xmlhttprequest();//This is for chorme and Firefox
So is there any way I can ignore self-signed certificates in XmlHttpRequest?
The Bad News:
There is no way to accomplish this with XmlHttpRequest directly. This is logical as SSL needs to be secure on the web as any ability to disable it would present a major security risk.
You are also right that setOption is not a standard/modern method under XmlHttpRequest.
The Good(ish) News:
What you can do is accomplish this via a trusted/properly configured proxy, with the example below as a Node/Express server configured to allow the insecure SSL connection:
$router.get("/", (oRequest, oResponse) => {
$nonStrictSSLRequest = require('request').defaults({strictSSL: false});
$nonStrictSSLRequest(
{ url: "https://192.168.1.2:8080/api/apiVer" },
function (err, oAPIResponse, sBody) {
oResponse.status(200).json(JSON.parse(sBody));
}
);
});
I had to do this to support RainMachine's (stupid HTTPS only) self signed certificates.
If you are developing on Chromium based browser, you can use flag --ignore-certificate-errors. Always pass also --user-data-dir to prevent affecting of your main secure profile. Use this only as last resort and only during local development, as this completely ignores all SSL errors. --ignore-certificate-errors is undocumented flag, which can be removed from chromium at any time.
Chromium Linux:
chromium-browser --ignore-certificate-errors --user-data-dir=~/chromium_dev_session
Chrome Windows:
"C:\Program Files\Google\Chrome\Application\chrome.exe" --ignore-certificate-errors --user-data-dir="c:/chrome_dev_session"
On Chrome, I'm having no troubles making a cross domain request, however on Firefox (Ubuntu 14.04), I get an error that consists only of a colon on the line that calls for the xmlhttprequest.
xmlhttp.open("GET", "http://x.x.x.x:xxxx/folder/file.xml", false);
The error message is just ":".
Try Disabling AdBlock
I was having a similar issue where all of my XMLHttpRequests were going through except for a few very specific ones where even minor URL changes fixed the problem. And the only thing I was getting was a colon : in the console. In the end I realized that AdBlockPlus was blocking at least one these requests from going through (the request had 'Banners' in the URL).
So I'm not sure if this would have solved your problem, but I've encountered it multiple times.
using Firebug, the issue turned out to be
Blocked loading mixed active content
I had the same error on a CORS POST request.
I'm using https://cors-anywhere.herokuapp.com/ to bypass the Same-origin policy.
The problem was NoScript blocking the external domain, so in case you're using an external API in your request, this might solve it.
Same as this answer, however the problem for me isn't present neither with AdBlock nor with uBlock.
It's present with uMatrix.
You can find that it blocks the request at this URL:
chrome://umatrix/content/logger-ui.html
You can enable it manually when clicking on that field:
and choosing to enable XHR in the uMatrix popup:
The blocking persists even if you have the code packaged in an Add-on that's correctly installed (the blue bar containts Internal UUID of a Firefox Add-on), therefore if you've ever wondered why some Add-ons don't work while uMatrix (or likes) is enabled, this might be one of the reasons.
We are implementing an AngularJS based application which uses a rest web service hosted in a different domain. The following script is used for CORS and it works perfectly on Chrome and FireFox. It has an issue in IE9 and Safari when authenticating. It seems the issue is with the withCredentials attribute in those browsers. Is there any other way that IE and Safari supports CORS?
<script type="text/javascript">
XMLHttpRequest.prototype.realSend = XMLHttpRequest.prototype.send;
XMLHttpRequest.prototype.send = function(vData) {
this.withCredentials = true;
this.realSend.apply(this, arguments);
};
</script>
According to the different scenarios we tried we could come up with following summary. Hope that would be useful.
According to the browser specifications IE10+ and Safari 4+ versions
supports XHR and withCredentials attribute.
That can be used for CORS without any issue as in the above script.
For other IE versions (9-) we need to use XDR.
URL rewrite mod was not successful with the server side when it
comes to HTTPS.
The solution was very simple. By default IE and Safari have disabled the 3rd party cookies. Since we use two different domains once a user enters the credentials to login, those browsers were unable to save that cookie. Because of that all other requests were unauthorized.
Steps for allowing 3rd party cookies
IE 10 - Internet Options > Privacy > Advanced > Third Party Cookies >
Accept
Safari - Preferences > Privacy > Block Cookies > Never
Due to this problem we found that in AngularJS version 1.1.1+ they have made it easy by setting the ‘withCredentials’ value in ‘config’ module (
$httpProvider.defaults.withCredentials = true;)
I want to make an XMLHttpRequest to a secure uri (https://site.com/ajaxservice/) from javascript running inside a nonsecure page (http://site.com/page.htm). I've tried all kinds of nutty stuff like iframes and dynamic script elements, so far no go. I know I am violating 'same origin policy' but there must be some way to make this work.
I will take any kind of wacky solution short of having the SSL protocol written in javascript.
That won't work by default due to the same origin policy, as you mentioned. Modern browsers are implementing CORS (Cross-Origin Resource Sharing) which you could use to get around this problem. However this will only work in Internet Explorer 8+, Firefox 3.5+, Safari 4+, and Chrome, and requires some server-side work. You may want to check out the following article for further reading on this topic:
Cross-domain Ajax with Cross-Origin Resource Sharing by Nicholas C. Zakas
You can also use JSONP as Dan Beam suggested in another answer. It requires some extra JavaScript work, and you may need to "pad" your web service response, but it's another option which works in all current browsers.
You can't circumvent cross-domain origin with XHR (well, only in Firefox 3.5 with user's permission, not a good solution). Technically, moving from port 80 (http) to 443 (https) is breaking that policy (must be same domain and port). This is the example the specification itself sites here - http://www.w3.org/Security/wiki/Same_Origin_Policy#General_Principles.
Have you looked into JSONP (http://en.wikipedia.org/wiki/JSON#JSONP) or CSSHttpRequests (http://nb.io/hacks/csshttprequest)?
JSONP is a way to add a <script> tag to a page with a pre-defined global callback across domains (as you can put the <script>s src to anywhere on the web). Example:
<script>
function globalCallback (a) { /* do stuff with a */ }
And then you insert a <script> tag to your other domain, like so:
var jsonp = document.createElement('script');
json.setAttribute('src','http://path.to/my/script');
document.body.appendChild(jsonp);
</script>
And in the source of the external script, you must call the globalCallback function with the data you want to pass to it, like this:
globalCallback({"big":{"phat":"object"}});
And you'll get the data you want after that script executes!
CSSHttpRequests is a bit more of a hack, so I've never had the need to use it, though feel free to give it a try if you don't like JSONP, :).
You said you would take anything short of having the SSL protocol written in JavaScript... but I assume you meant if you had to write it yourself.
The opensource Forge project provides a JavaScript TLS implementation, along with some Flash to handle cross-domain requests:
http://github.com/digitalbazaar/forge/blob/master/README
Check out the blog posts at the end of the README to get a more in-depth explanation of how it works.
This is bizarre, I was wondering if anyone could shed some light on why this happened.
Basically, I've been pulling my hair out trying to test JSONP out so I can implement a JSON web service that other sites can use. I'm doing development on localhost--specifically, Visual Studio 2008 and Visual Studio 2008's built-in web server.
So as a JSONP test run w/ jQuery, I implemented the following:
$().ready(function() {
debugger;
try {
$.getJSON("<%= new Uri(Request.Url, "/").ToString() %>XssTest?callback=?", function(data) {
alert(data.abc);
});
} catch (err) {
alert(err);
}
});
And on the server ..
<%= Request["callback"] %>({abc : 'def'})
So what ends up happening is I set a breakpoint on the server and I get the breakpoint both on the first "debugger;" statment in the client-side script as well as on the server. The JSONP URL is indeed being invoked after the page loads. That's working great.
The problem I was having was that the callback would never execute. I tested this in both IE8 as well as Firefox 3.5. Neither one would invoke the callback. The catch(err) was never reached, either. Nothing happened at all!
I'd been stuck on this for a week, and even tested with a manually keyed HTTP request in Telnet on the specified port to be sure that the server is returning the format...
callbackfn({abc : 'def'})
.. and it is.
Then it dawned on me, what if I change the hostname from localhost to localhost with a globalizer ('.'), i.e http://localhost.:41559/ instead of http://localhost:41559/ (yes, adding a dot to any hostname is legal, it is to DNS what global:: is to C# namespaces). And then it worked! Internet Explorer and Firefox 3.5 finally showed me an alert message when I just added a dot.
So this makes me wonder, what is going on here? Why would late script tag generation work with an Internet hostname and not with plain localhost? Or is that the right question?
Clearly this is implemented for security reasons, but what are they trying to secure?? And, by getting it to work with a dot, did I just expose a security hole in this security feature?
By the way, my hosts file, while altered for other hosts, has nothing special going on with localhost; the default 127.0.0.1 / ::1 are still in place with no overrides below.
FOLLOW-UP: I got past this for local development purposes by adding:
127.0.0.1 local.mysite.com
.. to my hosts file, then adding the following code to my global.asax:
protected void Application_BeginRequest(object sender, EventArgs e)
{
if (Request.Headers["Host"].Split(':')[0] == "localhost")
{
Response.Redirect(
Request.Url.Scheme
+ "://"
+ "local.mysite.com"
+ ":" + Request.Url.Port.ToString()
+ Request.Url.PathAndQuery
, true);
}
}
I'm going to throw an answer out there; after some thought I've reached my own conclusions.
It could be that this is a security feature that's implemented to try to thwart an Internet web site from invoking JSONP services running on the client machine.
A web site could just go through a list of ports and keep invoking localhost on different ports and paths. 'Localhost' is one of few DNS hostnames that are dynamic in meaning depending on when and where it's queried, making the potential targets vulnerable. And yes, the fact that appending a dot (.) to 'localhost' ('localhost.') produces a working workaround does expose a security vulnerability, but does offer a [tentative] workaround for development puposes.
A better approach is to map the loopback IP to a new hostname entry in the hosts file so that it works locally, isn't prone to be "fixed" by a browser update, and doesn't work anywhere else but on the development workstation.
I'm experiencing a similar problem. Most of the solutions I've tried work with IE (7), but I'm having difficulty getting Firefox (3.5.2) to play ball.
I've installed HttpFox in order to see how my server's responses are being interpreted on the client, and I'm getting NS_ERROR_DOM_BAD_URI. My situation is a little different to yours though, as I'm trying to invoke a JSONP call back to the same site the hosting page came from, and then this call is responding with a 302 redirect to another site. (I'm using the redirect as a convenient way to get cookies from both domains returned to the browser.)
I'm using jQuery, and I originally tried doing a standard AJAX call via $.ajax(). I figured that as the initial request was to the same site as the hosting page, Firefox would just follow the 302 response to another domain. But no, it appeared to fall foul of XSS defenses. (Note that contrary to what Returning redirect as response to XHR request implies, jQuery does follow the 302 redirect for a standard dataType="json" call: a redirect to the same domain works fine; a redirect to another domain generates NS_ERROR_DOM_BAD_URI in the browser.) As an aside, I don't see why same-domain 302 redirects to other domains can't just be followed - after all, it's the hosting page's domain that is issuing the redirect, so why can't it be trusted? If you're worried about scripting injection attacks, then the JSONP route is open for abuse anyway...
jQuery's $.getJSON() with a ?callback=? suffix also fails in Firefox with the same error. As does using $.getScript() to roll my own JSONP <script> tag.
What does appear to work, is having a pre-existing <script id="jsonp" type="text/javascript"></script> in the HTML and then using $("jsonp").attr("src", url + "?callback=myCallback") to invoke the JSONP call. If I do that, then the cross-domain 302 redirect is followed and I get my JSON response passed to myCallback (which I've defined at the same time as the <script/> tag).
And, yes, I'm developing all this using Cassini with localhost:port URLs. Cassini won't respond to non-localhost URLs, so I can't easily try local.mysite.com to see if that has any affect on the solutions I've tried above. However, sticking a dot at the end of localhost appears to have fixed all my problems!
Now I can go back to a standard $.ajax({ ... dataType:"jsonp" ... }) call with localhost__.__:port instead of localhost:port and all is well. I find it interesting that modifying the src attribute of a script tag that pre-exists in the page's HTML does allow ordinary localhost URLs to be invoked - I guess following your thought process, this could be another security vulnerability.