On Firefox, CORS request gives error ":" (colon) - javascript

On Chrome, I'm having no troubles making a cross domain request, however on Firefox (Ubuntu 14.04), I get an error that consists only of a colon on the line that calls for the xmlhttprequest.
xmlhttp.open("GET", "http://x.x.x.x:xxxx/folder/file.xml", false);
The error message is just ":".

Try Disabling AdBlock
I was having a similar issue where all of my XMLHttpRequests were going through except for a few very specific ones where even minor URL changes fixed the problem. And the only thing I was getting was a colon : in the console. In the end I realized that AdBlockPlus was blocking at least one these requests from going through (the request had 'Banners' in the URL).
So I'm not sure if this would have solved your problem, but I've encountered it multiple times.

using Firebug, the issue turned out to be
Blocked loading mixed active content

I had the same error on a CORS POST request.
I'm using https://cors-anywhere.herokuapp.com/ to bypass the Same-origin policy.
The problem was NoScript blocking the external domain, so in case you're using an external API in your request, this might solve it.

Same as this answer, however the problem for me isn't present neither with AdBlock nor with uBlock.
It's present with uMatrix.
You can find that it blocks the request at this URL:
chrome://umatrix/content/logger-ui.html
You can enable it manually when clicking on that field:
and choosing to enable XHR in the uMatrix popup:
The blocking persists even if you have the code packaged in an Add-on that's correctly installed (the blue bar containts Internal UUID of a Firefox Add-on), therefore if you've ever wondered why some Add-ons don't work while uMatrix (or likes) is enabled, this might be one of the reasons.

Related

Problem with $.json return status 0 in Cordova when run from browser [duplicate]

For some reason, while using AJAX (with my dashcode developed application) the browser just stops uploading and returns status codes of 0. Why does this happen?
Another case:
It could be possible to get a status code of 0 if you have sent an AJAX call and a refresh of the browser was triggered before getting the AJAX response. The AJAX call will be cancelled and you will get this status.
In my experience, you'll see a status of 0 when:
doing cross-site scripting (where access is denied)
requesting a URL that is unreachable (typo, DNS issues, etc)
the request is otherwise intercepted (check your ad blocker)
as above, if the request is interrupted (browser navigates away from the page)
Same problem here when using <button onclick="">submit</button>. Then solved by using <input type="button" onclick="">
Status code 0 means the requested url is not reachable. By changing http://something/something to https://something/something worked for me. IE throwns an error saying "permission denied" when the status code is 0, other browsers dont.
It is important to note, that ajax calls can fail even within a session which is defined by a cookie with a certain domain prefixed with www. When you then call your php script e.g. without the www. prefix in the url, the call will fail and viceversa, too.
Because this shows up when you google ajax status 0 I wanted to leave some tip that just took me hours of wasted time... I was using ajax to call a PHP service which happened to be Phil's REST_Controller for Codeigniter (not sure if this has anything to do with it or not) and kept getting status 0, readystate 0 and it was driving me nuts. I was debugging it and noticed when I would echo and return instead of exit the message I'd get a success. Finally I turned debugging off and tried and it worked. Seems the xDebug debugger with PHP was somehow modifying the response. If your using a PHP debugger try turning it off to see if that helps.
I found another case where jquery gives you status code 0 -- if for some reason XMLHttpRequest is not defined, you'll get this error.
Obviously this won't normally happen on the web, but a bug in a nightly firefox build caused this to crop up in an add-on I was writing. :)
This article helped me. I was submitting form via AJAX and forgotten to use return false (after my ajax request) which led to classic form submission but strangely it was not completed.
"Accidental" form submission was exactly the problem I was having. I just removed the FORM tags altogether and that seems to fix the problem. Thank you, everybody!
I had the same problem, and it was related to XSS (cross site scripting) block by the browser. I managed to make it work using a server.
Take a look at: http://www.daniweb.com/web-development/javascript-dhtml-ajax/threads/282972/why-am-i-getting-xmlhttprequest.status0
We had similar problem - status code 0 on jquery ajax call - and it took us whole day to diagnose it. Since no one had mentioned this reason yet, I thought I'll share.
In our case the problem was HTTP server crash. Some bug in PHP was blowing Apache, so on client end it looked like this:
mirek#toccata:~$ telnet our.server.com 80
Trying 180.153.xxx.xxx...
Connected to our.server.com.
Escape character is '^]'.
GET /test.php HTTP/1.0
Host: our.server.com
Connection closed by foreign host.
mirek#toccata:~$
where test.php contained the crashing code.
No data returned from the server (not even headers) => ajax call was aborted with status 0.
In my case, it was caused by running my django server under http://127.0.0.1:8000/ but sending the ajax call to http://localhost:8000/. Even though you would expect them to map to the same address, they don't so make sure you're not sending your requests to localhost.
In our case, the page link was changed from https to http. Even though the users were logged in, they were prevented from loading with AJAX.
In my case, setting url: '' in ajax settings would result in a status code 0 in ie8.. It seems ie just doesn't tolerate such a setting.
For me, the problem was caused by the hosting company (Godaddy) treating POST operations which had substantial response data (anything more than tens of kilobytes) as some sort of security threat. If more than 6 of these occurred in one minute, the host refused to execute the PHP code that responded to the POST request during the next minute. I'm not entirely sure what the host did instead, but I did see, with tcpdump, a TCP reset packet coming as the response to a POST request from the browser. This caused the http status code returned in a jqXHR object to be 0.
Changing the operations from POST to GET fixed the problem. It's not clear why Godaddy impose this limit, but changing the code was easier than changing the host.
I think I know what may cause this error.
In google chrome there is an in-built feature to prevent ddos attacks for google chrome extensions.
When ajax requests continuously return 500+ status errors, it starts to throttle the requests.
Hence it is possible to receive status 0 on following requests.
In an attempt to win the prize for most dumbest reason for the problem described.
Forgetting to call
xmlhttp.send(); //yes, you need this pivotal line!
Yes, I was still getting status returns of zero from the 'open' call.
In my case, I was getting this but only on Safari Mobile. The problem is that I was using the full URL (http://example.com/whatever.php) instead of the relative one (whatever.php). This doesn't make any sense though, it can't be a XSS issue because my site is hosted at http://example.com. I guess Safari looks at the http part and automatically flags it as an insecure request without inspecting the rest of the URL.
In my troubleshooting, I found this AJAX xmlhttpRequest.status == 0 could mean the client call had NOT reached the server yet, but failed due to issue on the client side. If the response was from server, then the status must be either those 1xx/2xx/3xx/4xx/5xx HTTP Response code. Henceforth, the troubleshooting shall focus on the CLIENT issue, and could be internet network connection down or one of those described by #Langdon above.
In my case, I was making a Firefox Add-on and forgot to add the permission for the url/domain I was trying to ajax, hope this saves someone a lot of time.
Observe the browser Console while making the request, if you are seeing "The Same Origin Policy disallows reading the remote resource at http ajax..... reason: cors header ‘access-control-allow-origin’ missing" then you need to add "Access-Control-Allow-Origin" in response header. exa: in java you can set this like response.setHeader("Access-Control-Allow-Origin", "*") where response is HttpServletResponse.

Disable cross domain web security in Firefox v.68 [duplicate]

In Firefox, how do I do the equivalent of --disable-web-security in Chrome. This has been posted a lot, but never a true answer. Most are links to add-ons (some of which don't work in the latest Firefox or don't work at all) and "you just need to enable support on the server".
This is temporary to test. I know the security implications.
I can't turn on CORS on the server and I especially would never be able to allow localhost or similar.
A flag, or setting, or something would be a lot better than a plugin. I also tried: http://www-jo.se/f.pfleger/forcecors, but something must be wrong since my requests come back as completely empty, but same requests in Chrome come back fine.
Again, this is only for testing before pushing to prod which, then, would be on an allowable domain.
Almost everywhere you look, people refer to the about:config and the security.fileuri.strict_origin_policy. Sometimes also the network.http.refere.XOriginPolicy.
For me, none of these seem to have any effect.
This comment implies there is no built-in way in Firefox to do this (as of 2/8/14).
From this answer I've known a CORS Everywhere Firefox extension and it works for me. It creates MITM proxy intercepting headers to disable CORS.
You can find the extension at addons.mozilla.org or here.
Check out my addon that works with the latest Firefox version, with beautiful UI and support JS regex: https://addons.mozilla.org/en-US/firefox/addon/cross-domain-cors
Update: I just add Chrome extension for this https://chrome.google.com/webstore/detail/cross-domain-cors/mjhpgnbimicffchbodmgfnemoghjakai
The Chrome setting you refer to is to disable the same origin policy.
This was covered in this thread also:
Disable firefox same origin policy
about:config -> security.fileuri.strict_origin_policy -> false
I have not been able to find a Firefox option equivalent of --disable-web-security or an addon that does that for me. I really needed it for some testing scenarios where modifying the web server was not possible.
What did help was to use Fiddler to auto-modify web responses so that they have the correct headers and CORS is no longer an issue.
The steps are:
Open fiddler.
If on https go to menu Tools -> Options -> Https and tick the Capture & Decrypt https options
Go to menu Rules -> Customize rules. Modify the OnBeforeResponseFunction so that it looks like the following, then save:
static function OnBeforeResponse(oSession: Session) {
//....
oSession.oResponse.headers.Remove("Access-Control-Allow-Origin");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
//...
}
This will make every web response to have the Access-Control-Allow-Origin: * header.
This still won't work as the OPTIONS preflight will pass through and cause the request to block before our above rule gets the chance to modify the headers.
So to fix this, in the fiddler main window, on the right hand side there's an AutoResponder tab.
Add a new rule and response:
METHOD:OPTIONS https://yoursite.com/ with auto response: *CORSPreflightAllow
and tick the boxes: "Enable Rules" and "Unmatched requests passthrough".
See picture below for reference:
Best Firefox Addon to disable CORS as of September 2016: https://github.com/fredericlb/Force-CORS/releases
You can even configure it by Referrers (Website).
As of June 2022, Mozilla Firefox does allow you to natively change the CORS configuration. No extra addons are required. As per Mozilla docs you can change the CORS setting by changing the value of the key content.cors.disable
To do so first go to your browser and type about:config in your address bar as shown in the
Click on accept risk and continue, since you are on this stack overflow page we assume you are aware of the risks you are undertaking.
You will see a page with your user variables. On this page just search for key content.cors.disable as
You do not have to type in true or false values, just hit the toggle button at the far right of you in the screen and it will change values.
While the question mentions Chrome and Firefox, there are other software without cross domain security. I mention it for people who ignore that such software exists.
For example, PhantomJS is an engine for browser automation, it supports cross domain security deactivation.
phantomjs.exe --web-security=no script.js
See this other comment of mine: Userscript to bypass same-origin policy for accessing nested iframes
For anyone finding this question while using Nightwatch.js (1.3.4), there's an acceptInsecureCerts: true setting in the config file:
firefox: {
desiredCapabilities: {
browserName: 'firefox',
alwaysMatch: {
// Enable this if you encounter unexpected SSL certificate errors in Firefox
acceptInsecureCerts: true,
'moz:firefoxOptions': {
args: [
// '-headless',
// '-verbose'
],
}
}
}
},

Externally load Json with jquery.getJSON

I don't know if this is a duplicate post or not, sorry if it is. I'm using jquery.getJSON to load a json on my server which works just fine. Although, if I try and load a json file on a different server it doesn't work. I know I don't have any code here (because there's not much point) but I just want to know if I'm using it wrong or if it isn't supposed to load external files. I'm using the iOS Safari browser if that effects anything.
EDIT: I've looked at the console (idk what the error thing really means, it's just red with an x by the url it's trying to get the json from) and it looks like it's not actually receiving the data. Plus, do remember I'm on iOS, not desktop so I couldn't look at the console in the "Develop tab :P
EDIT 2: Great! I think I got it working! http://skitty.xyz/getJSON/
You're most likely encountering a path issue; the purpose of $.getJSON is to acquire data via http GET request so yes, it is intended to work remotely. To diagnose your issue, make certain you can access the json file in your browser first: http://domain.com/my_data.json. If that works, use that as the URL you pass into $.getJSON:
$.getJSON( 'http://domain.com/my_data.json', function(data) {
// do something with your data
});
http://api.jquery.com/jquery.getjson/
jquery.getJSON uses ajax which is all about external resources. Here's a couple things to check for if it's not working on an external resource:
1: Is the path you specified correct? The usage is jquery.getJSON(path, callback). The path should be something you can just drop in your browser and see. If an incorrect path is your problem, you'll see a 404 in the console.
2: Is the resource http and your site https? Non-secure resources on secure pages will get blocked by browser security features. You'd see a error to this effect in the console.
3: Is CORS (Cross-origin resource sharing) enabled for your site on the external resource? Servers will sometimes use a whitelist of IPs and domains to determine what origins are allowed to make requests of it. You'd also see an error to this effect in the console.
There probably some other things to look for but this is where I'd start.
Also, by all means, use the debugging features of Safari to LQQK at the actual HTTP data-streams that are passing back-and-forth in response to what you're doing. (You might need to click on a preference to see the "Develop" menu, which will take you to "Show Web Inspector" and its Network tab.)
This approach will instantly answer many questions that a JavaScript-centered approach will not so-readily tell you. (And of course, you can look at the JavaScript console too ... and at the same time.) "The actual data streams, please." Safari will tell you "exactly what bytes" your app actually sent to the server, and "exactly what bytes" the server sent in return. "Priceless!™"
Are you saying you are using jquery ajax request to load some json data from a server?
check the "not working server" has the same end point as your server.
Check if the url you want to get data from is correct.
check if console logged any errors.
Also quote from http://api.jquery.com/jquery.getjson/
"Additional Notes:
Due to browser security restrictions, most "Ajax" requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, port, or protocol.
Script and JSONP requests are not subject to the same origin policy restrictions."

How can I see if the same-origin-policy was applied?

I am testing a website in Firefox 8.
I am using a jQuery.post call on a different domain.
In Firebug the result is just empty.
Can I see if this is due to the same origin policy? The error console is empty.
No, I don't think you can.
But you can try to rule out other problems. Like your Ajax code is not working by making a normal Ajax request (not cross domain) to a static file. Also make a Ajax request to a static file on the target domain for the case where the server is down.

JSONP callback doesn't execute when running at localhost

This is bizarre, I was wondering if anyone could shed some light on why this happened.
Basically, I've been pulling my hair out trying to test JSONP out so I can implement a JSON web service that other sites can use. I'm doing development on localhost--specifically, Visual Studio 2008 and Visual Studio 2008's built-in web server.
So as a JSONP test run w/ jQuery, I implemented the following:
$().ready(function() {
debugger;
try {
$.getJSON("<%= new Uri(Request.Url, "/").ToString() %>XssTest?callback=?", function(data) {
alert(data.abc);
});
} catch (err) {
alert(err);
}
});
And on the server ..
<%= Request["callback"] %>({abc : 'def'})
So what ends up happening is I set a breakpoint on the server and I get the breakpoint both on the first "debugger;" statment in the client-side script as well as on the server. The JSONP URL is indeed being invoked after the page loads. That's working great.
The problem I was having was that the callback would never execute. I tested this in both IE8 as well as Firefox 3.5. Neither one would invoke the callback. The catch(err) was never reached, either. Nothing happened at all!
I'd been stuck on this for a week, and even tested with a manually keyed HTTP request in Telnet on the specified port to be sure that the server is returning the format...
callbackfn({abc : 'def'})
.. and it is.
Then it dawned on me, what if I change the hostname from localhost to localhost with a globalizer ('.'), i.e http://localhost.:41559/ instead of http://localhost:41559/ (yes, adding a dot to any hostname is legal, it is to DNS what global:: is to C# namespaces). And then it worked! Internet Explorer and Firefox 3.5 finally showed me an alert message when I just added a dot.
So this makes me wonder, what is going on here? Why would late script tag generation work with an Internet hostname and not with plain localhost? Or is that the right question?
Clearly this is implemented for security reasons, but what are they trying to secure?? And, by getting it to work with a dot, did I just expose a security hole in this security feature?
By the way, my hosts file, while altered for other hosts, has nothing special going on with localhost; the default 127.0.0.1 / ::1 are still in place with no overrides below.
FOLLOW-UP: I got past this for local development purposes by adding:
127.0.0.1 local.mysite.com
.. to my hosts file, then adding the following code to my global.asax:
protected void Application_BeginRequest(object sender, EventArgs e)
{
if (Request.Headers["Host"].Split(':')[0] == "localhost")
{
Response.Redirect(
Request.Url.Scheme
+ "://"
+ "local.mysite.com"
+ ":" + Request.Url.Port.ToString()
+ Request.Url.PathAndQuery
, true);
}
}
I'm going to throw an answer out there; after some thought I've reached my own conclusions.
It could be that this is a security feature that's implemented to try to thwart an Internet web site from invoking JSONP services running on the client machine.
A web site could just go through a list of ports and keep invoking localhost on different ports and paths. 'Localhost' is one of few DNS hostnames that are dynamic in meaning depending on when and where it's queried, making the potential targets vulnerable. And yes, the fact that appending a dot (.) to 'localhost' ('localhost.') produces a working workaround does expose a security vulnerability, but does offer a [tentative] workaround for development puposes.
A better approach is to map the loopback IP to a new hostname entry in the hosts file so that it works locally, isn't prone to be "fixed" by a browser update, and doesn't work anywhere else but on the development workstation.
I'm experiencing a similar problem. Most of the solutions I've tried work with IE (7), but I'm having difficulty getting Firefox (3.5.2) to play ball.
I've installed HttpFox in order to see how my server's responses are being interpreted on the client, and I'm getting NS_ERROR_DOM_BAD_URI. My situation is a little different to yours though, as I'm trying to invoke a JSONP call back to the same site the hosting page came from, and then this call is responding with a 302 redirect to another site. (I'm using the redirect as a convenient way to get cookies from both domains returned to the browser.)
I'm using jQuery, and I originally tried doing a standard AJAX call via $.ajax(). I figured that as the initial request was to the same site as the hosting page, Firefox would just follow the 302 response to another domain. But no, it appeared to fall foul of XSS defenses. (Note that contrary to what Returning redirect as response to XHR request implies, jQuery does follow the 302 redirect for a standard dataType="json" call: a redirect to the same domain works fine; a redirect to another domain generates NS_ERROR_DOM_BAD_URI in the browser.) As an aside, I don't see why same-domain 302 redirects to other domains can't just be followed - after all, it's the hosting page's domain that is issuing the redirect, so why can't it be trusted? If you're worried about scripting injection attacks, then the JSONP route is open for abuse anyway...
jQuery's $.getJSON() with a ?callback=? suffix also fails in Firefox with the same error. As does using $.getScript() to roll my own JSONP <script> tag.
What does appear to work, is having a pre-existing <script id="jsonp" type="text/javascript"></script> in the HTML and then using $("jsonp").attr("src", url + "?callback=myCallback") to invoke the JSONP call. If I do that, then the cross-domain 302 redirect is followed and I get my JSON response passed to myCallback (which I've defined at the same time as the <script/> tag).
And, yes, I'm developing all this using Cassini with localhost:port URLs. Cassini won't respond to non-localhost URLs, so I can't easily try local.mysite.com to see if that has any affect on the solutions I've tried above. However, sticking a dot at the end of localhost appears to have fixed all my problems!
Now I can go back to a standard $.ajax({ ... dataType:"jsonp" ... }) call with localhost__.__:port instead of localhost:port and all is well. I find it interesting that modifying the src attribute of a script tag that pre-exists in the page's HTML does allow ordinary localhost URLs to be invoked - I guess following your thought process, this could be another security vulnerability.

Categories