I only have to support new browsers.
I have to rely on an external service to provide JSONP data, I do not own that service and it does not allow CORS.
I feel very uneasy having to trust JSONP requests from the external server, since they can run arbitrary code on my end, which would allow them to track my users, and even steal their information.
I was wondering if there was any way to create a JSONP request that is also secure?
(Related: How to reliably secure public JSONP requests? but not with the new browser relaxation)
NOTE: I asked/answered it Q&A style, but I'm very open to other ideas.
Yes!
It is possible. One way to do it would be to use WebWorkers. Code running in WebWorkers has no access to the DOM or other JavaScript code your page is running.
You can create a WebWorker and execute the JSONP request with it, then terminate it when you're done.
The process is something like this:
Create a WebWorker from a blob with the URL to request
Use importScripts to load the JSONP request with a local callback
When that callback executes, post a message back to the script, which in turn will execute the actual callback message with the data.
That way, an attacker would have no information about the DOM.
Here is a sample implementation:
// Creates a secure JSONP request using web workers.
// url - the url to send the request to
// data - the url parameters to send via querystring
// callback - a function to execute when done
function jsonp(url, data, callback) {
//support two parameters
if (typeof callback === "undefined") {
callback = data;
data = {};
}
var getParams = ""; // serialize the GET parameters
for (var i in data) {
getParams += "&" + i + "=" + data[i];
}
//Create a new web worker, the worker posts a message back when the JSONP is done
var blob = new Blob([
"var cb=function(val){postMessage(val)};" +
"importScripts('" + url + "?callback=cb" + getParams + "');"],{ type: "text/javascript" });
var blobURL = window.URL.createObjectURL(blob);
var worker = new Worker(blobURL);
// When you get a message, execute the callback and stop the WebWorker
worker.onmessage = function (e) {
callback(e.data);
worker.terminate();
};
worker.postMessage(getParams); // Send the request
setTimeout(function(){
worker.terminate();//terminate after 10 seconds in any case.
},10000);
};
Here is sample usage that works in JSFiddle:
jsonp("http://jsfiddle.net/echo/jsonp", {
"hello": "world"
}, function (response) {
alert(response.hello);
});
This implementation does not deal with some other issues but it prevents all access to the DOM or the current JavaScript on the page, one can create a safe WebWorker environment.
This should work on IE10+, Chrome, Firefox and Safari as well as mobile browsers.
Related
With this XHR request it would only send request once, if this is done with setTimeout it can repeatedly send request but how to do that in real time? Like on some websites we can see real time bitcoin prices. I read about EventSource but can't understand how it's used.
var xhr = new XMLHttpRequest()
xhr.onload = function(){
if(this.status == 200){
document.write(this.responseText)'
}
}
xhr.open("GET","https://api.coindesk.com/v1/bpi/currentprice/USD.json",true);
xhr.send();
With XHR, you can simulate with a Pull by either setTimeout or setInterval, but if you want more instant response, you'll have to set up a WebSocket server.
Not every browser supports websocket, so you'll have to fall back to the Timeout/Interval pulling.
Similar to an XHR object, a WebSocket object defines onopen, onmessage and onerror callbacks.
To get the data via EventSource you would need to have a server that implements SSE
Server-sent Events (SSE) is a specification that allows servers to send events directly to clients that subscribe to those events, similar to WebSockets and related server to client push technologies.
An implementation of EventSource looks something like this (take a look at this link):
var evtSource = new EventSource('sse.php');
var eventList = document.querySelector('ul');
evtSource.onmessage = function(e) {
var newElement = document.createElement("li");
newElement.textContent = "message: " + e.data;
eventList.appendChild(newElement);
}
If the event generator script is hosted on a different origin, a new EventSource object should be created with both the URL and an options dictionary.
const evtSource = new EventSource("//api.example.com/ssedemo.php", { withCredentials: true } );
In this case you have a REST API, not a SSE, so the best you can get with that to have real-time information is to use a setTimeout function to repeat the Ajax call every n period of time.
I am having some content in local storage . I want to send this in http header every time a request to the server is being made by invoking something like (xhr.setRequestHeader('custom-header', 'value');). Instead of calling the function which does this task before every request , I want it to be called automatically .
This can be done easily by overwriting the send method:
// save the real `send`
var realSend = XMLHttpRequest.prototype.send;
// replace `send` with a wrapper
XMLHttpRequest.prototype.send = function() {
this.setRequestHeader("X-Foobar", "my header content");
// run the real `send`
realSend.apply(this, arguments);
}
This turns XMLHttpRequest.prototype.send into a function that does some arbitrary operation (here, setting the X-Foobar request header on the XMLHttpRequest instance) and then executes the actual Ajax request with the real send method.
Local storage was actually designed not to be sent to the server automatically. This was done to improve on cookies, which result in a large overhead (if they hold much data) due to their being sent with every page request. That slows things down and makes is very bad for mobile phones particularly. So you will have to continue with the method you are already using, or take one of the alternative suggestions offered in other replies.
I was looking into the concept of JSONP callback function. I read some articles regarding that and wanted to get a good grasp of the concept of JSONP.
So, I uploaded one json file to the server - json file
And here is the js code which I wrote to retrieve the data. The call is made from localhost to the abhishekprakash.com.
var xhr;
var dataList;
xhr = new XMLHttpRequest();
xhr.open('GET', 'http://abhishekprakash.com/script/example.json?callback=func_callbk', true);
xhr.send();
func_callback = function(data){
alert(data.data.people[0].id);
}
xhr.onreadystatechange = function(){
if(xhr.readyState == 4){
console.log(dataList);
}
};
And this is the response that I get in the console:
The callback function is called but it does not contain the Json data.
What am I missing?
Any help is appreciated.
Thanks
That example service returns JSON, not JSONP.
The point of JSONP is that due to Same Origin Policy security restrictions, Javascript from domain A cannot make a GET request to resources on domain B; in other words a script cannot retrieve data cross-domain.
JSONP solves this by making domain B explicitly cooperate in the cross-domain data sharing. The script from domain A specifies the name of a callback function and embeds the URL of domain B in the document as if it were including a regular external Javascript file. Domain B then outputs data like this:
callbackFuncName({ data : foo, ... });
That means domain B explicitly outputs a Javascript snippet which calls the specified callback function with the data.
So, unless domain B explicitly cooperates in this, you cannot simply get a JSONP response from it.
The XHR is constrained by cross-domain rules; to use JSONP you need to add a script element:
function func_callbk()
{
console.log(arguments);
}
var s = document.createElement('script');
s.type = 'text/javascript';
s.src = 'http://abhishekprakash.com/script/example.json?callback=func_callbk';
var h = document.getElementsByTagName('script')[0];
h.parentNode.insertBefore(s, h);
As pointed out by Ian in the comments, the proper response of your server should be something like this:
func_callbk('hello world')
Update
If you wish to make this work without JSONP (e.g. if the response should always be JSON), you need to look into CORS as explained in this answer.
I want to retrieve a HTML page as document inside a Firefox/Greasemonkey userscript.
Edit: This is not a cross-domain request.
Here's my example code:
var r = new XMLHttpRequest();
r.open("GET", document.location.href, true);
r.responseType = "document";
r.send(null);
This looks just like the example in https://developer.mozilla.org/en/HTML_in_XMLHttpRequest ,
but r.send(null) causes a TypeError. Causes, not throws! Wrapping the line in a try...catch won't change anything, it seems like a callback or an event handler raises the exception:
TypeError: document.location is null
The traceback refers to a Firefox-internal event.js file, but not to my script.
Removing the line setting the responseType gets rid of the exception, adding callbacks does not.
However, the response is valid and responseXML provides a DOM tree.
I'm using FF 13.0.1.
Am I missing something or is this a bug?
Solution: This had something to do with an extension created by Mozilla's Addon Builder, not Firefox.
The script is running on google.com and you are trying to fetch google.de, right? That's a cross-domain request. (Also, the question code is not a valid synch or asynch use of XMLHttpRequest.)
To do cross-domain (or not) AJAX in a Greasemonkey script (Or Chrome), use GM_xmlhttpRequest().
Note that GM_xmlhttpRequest() does not currently let you specify responseType, but you don't need to do that in this case anyway. If you want a nice parsed document, use DOMParser.
Putting it all together:
GM_xmlhttpRequest ( {
method: 'GET',
//url: 'https://www.google.de/',
url: location.href, // self get, checking for updates
onload: function (respDetails) {
processResponse (respDetails);
}
} );
function processResponse (respDetails) {
// DO ALL RESPONSE PROCESSING HERE...
var parser = new DOMParser ();
var doc = parser.parseFromString (respDetails.responseText, "text/html");
//--- Example showing that the doc is fully parsed/functional...
console.log (doc.querySelectorAll ("p") );
}
PS: Since this is not cross-domain after all, the original code, corrected would be:
var r = new XMLHttpRequest();
r.onload = function () {
// DO ALL RESPONSE PROCESSING HERE...
console.log (this.response.querySelectorAll ("div") );
}
r.open ("GET", location.href, true);
r.responseType = "document";
r.send (null);
for an asynchronous request.
Unfortunately, you cannot do Ajax from one domain to another:
http://en.wikipedia.org/wiki/Same_origin_policy
You can read into CORS:
http://en.wikipedia.org/wiki/Cross-origin_resource_sharing
or JSONP as possible solutions:
http://en.wikipedia.org/wiki/JSONP
However, browsers are designed in such a way so that people can't just randomly create Ajax requests across domains due to this being a security issue.
If you absolutely need to grab content off a different domain, I'd look into creating your own server API using cURL, serving your own content on the same domain, and then using Ajax there. Otherwise, you'll have to see if Google will grant CORS access or has some sort of built in JSONP request.
I'm trying to implement a iGoogle like dashboard interface using widgets that get their content from other sites using JSONP calls.
The problem is that if the first widget that calls the "$.ajax" takes 8 seconds to get the content back, it seems that the callbacks of the other widgets will only be called after the callback of the first widget gets executed. For the user experience, it would be better if the widgets could be displayed as soon as they get the content back from the remote sites, and not wait for those that were scheduled before to complete.
Is there a way I can do that?
EDIT :
I use jquery 1.4.1.
I tested on Chrome and the behaviour seems to be different than on Firefox.
Here is a script that I've made up to try to get what happens :
function showTime(add) { console.log(getTime() + ': ' + add); }
function getNow() { return new Date().getTime(); }
initialTime = getNow();
function getTime() { return getNow() - initialTime; }
function display(data) { showTime('received a response'); }
showTime("Launched a request");
jQuery.getJSON("http://localhost:51223/WaitXSeconds/3?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://localhost:51223/WaitXSeconds/4?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://localhost:63372/WaitXSeconds/9?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://services.digg.com/stories/top?appkey=http%3A%2F%2Fmashup.com&type=javascript&callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://www.geonames.org/postalCodeLookupJSON?postalcode=10504&country=US&callback=?", display);
The first three calls are just fake calls that wait the specified number of seconds.
Note that I use two different servers implementing this method.
Here is the result in the console on Firefox 3.6.2 :
0: Launched a request
3: Launched a request
6: Launched a request
11: Launched a request
14: Launched a request
3027: received a response
7096: received a response
9034: received a response
9037: received a response
9039: received a response
.. and here is the result in Chrome 4.1.249.1036 (41514) :
1: Launched a request
2: Launched a request
3: Launched a request
4: Launched a request
5: Launched a request
165: received a response
642: received a response
3145: received a response
7587: received a response
9157: received a response
It seems that in Firefox, the two requests to the two public APIs get called at the end, after all the other calls succeed.
Chrome, on the other hand, manages to execute the callback as soon as it receives the response.
On both browsers, when the request happen on the same server, they are not done in parallel. They are scheduled one after the other. But I guess this is a reasonable behaviour.
Can anybody explain Firefox's behaviour or has any hack to go around this?
In Firefox, if one of concurrent JSONP request isn't finished, then all successive JSONP request aren't executed, even if their responses have already arrived and written into these tags. This is because <script> tags, used by JSONP, executed synchronously in Firefox. So if one <script> isn't finished, successive <script> tags aren't executed, even if they are populated with response data.
The solution is to wrap concurrent JSONP requests by iFrame. There is a project called jquery-jsonp that solves this issue.
Here is a simplified version of iFramed JSONP:
var jsc = (new Date()).getTime();
function sendJsonpRequest(url, data, callback) {
var iframe = document.createElement("iframe");
var $iframe = jQuery(iframe);
$iframe.css("display", "none");
jQuery("head").append($iframe);
var iframeWindow = iframe.contentWindow;
var iframeDocument = iframeWindow.document;
iframeDocument.open();
iframeDocument.write("<html><head></head><body></body></html>");
iframeDocument.close();
var jsonp = "jsonp" + jsc++;
var url = url + "?callback=" + jsonp;
var params = jQuery.param(data);
if (params) {
url += "&" + params;
}
// Handle JSONP-style loading
iframeWindow[jsonp] = function(data){
if (callback) {
callback(data);
}
// Garbage collect
iframeWindow[jsonp] = undefined;
try{ delete iframeWindow[jsonp]; } catch(e){}
if (head) {
head.removeChild(script);
}
$iframe.remove();
};
var head = iframeDocument.getElementsByTagName("head")[0];
var script = iframeDocument.createElement("script");
script.src = url;
head.appendChild(script);
}
According to the jQuery.ajax() page:
The first letter in Ajax stands for "asynchronous," meaning that the operation occurs in parallel and the order of completion is not guaranteed.
I don't know why the latter-called widgets are returning later, but I don't think it's to do with the jQuery call, unless, as Peter suggested, you've explicitly set async to false.
By default $.ajax is asynchronous.
asyncBoolean Default: true
Make sure you don't have it set to false. Debug the XHR requests using Firebug to see if the requests are correctly sent and why the dom is not getting updated.
You could have a look at this Tutorial to see how to use these tools and how to discover what's wrong with your GUI.