I'm trying to implement a iGoogle like dashboard interface using widgets that get their content from other sites using JSONP calls.
The problem is that if the first widget that calls the "$.ajax" takes 8 seconds to get the content back, it seems that the callbacks of the other widgets will only be called after the callback of the first widget gets executed. For the user experience, it would be better if the widgets could be displayed as soon as they get the content back from the remote sites, and not wait for those that were scheduled before to complete.
Is there a way I can do that?
EDIT :
I use jquery 1.4.1.
I tested on Chrome and the behaviour seems to be different than on Firefox.
Here is a script that I've made up to try to get what happens :
function showTime(add) { console.log(getTime() + ': ' + add); }
function getNow() { return new Date().getTime(); }
initialTime = getNow();
function getTime() { return getNow() - initialTime; }
function display(data) { showTime('received a response'); }
showTime("Launched a request");
jQuery.getJSON("http://localhost:51223/WaitXSeconds/3?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://localhost:51223/WaitXSeconds/4?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://localhost:63372/WaitXSeconds/9?callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://services.digg.com/stories/top?appkey=http%3A%2F%2Fmashup.com&type=javascript&callback=?", display);
showTime("Launched a request");
jQuery.getJSON("http://www.geonames.org/postalCodeLookupJSON?postalcode=10504&country=US&callback=?", display);
The first three calls are just fake calls that wait the specified number of seconds.
Note that I use two different servers implementing this method.
Here is the result in the console on Firefox 3.6.2 :
0: Launched a request
3: Launched a request
6: Launched a request
11: Launched a request
14: Launched a request
3027: received a response
7096: received a response
9034: received a response
9037: received a response
9039: received a response
.. and here is the result in Chrome 4.1.249.1036 (41514) :
1: Launched a request
2: Launched a request
3: Launched a request
4: Launched a request
5: Launched a request
165: received a response
642: received a response
3145: received a response
7587: received a response
9157: received a response
It seems that in Firefox, the two requests to the two public APIs get called at the end, after all the other calls succeed.
Chrome, on the other hand, manages to execute the callback as soon as it receives the response.
On both browsers, when the request happen on the same server, they are not done in parallel. They are scheduled one after the other. But I guess this is a reasonable behaviour.
Can anybody explain Firefox's behaviour or has any hack to go around this?
In Firefox, if one of concurrent JSONP request isn't finished, then all successive JSONP request aren't executed, even if their responses have already arrived and written into these tags. This is because <script> tags, used by JSONP, executed synchronously in Firefox. So if one <script> isn't finished, successive <script> tags aren't executed, even if they are populated with response data.
The solution is to wrap concurrent JSONP requests by iFrame. There is a project called jquery-jsonp that solves this issue.
Here is a simplified version of iFramed JSONP:
var jsc = (new Date()).getTime();
function sendJsonpRequest(url, data, callback) {
var iframe = document.createElement("iframe");
var $iframe = jQuery(iframe);
$iframe.css("display", "none");
jQuery("head").append($iframe);
var iframeWindow = iframe.contentWindow;
var iframeDocument = iframeWindow.document;
iframeDocument.open();
iframeDocument.write("<html><head></head><body></body></html>");
iframeDocument.close();
var jsonp = "jsonp" + jsc++;
var url = url + "?callback=" + jsonp;
var params = jQuery.param(data);
if (params) {
url += "&" + params;
}
// Handle JSONP-style loading
iframeWindow[jsonp] = function(data){
if (callback) {
callback(data);
}
// Garbage collect
iframeWindow[jsonp] = undefined;
try{ delete iframeWindow[jsonp]; } catch(e){}
if (head) {
head.removeChild(script);
}
$iframe.remove();
};
var head = iframeDocument.getElementsByTagName("head")[0];
var script = iframeDocument.createElement("script");
script.src = url;
head.appendChild(script);
}
According to the jQuery.ajax() page:
The first letter in Ajax stands for "asynchronous," meaning that the operation occurs in parallel and the order of completion is not guaranteed.
I don't know why the latter-called widgets are returning later, but I don't think it's to do with the jQuery call, unless, as Peter suggested, you've explicitly set async to false.
By default $.ajax is asynchronous.
asyncBoolean Default: true
Make sure you don't have it set to false. Debug the XHR requests using Firebug to see if the requests are correctly sent and why the dom is not getting updated.
You could have a look at this Tutorial to see how to use these tools and how to discover what's wrong with your GUI.
Related
I want to do an ajax call with vanilla js.
In jQuery, I have this working ajax call:
$.ajax({
url:"/faq/ajax",
datatype: 'json',
type:"POST",
data: {search:'banana'},
success:function(r) {
console.log(r['name'])
}
});
Vanilla JS:
var search = document.getElementById('searchbarfaq').value;
var r = new XMLHttpRequest();
r.open("POST", "/faq/ajax", true);
r.onreadystatechange = function () {
if (r.readyState != 4 || r.status != 200) return;
console.log("Success: " + JSON.parse(r.responseText));
var a = JSON.parse(r.responseText);
console.log(a.name); //also tried a['name']...
};
r.send("search=banana");
The vanilla js call just logs this to the console:
"Success: [object Object]"
Array [ ]
Can someone tell me what I am doing wrong?
You haven't told the server how you are encoding the data in the request.
r.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
Presumably whatever server side handler you are using to process the data isn't parsing it correctly, so isn't finding the data it needs, and then returns a blank array as the result.
Beyond printing out r.responseText to the console, you can also inspect the HTTP response from dev tools built into the browser itself.
On Firefox, for instance:
Tools -> Web Developer -> Network
(this should open a panel listing all the HTTP requests and responses)
Go through the process you use to execute your AJAX call
Look at the corresponding HTTP request by clicking on the item in the list in the panel shown in step 1 (a panel on the right should appear with more details about the request and subsequent response)
Digging around in these tools can give you a lot of insight into the the HTTP request your code is making and the values it's getting back in the response.
A similar process can be performed for all the major browsers out there.
You can use this simple and lightweight Ajax module with the following syntax:
import {ajax} from '/path/to/ajax.min.js';
ajax('https://api_url.com')
.data('key-1','Value-1')
.data('key-2','Value-2')
.send()
.then((data) => { console.log ('success', data) })
.catch((status) => { console.log ('failed', status)} );
I have a nodejs application where I make some ajax request using jquery. In developer tools response of last ajax request is empty if I make redirection, otherwise response exists. Is there any logic to why wouldn't it show response in case of redirection.
I don't understand redirection is made in ajax callback and based on values from response, redirection is made properly which means response exist but chrome dev tools won't show it, What am i doing wrong?
here is my callback
.done(function (response)
{
if (response.errorCode == "00") {
//window.location = "/"; //no response shown if dev tools if i uncomment this
console.log("Yeah i got some response " + response);
}
})
Make sure Preserve the log upon navigation is enabled in chrome dev tools settings.
Let
window.location = '/whatever/address';
be the last thing you call, or do it later using
setTimeout( function () {
window.location = '/whatever/address';
}, 1);`
Beware that all values of variables will be lost on a new page load / navigation
After redirection everything gets reloaded that's why you are not able to access response.
To solve this you can first use your response object & then redirect your page.
Noob question on using callbacks as a control flow pattern with Node and the http class. Based on my understanding of the event loop, all code is blocking, i/o is non-blocking and using callbacks, here's the a simple http server and a pseudo rest function:
// Require
var http = require("http");
// Class
function REST() {};
// Methods
REST.prototype.resolve = function(request,response,callback) {
// Pseudo rest function
function callREST(request, callback) {
if (request.url == '/test/slow') {
setTimeout(function(){callback('time is 30 seconds')},30000);
} else if (request.url == '/test/foo') {
callback('bar');
}
}
// Call pseudo rest
callREST(request, callback);
}
// Class
function HTTPServer() {};
// Methods
HTTPServer.prototype.start = function() {
http.createServer(function (request, response) {
// Listeners
request.resume();
request.on("end", function () {
// Execute only in not a favicon request
var faviconCheck = request.url.indexOf("favicon");
if (faviconCheck < 0) {
//Print
console.log('incoming validated HTTP request: ' + request.url);
//Instantiate and execute on new REST object
var rest = new REST();
rest.resolve(request,response,function(responseMsg) {
var contentType = {'Content-Type': 'text/plain'};
response.writeHead(200, contentType); // Write response header
response.end(responseMsg); // Send response and end
console.log(request.url + ' response sent and ended');
});
} else {
response.end();
}
});
}).listen(8080);
// Print to console
console.log('HTTPServer running on 8080. PID is ' + process.pid);
}
// Process
// Create http server instance
var httpServer = new HTTPServer();
// Start
httpServer.start();
If I open up a browser and hit the server with "/test/slow" in one tab then "/test/foo" in another, I get the following behavior - "foo" responds with "Bar" immediately and then 30 secs late, "slow" responds with "time is 30 seconds". This is what I was expecting.
But if I open up 3 tabs in a browser and hit the server with "/test/slow" successively in each tab, "slow" is being processed and responds serially/synchronously so that the 3 responses appear at 30 second intervals. I was expecting the responses right after each other if they were being processed asynchronously.
What am I doing wrong?
Thank you for your thoughts.
This is actually not the server's fault. Your browser is opening a single connection and re-using it between the requests, but one request can't begin until the previous finishes. You can see this a couple of ways:
Look in the network tab of the Chrome dev tools - the entry for the longest one will show the request in the blocking state until the first two finish.
Try opening the slow page in different browsers (or one each in normal and incognito windows) - this prevents sharing connections.
Thus, this will only happen if the same browser window is making multiple requests to the same server. Also, note that XHR (AJAX) requests will open separate connections so they can be performed in parallel. In the real world, this won't be a problem.
Slightly obscure title but here goes ....
I have a Backbone UI that makes a massive amount of calls to an API on page load. It uses Backbone Fetch Cache to cache the GET requests. On Chrome, a cache miss means that when executing many GET requests to the same URL, at the same time, Chrome causes the duplicate XHRs to wait until the first has finished, then subsequent ones hit the cache.
In Firefox, all XHRs process immediately, even when they are GET requests to the same API endpoint. Refactoring this out of the code would be a pain, so the question is:
Question:
Is there an existing method to patch either the sync() part of Backbone or jQuery so that the Chrome behavior is used across all browsers? such that Firefox waits on the first of duplicate GET requests before processing the others?
You can modify Backbone.ajax to create a lists of requests and wait for the first to complete before emitting the subsequent ones. For example
//cached requests
Backbone.xhrs = {};
Backbone.ajax = function(opts) {
// cache GET requests, not the others
if (opts.type!=='GET')
return Backbone.$.ajax.apply(Backbone.$, arguments);
var xhr;
// issue the request if a cached version does not exist
if (!Backbone.xhrs[opts.url]) {
xhr = Backbone.xhrs[opts.url] = Backbone.$.ajax.call(Backbone.$, opts);
} else {
xhr = Backbone.xhrs[opts.url].then(function() {
return Backbone.$.ajax.call(Backbone.$, opts);
});
}
return xhr;
};
And a demo http://jsfiddle.net/nikoshr/vexNP/
I only have to support new browsers.
I have to rely on an external service to provide JSONP data, I do not own that service and it does not allow CORS.
I feel very uneasy having to trust JSONP requests from the external server, since they can run arbitrary code on my end, which would allow them to track my users, and even steal their information.
I was wondering if there was any way to create a JSONP request that is also secure?
(Related: How to reliably secure public JSONP requests? but not with the new browser relaxation)
NOTE: I asked/answered it Q&A style, but I'm very open to other ideas.
Yes!
It is possible. One way to do it would be to use WebWorkers. Code running in WebWorkers has no access to the DOM or other JavaScript code your page is running.
You can create a WebWorker and execute the JSONP request with it, then terminate it when you're done.
The process is something like this:
Create a WebWorker from a blob with the URL to request
Use importScripts to load the JSONP request with a local callback
When that callback executes, post a message back to the script, which in turn will execute the actual callback message with the data.
That way, an attacker would have no information about the DOM.
Here is a sample implementation:
// Creates a secure JSONP request using web workers.
// url - the url to send the request to
// data - the url parameters to send via querystring
// callback - a function to execute when done
function jsonp(url, data, callback) {
//support two parameters
if (typeof callback === "undefined") {
callback = data;
data = {};
}
var getParams = ""; // serialize the GET parameters
for (var i in data) {
getParams += "&" + i + "=" + data[i];
}
//Create a new web worker, the worker posts a message back when the JSONP is done
var blob = new Blob([
"var cb=function(val){postMessage(val)};" +
"importScripts('" + url + "?callback=cb" + getParams + "');"],{ type: "text/javascript" });
var blobURL = window.URL.createObjectURL(blob);
var worker = new Worker(blobURL);
// When you get a message, execute the callback and stop the WebWorker
worker.onmessage = function (e) {
callback(e.data);
worker.terminate();
};
worker.postMessage(getParams); // Send the request
setTimeout(function(){
worker.terminate();//terminate after 10 seconds in any case.
},10000);
};
Here is sample usage that works in JSFiddle:
jsonp("http://jsfiddle.net/echo/jsonp", {
"hello": "world"
}, function (response) {
alert(response.hello);
});
This implementation does not deal with some other issues but it prevents all access to the DOM or the current JavaScript on the page, one can create a safe WebWorker environment.
This should work on IE10+, Chrome, Firefox and Safari as well as mobile browsers.