NOTE
This problem somehow happens only with a specific server-side api. Therefore it addresses the wrong problem. I'll not delete it since it have answers and comments.
I'm trying to execute a few ajax requests, do some stuff after each is done and some other stuff after all of them are done, for that I'm using the code below:
let
myarr = [],
myfunc = arg => myarr.push(arg);
$.when(
$.post(myparams).done(myfunc),
$.post(otherparams).done(myfunc),
$.post(yetanother).done(myfunc)
// it comes out with only one arg
).then(e => console.log(myarr));
But when it comes to execute the then block it usually has only executed the done of the first operation, how could I fix that?
I'm sorry if it's a duplicate but honestly I didn't even knew what to search for :/
Comment
I also tried to create my own deferreds where I would execute the ajax and resolve them inside the done block, but that yielded the same results.
Using only done or only then, same.
Per jQuery's documentation on $.when():
Each argument [of .then()] is an array with the following structure: [ data, statusText, jqXHR ]
Meaning you could do something like this...
$.when(
$.post(myparams),
$.post(otherparams),
$.post(yetanother)
).then((res1, res2, res3) => { //Arg for each result
myfunc(res1[0]); //Call myfunc for result 1's data
myfunc(res2[0]); //Call myfunc for result 2's data
myfunc(res3[0]); //Call myfunc for result 3's data
});
Though perhaps a cleaner version might be something like this instead...
let
myarr = [],
myfunc = arg => myarr.push(arg);
$.when(
$.get('https://jsonplaceholder.typicode.com/todos/1'),
$.get('https://jsonplaceholder.typicode.com/todos/2'),
$.get('https://jsonplaceholder.typicode.com/todos/3')
).then((...results) => { //Get all results as an array
results.map(r=>r[0]).forEach(myfunc); //Call myfunc for each result's data
console.log(myarr);
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
Related
I found the following example in ngResource documentation:
var cards = CreditCard.query(function() {
// GET: /user/123/card
// server returns: [ {id:456, number:'1234', name:'Smith'} ];
var card = cards[0];
// each item is an instance of CreditCard
expect(card instanceof CreditCard).toEqual(true);
card.name = "J. Smith";
// non GET methods are mapped onto the instances
card.$save();
// POST: /user/123/card/456 {id:456, number:'1234', name:'J. Smith'}
// server returns: {id:456, number:'1234', name: 'J. Smith'};
// our custom method is mapped as well.
card.$charge({amount:9.99});
// POST: /user/123/card/456?amount=9.99&charge=true {id:456, number:'1234', name:'J. Smith'}
});
As I can understand, the second parameter of function query() is a function, which evaluated on success result of resource query. But simultaneously, this function takes the variable cards which is assigned from result of function query().
I can't understand, if this is normal to Javascript, since every async operation executes single thread?
Or special efforts were taken by creators of AngularJS in order to have function paramater executed after it's result returned?
How would I write my own function
function myfunction(argument, runbefore, runafter) {
runbefore();
POSTPONE runafter();
return Math.sin(argument);
}
which would execute 2nd parameter before itself and 3rd parameter -- after itself?
If I understand right, you are asking how it is possible for the callback function to be called after the return statement. One way that this is possible is through builtin functions that call another function at a later time. Take this code for example:
function doItLater(arg1, callbackFn) {
setTimeout(1000, callbackFn);
return arg1;
}
This will return the same argument that it was passed, and the callback function will be called later (about 1 second after the function has already returned). There are other ways a callback function can be delayed. For example, with an XMLHttpRequest, a callback function can be called after an HTTP response has been received. You can also connect to user events, so that a function will be called when the user does something specific.
If you want a little clarification on how things like setTimeout work in a single-threaded environment, I would suggest reading this article by John Resig.
sorry if this question has been answered before but I couldn't find it.
I have an array of objects, and for each object I want to do an async call (ajax call), and when all async calls are finished, i want to call another function.
eg.
var list = [Object, Object, Object, Object];
var final= [];
$(list).each(function(){
//ajax call
getSomething(data, function(data){
final.push(data);
});
});
After all ajax calls are finished i wanna call function load(final);
Can this be done whith callbacks, and without libraries like when.js etc.
Th.
Call the function when the last item has arrived:
final.push(data);
if (final.length == list.length) load(final);
Since it looks like you have jQuery available, you can use the promises built into jQuery:
var list = [Object, Object, Object, Object];
var promises = [];
$.each(list, function(){
//ajax call
promises.push($.get(data));
});
$.when.apply($, promises).done(function() {
// all ajax calls done
// data from each ajax call is in arguments[0], arguments[1], etc...
load(arguments);
});
One other nice advantage of this mechansim vs. all the others shown so far is that this will keep the results in the order that you requested them, even if they don't come back in that order.
You can also provide a handler to .fail() in addition to .done() (or specify both with a .then(f1, f2)) if you want to catch the case where any ajax call fails.
This is a way to solve the problem with a simple counter
var counter = $(list).length;
$(list).each(function(){
$.get('URL', function(data){
/* do whatever you need for each list item */
if(--counter === 0){
/* Do whatever you wanted to do when all requests completed */
}
});
});
Fundamentally, you keep track of how many calls you've made and how many responses you've gotten, and when you've got all the responses, you call load(final). In your case, quite conveniently you have two arrays and are pushing the results of the calls based on the first array into the second, so you can compare their lengths. (Of course, you'll want to handle the error condition as well.)
Your quoted code is a bit suspect (I think you wanted $.each(list, ..., not $(list).each(...), but I think you probably meant something like this:
var list = [Object, Object, Object, Object];
var final= [];
$.each(list, function(data){
//ajax call
getSomething(data, function(result){ // <= I used `result` rather than `data`; using the same symbol in intermixed code like this is asking for trouble
final.push(result);
if (final.length === list.length) {
// All done
load(final);
}
});
});
I am having a problem, or perhaps a lack of understanding, with the jQuery execution order of $.get() function. I want to retrieve some information from a database server to use in the $.ready() function. As you all know, when the get returns, it passes the data to a return handler that does something with the data. In my case I want to assign some values to variables declared inside the ready handler function. But the problem is, the return handler of $.get() does not execute until after ready has exited. I was wondering if (a) am I doing this right/is there a better way or if (b) there was a way around this (that is, force the get return handler to execute immediately or some other fix I'm not aware of). I have a feeling this is some closure thing that I'm not getting about JavaScript.
As per request, I'll post an example of what I mean:
$(function() {
var userID;
$.get(uri, function(returnData) {
var parsedData = JSON.parse(returnData);
userID = parsedData.userID;
});
});
So as you can see, I'm declaring a variable in ready. Then using a get call to the database to retrieve the data needed. Then I parse the JSON that is returned and assign the userID to the variable declared before. I've tested it with a couple alerts. An alert after the get shows userID as undefined but then an alert in get's return handler shows it to be assigned.
$.get() is asynchronous. You have to use a callback to fill your variable and do the computation after the request is complete. Something like:
$(document).ready(function(){
$.get( "yourUrl", function( data, textStatus, jqXHR ) {
var myData = data; // data contains the response content
// perform your processing here...
registerHandlers( myData ); // you can only pass "data" off course...
});
});
// your function to register the handlers as you said you need to.
function registerHandlers( data ) {
// registering handlers...
}
$.get is an ajax request. A in AJAX stand for asynchronous, so script won't wait for this request to finish, but instead will proceed further with your code.
You can either use complete callback or you can use $.ajax and set async to false to perform synchronous request.
The $.get() function executes an async httprequest, so the callback function will be executed whenever this request returns something. You should handle this callback outside of $.ready()
Maybe if you explain exactly what do you want to do, it would be easier to help!
Are you looking for something like:
$(document).ready(function(){
var variable1, variable 2;
$.get('mydata.url', function(data){
variable1 = data.mydata1;
variable2 = data.mydata2;
});
});
If you declare the variables first, then you can set their values within the get call. You can add a function call at the end of the get handler to call a separate function using these values? Without some kind of example, its hard to go into any more detail.
Without seeing the full code, my guess is that you should declare your variable outside $.ready; initialize it in ready for the initial page load; then update it from the get callback handler.
for example
var x = ""; // declaration
$(document).ready(function() { x = "initial value"; });
$.get(...).success(function() { x = "updated from ajax"; });
I have two functions one of which includes multiple json call which are post by nature.
I want these to be synchronous. That is, one should run only upon the completion of the previous post (and if all posts are done and successful I want the second function to fire).
The code structure is somewhat like this:
$.getSomeData = function() {
$.postJSON("iwantdata.htm",{data:data},function(data)){
});
$.postJSON("iwantmoredata.htm",{data:data},function(data)){
});
});
$.useSomeData = function() {
});
The useSomeData must work upon subsequent json calls.
Can anyone please help me? Thanks in advance.
So basically you want something like this:
function chainPost(url1, url2, initialInput, func) {
$.post(url1, {data: initialInput})
.done(function (initialOutput) {
$.post(url2, {data: initialOutput})
.done(function (secondOutput) {
func(initialOutput, secondOutput);
});
});
}
chainPost("iwantdata.htm", "iwantmoredata.htm", 0, function (first, second) {
alert(first);
alert(second);
});
You can just nest them, starting the 2nd one in the completion function of the first and so on:
$.getSomeData = function() {
$.postJSON("iwantdata.htm",{data:data},function(data) {
$.postJSON("iwantmoredata.htm",{data:data},function(data)){
// use the data here
});
});
};
When dealing with asychronous functions, you cannot write code such as:
$.getSomeData();
$.useSomeData();
By definition, the first is asynchronous so it will not have completed yet with the second function is called and javascript does not have the ability to stop JS execution until an asynchronous operation is done.
You could pass your use function to the get function and then it would get called when the data was available as an addition to the above example like this:
$.getSomeData = function(fn) {
$.postJSON("iwantdata.htm",{data:data},function(data) {
$.postJSON("iwantmoredata.htm",{data:data},function(data)){
fn(data);
});
});
};
Then, you'd have a getSomeData(useFn) function that would take an argument of the function to call when all the data was ready.
Deferred objects [docs] are perfect for this. Unfortunately, your code example contains syntax errors and it is not clear how the calls are nested. So, I'm not sure if you want to run both Ajax calls after one another or parallel, but either way is possible.
Here are two examples. Have a look at the documentation for more information and play around with it.
Note: .postJSON is not a built in jQuery method, I assume here that you are returning the return value from the $.ajax (or $.post) function.
Parallel Ajax calls:
$.getSomeData = function() {
var a = $.postJSON("iwantdata.htm", {data:data});
var b = $.postJSON("iwantmoredata.htm", {data:data});
// return a new promise object which gets resolved when both calls are
// successful
return $.when(a, b);
};
// when both calls are successful, call `$.useSomeData`
// it will have access to the responses of both Ajax calls
$.getSomeData.done($.useSomeData);
See: $.when
Chained Ajax calls:
... where the response of the first call is the input for the second one. This is only an example, of course you can pass any data you want.
$.getSomeData = function() {
return $.postJSON("iwantdata.htm", {data:data}).pipe(function(response) {
// execute the second Ajax call upon successful completion
// of the first one
return $.postJSON("iwantmoredata.htm", {data:response});
});
};
// if both Ajax calls are successful, call `$.useSomeData`
// it will have access to the response of the second Ajax call
$.getSomeData.done($.useSomeData);
See: deferred.pipe()
If you have a more complex logic, you can also create, resolve or reject your own deferred objects. Have a look at the examples in the documentation.
In my app I have the following:
client.on('test', function(req, fn) {
var returnArr = [];
redis.hkeys(req, function (err, replies) {
replies.forEach(function(reply, i) {
if (reply.indexOf('list.') > -1) {
redis.hgetall(reply.substring(5), function(err, r) {
returnArr.push({name:r['name'],index:i});
console.log(returnArr);
});
}
});
console.log(returnArr);
});
console.log(returnArr);
});
For some reason, the second and third logs contain a blank array even though the array is declared once at the beginnning of the event. Any ideas?
EDIT: Sorry, I changed the variable name when I posted it here without thinking. This happens when it's named anything.
Those redis calls are asynchronous. That's why you provide them with callbacks. The code won't work even if you fix the variable name for that reason.
To elaborate: the code in the callback to "hkeys" will be invoked when the data is available. The call will return immediately, however, so your array will have nothing in it at that point.
You cannot wrap asynchronous calls in a function and expect to return a value. It simply won't work.
Instead, the general pattern is to do exactly what the redis API (and virtually everything else in the node.js world; that's kind-of the whole point in fact): give your own function a callback argument to be invoked when appropriate. In your case, it'll be inside the "hgetall" callback that's the last one to be invoked. It should figure out that your results array has as many values in it as there are keys, and so it's time to call the callback passed in to your function.
(I should note that it's not clear what you're trying to do, given that the overall function appears to be a callback to something.)
Another approach would be to use some sort of "promise" pattern, though that's really just a restructuring of the same idea.
edit — the general pattern for an API with a callback would be something like this:
function yourAPI( param1, param2, callback ) {
// ...
some.asynchronousFunction( whatever, function( result ) {
callback( result );
}
}
Now in your case you're making multiple asynchronous service requests, and you'd need to figure out when it's time to invoke the callback. I think you'd probably want to iterate through the "replies" from the call to get the keys and extract the list of ones you want to fetch:
redis.hkeys(req, function (err, replies) {
var keys = [];
replies.forEach(function(reply, i) {
if (reply.indexOf('list.') > -1) {
keys.push( reply.substring(5) );
}
});
keys.forEach( function( key ) {
redis.hgetall(key, function(err, r) {
returnArr.push({name:r['name'],index:i});
if (returnArr.length === keys.length) {
// all values ready
callback( returnArr );
}
});
});
You cannot call your variable return
It is one of a few reserved words that you cannot use in your code as variables.
As Neal suggests don't use javascript reserved words for your variables, here is the list :
https://developer.mozilla.org/en/JavaScript/Reference/Reserved_Words
#Pointy answered this tersely already, but let me explain it a bit more clearly: Those nested functions are not being run in the order you think they are.
Node.js is non-blocking, and uses Javascript's implicit event loop to execute them when ready. Here's your code with line numbers:
/*01*/ client.on('test', function(req, fn) {
/*02*/ var returnArr = [];
/*03*/ redis.hkeys(req, function (err, replies) {
/*04*/ replies.forEach(function(reply, i) {
/*05*/ if (reply.indexOf('list.') > -1) {
/*06*/ redis.hgetall(reply.substring(5), function(err, r) {
/*07*/ returnArr.push({name:r['name'],index:i});
/*08*/ console.log(returnArr);
/*09*/ });
/*10*/ }
/*11*/ });
/*12*/ console.log(returnArr);
/*13*/ });
/*14*/ console.log(returnArr);
/*15*/ });
/*16*/ //Any other code you have after this.
So, what's the order of execution of this thing?
Line 1: Register the event handler for the 'test' event.
Line 16: Start running any other code to be run during this pass through the event loop
Line 2: A 'test' event has been received at some point by the event loop and is now being handled, so returnArr is initialized
Line 3: A non-blocking IO request is performed, and a callback function is registered to execute when the proper event is queued into the event loop.
Line 14-15: The last console.log is executed and this function is finished running, which should end the current event being processed.
Line 4: The request event returns and the callback is executed. The forEach method is one of the few blocking Node.js methods with a callback, so every callback is executed on every reply.
Line 5: The if statement is executed and either ends (goes to line 10) or enters the block (goes to line 6)
Line 6: A non-blocking IO request is performed, adding a new event to the event loop and a new callback to be run when the event comes back.
Line 9: Finishes the registration of the callback.
Line 10: Finishes the if statement
Line 11: Finishes the `forEach callbacks.
Line 12: Executes the second console.log request, which still has nothing in the returnArr
Line 7: One of the events returns and fires the event handler. The returnArr is given the new data.
Line 8: The first console.log is executed. Depending on which event this is, the length of the array will be different. Also the order of the array elements DOES NOT have to match the order of the replies listed in the replies array.
Essentially, you can look at the more deeply nested functions as executing after the entirety of the less-deeply nested functions (because that's what's happening, essentially), regardless of whether the method contains statements after nested non-blocking callback or not.
If this is confusing to you, you can write your callback code in a Continuation Passing Style so it's obvious that everything in the outer function is executed before the inner function, or you can use this nice async library to make your code look more imperative.
This, I think, answers your real question, rather than the one you've entered.