Queueing a function call request javascript - javascript

I have a function like this
function queuingFunction(name){
var deferred = $q.defer()
$timeout(function(){
console.log("my name is ",name);
deferred.resolve(true);
},3000)
return deferred.promise;
}
i am calling the function like this
communicatorBaseService.queuingFunction("Jack");
communicatorBaseService.queuingFunction("max");
communicatorBaseService.queuingFunction("Ray");
In the console all the 3 results are displayed after 3 second.
What i need is , at the 3rd second Jack shows in console, after again 3 seconds max is shown then again after 3 second Ray is shown.
If i call queuingFunction in between the execution it should get added to a execution queue.
What i was thinking to do was
Receive the request to execute the function
Add the params to a queue
Call the queuingFunction with the queue
Run the queuingFunction with the queue[0]
On complete of queuingFunction delete the queue[0]
Check if anything is left at position queue[0] re-run the function with the current queue.
Pretty sure this is not the best way, what can be a good way to do this. I am using Angular thus $q is there in code. I don't want to use jQuery.

It sounds from your comments like you want to be able to just make some function call and have it's operation automatically sequenced with the ones that came before. If so, you will need some sort of queue object that can accumulate the operations underway and pending.
Promises already are a sort of queue, so if you can start with a promise and just add a new promise onto the end of the prior operation each time you make your function call, it can sequence them. Here's one way to do that:
// initialize queue with a resolved promise
var queue = $q();
function queuingFunction(name){
// chain this operation onto whatever operation was previously queued
queue = queue.then(function() {
var deferred = $q.defer()
$timeout(function(){
console.log("my name is ",name);
deferred.resolve(true);
},3000)
return deferred.promise;
});
}
Then, you can just call:
queuingFunction("Jack");
queuingFunction("max");
queuingFunction("Ray");
And, the three operations will be sequential.
Here's a working demo using ES6 standard promises: http://jsfiddle.net/jfriend00/adq3L6zt/

Using #jfriend00 approach I did the following using angular $q if anyone interested
var queue = $q(function(resolve,reject){resolve()});
service.queuingFunction = function(name){
// perform some asynchronous operation, resolve or reject the promise when appropriate.
queue = queue.then(function() {
var deferred = $q.defer()
$timeout(function(){
console.log("my name is ",name);
deferred.resolve(true);
},3000)
return deferred.promise;
});
}

Related

Force jQuery Deferred to wait until Ajax complete in "then" handler

I have situation where I believe I need to create a Deferred object with a "then" handler, but wait until the "then" handler has completed it's own promise before moving on.
The use case is a record object, and the above function is it's save method. The record object has an attribute called saveQueue, which is set to $.Deferred() on the record's instantiation. The resolve call on saveQueue was supposed to make sure the Deferred there is always executing every new handler attached to it as soon as it could. The idea being that you can call save several times on the record in short succession, but the calls will run one after another, and not overlap.
I am using a Deferred to enqueue Ajax calls, so that one does not run until the previous one call finished. However, from the same method, I want to return a Deferred that can be resolved/rejected by the jQuery Ajax object, like so:
record.saveQueue = $.Deferred();
self.save = function( record ){
var deferredAction = $.Deferred();
deferredAction.then(function() {
return $.post("/example_save_endpoint");
});
record.saveQueue.always(function(){
deferredAction.resolve();
}).resolve();
return deferredAction;
}
However, when I use this code, the deferredAction promise always ends up resolved, presumably because the #then handler is returning a "pending" (and thus non-rejecting) promise. Is there any way to force the Deferred to wait until the Ajax promise is complete before resolving/rejecting? Or is there another, better way to thread this needle?
Your idea might work, but
the queue must not be resolved using .resolve() every time the method is called, instead it should be initialised only with a resolved promise.
to actually queue on the record.saveQueue, it needs to be changed (overwritten) on every method call, to represent the end of the latest request.
And we don't need any deferreds for that, as we can work with the promises that $.post returns.
So use this:
var emptyQueue = $.when(undefined); // an already fulfilled promise as the start
// equivalent: = $.Deferred().resolve().promise();
function startQueue() {
return emptyQueue; // yes, this delibaretely returns a constant, the begin
// of the queue always looks the same (and is never mutated)
}
// every time you create a record, do
record.saveQueue = startQueue();
// and use that in your methods:
this.save = function(record) {
var queuedRequestResult = record.saveQueue.then(function() {
return $.post("/example_save_endpoint");
// ^^^^^^ promises chain :-)
});
// Magic happens here:
record.saveQueue = queuedRequestResult // we swap the previous queue promise for a new
// one that resolves only after the request
.then(startQueue, startQueue); // and make sure it then starts with a fresh
// queue, especially when the request failed
//.then(null, startQueue) is similar, except unnecessarily remembering the last result
return queuedRequestResult;
}
I would probably choose not to do it this way, but a deferred/promise can indeed be used as a queuing device.
You need a slight(?) variation of what you already tried.
self.queue = $.when();//A resolved promise, used to form a queue of functions in a .then() chain.
self.save = function(data) {
var dfrd = $.Deferred();//A Deferred dedicated to this particular save.
self.queue = self.queue.then(function() {
return $.post("/example_save_endpoint", data) //Make the AJAX call, and return a jqXHR to ensure the downstream queue waits for this jqXHR to resolve/reject.
.then(dfrd.resolve, dfrd.reject) //Resolve/reject the Deferred for the caller's benefit
.then(null, function() {
//Force failure down the success path to ensure the queue is not killed by an AJAX failure.
return $.when();//Return a resolved promsie, for the queue's benefit.
});
});
return dfrd.promise();//allow the caller to do something when the AJAX eventually responds
}
For explanation, see comments in code

Creating jQuery Deferred objects using setTimeout

I am having a hard time understanding the value of creating your own Deferred object.
Say you have the following jQuery
function doStuffFirst() {
var dfd = $.Deferred();
// do some stuff here
setTimeout(function() {
dfd.resolve();
},1);
return dfd.promise();
}
function doStuffAfter() {
//do some other stuff
}
$.when(doStuffFirst()).done(doStuffAfter);
I don't actually know that doStuffFirst() has finished, its just waiting some time before firing doStuffAfter()
why is that any better than just going
function doStuffFirst() {
// do some stuff here
}
setTimeout(function() {
//do some other stuff
},1);
You do know that it is finished; but it is useless as it stands, since you're not executing your task asynchronously. The task executes, then the deferred gets created and fired almost immediately. However, if you change to this:
function doStuffFirst() {
var dfd = $.Deferred();
setTimeout(function() {
// do some stuff HERE
dfd.resolve();
},1);
return dfd.promise();
}
then it becomes useful, since it will return immediately, but resolve some time later (whenever the task is done). While you have just one async task, using a deferred is not much different than using a plain callback (just more complex, and prettier, and the dependency goes the other way: callbacks go in, while promises come out of the routine that schedules the task). However, if you have more complex requirements, like having two async tasks which you want to execute simultaneously but wait until both are done, promises are much superior.
Using setTimeout simply executes the code after the given time, whereas promises executes code once the promised task has completed. Promises are used when dealing with things in which completion time is unknown, such as ajax requests. In setTimeout, you know when exactly to execute some code.

AJAX asynch callback working correctly but how do I wait for the returned values

My main EXTJS function is calling three different AJAX asynch functions.The good news is that each AJAX function is successfully returning a boolean callback on whether the AJAX GET call was successful or not.
The problem - I need to perform additional post business logic from my main calling function once the boolean results have been returned from all three AJAX functions.
But as it is async the code will not wait for the async functions to complete and will fly through.
Any ideas on how I can get my code to GRACEFULLY wait until the typeof status is no longer 'undefined' ?
I am not near my code right now but here is a rough approximation.
function main(){
var status1, status2, status3, result1, result2,result3;
thisIsAsynchFunction1(args,function(result1){
status1 = result1;
});
thisIsAsynchFunction2(args,function(result1){
status2 = result1;
});
thisIsAsynchFunction1(args,function(result1){
status3 = result1;
});
// Perform logic based on callback from functions above
}
Thank You.
You cannot make Javascript wait. It simply doesn't work that way. What you can do is to trigger your final action when you discover that all three async results are done. Without redesigning your code to use promises (which would be the modern way to do this), you could do something like this:
function main(callbackFn){
var status1, status2, status3, doneCnt = 0;
thisIsAsynchFunction1(args,function(result1){
status1 = result1;
++doneCnt;
checkDone();
});
thisIsAsynchFunction2(args,function(result1){
status2 = result1;
++doneCnt;
checkDone();
});
thisIsAsynchFunction1(args,function(result1){
status3 = result1;
++doneCnt;
checkDone();
});
// Perform logic based on callback from functions above
function checkDone() {
// if all three results are done, then call the callback
if (doneCnt === 3) {
callbackFn(status1, status2, status3);
}
}
}
// call main and pass it a callback function
main(function(s1, s2, s3) {
// all three async results are available here
});
// Be careful because code here will execute BEFORE the async results are done
// any code that must wait for the async results must be in the above callback
A newer type of approach would be to have each async operation return a promise and when each async operation completes it would resolve its promise with its result. Then, you could use a promise library function (often called .when() to trigger a callback when all three promises were done. Many modern day JS libraries already return promises from ajax calls (such as jQuery) so the promise is already there.
I don't know ExtJS myself, but looked in its documentation and didn't see promise support yet and I found one thread discussing the fact that promises were not going to be in version 5. So, perhaps you just go with the counter design above or maybe ExtJS has some of its own support for monitoring when multiple async operations are all done.

jQuery's when.apply and wrapped async functions

I have code that requires waiting on a variable number of server responses along with loading several javascript modules via requireJS.
I want to leverage jQuery's when.apply. Unfortunately, the .then() portion of my code is always executing before ANY of my callbacks do.
Here is a summary of my code:
// load module func
function loadRequireJsModule(arg, callback) {
require([arg], callback);
}
// get layer info from server func
function getLayerInfoFromServer(arg, callback) {
var layerInfoListener = new DataLayer([arg]);
layerInfoListener.addRetrievedCallback(callback);
layerInfoListener.retrieveLayerInfo();
}
tasks.push( $.Deferred( function() {
loadRequireJsModule( "./" + layerJSON.type, function( module ) {
layer.controller = module;
});
}));
for (i=0; i<layerJSON.views.length; i++) {
layer.views[i] = {};
// get renderer class from require.js
tasks.push( $.Deferred( function() {
loadRequireJsModule( "./impl/" + layerJSON.views[i].renderer, function( index ){
return function( module ) {
layer.views[index].renderer = new module();
}
}(i));
}));
// POST request for layerInfo
tasks.push( $.Deferred( function() {
getLayerInfoFromServer( {
request: "configure",
layer: layerJSON.layer,
configuration: layerJSON.views[i]
}, function( index ){
return function( dataLayer, layerInfo ) {
layer.views[index].dataService = new TileService( layerInfo );
}
}(i));
}));
}
$.when.apply($, tasks).then( function() {
...
}
I'm worried that because my async functions are wrapped, jQuery is simply waiting until those functions end, which is immediate. I cannot pass jquery the naked async functions, they have to be wrapped. What do I have to do in order for my functions to be considered 'deferred objects'?
Edit - now that you've disclosed the actual code:
You are not using $.Deferred() correctly.
A typical usage in your context would look like this:
var d;
var tasks = [];
for (var i = 0; i < len; i++) {
// create new deferred object
d = $.Deferred();
// put deferred into tasks object
tasks.push(d);
loadRequireJsModule( "./impl/" + layerJSON.views[i].renderer, function( index, deferred ){
return function( module ) {
layer.views[index].renderer = new module();
// now that this operation is done, resolve our deferred
deferred.resolve();
}
}(i, d));
}
$.when.apply($, tasks).done(function() {
// code executes here when all the deferreds
// have had `.resolve()` called on them
});
Then, when all deferred objects that were put into the tasks array get resolved (by you calling .resolve()), then $.when() will fire.
You do not generally pass a callback to $.Deferred() like you were doing (that's used for something else and only used in special circumstances where you want to modify the deferred object in certain ways).
You MUST resolve or reject each deferred yourself by calling one of the methods on the deferred object that resolves or rejects it (there are four different methods that can be used).
Also, note that this structure runs ALL the async tasks in parallel (e.g. it fires them all at the beginning and then notifies you at the end when all of them have finished). You would need a different structure if you want them to run sequentially where you don't start the second one until the first has finished and so on.
If you want async behavior, then $.when() expects its arguments to be jQuery deferred or promise objects. So, for the structure of your code to work, the return from myAsyncWrapper() must be a jQuery deferred or promise object. That way, $.when() can monitor the promise objects for completion and when they are all complete, call your .then() handler.
Then, your async operations must either resolve or reject every one of the deferred objects that you passed to $.when() because it will only call its own .done() when ALL the deferred objects you passed to it have been resolved. Or, it will call its own .fail() if any of the deferred objects you passed to it is rejected.
If you are sure all the arguments you are passing to $.when() are deferreds or promises and your .then() handler is still getting called immediately, then your deferreds must already be resolved somehow because that's the likely explanation for why your .then() handler is called immediately.
If none of the arguments passed to $.when() are deferred or promise objects, then it will resolve itself immediately and thus call your .then() handler immediately which sounds like the behavior you are experiencing.
If your async operations are ajax calls, then jQuery.ajax() already returns a promise object that will be resolved when the ajax call completes. You can directly add the return result from jQuery.ajax() to your tasks array and pass that to $.when() and it will support your asynchronous behavior.
Using jQuery.ajax(), here's the concept for using $.when() with multiple concurrent ajax calls:
var promises = [];
for (var i = 0; i < myNum; i++) {
promises.push($.ajax(...));
}
$.when.apply($, promises).done(function() {
// put code here that runs when all the ajax calls have completed
// the results of all the ajax calls are in
// arguments[0], arguments[1], etc...
}).notify(function() {
// .notify is purely optional and only necessary if you want to be notified as
// each async operation completes
// This will get called as each async operation completes with whatever arguments
// were passed to the success handler
}).fail(function() {
// this will get called immediately if any of the async operations failed
// the arguments are the same as the .done() function except that some may be empty
// if they didn't complete before one of them failed
});
Obviously, you don't have to use all three operations .done(), .notify() and .fail() or you can specify all of them in .then() - I just included all three with comments for educational purposes.
If you want help with some other type of async operation that doesn't already create its own promise object, then I'd be happy to help if you show and describe the actual operation. Deferreds are confusing to get a handle on initially, but like many things are actually quite simple and extremely useful once it finally sinks in.
There is an alternative I found for processing a loop of parallel promises without using an array. Instead use the pattern promise = $.when(promise, anotherPromise).
e.g. for you example something like this
// Start with an empty resolved promise - undefined does the same thing!
var promise;
for (i = 0; i < layerJSON.views.length; i++) {
layer.views[i] = {};
// get renderer class from require.js
promise = $.when(promise, loadRequireJsModule("./impl/" + layerJSON.views[i].renderer, function (index) {
return function (module) {
layer.views[index].renderer = new module();
}
}
));
}
// Now do something after all the promises are resolved
promise.then(function () {
// Do the next part
});
This example assumes you change getLayerInfoFromServer to return a promise:
Notes:
The downside of this technique is that the return values of the individual promises are nested within each other, in reverse order, so are not very useful. It is great if you just need to know when it has completed all tasks.

Late binding for jQuery.when

I am working with a framework that uses jQuery Deferred Objects.
In my code, one async operation should executed successfully so that another async operation be executed.
function doOperation() {
var def2mock = $.Deferred();
var def1 = doSomeAsyncOperation().done(function () {
var def2 = doAnotherAsyncOperation();
});
return $.when(def1, def2mock);
}
So I put the call for the second operation in the "done" promise of the deferred of the first operation as in the example.
Now, I wrap this sequence with the function doOperation. The result of doOperation should be the join of the two deferreds of both async calls. Meaning doOperation succeeds if all async operations succeed and fails if any of them fail, which is exactly what $.when does.
The problem is that in order to create the join with $.when, I need both deferreds present at the time $.when is called. And because the deferred of the second async operation is not available at that moment, I had to find a way to create the join first, then add the deferred of the second async operation to the join later.
To do this, I thought I can define a new mock $.Deferred named def2 in doOperation. And when the real def2 is ready, I can somehow link the real deferred to the mock one, i.e. make the real one synchronize its state with mock one. The only way I found was to do
def2.done(def2mock.resolve)
// and
def2.fail(def2mock.reject)
but I don't feel this manual linking is the right way to do it.
So please tell me if you have a better suggestion on how to do it the right way.
And when the real def2 is ready, I can somehow link the real deferred to the mock one, i.e. make the real one synchronize its state with mock one.
This is actually what .then does: It returns a new promise which gets resolved/rejected when the promise returned from the passed callback is resolved/rejected.
In order to make this work with $.when, you have to keep a reference to the original promise:
function doOperation() {
var def1 = doSomeAsyncOperation();
var def2 = def1.then(function () {
return doAnotherAsyncOperation();
});
return $.when(def1, def2);
}
DEMO
def2 will be resolved when the promise returned by the callback is resolved, i.e. the one that doAnotherAsyncOperation returns.
But I don't think using $.when is a good choice here, from a conceptional point of view. You are actually executing two asynchronous functions sequentially, but $.when is used to run asynchronous functions in parallel.
I would just chain the function calls via .then and collect the responses:
function doOperation() {
return doSomeAsyncOperation().then(function (resp1) {
return doAnotherAsyncOperation().then(function(resp2) {
return [resp1, resp2];
});
});
}
DEMO
I'd say it's clearer here that doAnotherAsyncOperation is executed after doSomeAsyncOperation and the result of both calls is available to the caller of doOperation.
You might have to do something similar for the fail case.
I don't have any experience with Deferred (nor do I fully understand what it does), but if you are just looking to chain operations, would something like this work?
function doOperation(operations) {
var wrapper = {};
wrapper.index = 0;
wrapper.operations = operations;
wrapper.onthen = $.proxy(function() {
if (this.index < this.operations.length) {
this.operations[this.index]().then(this.onthen);
}
this.index++;
}, wrapper);
wrapper.onthen();
}
Here's a demonstration: http://jsfiddle.net/vz6D7/2/

Categories