I am trying to implement some flow control of async ajax calls. My project receives basically an array of values then completes an ajax call for each of them, process the results then run a function when complete.
I have a scoping issue in that the flow control function when initiated a 2nd time with a different array of work to complete its job reference value is overwritten. I have rewritten just the part I have trouble with eliminating all the other functions and posted it on jsfiddle.
http://jsfiddle.net/qanx9/
When the code is initiated the jobRunner function will console log the key to the globalObj where work is found. On the 3rd run of the function job2 is lost and everything is now referring to job1. I have tried putting everything inside this function in an anonymous function but that made no difference.
This is my first attempt at flow control and my project is actually in nodejs. There are libraries that do this for me but I am trying to get a good understanding on how this works.
// global used to track async flow control and store completed work
globalObj = {
job1: {
work: [0,1,2,3,4,5,6,7,8,10],
results: []
},
job2: {
work: [11,12,13,14,15,16,17,18,19,20],
results: []
},
async: {
limit: 5,
running: 0
}
};
// flow control
function jobRunner(job) {
console.log(job)
g = globalObj.async;
j = globalObj[job];
while (g.running < g.limit && j.work.length > 0) {
var task = j.work.shift();
fakeAsyncAjax(task,function(result){
j.results.push(result);
g.running--;
if (j.work.length > 0) {
jobRunner(job);
} else if (g.running == 0) {
jobDone(job);
}
})
g.running++;
}
}
function jobDone(job) {
console.log(job+' complete..');
}
// instead of doing real ajax calls here ive done a basic simulation with a random delay, using setTimeout to make it async.
function fakeAsyncAjax(ref,completeFunc){
setTimeout(function(){
completeFunc(ref);
},Math.floor((Math.random() * 500) + 1))
}
// initate jobs
jobRunner('job1')
jobRunner('job2')
Related
I'm attempting to run an automated job with CasperJS which brings in a JSON file and checks if certain elements on a given page declared in each JSON object exist. I am attempting to write this recursively, but I am running in to an issue where the beginning of the asynchronous recursive flow (i.e. my only synchronous function) is returning synchronously (as expected) and causing the script to fail.
According to this CasperJS github issue, Promises are not currently supported in PhantomJS. I have also tried the .waitFor() CasperJS method, but this causes an infinite loop if I try and bubble up a return value of true from the final asynchronous method (there is also the issue of waitFor being controlled by a timeout value and not waiting on a response, but the timeout value can be changed). Here is a code example attempting to use waitFor below, simplified slightly:
casper.start(urlStart, function() {
recursiveRun(jsonInput.length);
});
var recursiveRun = function(count) {
if (count) {
casper.waitFor(function check() {
return controller(); // Infinite loop here
}, function then() {
jsonInput.shift();
recursiveRun(count - 1);
}
}
}
var controller = function() {
casper.thenOpen(jsonInput[0].url, function() {
return renavigation();
}
}
var renavigation = function() {
casper.waitForSelector("#nav", function() {
this.thenOpen(inventoryUrl, function() {
return inventoryEvaluation();
}
}
}
var inventoryEvaluation = function() {
casper.waitForSelector("#status", function() {
this.evaluate(function() {
// Unimportant in-browser function
}
return true;
}
}
casper.run();
Why is this not working? What is the proper way to recursively and asynchronously perform these actions, if at all?
The CasperJS method .eachThen() seems to perform this functionality as intended.
From the example above, the synchronous function which recursively calls itself can be entirely replaced by having CasperJS iterate over the jsonInput variable in the callback of the .start() method.
casper.start(urlStart, function() {
this.eachThen(jsonInput, function() {
controller();
jsonInput.shift();
}
}
I have written a basic job runner in javascript (utilizing some JQuery too, but that's another story for another day) and I came across this queer Issue:
The method I run to wait for all jobs to complete:
$.getAllProducts = function(callback){
$.getProductDetails('|ALL|', function(allProductsResult){ //intentionally
var objAllProducts = JSON.parse(JSON.parse(allProductsResult));
var objProductsBuiltUp = {};
var productLength = objAllProducts.length;
$.totalJobs(productLength);
var processed = 0;
$.completedJobs(processed);
$.each(objAllProducts, function(i,v){
$.getProductDetails(objAllProducts[i].ProductCode, function(result){
$.mergeShallow(objProductsBuiltUp, JSON.parse(JSON.parse(result)));
processed++;
$.completedJobs(processed);
});
});
$.wait(0, false, function(isDone){ //allow at least 50ms wait time, otherwise this confuses javascript into thinking there are no callbacks
if (isDone){
callback(objProductsBuiltUp.ProductComponents);
}
});
});
}
The handlers for the job
$.checkProgress = function() {
return $.jobs === $.completed;
}
$.totalJobs = function(total) {
$.jobs = total;
}
$.completedJobs = function(completed) {
$.completed = completed;
}
$.wait = function(timeout, debug, callback) {
setTimeout(function() {
if (debug) {
console.log($.completed + " / " + $.jobs + " = " + ($.completed / $.jobs * 100) + "%");
}
if ($.checkProgress() == false) {
$.wait(timeout, debug);
}
callback($.checkProgress()); // <-- complaining one
}, timeout);
}
This is the key-point code for my little job runner, other methods will call $.totalJobs() to set the amount of jobs that need to be performed (normally based on amount different calls need to be made to an API In my scenario), and $.completedJobs() - which is called when the payloads are returned in the API handler's callbacks
The issue is, when I set my "Waiter" to 50ms, I don't get any errors whatsoever, and the method performs as per expected.
When I set it to low values like 5ms, 1ms, 0ms, It tells me:
"xxxxx.helpers.js:48 Uncaught TypeError: callback is not a function"
Anyone have a wild-flung theory why this would occur? it is, afterall, only a glorified setTimeout.
(P.S. In response to why I use JQuery global methods and variables to store info is to make using Meteor easier on myself knowing it's loaded into 1 place - this is the platform I am developing on at the moment.)
EDIT was better for me to add the whole method where the callback is run
It looks like you're not passing a callback here:
$.wait = function(timeout, debug, callback) {
//code here
if ($.checkProgress() == false) {
$.wait(timeout, debug); // === $.wait(timeout, debug, undefined);
}
callback($.checkProgress()); // <-- complaining one
}, timeout);
so if $.checkProgress() is false, you're calling $.wait recursively only callback is undefined...
At first glance, I think what you wanted to write there was:
$.wait(timeout, debug, callback); // pass callback argument to inner call
But then obviously, you wouldn't want to invoke the callback multiple times:
$.wait = function(timeout, debug, callback) {
//code here
if ($.checkProgress() == false) {
$.wait(timeout, debug, callback);
} else {
callback($.checkProgress());
}
}, timeout);
The reason why the line you marked as "the complaining one" is in fact complaining is because it's the recursive call. $.checkProgress evaluated to false, the $.wait function is invoked (this time, callback is undefined), and that continues until $.checkProgress() === false evaluates to false. Then, callback (which is undefined) will get invoked in the inner call.
This issue started appearing when the interval was reduced down. That makes sense, because you're only recursively calling $.wait if the jobs hadn't been completed. The higher the timeout/interval, the greater the chance the jobs were completed the first time around.
By reducing the interval, you arrived at a point where $.wait got invoked before the jobs had finished, and you entered the $.checkProgress() === false branch, calling $.wait without passing the callback (essentially losing a reference to it).
By the time the jobs had completed, you were trying to invoke callbackm which was set to undefined.
In
if ($.checkProgress() == false) {
$.wait(timeout, debug);
}
you're not passing through the callback parameter, so in the "recursive" call it will be undefined and you're getting the exception you posted. Make it
if ($.checkProgress() == false) {
$.wait(timeout, debug, callback);
// ^^^^^^^^
}
Although I usually enjoy the callback-soup that is Node.JS, I found that a certain part of my code needs to be run in a blocking manner because of an SQLite issue. (Yes, I know I could try and address the SQLite part, it actually makes more sense to ensure blocking.)
I like using the async module, and though I have a feeling that module has a feature which can be used here, I can't seem to find it. Or, maybe there is a better module out there. Anyway, without further ado:
func = function(callback) {
let i = 0;
arr.forEach(val => {
if (val.trim().length > 0) {
console.log(`Starting for user ${val}.`);
let mylc = new lcapp(val);
////// TODO this needs to be made sycnronous. /////
async.series({
getMyDetails: callback => getMyDetails(mylc, callback)
}, (err, results) => handleResults(err, results, mylc, callback));
/////////////
}
});
};
The section of code surrounded by //// I would like to block until the handleResults function returns. I understand it will require reprogramming the callback in handleResults, or maybe I need to write a parent function around func but I'd like to see if StackOverflow people have some good ideas.
You could turn it into a function that recursively calls itself when the handleResults callback is hit.
You can do this by following the example below.
fun()
function fun() {
console.time("fun")
var arr = [1, 2, 3, 4, 5]
var i = arr.length - 1;
doStuff(doStuffCallback)
function doStuffCallback() {
if (i > 0) {
i--
doStuff(doStuffCallback)
} else {
console.timeEnd("fun")
}
}
function doStuff(callback) {
setTimeout(function() {
logIt()
callback()
}, 25)
}
function logIt() {
console.log(arr[i])
}
}
// Output:
// 5
// 4
// 3
// 2
// 1
// fun: about 160ms
PS: I'm assuming you only need to be synchronous within this method and the loop therein. Other code might still be running elsewhere in your application while this runs.
Yes, I know I could try and address the SQLite part, it actually makes more sense to ensure blocking.
No it doesn't, because you can't. You need to resolve whatever issue you have with it being async because there is no way to turn asynchronous code into synchronous code.
Hi I am using Jquery and ember to delete certain elements ,I want to use Deferred objects to stop the code and then next statements has to be executed
Here KillMany is Function once it is called it will execute array.forEach(tryKill);
statement
that contains an array of elemets[array contains 100 elements inside for each each time a call back is calling to delete the each element from server]
Here I want to execute my code after completely finishing [deletion of elements] myFinalblock callback has to be calle
please guide me
killMany: function(c) {
var t = this
, wait = []
, dfd = new $.Deferred();
function keep(tile) {
tile.setProperties({ isSelected: false, isHidden: false });
}
function destroy(tile) {
if (t.get('reports')) {
t.get('reports').removeObject(tile.entity);
}
tile.remove.bind(tile);
}
function tryKill(tile) {
tile.set('isHidden', true);
tile.get('entity').kill()
.then(destroy.bind(null, tile),
keep.bind(null, tile));
}
function myFinalblock(){
this.set('selectedTiles', []);
}
this.set('promptDestroyMany', false);
if (c.response) {
var array = this.get('selectedTiles');
array.forEach(tryKill);
myFinalblock();
}
},
You seem to be missing the point of promises a bit. They do not "stop" your code. They allow you to cleanly route your code's asyncrhonous functionality. Therefore you need a way to wait until all of the tryKill calls are done before calling myFinalBlock. To do this, you first need to modify your tryKill function to return its promise:
function tryKill(tile) {
tile.set('isHidden', true);
return tile.get('entity')
.kill()
.then(destroy.bind(null, tile),
keep.bind(null, tile));
}
Then you can do this:
var tiles = this.get('selectedTiles');
$.when.apply($, tiles.map(tryKill))
.then(myFinalBlock)
.done();
On a side note, I suggest looking into a proper promise library and not using jQuery's built-in deferreds, as they are known to have a number of problems.
If I have a function that's passed this function:
function(work) {
work(10);
work(20);
work(30);
}
(There can be any number of work calls with any number in them.)
work performance some asynchronous activity—say, for this example, it just is a timeout. I have full control over what work does on the completion of this operation (and, in fact, its definition in general).
What's the best way of determining when all the calls to work are done?
My current method increments a counter when work is called and decrements it when it completes, and fires the all work done event when the counter is 0 (this is checked after every decrement). However, I worry that this could be a race condition of some sort. If that is not the case, do show my why and that would be a great answer.
There are a ton of ways you can write this program, but your simple technique of using a counter will work just fine.
The important thing to remember, the reason this will work, is because Javascript executes in a single thread. This is true of all browsers and node.js AFAIK.
Based on the thoughtful comments below, the solution works because the JS event loop will execute the functions in an order like:
function(work)
work(10)
counter++
Start async function
work(20)
counter++
Start async function
work(30)
counter++
Start async function
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
Counter is 0, so you fire your work done message
-- back out to event loop --
There's no race condition. There is the added requirement for every request made to perform a decrement when it's finished (always! including on http failure, which is easy to forget). But that can be handled in a more encapsulated way by wrapping you calls.
Untested, but this is the gist (I've implemented an object instead of a counter, so theoretically you can extend this to have more granular queries about specific requests):
var ajaxWrapper = (function() {
var id = 0, calls = {};
return {
makeRequest: function() {
$.post.apply($, arguments); // for example
calls[id] = true;
return id++;
},
finishRequest: function(id) {
delete calls[id];
},
isAllDone: function(){
var prop;
for(prop in calls) {
if(calls.hasOwnProperty(prop)) {return false;}
}
return true;
}
};
})();
Usage:
Instead of $.post("url", ... function(){ /*success*/ } ... ); We'll do
var requestId;
requestId = ajaxWrapper.makeRequest("url", ...
function(){ /*success*/ ajaxWrapper.finishRequest(requestId); } ... );
If you wanted to be even more sophisticated you could add the calls to finishRequest yourself inside the wrapper, so usage would be almost entirely transparent:
ajaxWrapper.makeRequest("url", ... function(){ /*success*/ } ... );
I have an after utility function.
var after = function _after(count, f) {
var c = 0, results = [];
return function _callback() {
switch (arguments.length) {
case 0: results.push(null); break;
case 1: results.push(arguments[0]); break;
default: results.push(Array.prototype.slice.call(arguments)); break;
}
if (++c === count) {
f.apply(this, results);
}
};
};
The following code below would just work. Because javascript is single threaded.
function doWork(work) {
work(10);
work(20);
work(30);
}
WorkHandler(doWork);
function WorkHandler(cb) {
var counter = 0,
finish;
cb(function _work(item) {
counter++;
// somethingAsync calls `finish` when it's finished
somethingAsync(item, function _cb() {
finish()
});
});
finish = after(counter, function() {
console.log('work finished');
});
};
I guess I should explain.
We pass the function that does work to the workhandler.
The work handler calls it and passes in work.
The function that does work calls work multiple times incrementing the counter
Since the function that does work is not asynchronous (very important) we can define the finish function after it has finished.
The asynchronouswork that is being done cannot finish (and call the undefined finish function) before the current synchronous block of work (the execution of the entire workhandler) has finished.
This means that after the entire workhandler has finished (and the variable finish is set) the asynchronous work jobs will start to end and call finish. Only once all of them have called finish will the callback send to after fire.