Array of Callback Functions - javascript

What I'm doing atm:
function1(function(){
function2(function(){
function3(function(){
function4();
}
}
}
Is there an easier way to do it?
miracleFunction([function1,function2,function3,function4]);
miracleFunction = function(array){
???
}

Using the async package on npm, you can use an array like that, e.g.:
var async = require('async');
async.series([function1, function2, function3, function4]);
In addition to simply running several asynchronous functions in a series, it also has functions simplifying running asynchronous operations in parallel, mapping an array using an asynchronous function, and various other helpful combinators.

Take a look at promises which also allow you to handle errors very nicely.
Q is especially nice, and support just your use case. Direct link here: https://github.com/kriskowal/q#sequences

Instead of going deep into callbacks, break them up into easily understandable functions:
function GetUserData(id, callback) {
// db queries, etc
connection.query('...get user info...', function (err, results) {
connection.query('...get user related whatnot...', function (err, results) {
callback ();
});
});
}
connection.query('...load page data...', function (err, results) {
GetUserData( function () {
res.render('page.ejs', ... );
});
});
You could even break the more used functions off into a module so you don't have too much clutter in your code. The async package looks nice, but to me, personally, I like to see the flow. Always up to programmer's preference.

Related

How to render result of an array of asynchronous forEach 'find' functions?

This is my simple task: Find images by id array and render images value into template.
router.get('/gallery', function(req, res) {
var images = [];
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
if (foundImage) {
images.push(foundImage);
}
});
});
res.render('gallery', {
images: images
});
});
The problem is the 'res.render' function does not wait for 'findById' function to finish. 'images' array always become '[]' empty.
I try to use generator but did not know how to achieve.
If someone can explain without library(like q) will be better. Because I want to know generator deeply how to deal with this problem.
Generators allow to write synchronous-like function, because they can stop its execution and resume it later.
I guess you already read some articles like this and know how to define generator function and use them.
Your asynchronous code can be represented as a simple iterator with a magic yield keyword. Generator function will run and stop here until you resume it using method next().
function* loadImages(imagesIds) {
var images = [], image;
for(imageId of imagesIds) {
image = yield loadSingleImage(imageId);
images.push(image);
}
return images;
}
Because there is a cycle, function will go though the cycle with each next() until all imagesIds will have been walked. Finally there will be executed return statement and you will get images.
Now we need to describe image loading. Our generator function need to know when current image have loaded and it can start to load next. All modern javascript runtimes (node.js and latest browsers) have native Promise object support and we will define a function which returns a promise and it will be eventually resolved with image if it will have been found.
function loadSingleImage(imageId) {
return new Promise((resolve, reject) => {
Images.findById(imageId).exec((findImageErr, foundImage) => {
if (foundImage) {
resolve(foundImage)
} else {
reject();
}
});
});
}
Well we have two functions, one for single image load and the second for putting them together. Now we need a some dispatcher for passing control from one to another function. Since your don't want to use libraries, we have to implement some helper by yourself.
It is a smaller version of spawn function, which can be simpler and better to understand, since we don't need to handle errors, but just ignore missing images.
function spawn(generator) {
function continuer(value) {
var result = generator.next(value);
if(!result.done) {
return Promise.resolve(result.value).then(continuer);
} else {
return result.value;
}
}
return continuer();
}
This functions performs a recursive calls of our generator within continuer function while the result.done is not true. Once it got, that means that generation has been successfully finished and we can return our value.
And finally, putting all together, you will get the following code for gallery loading.
router.get('/gallery', function(req, res) {
var imageGenerator = loadImages(imagesIds);
spawn(imageGenerator).then(function(images) {
res.render('gallery', {
images: images
});
});
});
Now you have a little bit pseudo-synchronous code in the loadImages function. And I hope it helps to understand how generators work.
Also note that all images will be loaded sequently, because we wait asynchronous result of loadSingleImage call to put it in array, before we can go to the next imageId. It can cause performance issues, if you are going to use this way in production.
Related links:
Mozilla Hacks – ES6 In Depth: Generators
2ality – ES6 generators in depth
Jake Archibald – ES7 async functions
It can be done without a 3rd party as you asked, but it would be cumbersome...
Anyway the bottom line is to do it inside the callback function "function(findImageErr,foundImage){..}".
1) Without a 3rd party you - you need to render only after all images were accounted for:
var images = [];
var results=0;
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
results++;
if(foundImage)
images.push(foundImage);
if(results == imagesIds.length)
res.render('gallery',{images:images});
});
});
2) I strongly recommend a 3rd party which would do the same.
I'm currently using async, but I might migrate to promises in the future.
async.map(
imageIds,
function(eachImageId,next){
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
next(null,foundImage);
// don't report errors to async, because it will abort
)
},
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
Edited: following your readability remark, please note if you create some wrapper function for 'findById(...).exec(...)' that ignores errors and just reports them as null (call it 'findIgnoreError'(imageId, callback)) then you could write:
async.map(
imageIds,
findIgnoreError,
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
In other words, it becomes a bit more readable if the reader starts to think Functions... It says "go over those imageIds in parallel, run "findIgnoreError" on each imageId, and the final section says what to do with the accumulated results...
Instead of querying mongo(or any DB) N times, I would just fire a single query using $in:
Images.find({ _id : { $in : imagesIds}},function(err,images){
if(err) return next(err);
res.render('gallery',{images:images});
});
This would also reduce the number of io's, plus you won't have to write additional code to handle res.render

Best way of structuring Javascript If statements to be synchronous in a function

I'm asking this question so I can learn the 'best practice' way of doing something in javascript. Say I have this code here:
var someFunc = function () {
if (something) {
// do something
}
if (somethingElse) {
// do somethingElse
}
};
The question is what would be the best way to ensure that the 'something' is always ran BEFORE the 'somethingElse'. Since javascript is asynchronous, I understand that I would need some sort of callback system to ensure this. However, is there an easier way to refactor this? What if there are many if statements? What are the best libraries to do something like this cleanly? Thanks in advance.
Not all lines of code run asynchronously in Javascript. It depends on your code. For example:
var someFunc = function () {
if (something) {
console.log('something');
}
if (somethingElse) {
console.log('something else');
}
};
Will always write the following output:
something
something else
However if instead of printing the values you are calling a function that will be run later (like an Ajax request or a setTimeout callback), there is no guarantee that your code is run in the exact order. This behaviour depends on the function you are calling. For example the JQuery $.get() function is asynchronous (which means it will call your function at a later time that is not in your control) like this:
var someFunc = function () {
if (something) {
$.get('some-file.txt').done(function (result) {
console.log(result);
});
}
if (somethingElse) {
$.get('some-other-file.txt').done(function (result) {
console.log(result);
});
}
};
The resulting output can be the contents of 'some-file.txt' and 'some-other-file.txt' in any other.
As a rule of thumb whenever you are passing a function to another function (callbacks) you may be using the asynchronous feature of Javascript.
Nested callbacks
One way to solve this issue is to call the second asynchronous call in the first function:
var someFunc = function () {
if (something) {
$.get('some-file.txt').done(function (result1) {
console.log(result1);
if (somethingElse) {
$.get('some-other-file.txt').done(function (result2) {
console.log(result2);
});
}
});
}
};
But as you might have guessed this code will be hard to read.
Promises to the rescue
With Promises you can have a code that is easier to read.
Let's write the above ugly code with promises:
var someFunc = function () {
if (something) {
$.get('some-file.txt').then(function (result1) {
console.log(result1);
if (somethingElse) {
return $.get('some-other-file.txt');
}
}).then(function (result2) {
console.log(result2);
});
};
In general, promises make the code more readable and avoid too many nested callbacks. You can chain promises and it will read like synchronous code but it actually runs asynchronously.
See these questions to get more information:
How do I chain three asynchronous calls using jQuery promises?
Chain of Jquery Promises
What's the catch with promises?
They are not supported in old browsers (but you can add them with a 3rd party library like ES6 Promise Polyfill.
Before the promises were officially standardized every library had their own implementation which are slightly incompatible (jQuery, Angular, Ember)
They are a new concept to learn so the learning curve will be a little steep for newcomers.
Javascript is not asynchronous.
Provided both the if conditions are satisfied, what is inside the first if will get executed first and then the contents of the second if will be executed.

Join thread in JavaScript

Probably asked before, but after the serious searching I'm still not able to find a proper solution. Please consider something like this:
function compute() {
asyncCall(args, function(err, result) {
});
/* 'join thread here' */
}
Even though asyncCall is asynchronous I'd like to use the result and return it from the function compute synchronously. asyncCall is a library call and I can't modify it in any way.
How to wait properly for the asynchronous result without setTimeout and watching a conditional variable? This is possible but suboptimal.
not sure how you can really use something that doesn't exist yet, but it's easy enough to return a slot where the result will be:
function compute() {
var rez=[];
asyncCall(args, function(err, result) {
rez[0]=result;
if(rez.onchange){ rez.onchange(result); }
});
/* 'join thread here' */
return rez;
}
now, you can refer to the [0] property of the return, and once the callback comes in, compute()[0] will have the result. It will also fire an event handler you can attach to the returned array that will fire when the data updates inside the callback.
i would use something more formal like a promise or secondary callback, but that's me...
EDIT: how to integrate a callback upstream:
// sync (old and busted):
function render(){
var myView=compute();
mainDiv.innerHTML=myView;
}
//async using my re-modified compute():
function render(){
var that=compute();
that.onchange=function(e){ mainDiv.innerHTML=e; }
}
see how making it wait only added a single wrapper in the render function?
There's no await syntax in browsers that is widely available. Your options are generally limited to Callback patterns or Promises.
NodeJS follows a callback pattern for most async methods.
function someAsyncMethod(options, callback) {
//callback = function(error, data)
// when there is an error, it is the first parameter, otherwise use null
doSomethingAsync(function(){
callback(null, response);
});
}
....
someAsyncMethod({...}, function(err, data) {
if (err) return alert("OMG! FAilZ!");
// use data
});
Another common implementation is promises, such as jQuery's .ajax() method...
var px = $.ajax({...});
px.data(function(data, xhr, status){
//runs when data returns.
});
px.fail(function(err,xhr, status){
//runs when an error occurs
});
Promises are similar to events...
Of the two methods above, the callback syntax tends to be easier to implement and follow, but can lead to deeply nested callback trees, though you can use utility patterns, methods like async to overcome this.

Any established convenient callback writing styles for javascript?

Callbacks are more and more a requirement in coding, especially when you think about Node.JS non-blocking style of working. But writing a lot of coroutine callbacks quickly becomes difficult to read back.
For example, imagine something like this Pyramid Of Doom:
// This asynchronous coding style is really annoying. Anyone invented a better way yet?
// Count, remove, re-count (verify) and log.
col.count(quertFilter, function(err, countFiltered) {
col.count(queryCached, function(err, countCached) {
col.remove(query, function(err) {
col.count(queryAll, function(err, countTotal) {
util.log(util.format('MongoDB cleanup: %d filtered and %d cached records removed. %d last-minute records left.', countFiltered, countCached, countTotal));
});
});
});
});
is something we see often and can easily become more complex.
When every function is at least a couple of lines longer, it starts to become feasible to separate the functions:
// Imagine something more complex
function mary(data, pictures) {
// Do something drastic
}
// I want to do mary(), but I need to write how before actually starting.
function nana(callback, cbFinal) {
// Get stuff from database or something
callback(nene, cbFinal, data);
}
function nene(callback, cbFinal, data) {
// Do stuff with data
callback(nini, cbFinal, data);
}
function nini(callback, data) {
// Look up pictures of Jeff Atwood
callback(data, pictures);
}
// I start here, so this story doesn't read like a book even if it's quite straightforward.
nana(nene, mary);
But there is a lot of passing vars around happening all the time. With other functions written in between, this becomes hard to read. The functions itself might be too insignificant on their own to justify giving them their own file.
Use an async flow control library like async. It provides a clean way to structure code that requires multiple async calls while maintaining whatever dependency is present between them (if any).
In your example, you'd do something like this:
async.series([
function(callback) { col.count(queryFilter, callback); },
function(callback) { col.count(queryCached, callback); },
function(callback) { col.remove(query, callback); },
function(callback) { col.count(queryAll, callback); }
], function (err, results) {
if (!err) {
util.log(util.format('MongoDB cleanup: %d filtered and %d cached records removed. %d last-minute records left.',
results[0], results[1], results[3]));
}
});
This would execute each of the functions in series; once the first one calls its callback the second one is invoked, and so on. But you can also use parallel or waterfall or whatever flow matches the flow you're looking for. I find it's much cleaner than using promises.
A different approach to callbacks are promises.
Example: jQuery Ajax. this one might look pretty familiar.
$.ajax({
url: '/foo',
success: function() {
alert('bar');
}
});
But $.ajax also returns a promise.
var request = $.ajax({
url: '/foo'
});
request.done(function() {
alert('bar');
});
A benefit is, that you simulate synchronous behavior, because you can use the returned promise instead of providing a callback to $.ajax.success and a callback to the callback and a callback.... Another advantage is, that you can chain / aggregate promises, and have error handlers for one promise-aggregate if you like.
I found this article to be pretty useful.
It describes the pro and cons of callbacks, promises and other techniques.
A popular implementation (used by e.g. AngularJS iirc) is Q.
Combined answers and articles. Please edit this answer and add libraries/examples/doc-urls in a straightforward fasion for everyone's benefit.
Documentation on Promises
Asynchronous Control Flow with Promises
jQuery deferreds
Asynchronous Libraries
async.js
async.waterfall([
function(){ // ... },
function(){ // ... }
], callback);
node fibers
step
Step(
function func1() {
// ...
return value
},
function func2(err, value) {
// ...
return value
},
function funcFinal(err, value) {
if (err) throw err;
// ...
}
);
Q
Q.fcall(func1)
.then(func2)
.then(func3)
.then(funcSucces, funcError)
API reference
Mode examples
More documentation

Javascript - waiting for a number of asynchronous callbacks to return?

What's the best way/library for handling multiple asynchronous callbacks? Right now, I have something like this:
_.each(stuff, function(thing){
async(thing, callback);
});
I need to execute some code after the callback has been fired for each element in stuff.
What's the cleanest way to do this? I'm open to using libraries.
Since you're already using Underscore you might look at _.after. It does exactly what you're asking for. From the docs:
after _.after(count, function)
Creates a version of the function that will only be run after first being called count times. Useful for grouping asynchronous responses, where you want to be sure that all the async calls have finished, before proceeding.
There is a great library called Async.js that helps solve problems like this with many async & flow control helpers. It provides several forEach functions that can help you run callbacks for every item in a an array/object.
Check out:
https://github.com/caolan/async#forEach
// will print 1,2,3,4,5,6,7,all done
var arr = [1,2,3,4,5,6,7];
function doSomething(item, done) {
setTimeout(function() {
console.log(item);
done(); // call this when you're done with whatever you're doing
}, 50);
}
async.forEach(arr, doSomething, function(err) {
console.log("all done");
});
I recommend https://github.com/caolan/async for this. You can use async.parallel to do this.
function stuffDoer(thing) {
return function (callback) {
//Do stuff here with thing
callback(null, thing);
}
}
var work = _.map(stuff, stuffDoer)
async.parallel(work, function (error, results) {
//error will be defined if anything passed an error to the callback
//results will be an unordered array of whatever return value if any
//the worker functions passed to the callback
}
async.parallel() / async.series should suit your requirement. You can provide with a final callback that gets executed when all the REST calls succeed.
async.parallel([
function(){ ... },
function(){ ... }
], callback);
async.series([
function(){ ... },
function(){ ... }
], callback);
Have a counter, say async_count. Increase it by one every time you start a request (inside you loop) and have the callback reduce it by one and check if zero has been reached - if so, all the callbacks have returned.
EDIT: Although, if I were the one writing this, I would chain the requests rather than running them in parallel - in other words, I have a queue of requests and have the callback check the queue for the next request to make.
See my response to a similar question:
Coordinating parallel execution in node.js
My fork() function maintains the counter internally and automatically.

Categories