Which is a better way of writing callbacks? - javascript

Just by seeing what I've wrote now, I can see that one is much smaller, so in terms of code golf Option 2 is the better bet, but as far as which is cleaner, I prefer Option 1. I would really love the community's input on this.
Option 1
something_async({
success: function(data) {
console.log(data);
},
error: function(error) {
console.log(error);
}
});
Option 2
something_async(function(error,data){
if(error){
console.log(error);
}else{
console.log(data);
}
});

They are not exactly the same. Option 2 will still log the (data), whereas Option 1 will only log data on success. (Edit: At least it was that way before you changed the code)
That said, Option 1 is more readable. Programming is not / should not be a competition to see who can write the fewest lines that do the most things. The goal should always be to create maintainable, extendable (if necessary) code --- in my humble opinion.

Many people will find option#1 easier to read and to maintain - two different callback functions for two different purposes. It is commonly used by all Promise Libraries, where two arguments will be passed. Of course, the question Multiple arguments vs. options object is independent from that (while the object is useful in jQuery.ajax, it doesn't make sense for promise.then).
However, option#2 is Node.js convention (see also NodeGuide) and used in many libraries that are influenced by it, for example famous async.js. However, this convention is discussable, top google results I found are WekeRoad: NodeJS Callback Conventions and Stackoverflow: What is the suggested callback style for Node.js libraries?.
The reason for the single callback function with an error argument is that it always reminds the developer to handle errors, which is especially important in serverside applications. Many beginners at clientside ajax functions don't care forget about error handling for example, asking themselves why the success callback doesn't get invoked. On the other hand, promises with then-chaining are based on the optionality of error callbacks, propagating them to the next level - of course it still needs to be catched there.

In all honesty, I prefer to take them one step further, into Promises/Futures/Deferreds/etc...
Or (/and) go into a "custom event" queue, using a Moderator (or an observer/sub-pub, if there is good reason for one particular object to be the source for data).
This isn't a 100% percent of the time thing. Sometimes, you just need a single callback. However, if you have multiple views which need to react to a change (in model data, or to visualize user-interaction), then a single callback with a bunch of hard-coded results isn't appropriate.
moderator.listen("my-model:timeline_update", myView.update);
moderator.listen("ui:data_request", myModel.request);
button.onclick = function () { moderator.notify("ui:data_request", button.value); }
Things are now much less dependent upon one big callback and you can mix and match and reuse code.
If you want to hide the moderator, you can make it a part of your objects:
var A = function () {
var sys = null,
notify = function (msg, data) {
if (sys && sys.notify) { sys.notify(msg, data); }
},
listen = function (msg, callback) {
if (sys && sys.listen) { sys.listen(msg, callback); }
},
attach = function (messenger) { sys = messenger; };
return {
attach : attach
/* ... */
};
},
B = function () { /* ... */ },
shell = Moderator(),
a = A(),
b = B();
a.attach(shell);
b.attach(shell);
a.listen("do something", a.method.bind(a));
b.notify("do something", b.property);
If this looks a little familiar, it's similar behaviour to, say Backbone.js (except that they extend() the behaviour onto objects, and others will bind, where my example has simplified wrappers to show what's going on).
Promises would be the other big-win for usability, maintainable and easy to read code (as long as people know what a "promise" is -- basically it passes around an object which has the callback subscriptions).
// using jQuery's "Deferred"
var ImageLoader = function () {
var cache = {},
public_function = function (url) {
if (cache[url]) { return cache[url].promise(); }
var img = new Image(),
loading = $.Deferred(),
promise = loading.promise();
img.onload = function () { loading.resolve(img); };
img.onerror = function () { loading.reject("error"); };
img.src = url;
cache[url] = loading;
return promise;
};
return public_function;
};
// returns promises
var loadImage = ImageLoader(),
myImg = loadImage("//site.com/img.jpg");
myImg.done( lightbox.showImg );
myImg.done( function (img) { console.log(img.width); } );
Or
var blog_comments = [ /* ... */ ],
comments = BlogComments();
blog_comments.forEach(function (comment) {
var el = makeComment(comment.author, comment.text),
img = loadImage(comment.img);
img.done(el.showAvatar);
comments.add(el);
});
All of the cruft there is to show how powerful promises can be.
Look at the .forEach call there.
I'm using Image loading instead of AJAX, because it might seem a little more obvious in this case:
I can load hundreds of blog comments, if the same user makes multiple posts, the image is cached, and if not, I don't have to wait for images to load, or write nested callbacks. Images load in any order, but still appear in the right spots.
This is 100% applicable to AJAX calls, as well.

Promises have proven to be the way to go as far as async and libraries like bluebird embrace node-style callbacks (using the (err, value) signature). So it seems beneficial to utilize node-style callbacks.
But the examples in the question can be easily be converted into either format with the functions below. (untested)
function mapToNodeStyleCallback(callback) {
return {
success: function(data) {
return callback(null, data)
},
error: function(error) {
return callback(error)
}
}
}
function alterNodeStyleCallback(propertyFuncs) {
return function () {
var args = Array.prototype.slice.call(arguments)
var err = args.shift()
if (err) return propertyFuncs.err.apply(null, [err])
return propertyFuncs.success.apply(null, args)
}
}

Related

node, async programming, callback hell

i'm trying to understand callbacks and async programming, but I'm having a bit of trouble.
here's some pseudocode :
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
console.log(lines.length);
It obviously returns 0, as the scrape function is executed asynchronously. I understand that much, but I've tried many intricated ways and can't figure out how to write it properly. Any help/explanation would be greatly appreciated. I've read (and I'm still reading) a lot of tutorials and examples, but I think the only way for me to get it is to write some code myself. If I solve this I'll post the answer.
You could want to check this article for an introduction in Node that might help you understand async programming in Node a little better.
As far as async programming goes, async is a very popular module in Node's userland which helps you write asynchronous code effortlessly. For instance (untested pseudo-code):
function scrape (done) {
http.get(url, done);
}
function parse (res, done) {
var lines = [];
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
})
.on('end', function () {
done(null, lines);
});
}
function done (err, lines) {
if (err) { throw err; }
console.log(lines.length);
}
async.waterfall([scrape, parse], done);
This depends on if you want to scrape all urls in parallell or in series.
If you were to do it in series, you should think of it as this:
Start with the first url. Scrape. In the callback, scrape the next url. in the callback, scrape the next url.
This will give the notorious callback hell you are talking about, but that is the principle at least. That where librarires like async etc removes a lot of the headache.
When programming async calls in this manner, functions and instructions that you want to chain onto the end, such as console.log(lines.length);, must also be callbacks. So for instance, try something like this:
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
done();
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
function done () {
if (lines.length == arrayOfFeedUrls.length) {
console.log(lines.length);
}
}
You may also want to look into promises, an alternative programming style to callbacks, which aims to avoid callback hell.
Have to admit that I'm very new to node.js and struggling to grok the callback stuff. In my limited experience, adding one more parameter to the callback function may be the trick. The hard question is, which parameter?
In your example, if the function scrape had an extra boolean "lastOne", then it could call console.log(lines) itself. Or, if it understood that a null url meant to stop. However, I don't think even this works, as I'm not sure everything will get done in order. If the 2nd URL takes forever, the last one may complete first, right??? (You could try it). In other words, I still don't know which parameter to add. Sorry...
What seems more reliable is to set a counter to urls.length, and for scrape() to decrement it each time. When the counter reaches 0, it knows that the entire process is done and it should log (or do whatever) with the result. I'm not 100% sure where to declare this counter. Coming from Java I still have little idea what is a static global, what is an instance, whatever...
Now, a true-blue node.jser would pass a function to doWhatever as an extra parameter to scrape(), so that you can do something other than console.log(). :-) But I'd settle for the check for zero.
to elaborate slightly, add a callWhenDone parameter to scrape(), and add (somewhere in all that nesting!!!)
if (--counter <= 0)
callWhenDone (lines);
Ok, so here's how i've solved the problem, feel free to comment and tell me if it's right.
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(array){
var url = array.shift();
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
}).on('end', function () {
if(array.length){
scrapeFeeds(array);
}
});
});
});
scrapeFeeds(array);
Thanks for all the answers, i'm looking more in depth to async as I've got more complicated stuff to do. Let me know what you think of my code, it's always useful.

Returning values from Javascript modules after ajax call

---EDITED---due to my ignorance, this is actually the same as alllll the other AJAX-type questions out there...need to get into the right mindset. Leaving it here for posterity's sake and maybe help others take a second look at callbacks before posting.
So I would like to say up front that I think this is not the standard "how do I return a value from an ajax call" issue where people aren't waiting for the async call to finish. I think this is a variable scope misunderstanding with Javascript module patterns, so any guidance would be appreciated.
I am following this SO post on constructing my ajax call, so I am using deferred objects to crunch my data after the call finishes. And also several tutorials on the Javascript module pattern, like this and this. It seems fairly straightforward to return values from within a private module inside my outer module--however, I always get myObj.roots() as undefined. Even though it is defined as an array of X values when I check with breakpoints. What simple thing am I missing--any hints? Thanks! Sorry for a simple question, I'm entirely new to JS module patterns and trying to build my own library...
My JS code:
var myObj = (function(window,document,$,undefined){
var _baseUri = 'http://example.com/',
_serviceUri = 'services/',
_bankId = '1234',
_myUri = _baseUri + _serviceUri + 'objectivebanks/' + _bankId,
_roots = [];
function myAjax(requestURL) {
return $.ajax({
type: 'GET',
url: requestURL,
dataType: 'json',
async: true
});
}
var getRoots = function() {
var _url = _myUri + '/objectives/roots';
_roots = [];
myAjax(_url).done(function(data) {
$.each(data, function(index, nugget) {
_roots.push(nugget);
});
return _roots;
}).fail(function(xhr, textStatus, thrownError) {
console.log(thrownError.message);
});
}
return {
roots: getRoots
};
})(this,document,jQuery);
My error (from Chrome's developer tools' console):
myObj.roots()
undefined
Your "getRoots" function does not return anything. Using the $.ajax(successCallback) or $.ajax.done() patter is the same thing. You are not deferring anything.
There is no way you can do this without callbacks, events or promises.
Callbacks and events are basically the same, only the latter allow better architectural decoupling (highly debatable fact).
Promises mean that you can write var x = getRoots() and x will be undefined until the browser gets back a response from the server. Your application has to account for this. So either you start coding with the async pattern in mind (callbacks, events) or design applications that handle null/undefined values gracefully.
Using callbacks:
function getStuff(callback) {
$.ajax(...).done(function(data) {
// maybe process data?
callback(data);
});
}
getStuff(function(data) {
// this is where I can use data
});
This way you can write your getStuff methods in a separate module, say "DataService" so MVC logic is not polluted.

Does anyone know of a JS Chaining library that allows for deferred execution of methods? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I'm looking for a library that allows me to easily chain together methods but defer their execution until arguments are provided further along in the chain:
chain
.scanDirectory ( '/path/to/scan' )
.recursively()
.for ( /\.js$/i )
.cache()
.provideTo ( '0.locals' )
.as ( 'scripts' )
.defer();
The important thing is that the code behind the scanDirectory function isn't actually called until it's defined that it should be recursive and looking for .js files.
I'm not quite sure how to logically set this up so that I can do something like:
chain
.scanDirectory( '/path/to/scan' )
.scanDirectory( '/another/path' )
.for ( /\.js$/i ) // provided to both paths above?
.doSomethingElse()
which is why I'm looking for a library that may have more mature ideas that accomplish this :)
This post talks about types of execution in JS, there are links to relevant libraries in the end of it
Execution in JavaScript
You have two types of execution in JS:
Synchronous - stuff that happens right when it's called
Asynchronous - stuff that happens when after the current code is done running, also what you refer to as deferred.
Synchronous
Synchronously, you can push actions and parameters to a queue structure, and run them with a .run command.
You can do something like:
var chain = function(){
var queue = []; // hold all the functions
function a(param){
//do stuff, knowing a is set, may also access other params functions set
}
return {
a:function(someParam){
queue.push({action:a,param:someParam});
return this;
},
... // more methods
run:function(){
queue.forEach(function(elem){ // on each item
elem.action.apply(null,param);//call the function on that item
});
}
};
}
This will execute all the functions in the queue when you call run, syntax would be something like
chain().a(15).a(17).run();
Asynchronous
You can simply set a timeout, you don't need to use something like .run for this.
var chainAsync = function(){
// no need for queue
function a(param){
//do stuff, knowing a is set, may also access other params functions set
}
return {
a:function(someParam){
setTimeout(a,0,someParam);
return this;
},
... // more methods
};
}
Usage would be something like
chain().a(16).a(17);
Some issues:
If you want to share parameters between functions, you can store them somewhere in the object itself (have a var state in addition to the queue).
It's either sync, or async. You can't detect one or the other by context. Workarounds are being built for ES6.
More resources
For some implementation of something similar, you can see this question where I implement something similar.
Promises tutorial - promises let you use this type of execution called CPS (continuation passing style) to great effect.
Another nice post on promises.
Bluebird - the fastest and likely best promise library.
Q - probably the most well known and widely used library for chaining execution and promises in JavaScript. Used it several times myself.
Question here on promises and their benefits.
How does basic chaining work in JavaScript - another relevant question here in SO.
Note sure you'll find an all-around working solution for this.
Looks like you're looking for a generic solution to something that would require to have been already baked into the library. I mean, I'm sure there are libraries that have this functionality, but they wouldn't hook auto-magically on other libraries (expect if they have specifically implemented overrides for the right version of the libraries you want to target, maybe).
However, in some scenarios, you may want to look at the Stream.js library, which probably covers enough data-related cases to make it interesting for you:
I don't know whether there's a library to build such methods, but you can easily build that feature yourself. Basically, it will be a settings object with setter methods and one execute function (in your case, defer).
function Scanner() {
this.dirs = [];
this.recurse = false;
this.search = "";
this.cache = false;
this.to = "";
this.name = "";
}
Scanner.prototype = {
scanDirectory: function(dir) {
this.dirs.push(dir);
return this,
},
recursively: function() {
this.recurse = true;
return this;
},
for: function(name) {
this.search = name;
return thsi;
},
cache: function() {
this.cache = true;
return this;
},
provideTo: function(service) {
this.to = service;
return this;
},
as: function(name) {
this.name = name;
return this;
},
defer: function() {
// now, do something with all the given settings here
},
doSomethingElse: function() {
// now, do something else with all the given settings here
}
};
That's the standard way to build a fluent interface. Of course, you could also create a helper function to which you pass a methodname-to-setting map which writes the methods for you if it gets too lengthy :-)
You need a queue to maintain the async and sync-ness of your method chain.
Here is an implementation using jQuery.queue I did for a project:
function createChainable(options) {
var queue = [];
var chainable = {
method1 : function () {
queue.push(function(done){
// code here
done();
});
return chainable;
},
exec1 : function () {
queue.push(function(done){
// code here
done();
});
$.queue(ELEMENT, QUEUE_NAME, queue).dequeue(QUEUE_NAME);
return chainable;
}
};
return chainable;
}
As #Jordan Doyle said in his comment:
Just return this
So every method in your objects should return the object in the return statement so that you can chain on another method.
For example:
var obj = new (function(){
this.methOne = function(){
//...
return this;
}
this.methTwo = function(){
//...
return this;
}
this.methThree = function(){
//...
return this;
}
})();
//So you can do:
obj.methOne().methTwo().methThree();

Wait For All synchronization pattern in Javascript

A JS control calls a data service and continues rendering itself without waiting for the result. Sometimes the service returns after the the controls is being fully rendered, sometimes - before. How do you implement WaitForAll in JS? I'm using jQuery.
Here's what I've done myself: (Utils.WaitForAll simply counts the number of hits once it's matched with the count it calls handle)
// before we started
var waiter = Utils.WaitFor({handle: function(e){ alert("got called"; }, count: 2});
the way it gets triggered:
// place one
waiter.Notify({one: {...}});
and then
// place two (can occur before one though)
waiter.Notify({two: {...}});
which triggers handle, handle has values tagged as one & two in its e. Waiter is an extra 'global' var, travelling down the stack, which i didn't quite like and it's a another new object after all... Any obvious problems with my approach?
You should take a look a promise interface of CommonJS (implemented by jQuery.Deferred) it provides progress callback which can be used in this case.
sample code:
var waiter = $.Deferred();
var len = 2;
waiter.done(function() {
alert("Hooray!!!");
});
waiter.progress(function() {
if(--len === 0) {
waiter.resolve();
}
});
// somewhere
$.ajax({
...
data: somedata,
success: function() {
waiter.notify();
}
});
// somewhere else
$.ajax({
...
data: someotherdata,
success: function() {
waiter.notify();
}
});
More about deferred:
jQuery Deferred API
Learn how to use Deferred here
How to use deferred objects in jQuery (from OP's answer to the same question)
I've found exactly wheat I need being jQuery Deferred, see the article:
http://richardneililagan.com/2011/05/using-deferred-objects-in-jquery-1-5/

How organize asynchronous codes with Dojo?

I am creating a webapp with dynamic TABs (data from RESTful) and each TAB has a dgrid, which I get the columns from a RESTful and the rows from a RESTful, as well. I made everything works well with XHR and MemoryStore, but I now need to change from XHR to JsonRest, because I need to pass to the server, a HTTP Range.
I am having dificulties to organize my code with Asynchronous calls in Dojo. I will give you an example:
method1() - Sync
method2() - Async (JsonRest)
method3() - Sync
What the best way for the method3() be executed, only after method2() is ready?
I have found a class called WHEN. It seems nice. But how do you work with Async apps in dojo?
My biggest problem now: I can't separate my codes by methods, I need put all my code inside the JsonRest's promise function(THEN). Because inside THEN I can't access another method.
I would concur with the recommendation of using Dojo's promise implementation.
This might help you make some sense of it faster if you are not used to promises: http://jsfiddle.net/27jyf/9/. Another nice feature of this is error handling, I would encourage you to read on this after you have the basic sequencing down.
require(["dojo/Deferred", "dojo/when"], function(Deferred, when) {
var sayHello = function() { return 'hello' };
var sayWorld = function() {
var deferred = new Deferred();
window.setTimeout(function() {
deferred.resolve('world');
}, 1000);
return deferred.promise;
};
var sayBang = function() { return '!' };
//This will echo 'hello world !'
//That's probably how you want to sequence your methods here
var message = [];
message.push(sayHello());
sayWorld().then(function(part) {
message.push(part);
message.push(sayBang());
console.debug(message.join(' '));
});
//This will also echo 'hello world !'
//This probably not the syntax that you want here,
//but it shows how to sequence promises and what 'when' actually does
var message2 = [];
when(sayHello())
.then(function(part) {
message2.push(part);
return sayWorld();
})
.then(function(part) {
message2.push(part);
return when(sayBang());
})
.then(function(part) {
message2.push(part);
console.debug(message2.join(' '));
});
//Provided the behavior observed above, this will echo 'hello !'
//dojo/when allows you to use the same syntax for sync and async...
//but it does not let you magically write async operations in a sync syntax
//'world' will be pushed into the array a second later, after the message has already been echoed
var message3 = [];
message3.push(sayHello());
when(sayWorld(), function(part) {
message3.push(part);
});
message3.push(sayBang());
console.debug(message3.join(' '));
});
You can use the promise api to run async/sync methods in a specified order.

Categories