Need Help Thinking How to Program Asynchronously - javascript

I'm using NodeJS to walk over a list of files and generate an MD5 hash for each one. Here's how I would normally do this synchronously:
// Assume files is already populated with an array of file objects
for(file in files) {
var currentFile = files[file];
currentFile.md5 = md5(file.path);
}
The problem here is that the MD5 function is asynchronous and actually has a callback function that is runs once the MD5 hash has been generated for the file. Thus, all of my currentFile.md5 variables are just going to be set to undefined.
Once I have gotten all of the MD5 hashes for all of the files I'll need to move onto another function to deal with that information.
How gnarly is the code going to get in order for me to do this asynchronously? What's the cleanest way to accomplish what I want to do? Are there common different approaches that I should be aware of?

To call an async function multiple times, you should make a function and call it in recursion like this.
I have assumed your md5 function has a callback with two params err and result.
var keys = Object.keys(files); // taking all keys in an array.
function fn() {
var currentFile = files[keys.shift()];
md5(currentFile, function (err, result) {
// Use result, store somewhere
// check if more files
if (keys.length) {
fn();
} else {
// done
}
});
}

One great approach is to use async. (Search on npm)
If you want to roll your own
Count the files, put that in a var
Everytime fs opens a file and calls your intermediate callback, compute and store the MD5
Also, decrement that counter.
When counter === 0, call a "final" callback, passing back all the MD5s.

To answer your questions (theoretically), in Javascript world, there are (at the moment) 2 different ways to deal with asynchronous code
Using callbacks. This is the most basic way that people start using Javascript know. However , there are plenty of libraries to help people deal with callback in a less painful way such as async, step. In your particular problem. Assuming that md5 is somehow weirdly asynchronous, you can use https://github.com/caolan/async#parallel to achieve it
Another way is to use promise, there are also plenty of promise-compliant libraries such as q, when. Basically, with a promise you have a nicer way to organize your code flow (IMO). With the problem above you can use when.all to gather the result of md5. However, you need to turn md5 into a promise-compliant function

To avoid "callback hell" you should introduce the world of promises to your Node toolset. I suggest q https://npmjs.org/package/q
Here is a post on SO that can help and give you an idea of the syntax how to use q.js promises to work with multiple asynchronous operations.
You essentially would run all your async functions with defered promises, the .then() chained method would fire when all promises are resolved and the function passed inside then() can process your MD5'd data.
I hope this helps.

Related

Async Function Call in Recursion using node js

How to call Async function in recursion actually I have Async method - async function getSSN(req,res,next) and inside this function block I need to call this function but when I call this method by code - return await getSSN(req,res,next) this code is not working for me in node js. Can anyone give me solution for that.?
So, you can't mix async/await with plain asynchronous callbacks. They do not work properly. Your async function getSSN() is not properly resolving when your request.get() finishes because that's just a plain asynchronous callback which the async function knows nothing about.
Instead, you need to use a promise-based request() option. Since, the request library is now deprecated and should not be used for new projects, I would suggest one of the alternatives that are all based on promises already. My favorite is the got() library, but you can look at the list of alternatives and pick the one that has a programming style you like.
In addition, your getSSN() function has a bunch of side effects (modifying higher scoped variables). This is not a good way to program, particularly for asynchronous operations because the outside world generally doesn't know when things are done and when the higher scoped variables have the values in them you want them to. You should make getSSN() return a promise that resolves to its result and then have the caller use that result to modify the higher scoped variables. Again, if you showed us the rest of the coding context here, we could suggest a better overall way to do this. Please notice here that you're just not providing enough code for us to be able to show you the best way to write this code. That should be a general lesson for stackoverflow. Err on the side of giving us more than you think and we can then often make suggestions and improvements far beyond what you even know to ask about.
Here's your function with all the ill-adivsed side effects still in it (because you haven't shown us the rest of the code) using got():
async function getSSN(next) {
const result = await got('idms.dealersocket.com/api/account/…').json();
if (count <= totalpage) {
const data = Object.assign({}, result.Data);
const len = Object.keys(data).length;
for (var i = 0; i <= len; i++) {
ssn.push(data[i].Row.Borrower1SSN);
}
count++;
}
}
Now, getSSN() will return a promise that resolves or rejects when the network request is finished.

JavaScript initializing callback parameters down the callback chain

Note: I'm bootstrapping a reactjs app but this is a general JavaScript question.
I have a special module "locationUtils" that I am trying to keep in it's own package, but keeping that code separate is causing an eyesore with callbacks.
When I access one of it's methods I have to send a callback with it that only has one of its parameters initially defined, and in that method I get the other data parameter to initalize the other parameter.
Can I add in undefined parameters later like that in JavaScript, and is it good practice to initial parameters for a callback method as you go down the callback chain in general, or am I making a convoluted newbie mistake?
/********************Module 1******************************/
var bootStrapUI = function(callback) {
locationUtils.findData(findOtherData(callback));
}
//This gets called last to finalize bootstraping
var findOtherData = function(callback,originalFetchedData){
//use originalFetchedData to get more data
//bootStraping program with all rendering data
callback() //sends back a boolean confirming all fetched
}
/**********************Module2**********************************/
var findData = function(findOtherData){
var data = magicGetData();
findOtherData(findOtherData,data);//I initialized a param late here!
}
It's a good Javascript question, callbacks can become a serious hell for the uninitiated, particularly when they are nested and / or the order in which they return is important.
This is where promises come in: they are an essential tool for Javascript development and about to become part of the standard (in EcmaScript 6).
In essence: a promise is an object that is returned from a function with a method (callback) that is called when the asynchronous action (e.g. API call) has been completed. The difference between a promise and a callback is that promises allow you to structure how you handle the callbacks and, importantly, in what order.
I recently wrote a method that had to make 30 api calls with each call dependent on the results of the previous one (this was not a well designed api). Can you imagine trying to do that with callbacks? As it was, I created an array of promises and used jQuery.when() to handle things when all the api calls had completed.
For the moment we need to use a library for promises. jQuery: https://api.jquery.com/jquery.deferred/ is the obvious one but there are various other implementations that do much the same thing.
Update:
The question relates more specifically to the passing of arguments between callbacks and modifying the arguments as execution moves between them. This can be done easily by passing whatever info you need as an argument to your resolve method.
Typically it looks something like this (using jQuery):
var myAsyncMethod = function(info){
var deferred = $.Deferred();
$.getJSON(myUrl,
function(dataFromServer) {
// Do stuff with data
var newData = doSomething(dataFromServer);
deferred.resolve(newData);
});
});
return deferred.promise();
};
// Make initial method call
myAsyncMethod(myInitialData).then(
function(transformedData){
// transformed data from server is returned here.
}
);

Node.JS - Q Module - Promises

I'm hoping someone who uses the Q module in Node.js can advise me.
So, my question is:
Is it typically necessary to create more than one promise if you are trying to perform multiple functions in sequence?
For example, I'm trying to create an application that:
1) Read data from a file (then)
2) Opens a database connection (then)
3) Executes a query via the database connection (then)
4) Store JSON dataset to a variable (then)
5) Close Database connection. (then)
6) Perform other code base on the JSON dataset I've stored to a variable.
In raw Node.js, each method of an object expects a callback, and in order to do these tasks in the proper order (and not simultaneously -which wouldn't work), I have to chain these callbacks together using an ugly amount of code nesting.
I discovered the Q module, which prevents nesting via the Promise concept.
However, in my first attempt at using Q, I'm trying to make a promise out of everything, and I think I might be over-complicating my code.
I'm think that maybe you only really have to create one promise object to perform the steps mentioned above, and that it may be unnecessary for me to convert every method to a promise via the Q.denodeify method.
For example, in my code I will be connecting to a db2 database using the ibm_db module. Probably due to misunderstanding, I've converted all the ibm_db methods to promises like this:
var ibmdb = require('ibm_db');
var q = require('q');
var ibmdbOpen = q.denodeify(ibmdb.open);
var ibmdbConn = q.denodeify(ibmdb.conn);
var ibmdbClose = q.denodeify(ibmdb.close);
var ibmdbQuery = q.denodeify(ibmdb.query);
Is this really necessary?
In order to do one thing after another in Q, is it necessary for me to denodeify each method I'll be using in my script?
Or, can I just create one promise at the beginning of the script, and use the q.then method to perform all the asynchronous functions in sequence (without blocking).
Is it typically necessary to create more than one promise if you are trying to perform multiple functions in sequence?
Yes, definitively. If you didn't have promises for all the intermediate steps, you'd have to use callbacks for them - which is just what you were trying to avoid.
I'm trying to make a promise out of everything
That should work fine. Indeed, you should try to promisify on the lowest possible level - the rule is to make a promise for everything that is asynchronous. However, there is no reason to make promises for synchronous functions.
Especially your steps 4 and 5 trouble me. Storing something in a variable is hardly needed when you have a promise for it - one might even consider this an antipattern. And the close action of a database should not go in a then handler - it should go in a finally handler instead.
I'd recommend not to use a linear chain but rather:
readFile(…).then(function(fileContents) {
return ibmdbOpen(…).then(function(conn) {
return imbmdbQuery(conn, …).finally(function() {
return imbdbClose(conn);
});
}).then(function(queriedDataset) {
…
});
});

Asserting values in node.js

I have a function,
Edit1 - Updated function with real one because the previous one was simplified synchronous function and the code would have worked as correctly pointed by #AlexMA in the comments
'returnSuccessOrFailure': function () {
return driver.findElement(wd.By.css('div#button')).then(function (button) {
return button.getAttribute('class').then(function (status) {
return status;
});
});
}
In my node.js test, my assertion is failing because the assert is called before returnSuccessOrFailure finishes execution.
var value = returnSuccessOrFailure();
assert.equal(value,'success', 'Looks like something failed');
If I implement a promise in returnSuccessOrFailure and chain my assert then that works. My question is do I have to implement promises all the time for such situations to block the execution? I am new to Javascript and the async nature of it and any insight when to use promises and when not to would be useful.
you don't have to "implement a promise" in, just return the one you already have:
returnSuccessOrFailure': function () {
return driver.findElement(wd.By.css('div#button')).then(function (button) {
...
but then, yes, you do still need to put your assert in a done handler
returnSuccessOrFailure().done(function(value) {
assert.equal(value,'success', 'Looks like something failed');
}
Chaining you asserts will not only make it work but will also make for more readable code. Knowing what happens in what order can be useful when going back to refactor. Not only that but the structure of callbacks/promises also allow for easily written timer tests.
Also, since your test needs to have the current state of execution, it is more than likely that writing tests with asserts in callbacks is what you will need anyway.
My question is do I have to implement promises all the time for such situations to block the execution?
Notice that promises don't block the execution. They defer the execution of code that depends on the result, notice that you're still chaining callbacks on them.
I am new to Javascript and the async nature of it and any insight when to use promises and when not to would be useful.
Promises are useful wherever you have some code that might run asynchronously and needs to pass back an asynchronous result. Otherwise you would need to use callbacks, which are way more ugly than promises.
This is part of code contracts and representing preconditions (what holds before you execute), postconditions (what holds after you execute), and object invariants (what can not change). JavaScript does not have native support for this, but you can use third party libraries (Cerny.js, ecmaDebug, jsContract, or jscategory)
I think it depends on your coding style, is it EAFP(Easier to ask for forgiveness than permission) or LBYL(Look before you leap). Both are viable! In most compiled languages you would use LBYL. However in Python for example you would use EAFP.
Generally if you know you will fail you want to fail fast. If you like to use assertions to ensure code fails fast it is up to you.

Multiple asynchronous public API calls (rails+node.js or reactive js)

I'm trying to make non-blocking calls to 3 public APIs, i.e website A,B,C and then forward the results back to the rails app as JSON datas. I asked if this is possible in node.js on another forum and it seems it is and someone pointed me to this solution that involves using node.js Step module and async library:
Step(
// Make 3 async calls in parallel
function loadStuff() {
getResultFromSiteA(params1, this.parallel());
getResultFromSiteB(params2, this.parallel());
getResultFromSiteC(params3, this.parallel());
},
// Pass the result to Rails when you're done
function passOntoRails(err, resultsA, resultsB, resultsC) {
if (err) { throw err; }
passResultsToRails(resultsA, resultsB, resultsC);
}
)
Recently I also found similar question here. The answer suggests using forkjoin operator available within js extension I've never heard of; 'reactive js'.
So from what I can understand there's 2 ways of doing this; the first one through node.js and the second way is through simple multiple asynchronous ajax calls from client side using 'reactive'.
I'd like to know if one way simply performs better/faster than another? thanks. any opinions/answers/suggestions would be appreciated.
Well the idea is the same, but the first approach is for the server (Node.js) and the second one is for the browser (which you don't need in this case).
Since you have N asynchronous tasks that need to be resolved and only then (after all the tasks are executed and the results returned) can you send the data back to Rails, then using either Step or Async is fine.
How do they work behind the scenes? Well you have N tasks and after each task is resolved N becomes N-1 and so on, until N == 0 and then the callback function gets executed with the desired data.
Read more about flow control in Node.js here:
http://howtonode.org/control-flow
http://howtonode.org/step-of-conductor
http://dailyjs.com/2011/11/14/popular-control-flow/

Categories