Node asynchronous calls in sequence - javascript

I have a couple of asynchronous requests that fetch some data from a url. The problem I am having is that I actually want to delay sending the json response until all requests have come back. The code looks something like this:
getFirstStuff(callback) //imagine this is async
{
console.log('gettingFirstStuff');
callback(stuff);
}
function getFurtherStuff(callback) //imagine this is async
{
console.log('gettingFurtherStuff');
callBack(thing);
}
function getStuff(callBack)
{
getFirstStuff(function(stuff) // async
{
// stuff is an array of 3 items
stuff = stuff.map(function(item) // map is synchronous
{
// For each item in stuff make another async request
getFurtherStuff( function(thing) { // this is also async
stuff.thing = thing;
});
return item;
});
callback(stuff);
});
}
router.get('/getstuff', function(req, res, next) {
getStuff(function(stuff)
{
console.log('finished stuff');
// RETURN RESPONSE AS JSON
res.json(stuff);
});
});
The output will be:
gettingFirstStuff
finished stuff
gettingFurtherStuff
gettingFurtherStuff
gettingFurtherStuff
but it should be:
gettingFirstStuff
gettingFurtherStuff
gettingFurtherStuff
gettingFurtherStuff
finished stuff
I understand that the reason is that getFurtherStuff is async and item will be returned from map before the getFurtherStuff async calls are back with a result. My question is, what is the standard way to wait for these calls to finish before calling the final callback 'callback(stuff)'

There are a bunch of ways to solve this problem. Libraries like async and queue would probably be the best choice, if you have no problem adding dependencies.
The easiest option without external libs is just to count the async jobs and finish when they're all done:
// assuming stuff is an array
var counter = 0;
var jobCount = stuff.length;
// wrap callback in one that checks the counter
var doneCallback = function() {
if (counter >= jobCount) {
// we're ready to go
callback(stuff);
}
};
// run jobs
stuff.map(function(item) {
getFurtherStuff(item, function(itemThing) {
// process async response
stuff.thing = itemThing;
// increment counter;
counter++;
// call the wrapped callback, which won't fire
// until all jobs are complete
doneCallback();
});
});

npm install async
You would then simply throw your functions in to an async.parallel()
More info at https://www.npmjs.com/package/async

Related

How to execute variable number of async calls(coming dynamically at runtime) serially?

I am making a chrome extension (mv3). Based on user activity, the content.js passes a message to the background.js which then calls an async function to add data in Google Docs using Docs API.
I want each request to execute only after the previous one has finished running. I am using chrome.runtime.sendMessage to send a message from content.js and don't see a way of calling background.js serially from there. So I need a way of executing them one by one in background.js only. The order of these requests is also important (but if the order of the requests gets changed by one/two places, I think that would still be okay from a user perspective).
I tried something and it is working but I am not sure if I am missing some edge cases, because I was unable to find the approach in any other answers -
Semaphore-like queue in javascript?
Run n number of async function before calling another method in nodejs
JavaScript: execute async function one by one
The approach I used is: I use a stack like structure to store requests, use setInterval to check for any pending requests and execute them serially.
content.js:
chrome.runtime.sendMessage({message});
background.js:
let addToDocInterval = "";
let addToDocCalls = [];
async function addToDoc(msg) {
// Await calls to doc API
}
async function addToDocHelper() {
if(addToDocCalls.length === 0)
return;
clearInterval(addToDocInterval)
while(addToDocCalls.length > 0) {
let msg = addToDocCalls.shift();
await addToDoc(msg);
}
addToDocInterval = setInterval(addToDocHelper, 1000);
}
chrome.runtime.onMessage.addListener((msg) => {
// Some other logic
addToDocCalls.push(msg);
})
addToDocInterval = setInterval(addToDocHelper, 1000);
Is this approach correct? Or is there any better way to do this?
I'd suggest changing several things.
Don't use timers polling the array. Just initiate processing the array anytime you add a new item to the array.
Keep a flag on whether if you're already processing the array so you don't start duplicate processing.
Use a class to encapsulate this functionality into an object.
Encapsulate the addToDocCalls array and adding to it so your class is managing it and outside code just calls a function to add to it which also triggers the processing. Basically, you're making it so callers don't have to know how the insides work. They just call helper.addMsg(msg) and the class instance does all the work.
Here's an implementation:
async function addToDoc(msg) {
// Await calls to doc API
}
class docHelper {
constructor() {
this.addToDocCalls = [];
this.loopRunning = false;
}
addMsg(msg) {
// add item to the queue and initiate processing of the queue
this.addToDocCalls.push(msg);
this.process();
}
async process() {
// don't run this loop twice if we're already running it
if (this.loopRunning) return;
try {
this.loopRunning = true;
// process all items in the addToDocCalls we have
while(this.addToDocCalls.length > 0) {
let msg = addToDocCalls.shift();
await addToDoc(msg);
}
} finally {
this.loopRunning = false;
}
}
}
const helper = new docHelper();
chrome.runtime.onMessage.addListener((msg) => {
// Some other logic
helper.addMsg(msg);
});
So, process() will run until the array is empty. Any interim calls to addMsg while process() is running will add more items to array and will call process() again, but the loopRunning flag will keep it from starting duplicate processing loops. If addMsg() is called while process is not running, it will start the process loop.
P.S. You also need to figure out what sort of error handling you want if addToDoc(msg) rejects. This code protects the this.loopRunning flag if it rejects, but doesn't actually handle a reject error. In code like this that is processing a queue, often times all you can really do is log the error and move on, but you need to decide what is the proper course of action on a rejection.
You don't need to use setTimeout. You do not even need a while loop.
let addToDocInterval = "";
let addToDocCalls = [];
let running = false;
async function addToDoc(msg) {
// Await calls to doc API
}
async function addToDocHelper() {
if(running || addToDocCalls.length === 0)
return;
running = true;
let msg = addToDocCalls.shift();
await addToDoc(msg);
running = false;
addToDocHelper();
}
chrome.runtime.onMessage.addListener((msg) => {
// Some other logic
addToDocCalls.push(msg);
addToDocHelper();
});
The code should be self explanatory. There is no magic.
Here is a generic way to run async tasks sequentially (and add more tasks to the queue at any time).
const tasks = [];
let taskInProgress = false;
async function qTask(newTask) {
if (newTask) tasks.push(newTask);
if (tasks.length === 0) return;
if (taskInProgress) return;
const nextTask = tasks.shift();
taskInProgress = true;
try {
await nextTask();
} finally {
taskInProgress = false;
//use setTimeout so call stack can't overflow
setTimeout(qTask, 0);
}
}
//the code below is just used to demonstrate the code above works
async function test() {
console.log(`queuing first task`);
qTask(async () => {
await delay(500); //pretend this task takes 0.5 seconds
console.log('first task started');
throw 'demonstrate error does not ruin task queue';
console.log('first task finished');
});
for (let i = 0; i < 5; i++) {
console.log(`queuing task ${i}`)
qTask(async () => {
await delay(200); //pretend this task takes 0.2 seconds
console.log(`task ${i} ran`);
});
}
await delay(1000); //wait 1 second
console.log(`queuing extra task`);
qTask(async () => {
console.log('extra task ran');
});
await delay(3000); //wait 3 seconds
console.log(`queuing last task`);
qTask(async () => {
console.log('last task ran');
});
}
test();
function delay(ms) {
return new Promise(resolve => {
setTimeout(resolve, ms);
});
}

Nodejs: Fully executing websocket responses sequentially with async calls

Below is a simple example of what I'm currently working with: a websocket stream which does some asynchronous calls as part of the logic when consuming the incoming data. I'm mimicking async calls with a Promise-ified setTimeout function:
function someAsyncWork() {
return new Promise(resolve => {
setTimeout(() => {
resolve('async work done');
}, 5);
});
}
async function msg() {
const msg = await someAsyncWork();
console.log(msg)
}
const main = async() => {
web3.eth.subscribe('pendingTransactions').on("data", async function(tx){
console.log('1st print: ',tx);
await msg();
console.log('2nd print: ',tx);
})
}
main();
Running the above results in a console output like so:
1st print: 0x8be207fcef...
1st print: 0x753c308980...
1st print: 0x4afa9c548d...
async work done
2nd print: 0x8be207fcef...
async work done
2nd print: 0x753c308980...
async work done
2nd print: 0x4afa9c548d...
.
.
.
I get what's happening here. The '1st print' is executed, followed by await-ing the async calls for each piece of data response. The '2nd print' is only executed after the 'async work done' occurs.
However this isn't quite what I'm looking for.
My logic has conditionals in place, where each data response will first use a global variable to check for a condition, followed by some async work if condition is met. Problem is that there are instances where some data responses will go ahead and execute async work when they shouldn't have: Nodejs's event loop hasn't had a chance to transfer some previous data response's async calls from the callback queue to the call stack, as the stack was too busy processing new incoming data. This means the '2nd prints' haven't executed (where the global variable is updated) before new incoming data has been processed. I imagine the someAsyncWork is only resolved when there is a free period in the websocket with no data incoming.
My question is: is there a way to ensure full, sequential processing of each piece of new data? Ideally the console output would look something like this:
1st print: 0x8be207fcef...
async work done
2nd print: 0x8be207fcef...
1st print: 0x753c308980...
async work done
2nd print: 0x753c308980...
1st print: 0x4afa9c548d...
async work done
2nd print: 0x4afa9c548d...
.
.
.
You can have a queue-like promise that keeps on accumulating promises to make sure they run sequentially:
let cur = Promise.resolve();
function enqueue(f) {
cur = cur.then(f);
}
function someAsyncWork() {
return new Promise(resolve => {
setTimeout(() => {
resolve('async work done');
}, 5);
});
}
async function msg() {
const msg = await someAsyncWork();
console.log(msg);
}
const main = async() => {
web3.eth.subscribe('pendingTransactions').on("data", function(tx) {
enqueue(async function() {
console.log('1st print: ',tx);
await msg();
console.log('2nd print: ',tx);
});
})
}
main();

JavaScript event handler working with an async data store API causing race condition

I need to update some data every time certain browser event fires (for example, when a browser tab closes):
chrome.tabs.onRemoved.addListener(async (tabId) => {
let data = await getData(); // async operation
... // modify data
await setData(data); // async operation
});
The problem is, when multiple such event triggers in quick succession, the async getData() could return stale result in subsequent invocation of the event handler before setData() gets a chance to finish in earlier ones, leading to inconsistent result.
If the event handler can execute synchronously then this problem wouldn't occur, but getData() and setData() both are async operations.
Is this a race condition? What's the recommended pattern to handle this type of logic?
--- Update ---
To provide more context, getData() and setData() are simply promisified version of some Chrome storage API:
async function getData() {
return new Promise(resolve => {
chrome.storage.local.get(key, function(data) => {
// callback
});
});
}
async function setData() {
return new Promise(resolve => {
chrome.storage.local.set({ key: value }, function() => {
// callback
});
});
}
I wrapped the API call in a Promise for readability purposes, but I think it's an async op either way?
You have a fairly classic race condition for a data store with an asynchronous API and the race condition is even worse if you use asynchronous operations in the processing of the data (between the getData() and setData(). The asynchronous operations allow another event to run in the middle of your processing, ruining the atomicity of your sequence of events.
Here's an idea for how to put the incoming tabId in a queue and make sure you're only processing one of these events at a time:
const queue = [];
chrome.tabs.onRemoved.addListener(async (newTabId) => {
queue.push(newTabId);
if (queue.length > 1) {
// already in the middle of processing one of these events
// just leave the id in the queue, it will get processed later
return;
}
async function run() {
// we will only ever have one of these "in-flight" at the same time
try {
let tabId = queue[0];
let data = await getData(); // async operation
... // modify data
await setData(data); // async operation
} finally {
queue.shift(); // remove this one from the queue
}
}
while (queue.length) {
try {
await run();
} catch(e) {
console.log(e);
// decide what to do if you get an error
}
}
});
This could be made more generic so it could be reusably used in more than place (each with their own queue) like this:
function enqueue(fn) {
const queue = [];
return async function(...args) {
queue.push(args); // add to end of queue
if (queue.length > 1) {
// already processing an item in the queue,
// leave this new one for later
return;
}
async function run() {
try {
const nextArgs = queue[0]; // get oldest item from the queue
await fn(...nextArgs); // process this queued item
} finally {
queue.shift(); // remove the one we just processed from the queue
}
}
// process all items in the queue
while (queue.length) {
try {
await run();
} catch(e) {
console.log(e);
// decide what to do if you get an error
}
}
}
}
chrome.tabs.onRemoved.addListener(enqueue(async function(tabId) {
let data = await getData(); // async operation
... // modify data
await setData(data); // async operation
}));
JS ascync/await does not really turns JS code synchronous.
What you cold do is block the event loop on getData using Promisse.all.
So,
chrome.tabs.onRemoved.addListener(async (tabId) => {
... // turns in a composition
await setData(Promise.all([getData])[0]); // async composition
});
You cold do a async composition with a block on event loop, when the event is triggered, the VM will have a list with events, and a block on the await getData.
In fact does not exist async composition, is just a trick with the VM to block the event loop and awaits to the result of an operation cause the VM process this as a list and lists don't wait.
Be careful with your code to become readable when using compositions.

AngularJS: Waiting for Multiple asynchronous calls in a single function

I have an AngularJS function in which I am making 2 async calls. The 2 calls are not dependent but the function should only return when both calls are finished and result is stored in the return var.
I tried different solutions and ending up using the one shown below. But its very slow as its wait for the first one to finish
function myAngularfunction(a, b) {
var defer = $q.defer();
var test= {
a: "",
b: ""
};
msGraph.getClient()
.then((client) => {
// First Async call
client
.api("https://graph.microsoft.com/v1.0/")
.get()
.then((groupResponse) => {
var result = groupResponse;
test.a= result.displayName;
msGraph.getClient()
.then((client) => {
// Second Async call
client
.api("https://graph.microsoft.com/v1.0/teams/")
.get()
.then((groupResponse) => {
var result = groupResponse;
test.b= result.displayName;
defer.resolve(test);
});
});
});
});
return defer.promise;
}
Calling the function
myAngularfunction(a, b).then(function(obj)
How can I wait for the both calls in the same function with out nesting them? Or calling the next one without waiting for the first one to finish.
may be you can try $when like this: $.when(functiton1,function2).done(()=>{}). but you need add deffered in function1 and function 2 .
I think your situation is best suited to use $q.all([promise1, promise2, promise3]).then() syntax.
You are calling msGraph.getClient() multiple times, I think this can be avoided.

GET request inside of a loop in JavaScript

So, my code looks something like
for(int n = 0; n < object.length; n++){
/*Other code */
$.get(...,function(data){
//do stuff
});}
Now, the other code executes multiple times like it should. However, when the get command is ran, it is only ran once and that is when n reaches the object.length. This causes all sorts of errors. n is only being incremented in the for loop.
Can you not loop get/post commands? Or if you can, what am doing wrong? Thanks.
The for-loop won't wait for the $.get call to finish so you need to add some async flow control around this.
Check out async.eachSeries. The done callback below is the key to controlling the loop. After the success/fail of each $.get request you call done(); (or if there's an error, you call done(someErr);). This will advance the array iterator and move to the next item in the loop.
var async = require("async");
var list = ["foo", "bar", "qux"];
async.eachSeries(list, function(item, done) {
// perform a GET request for each item in the list
// GET url?somevar=foo
// GET url?somevar=bar
// GET url?somevar=qux
$.get(url, {somevar: item}, function(data) {
// do stuff, then call done
done();
});
}, function(err) {
if (err) {
throw err;
}
console.log("All requests are done");
});
Do you want all the get requests to be done at the same time or one after the other?
You can do all at once with a forEach loop. forEach works better than a normal for loop because it gives you a new instance of elem and n in each iteration.
object.forEach(function(elem, n) {
/*Other code */
$.get(...,function(data){
//do stuff
});
});
But if you want to do the requests one after the other you could do it this way.
(function loop(n) {
if(n >= object.length) return;
/*Other code */
$.get(...,function(data){
//do stuff
loop(n + 1);
});
})(0);

Categories