NodeJs : Multiple loop of HTTP requests execute simultaneously - javascript

nodejs multiple http requests in loop
As per the above question,it was answered how to perform a loop of http requests with an array of urls which works fine.But what I am trying to achieve is to perform another loop of http requests which should be done only after the completion of the first loop (i.e) it should wait for the first loop of http requests to complete.
// Import http
var http = require('http');
// First URLs array
var urls_first = ["http://www.google.com", "http://www.example.com"];
// Second URLs array
var urls_second = ["http://www.yahoo.com", "http://www.fb.com"];
var responses = [];
var completed_requests = 0;
function performHTTP(array) {
for (i in urls_first) {
http.get(urls[i], function(res) {
responses.push(res);
completed_requests++;
if (completed_requests == urls.length) {
// All download done, process responses array
console.log(responses);
}
});
}
}
In the above snippet ,i have added another array of urls.I wrapped the for inside a function to change the array each time called.Since i have to wait for the first loop to complete, i tried async/await like below.
async function callMethod() {
await new Promise (resolve =>performHTTP(urls_first))) // Executing function with first array
await new Promise (resolve =>performHTTP(urls_second))) // Executing function with second array
}
But in this case both the function calls get executed simultaneously(i.e)it does not wait for the first array execution to complete .Both execution happens simultaneously,which i need to happen only after the completion of one.

You need to make your request inside a Promise :
function request(url) {
return new Promise((resolve, reject) => {
http.get(url, function(res) {
// ... your code here ... //
// when your work is done, call resolve to make your promise done
resolve()
});
}
}
And then resolve all your requests
// Batch all your firts request in parallel and wainting for all
await Promise.all(urls_first.map(url => request(url)));
// Do the same this second url
await Promise.all(urls_second.map(url => request(url)));
Note, this code is not tested and may contains some mistake, but the main principle is here.
More information about Promise : https://developer.mozilla.org/fr/docs/Web/JavaScript/Reference/Objets_globaux/Promise

Check out how to use .then() to call second performHttp right after the first one is completed.

You can call services using eachSeries.
https://www.npmjs.com/package/async-series
series([
function(done) {
console.log('First API Call here')
done() // if you will pass err in callback it will be caught in error section.
},
function(done) {
console.log('second API call here')
done()
},
function(done) {
// handle success here
}
], function(err) {
console.log(err.message) // "another thing"
})

Related

returns doesn't wait for the cycle to end

I've a problem with my code (typescript):
async getAllServers(#Res() response) {
const servers = await this.serverService.getAllServers();
let bot = []
servers.map(async server => {
console.log(server.id)
bot.push(await this.serverService.getInfo(server.id));
console.log(bot)
})
return response.status(HttpStatus.OK).json({
bot,
servers
})
}
This function need to return 2 array, but the second array (bot) is always empty.
This is because return is executed before the loop.
How I can execute the return when the loop finish?
Thanks in advance and sorry for bad english.
This is because your map function has an async function which will push the functions to the microtask queue and will execute when the call stack is empty.
To be able to return the bot array you need to wait for these async functions to complete.
async getAllServers(#Res() response) {
const servers = await this.serverService.getAllServers();
let bot = []
let botServerCalls = [];
// get the API call Promise in an array to be able to hit them parallely
botServerCalls = servers.map(server => {
console.log(server.id)
// assuming this returns a Promise to make API call
return this.serverService.getInfo(server.id);
});
// use Promise.all to make API calls in parallel and wait for Promise.all to resolve
bot = await Promise.all(botServerCalls);
return response.status(HttpStatus.OK).json({
bot,
servers
})
}

javascript promise executes code synchronously despite of promise

so basically i have a web application that retrieves data from firebase. Since it takes time to query, i used promises in javascript so that my code will execute at the right time. In function getDataFirebase, data is retrieved from firebase and is push to an array called Collect. So, after pushing 1 row, it query again to the firebase for another table of data to be collected then continues the loop. So, i create a promise before calling Firebase, and then resolving afterwards. But, im not doing that on getDataUsers.once("value"). There, i'm firing event and not waiting for it to return - the for-loop continues on, and all the callbacks processed later, but resolve is at the end of the for-loop, so its too late by then.
I used the async keyword in the when querying hoping that it would cause the for0loop to wait for that job to complete, but actually all it does here is causing the callback to return a promise - which is ingored by the on function. Its been a while already debugging and hoping that it would populate the execution at the right time. Someone pleasee help me? :(((
var promise = getDataFirebase();
promise.then(function () {
console.log(Collect);
console.log("firsst");
return getDataFirebaseUser();
}).then(function () {
console.log("Second");
});
function getDataFirebase() {
return new Promise(function (resolve, reject) {
refReview.on("value", function (snap) {
var data = snap.val();
for (var key in data) {
Collect.push({
"RevieweeName": data[key].revieweeID.firstname.concat(" ", data[key].revieweeID.lastname),
"ReviewerName": data[key].reviewerID.firstname.concat(" ", data[key].reviewerID.lastname),
rating: data[key].rating,
content: data[key].content,
keyOfReviewee: data[key].revieweeID.userID
})
var getDataToUsers = firebase.database().ref("users").child(data[key].revieweeID.userID);
getDataToUsers.once("value", async function (snap) {
var fnLn = snap.val();
var first = fnLn.isTerminated;
console.log("terminateStatus", first);
});
}//end of for loop
resolve();
}); //end of snap
});
}
Output of console according to the code is as follows:
Collect(array)
first
Second
terminateStatus, 1
it must be
Collect(array)
first
terminateStatus, 1
second
First of all, you misunderstood the concepts of async/await in javascript. Please, read this manual from mozilla and some examples to understand how to use it.
Coming back to your problem, adding "async" keyword to the callback function in your example does not change anything. Firebase will call that function when it fetches data the same way if its async or not. And before it fetches data, you call resolve at the end of for loop, completing promise. So the promise is resolved before you fetch the data, which is why you get terminateStatus at the end.
You could use Promise returned from once method, and await it so that it wont go out of for loop, until all the data fetched. To do that, you have add async to first callback. See this:
function getDataFirebase() {
return new Promise(function (resolve, reject) {
refReview.on("value", async function (snap) {
var data = snap.val();
for (var key in data) {
Collect.push({
"RevieweeName": data[key].revieweeID.firstname.concat(" ", data[key].revieweeID.lastname),
"ReviewerName": data[key].reviewerID.firstname.concat(" ", data[key].reviewerID.lastname),
rating: data[key].rating,
content: data[key].content,
keyOfReviewee: data[key].revieweeID.userID
})
var getDataToUsers = firebase.database().ref("users").child(data[key].revieweeID.userID);
var snapshot = await getDataToUsers.once("value");
var fnLn = snapshot.val();
var first = fnLn.isTerminated;
console.log("terminateStatus", first);
}//end of for loop
resolve();
}); //end of snap
});
}
However, this is still not very good piece of code since you call refReview.on method each time inside a different promise. You should either call it once in your app and attach a callback, or you should use refReview.once method if you just intend to fetch one document for each call.

Promise to run nested promises sequentially and resolve upon first reject

I'm struggling to send multiple AJAX calls in the following task. The API returns takes two parameters: userId and offsetValue and returns last 10 messages for the specified user, starting from specified offset. If the offset is more than the total count of messages for the user, API returns an empty string.
I wrote a function that returns an individual promise to get 10 messages for specified userId and offsetValue.
function getMessages(userId, offsetValue) {
return new Promise(function (resolve, reject) {
$.ajax(
{
url: 'https://example.com/api.php',
type: 'POST',
data: {
action: 'get_messages',
offset: offsetValue,
user: userId
},
success: function (response) {
if (response != '') {
resolve(response);
} else {
reject(response);
}
},
error: function (response) {
reject(response);
}
});
});
}
I need to run parallel tasks using .all() for multiple userId, however I cannot run parallel subtasks for each userId (incrementing offsetValue by 10 each time) as I don't know in advance how many messages does each user have, so the execution should stop when first individual promise is rejected (i.e. offsetValue is more than total messages count). Something like this:
var messages = '';
getMessages('Alex', 0)
.then(function(result) {
messages += result;
getMessages('Alex', 10);
.then(function(result) {
messages += result;
getMessages('Alex', 20)
....
});
});
So, is there any way to run sequental promises with iterating parameter sequentially one by one and resolve the overall concatenated result upon first reject?
First off, you want to avoid the promise anti-pattern where you unnecessarily wrap your code in a new promise when $.ajax() already returns a promise that you can just use. To fix that, you would change to this:
// retrieves block of messages starting with offsetValue
// resolved response will be empty if there are no more messages
function getMessages(userId, offsetValue) {
return $.ajax({
url: 'https://example.com/api.php',
type: 'POST',
data: {
action: 'get_messages',
offset: offsetValue,
user: userId
}
});
}
Now, for your main question. Given that you want to stop requesting new items when you get a rejection or empty response and you don't know how many requests there will be in advance, you pretty much have to request things serially and stop requesting the next one when you get an empty response or an error. The key to doing that is to chain sequential promises together by returning a new promise from a .then() handler.
You can do that like this:
function getAllMessagesForUser(userId) {
var offsetValue = 0;
var results = [];
function next() {
return getMessages(userId, offsetValue).then(function(response) {
// if response not empty, continue getting more messages
if (response !== '') {
// assumes API is returning 10 results at a time
offsetValue += 10;
results.push(response);
// chain next request promise onto prior promise
return next();
} else {
// empty response means we're done retrieving messages
// so just return results accumulated so far
return results.join("");
}
});
}
return next();
}
This creates an internal function that returns a promise and each time it gets some messages, it chains a new promise onto the original promise. So, getAllMessagesForUser() returns a single promise that resolves with all the messages it has retrieved or rejects with an error.
You would use it like this:
getAllMessagesForUser('Bob').then(function(messages) {
// got all messages here
}, function(err) {
// error here
});
You could parallelize multiple users (as long as you're sure you're not overloading the server or running into a rate limiting issue) like this:
$.when.apply($, ['Bob', 'Alice', 'Ted'].map(function(item) {
return getAllMessagesForUser(item);
})).then(function() {
// get results into a normal array
var results = Array.prototype.slice.call(arguments);
});
P.S. Promise.all() is much nicer to use than $.when() (since it takes an array and resolves to an array), but since there are already jQuery promises involved here and I didn't know your browser compatibility requirements, I stuck with jQuery promise management rather than ES6 standard promises.

Node JS, chaining variable number of http requests

I'm using request js library to make HTTP requests to my API. My single API call looks like this:
var options = {
method: "post",
url: 'http//example.com',
json: true,
headers: headers,
body: {key: value}
}
request(options, callback);
However, I have array of options, which are needed to be called one after another and I need to break whole chain if one of them fails.
If last chain finishes, I need to output result to console.
I know that chaining callbacks could be fulfilled via promises, but all examples that I have found uses predefined amount of chained requests.
Is it possible?
A recursive function which calls itself in the request callback should work.
options = [{...}, {...}];
function doRequests(options){
request(options.shift(), function(){
if(error){
return "handle error";
}
if(options.length > 0){
doRequests(options);
}
});
}
The first thing I would do would be to use a request library that returned a promise. Assuming you have such a thing then you just chain the promises together.
First create a resolved promise:
var promise = new Promise.resolve();
The for each new object you want to request:
promise = promise.then(() => requestToPromise(options));
will chain another request onto the existing promise and will fire off a new request only when the previous one has completed.
If you have an array, you can have an index into that array, and have the callback kick off the next request when the previous one finishes. Roughly:
var index = 0;
var options = [/*...array of options objects...*/];
doRequest() {
request(options[index], function(err, result) {
// ...handle the result/error here. If there's no error, then:
if (++index < options.length) {
// Kick off the next request
doRequest();
}
});
}
While the above can be Promise-ified, since your requestmethod appears not to be, it would just complicate things.
You can instead use request-promise
and do the following
import request = require('request-promise');
var options = [/*...array of options objects...*/];
requests = [];
options.forEach(function(option){
requests.push(request(option));
}
Promise.all(requests).then(function(reponses){
//all requests are done.
});

How to hold my response in nodejs until the end of my loop

My code is as follows :
myinsts.forEach(function (myinstId) {
Organization.getOrgById(myinstId,function (err,insts)
{
res.json(insts);
})
});
I'm usng Node.js and I'm getting the error "Can't set headers after they are sent" , Obviously my server sends first iteration , how can I make it hold until I get the whole data
It's not going to be pretty. Essentially what you could do is create a variable to hold the data through every iteration and then check if that's the last callback to be called. If it is, then you can output your json. Try something like the following:
var _counter = 0;
var _values = [];
myinsts.forEach(function (myinstId) {
Organization.getOrgById(myinstId,function (err,insts)
{
_values.push(insts);
if(++_counter == myinsts.length)
res.json(_values);
})
});
You would like to use Promises for this kind of works, you can execute each async function in parallel with Promise.all() and get the data in the order that is called.
var promises = [];
myinsts.forEach(function (myinstId) {
promises.push(new Promise((resolve, reject)=>{
Organization.getOrgById(myinstId,function (err,insts){
if(err) return reject(err);
resolve(insts);
});
}));
});
Promise.all(promises)
.then((allInsts)=>{res.json(allInsts)}) // all data fetched in loop order.
.catch((error)=>{console.log(error)});
Also consider to make your getOrgById handler return a Promise instead of using callback.
You have to send response just one time, not several times. Your code send the response each of iteration. Send the response once if specific condition fulfilled. Like below.
myinsts.forEach(function (myinstId, index) {
Organization.getOrgById(myinstId,function (err,insts){
if( index == myinsts.length -1 ){
res.json(insts);
}
})
});

Categories