controlling execution flow in a promise - javascript

In the following code, why doesn't the promise inside get_dbinfo resolve prior to executing .then(result) in the calling code block?
My understanding is that the code inside the new Promise will complete before returning to the .then part of the calling statement.
dbFuncs.get_dbinfo()
.then((result) => {
count = result.info.doc_count
if (count < 500){perPage = count};
});
function get_dbinfo() {
return new Promise((resolve, reject) => {
return db.info()
.then((result) => {
resolve(result)
}).catch((err) => {
console.log(err);
reject(err)
});
});
}

Figured this out. The first issue was not returning the first call to get_dbinfo, which I did not do because it was marking the post promise parts of the function as "unreachable, due to two returns within the same function", which clued me into the second part of the issue, which was trying to include two distinct promise chains in the same function.
The final solution was to bring everything under a single promise chain with a retun on the first call to dbFuncs, as shown below. No changes were made to get_dbinfo().
var count = 0;
var perPage = 500;
var page = req.query.p || 1;
return dbFuncs.get_dbinfo()
.then((result) => {
count = result.doc_count
if (count < 500){perPage = count};
return db.find({
selector: {
$and: [
{last_name: { '$gt': 1}},
]
},
use_index: ['patients'],
sort: [{'last_name': 'asc'}],
skip: ((perPage * page) - perPage),
limit: perPage
}).then((docs) => {
let pages = Math.ceil(count / perPage);
res.render("patients", {
pagination: {
currentpage: page,
page: pages
},
obj: docs,
current: page,
});
}).catch((err) => {
console.log('error in patients_controller', err);
});
});
};
For posterity, this complete block is an excellent way to paginate a dataset (in this case from pouchdb) to an hbs template.

Related

How to do promise inside promise

I am trying to do pagination with promise. Here's my workflow.
Step 1 : Total page in pagination is 50. The url will be like url?page=1
Step 2 : In each page i will get 50 products, which i need to call a seperate api.
The only condition is, during pagination, the second page should called only if first page's 50 api call is done. Once all the 50 pages are fetched, it should return the promise.
What i have so far is.
let promises = [];
paginate(50);
Promise.all(promises)
.catch(function(err) {
console.log('err',err);
})
.then(() => {
console.log('All Done!!!!!');
});
function paginate(loop){
promises.push(
axios(config)
.then(function (response) {
// Got 50 products
})
.catch(function (error) {
console.log('err',err);
})
)
In the place Got 50 products i need to still iterate the 50 products in axios. I am not sure whether it is possible to do promise inside promise.
As the server can't bear all sudden loads, the only condition is to iterate the second 50 products (or next page's products) only after first (or previous 50 api is called).
Edit :
// Here i have got 50 products
As i told, On each page i will get 50 products, i will call another api for all those 50 products. I have given the code below.
Only constraint is on first page response, the response 50's api should be called.. it's like product?product=1 . the next 50 should be called only after first 50's api called
for (let index = 0; index < response.length; index++) {
const element = response[index];
//Here call api for one item
axios(config)
.then(function (elemen t) {
// Here i have got 50 products
})
.catch(function (error) {
console.log('err',err);
})
}
You're probably not looking for Promise.all (which is meant for running things in parallel), but for recursion:
fetchPages(0, 49)
.then(console.log);
function fetchPages(from, to, results = []) {
if (from <= to) {
return fetchPage(from)
.then(res => results.push(...res))
.then(() => fetchPages(from + 1, to, results)); // calls itself
} else {
return results;
}
}
function fetchPage(n) {
console.log(`Fetching page ${n}`);
// Here, you would return axios()...
// But just for this demo, I fake that:
const results = new Array(50).fill(null)
.map((_,i) => `Product ${i} from page ${n}`);
return new Promise(resolve => setTimeout(() => resolve(results), 100));
}
Edit
The code above solves the problem of pages needing to be fetched only one after the other. We can keep the fetchPages function as it is. Now, since every page will contain products which will need to be fetched individually, we'll just edit fetchPage a bit.
There are multiple ways this could be done. Here are some:
Solution A: fetch every product in parallel
If the server can handle 50 requests going off at once, you could use Promise.all:
function fetchPage(n) {
console.log(`Fetching page ${n}`);
return axios(`/page?page=${n}`)
.then(productIds => {
return Promise.all(productIds.map(fetchProduct));
});
}
function fetchProduct(id) {
return axios(`/product?product=${id}`);
}
Solution B: fetch every product in sequence
If the server can't handle multiple requests going off at once, you could use recursion once again:
function fetchPage(n) {
console.log(`Fetching page ${n}`);
return axios(`/page?page=${n}`)
.then(fetchproducts);
}
function fetchProducts(productIds, results = []) {
if (productIds.length) {
const productId = productIds.shift();
return fetchProduct(productId)
.then(res => results.push(res))
.then(() => fetchProducts(productIds, results)); // calls itself
} else {
return results;
}
}
function fetchProduct(id) {
return axios(`/product?product=${id}`);
}
Solution C: fetch products X requests at a time
If the server can handle X requests going off at once, you could use a module like queue, which can help you with concurrency:
const queue = require('queue'); // Don't forget to $ npm install queue
const MAX_CONCURRENT_CALLS = 4; // 4 calls max at any given time
function fetchPage(n) {
console.log(`Fetching page ${n}`);
return axios(`/page?page=${n}`)
.then(fetchproducts);
}
function fetchProducts(productIds) {
const q = queue();
q.concurrency = MAX_CONCURRENT_CALLS;
const results = [];
q.push(
...productIds.map(productId => () => {
return fetchProduct(productId)
.then(product => results.push(product));
})
);
return new Promise((resolve, reject) => {
q.start(function (err) {
if (err) return reject(err);
resolve(results);
});
});
}
function fetchProduct(id) {
return axios(`/product?product=${id}`);
}

Async for-loop not advancing past first iteration in Node JS

I have an array of video info from a YouTube playlist. I'm trying to retrieve the audio of each video using npm module youtube-mp3-downloader.
I'm want to iterate over the array asynchronously, in sequence.
The array I'm iterating has a bunch of objects like the following one:
{
snippet: {
title: 'Video Title'
},
contentDetails: {
videoId: 'video-Id'
}
}
I'm running the following loop and it successfully runs the 1st iteration (downloads the audio file and resolves the promise). Then it enters the 2nd iteration, it does console.log("started iteration " + i) but stops right there, before entering await getFromYT()
async function getAudio(list) {
for (let i = 0; i < list.length; i++) {
console.log("started iteration " + i)
await getFromYT(list[i]).then(message =>
console.log(message + " " + i)
).catch(error =>
console.log(error)
)
}
}
here is what I get in the console:
started iteration 0
Finished 0
started iteration 1
And here is the function that is being awaited:
function getFromYT(item) {
return new Promise((resolve, reject) => {
YD.download(item.contentDetails.videoId, item.snippet.title + '.mp3');
YD.on("error", function (error) {
reject(error);
});
YD.on("finished", function (err, data) {
resolve("Finished");
});
})
}
I have been trying to figure this out for a while now and haven't gotten anywhere.
Ideally I would like to know what's wrong with the code that I have. Why doesn't it continue the loop?
However, if you can suggest a straight forward way to achieve what I'm trying to do that would be helpful too.
You are mixing async/await style and promises. the function which loops through should look like this
async function getAudio(list) {
for (let i = 0; i < list.length; i++) {
try {
console.log("started iteration " + i)
const message = await getFromYT(list[i]);
console.log(message + " " + i);
} catch(err) {
console.error(err);
}
}
}
if the operations can be done in parallel, consider using Promise.all with .map
Here's how I'd do the same using a combination of async/await and Promise.
const getFromYT = (item) => new Promise((resolve, reject) => {
YD.download(item.contentDetails.videoId, item.snippet.title + '.mp3');
YD.on("error", (error) => {
reject(new Error(error));
});
YD.on("finished", (err, data) => {
resolve({ id: item.id, data});
});
});
const getAudio = async (list) => {
const promises = list.map((listItem) => {
return getFromYT(listItem)
.catch((err) => console.log(err));
});
let data = null;
try {
data = await Promise.all(promises);
} catch (err) {
console.log(err);
}
return data;
};
So, what I've done here is simply instantiated all the promises together in a loop using the map function.
Then, I execute all the promises parallely using Promise.all and await for all the promises to resolve parallely.
I ensure that the promise chain in getAudio doesn't break by catching rejections at the individual getFromYT promises itself.
You can get the data and the corresponding item without any mismatch as follows :-
const allData = await getAudio(list);
allData.forEach((datum) => {
const { id, data } = datum;
console.log(id);
console.log(data);
});
Hope this helps!
Edit 1: So as #bergi said, I can't return / throw from event-emitters, I need to promisify them. Here's how!

javascript promise after foreach loop with multiple mongoose find

I'm trying to have a loop with some db calls, and once their all done ill send the result. - Using a promise, but if i have my promise after the callback it dosent work.
let notuser = [];
let promise = new Promise((resolve, reject) => {
users.forEach((x) => {
User.find({
/* query here */
}, function(err, results) {
if(err) throw err
if(results.length) {
notuser.push(x);
/* resolve(notuser) works here - but were not done yet*/
}
})
});
resolve(notuser); /*not giving me the array */
}).then((notuser) => {
return res.json(notuser)
})
how can i handle this ?
Below is a function called findManyUsers which does what you're looking for. Mongo find will return a promise to you, so just collect those promises in a loop and run them together with Promise.all(). So you can see it in action, I've added a mock User class with a promise-returning find method...
// User class pretends to be the mongo user. The find() method
// returns a promise to 'find" a user with a given id
class User {
static find(id) {
return new Promise(r => {
setTimeout(() => r({ id: `user-${id}` }), 500);
});
}
}
// return a promise to find all of the users with the given ids
async function findManyUsers(ids) {
let promises = ids.map(id => User.find(id));
return Promise.all(promises);
}
findManyUsers(['A', 'B', 'C']).then(result => console.log(result));
I suggest you take a look at async it's a great library for this sort of things and more, I really think you should get used to implement it.
I would solve your problem using the following
const async = require('async')
let notuser = [];
async.forEach(users, (user, callback)=>{
User.find({}, (err, results) => {
if (err) callback(err)
if(results.length) {
notUser.push(x)
callback(null)
}
})
}, (err) => {
err ? throw err : return(notuser)
})
However, if you don't want to use a 3rd party library, you are better off using promise.all and await for it to finish.
EDIT: Remember to install async using npm or yarn something similar to yarn add async -- npm install async
I used #danh solution for the basis of fixing in my scenario (so credit goes there), but thought my code may be relevant to someone else, looking to use standard mongoose without async. I want to gets a summary of how many reports for a certain status and return the last 5 for each, combined into one response.
const { Report } = require('../../models/report');
const Workspace = require('../../models/workspace');
// GET request to return page of items from users report
module.exports = (req, res, next) => {
const workspaceId = req.params.workspaceId || req.workspaceId;
let summary = [];
// returns a mongoose like promise
function addStatusSummary(status) {
let totalItems;
let $regex = `^${status}$`;
let query = {
$and: [{ workspace: workspaceId }, { status: { $regex, $options: 'i' } }],
};
return Report.find(query)
.countDocuments()
.then((numberOfItems) => {
totalItems = numberOfItems;
return Report.find(query)
.sort({ updatedAt: -1 })
.skip(0)
.limit(5);
})
.then((reports) => {
const items = reports.map((r) => r.displayForMember());
summary.push({
status,
items,
totalItems,
});
})
.catch((err) => {
if (!err.statusCode) {
err.statusCode = 500;
}
next(err);
});
}
Workspace.findById(workspaceId)
.then((workspace) => {
let promises = workspace.custom.statusList.map((status) =>
addStatusSummary(status)
);
return Promise.all(promises);
})
.then(() => {
res.status(200).json({
summary,
});
})
.catch((err) => {
if (!err.statusCode) {
err.statusCode = 500;
}
next(err);
});
};

How to make run nested asynchronous methods synchronously?

How do I wrap this routine inside a Promise so that I only resolve when I get all the data?
var accounts = [];
getAccounts(userId, accs => {
accs.forEach(acc => {
getAccountTx(acc.id, tx => {
accounts.push({
'id': acc.id,
'tx': tx
});
});
})
});
EDIT: Any issues if I do it like this?
function getAccountsAllAtOnce() {
var accounts = [];
var required = 0;
var done = 0;
getAccounts(userId, accs => {
required = accs.length;
accs.forEach(acc => {
getAccountTx(acc.id, tx => {
accounts.push({
'id': acc.id,
'tx': tx
});
done = done + 1;
});
})
});
while(done < required) {
// wait
}
return accounts;
}
Let's put this routine into a separate function, so it is easier to re-use it later. This function should return a promise, which will be resolved with array of accounts (also I'll modify your code as small as possible):
function getAccountsWithTx(userId) {
return new Promise((resolve, reject) => {
var accounts = [];
getAccounts(userId, accs => {
accs.forEach(acc => {
getAccountTx(acc.id, tx => {
accounts.push({
'id': acc.id,
'tx': tx
});
// resolve after we fetched all accounts
if (accs.length === accounts.length) {
resolve(accounts);
}
});
});
});
});
}
The single difference is just returning a promise and resolving after all accounts were fetched. However, callbacks tend your codebase to have this "callback hell" style, when you have a lot of nested callbacks, and it makes it hard to reason about it. You can workaround it using good discipline, but you can simplify it greatly switching to returning promises from all async functions. For example your func will look like the following:
function getAccountsWithTx(userId) {
getAccounts(userId)
.then(accs => {
const transformTx = acc => getAccountTx(acc.id)
.then(tx => ({ tx, id: acc.id }));
return Promise.all(accs.map(transformTx));
});
}
Both of them are absolutely equivalent, and there are plently of libraries to "promisify" your current callback-style functions (for example, bluebird or even native Node util.promisify). Also, with new async/await syntax it becomes even easier, because it allows to think in sync flow:
async function getAccountsWithTx(userId) {
const accs = await getUserAccounts(userId);
const transformTx = async (acc) => {
const tx = getAccountTx(acc.id);
return { tx, id: acc.id };
};
return Promise.all(accs.map(transformTx));
}
As you can see, we eliminate any nesting! It makes reasoning about code much easier, because you can read code as it will be actually executed. However, all these three options are equivalent, so it is up to you, what makes the most sense in your project and environment.
I'd split every step into its own function, and return a promise or promise array from each one. For example, getAccounts becomes:
function getAccountsAndReturnPromise(userId) {
return new Promise((resolve, reject) => {
getAccounts(userId, accounts => {
return resolve(accounts);
});
});
};
And getAccountTx resolves to an array of { id, tx } objects:
function getAccountTransactionsAndReturnPromise(accountId) {
return new Promise((resolve, reject) => {
getAccountTx(account.id, (transactions) => {
var accountWithTransactions = {
id: account.id,
transactions
};
return resolve(accountWithTransactions);
});
});
};
Then you can use Promise.all() and map() to resolve the last step to an array of values in the format you desire:
function getDataForUser(userId) {
return getAccountsAndReturnPromise(userId)
.then(accounts=>{
var accountTransactionPromises = accounts.map(account =>
getAccountTransactionsAndReturnPromise(account.id)
);
return Promise.all(accountTransactionPromises);
})
.then(allAccountsWithTransactions => {
return allAccountsWithTransactions.map(account =>{
return {
id: account.id,
tx: tx
}
});
});
}

Node JS Sync Work flow with Async request

Currently try to learn Node JS and getting my head around Async/Sync workflow.
Try to the follow:
Step 1:
- Get data 1 with function 1
- Get data 2 with function 2
- Get data 3 with function 3
Step2:
- Work out logic with data 1,2,3
Step 3
- Do final call
I been looking at Q and Async packages but still havent really find an example.
Can some one show me how they will go about this issue in Node JS?
Thanks
Not entirely clear on your implementation, but depending on how specific your ordering needs to be you could try something like this:
var data1 = null;
var data2 = null;
var data3 = null;
async.series([
function(httpDoneCallback){
async.parallel([
function(data1Callback){
$http(...).then(function(response){
// some logic here
data1 = response;
data1Callback();
})
},
function(data2Callback){
$http(...).then(function(response){
// some logic here
data2 = response;
data2Callback();
})
},
function(data3Callback){
$http(...).then(function(response){
// some logic here
data3 = response;
data3Callback();
})
}
], function(){
//all requests dome, move onto logic
httpDoneCallback();
})
},
function(logicDoneCallback){
// do some logic, maybe more asynchronous calls with the newly acquired data
logicDoneCallback();
}
], function(){
console.log('all done');
})
Do you want function 1, 2, and 3 to trigger at the same time? If so then this should help:
var async = require('async');
async.parallel([
function(cb1) {
cb1(null, "one")
},
function(cb2){
cb2(null, "two")
},
function(cb3){
cb3(null, "three")
}
], function(err, results) {
console.log(results); // Logs ["one", "two", "three"]
finalCall();
});
To explain, every function in the array submitted as the first param to the parallel method will also receive a callback function. Activating the callback function signifies that you're done fetching your data or doing whatever you need to do in said function. All three functions will trigger at the same time, and once all three callbacks are called, the final function is called. The callback accepts two parameters: "error", and "result." If everything's successful, pass "null" as the error parameter. The results will be given to the final function as an array containing each of the results for your individual functions.
You can setup a chain of Promises to do things sequentially:
var funcA = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve('some data from A')
}, 1000)
});
}
var funcB = (dataFromA) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(dataFromA + ' data from B')
}, 2000)
})
}
var funcC = (dataFromB) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(dataFromB + ' data from C')
}, 500)
})
}
// Doing the functions on after another
funcA().then(funcB).then(funcC).then((data) => {
console.log(data);
})
Or if you want to do them all at the same time you can use Promise.all():
var promises = [];
promises.push(new Promise((resolve, reject) => {
setTimeout(() => {
resolve('some data from A')
}, 1000)
}));
promises.push(new Promise((resolve, reject) => {
setTimeout(() => {
resolve('some data from B')
}, 1000)
}));
promises.push(new Promise((resolve, reject) => {
setTimeout(() => {
resolve('some data from C')
}, 1000)
}));
// Execute the array of promises at the same time, and wait for them all to complete
Promise.all(promises).then((data) => {
console.log(data);
})
Probably the best thing to do is use Promises like #Tyler here states. However, for conceptual understanding it is best to first understand the node callback pattern.
Because some tasks take time, we give the task a function and say 'When you are done, put the data you retrieved into this function'. These functions that we give to other functions are called callbacks. They must be constructed to accept the data, and also an error in case there is a problem while fetching the data. In Node the error is the first callback parameter and the data is the second.
fs.readFile('/file/to/read.txt', function callback(error, data) {
if (error) console.log(error);
else console.log(data);
});
In this example, once node reads the file, it will feed that data into the callback function. In the callback we must account for the case that there was a problem and handle the error.
In your question you want to do multiple async tasks and use their results. Therefore you must take this pattern and nest several of them. So, continuing this example, if there is no error you will begin another async task.
fs.readFile('/file/to/read.txt', function callback(error, data) {
if (error) console.log(error);
else {
someOtherAsyncThing(function callback(error, data2) {
if (error) console.log(error);
else {
console.log(data + data2)
}
});
}
});

Categories