I have a postgres database. It contains records that have a column that refers to other similar records. The column format is JSON.
So I get a record, it has links to other records. Such nesting can be n quantity. Therefore, I cannot know in advance how many requests I will need. The fact is that all these requests must be made through an asynchronous method, since I use the "pg" library. But res.send is a synchronous method and it is executed on the stack before the first iteration of the loop is completed. As a result, I am getting incomplete data.
I understand what the problem is, but I don't understand the solution
const dbResult = await db.query("SELECT * FROM docs where id = 17");//17 is for example
const doc = dbResult.rows[0];
doc.linked.forEach(linked => {
linked.assignedDocs = [];
linked.links.forEach(async link => {
const dbRes = await db.query("SELECT name, img FROM docs WHERE id = $1", [link.refId])
link.assignedDocs.push({
id: link.refId,
name: dbRes.rows[0].name,
img: dbRes.rows[0].img,
})
})
});
res.send(doc);
forEach doesn't wait for async/await, use for ... of with async blocks
for (const linked of doc.linked) {
// ...
const dbRes = await func();
}
Related
I need to save a list of dishes in my restaurant's menu. I do it like this:
fromFormMenu.forEach(fromFormMeal => {
MealAPI.create(fromFormMeal, restaurant.id).then(resp => console.log(resp))
})
In this case, the dishes are saved to the database in the wrong order in which the user entered them into the form. Here is what my API method looks like:
static async create(meal, restaurantId) {
const response = await axios.post(REST_URL + "/" + restaurantId, meal);
return response
}
It is necessary for me that the order of records was saved such as they were entered by the user. I assume that the problem is related to the asynchrony of requests, because of which the records are stored in random order. I even tried removing the 'async' in the method declaration, but that didn't work.
The reason of getting the wrong order is you're calling all the api at the same time. If some of them is finished faster than the previous items, it will be saved before them.
So in this situation you might wanna call the next api after the first one is finished.
Something like:
const mainFunction = async () => {
for (let i = 0; i < fromFormMenu.length; i++) {
const resp = await MealAPI.create(fromFormMeal, restaurant.id)
console.log(resp)
}
}
I am having trouble writing a scheduled function to read data from firestore. The function is successfully running every 1 minute but now the problem is reading the data from firestore. I am using async-await because I want to loop through the data after the read then do then so some update. Kindly help I am using firebase functions for the first time.
Below is my function. I keep getting this error cannot read property map of undefined
exports.checkDefaultedPledges = functions.pubsub.schedule("every 1 minutes").onRun( async
(context) => {
console.log("This will be run every 2 minutes!");
const time = new Date().getTime();
const snapshot = db.collection("pledges").get();
const res = await snapshot.docs.map(doc => ({ id: doc.id, ...doc.data() }));
console.log(res);
return null;
});
Is it possible for me to write this function without using the .then()?
The problem I am having is that no data is being returned when reading
the data.
You need to use await when calling get(), since it is an asynchronous method.
On the other hand, you should not use await in const res = await snapshot.docs.map() since docs is a simple property (no asynchronicity here).
If you want to update all the docs in the pledges collection, you can use a batched write as follows:
exports.checkDefaultedPledges = functions.pubsub.schedule("every 1 minutes").onRun(async (context) => {
const time = new Date().getTime();
const snapshot = await db.collection("pledges").get();
const batch = db.batch();
snapshot.forEach(doc => {
batch.update(doc.ref, { "updateTime": time });
})
return batch.commit();
});
Note that a batched write can contain up to 500 operations: so if you know your collection has/will have more than 500 docs, you should use Promise.all() as follows:
exports.checkDefaultedPledges = functions.pubsub.schedule("every 1 minutes").onRun(async (context) => {
const time = new Date().getTime();
const snapshot = await db.collection("pledges").get();
const promisesArray = snapshot.docs.map(doc => doc.ref.update({ "updateTime": time }));
return Promise.all(promisesArray);
});
Side note:
It is a best practice to use FieldValue.serverTimestamp() instead of using the JS Date(), especially when you are writing to Firestore from a client app. serverTimestamp "returns a sentinel used with set() or update() to include a server-generated timestamp in the written data".
Since a Cloud Function is executed by a server, it is not a must, but you could adapt your code as follows:
const time = admin.firestore.FieldValue.serverTimestamp();
I'd like to:
Make a call to an API resource
Get back an array of records - [arr]
forEach over [arr] and perform some function - another aysnc call to an API
For each iteration, create an object that has elements of the original API call, and the subsequent call for each item
Save each object instance generated to Mongo
At the end of all required save operations, call the complete collection from Mongo
res.render that collection
I have code that looks like this:
//First API call to get [arr]
const results = await getlist();
//Iterate over [arr] and perform a request on each item
_.forEach(results, async function (result) {
//Seconday request for each item in [arr]
const record = await item(result.id).fetch();
//Combined doc from original result and secondary call for record
let doc = new DocModel({
item1: result.id,
item2: record.something,
});
//Save doc
const saveDoc = doc.save();
});
//Call for all docs
const allItems = await DocModel.find();
//Render all docs
res.render(`aView`, {
recordings: allItems,
});
The problem I am facing is that the render executes before the forEach has completed / has populated Mongo.
To try and work around this, I have tried to wrap the forEach block in a promise, and then .then res.render, but this seemed to have no effect.
What is the solution to ensure all function calls have completed before the render occurs?
I place two marks in following code. And i deleted the _.forEach function
mark1: Use normal for-loop to do it
mark2: use await here
//First API call to get [arr]
const results = await getlist();
// ########## mark1 ########## : Use normal for-loop to do it
for (const result of results) {
//Seconday request for each item in [arr]
const record = await item(result.id).fetch();
//Combined doc from original result and secondary call for record
let doc = new DocModel({
item1: result.id,
item2: record.something,
});
// ########## mark2 ########## : use await here
//Save doc
const saveDoc = await doc.save();
}
//Call for all docs
const allItems = await DocModel.find();
//Render all docs
res.render(`aView`, {
recordings: allItems,
});
You can't use async await inside forEach. Instead you need to use for...of loop.
Another best solution is to use Promise.all
Promise.all
await Promise.all(_.map(results,async result => {
... existing code
});
I want to fetch 50 users from cloud firestore and I have two way both works.
But actually I dont know which is performant, as we have a poor internet connection in our country, if our focus be only fetching not iterating.
The first way (Single Request)
let tempList = [];
const matchingUsers = [user1, user2, user3, ..., user50];
const snap = await db.collection('users').get();
if (snap.size > 0) {
sanp.docs.forEach(doc => {
const data = doc.data();
matchingUsers.forEach(user => {
if (data.user === user) {
tempList.push(data.user);
}
});
});
}
The second way (multiple request)
matchingUsers.forEach(async user => {
const snap = await db.collection('users').doc(user).get();
tempList.push(snap.data().user)
});
With the first way, you are actually fetching the entire users collection and transmit all the corresponding data from the backend (Firestore) to your front-end. This is really not efficient, especially if you want to filter 50 users out 500k! Note also that you will pay for 500K reads instead of 50 (see pricing).
So fetching for only the docs you want (i.e. for exactly the 50 users) is the most efficient way. Since the get() method is asynchronous and returns a Promise, you can use Promise.all() as follows:
const matchingUsers = [user1, user2, user3, ..., user50];
const promises = matchingUsers.map(u => db.collection('users').doc(u).get());
Promise.all(promises).then(results => {
//results is an array of DocumentSnapshots
//use any array method, like map or forEach
results.map(docSnapshot => {
console.log(docSnapshot.data());
});
});
As explained in the doc, the advantage of Promise.all() is that "it returns a single Promise that fulfills when all of the promises passed as an iterable have been fulfilled", making it really easy to manage the different asynchronous parallel calls.
I made a connection to two collections and stored it in two arrays, but I can't access it cause its asynchronous.
Also, I wanted to perform something like this
SELECT roll FROM student
db.student.find({}, {roll:1, _id:0});
but implementing this doesn't work it just fetches everything from the collection.
I have tried using async/await but it didnt work.
I tried implementing the async (npm module) and used the async.series method but didnt work.
Using setTimeout to console.log logs the values but I need to perform some comparison, so it is not helpful.
let collectionOneArr = [];
let collectionTwoArr = [];
let db = client.db('job');
db.collection('one').find({}, {field:1, name: 0}).toArray((err, data) => {
data.forEach(val => collectionOneArr.push(val))
});
db.collection('two').find({}).toArray((err,data) => {
data.forEach(val => collectionTwoArr.push(val))
});
console.log(collectionOneArr) // returns []
console.log(collectionTwoArr) // returns []
// setTimeout(() => console.log(collectionTwoArr, collectionOneArr), 1000);
client.close();
});
It would be helpful to know what the objects that are being stored in the database look like, but I will try to answer the question anyhow.
There are 2 issues that I can see.
You're not querying correctly
You need to work with javascript's asynchronous nature :)
1. Incorrect mongo filtering
You don't appear to be supplying the arguments to the find method in the correct order and aren't actually telling mongo to filter by anything.
The first argument to the mongodb find method is the filter object. This should properly filter:
db.collection('one').find({field:1, name: 0}).toArray((err,
data.forEach(val => collectionOneArr.push(val))
});
2. Asynchronous issue
We can use a readyForCompute flag for instance to make sure that we only run the compute method after both database queries have completed.
This is a very primitive solution and does not scale very well but is sufficient for performing only 2 queries.
let readyForCompute = false;
let arr1 = [];
let arr2 = [];
let db = client.db('job');
db.collection('one').find({name: "value_to_filter_name_field_by"}).toArray((err, data) => {
arr1 = data;
if (readyForCompute) compute();
readyForCompute = true;
});
db.collection('two').find({}).toArray((err,data) => {
arr2 = data;
if (readyForCompute) compute();
readyForCompute = true;
});
function compute() {
client.close();
console.log(arr1);
console.log(arr2);
// do your computation...
}