Firebase arrayRemove takes very long time in modular version - javascript

I'm refactoring my firebase code to support the modular firebase version.
After I refactored the "arrayRemove" function, I noticed that it started to take a very long time to execute. Something about 1 minute. At v8, that was about 1 seconds
Here is my code:
const userRef = doc(db, 'users', userId)
try {
await updateDoc(userRef, {
items: arrayRemove(itemId),
})
console.log('REMOVED')
} catch (error) {
console.log(error)
}
Execution of this code takes around one minute. The same action with the same data in the same environment for the older version takes about one second. The rest functions works fast.
Firebase version: 9.6.1
Any ideas why it takes so long and how to make it faster?

After reading article "Best Practices: Arrays in Firebase"
I found that practice:
to remove keys, we save the entire array instead of using .remove()
So... I removed item in my code, then updated array and it took less than a second!
// removing item I need to remove at my local array
const updArr = arr(
(item) => item !== itemId
)
const userRef = doc(db, 'collection', 'id')
// updating array
await updateDoc(userRef, {
gifts: updArr,
})
// p.s.
Obviously, there is some bug at arrayRemove function, but actually we don't have to use that function 🤫

Related

How do we create a document, give it some fields, and give it a sub collection all in a single call in firestore (RNFirestore)?

So, I have a collection in firestore and I add a document to it with some fields like follows:
const chatRef = firestore().collection('CHAT').doc(id);
chatRef.set({
field-1: 'something',
field-2: 'something',
})
.then(() => { });
Now, to add a sub-collection to this document, I add the following code to "then". The complete code would look like:
const chatRef = firestore().collection('CHAT').doc(id);
chatRef.set({
field-1: 'something',
field-2: 'something',
})
.then(() => {
chatRef.collection('MESSAGES').add(initialBotMessage)
.then(() => { });
});
I would like to do both this actions in a single call, but couldn't find any leads. Is there any way to do this? I was trying to improve performance with reducing the amount of promise calls :)
There is no way to create multiple documents in a single API call. Writing to a single document always requires a single API call.
But you can have the two documents be created atomically, by using what Firestore calls a batched write. While this still requires one set call for each document, these will then be sent to the server (and committed or rejected there) as a single operation.

How to get the ID of a newly created document during a Transaction in Firestore?

I'm reading the docs on Transaction operations, and I figured the t.set() method would work similar to the docReference.set() documented in the the Add data page.
To my surprise, it doesn't:
const newCustomerRef = db.collection('customers').doc();
await db.runTransaction(t => {
const res = t.set(newCustomerRef, formData)
console.log(res)
});
The res object above (return value of t.set()) contains a bunch of props that looks obfuscated, and it doesn't look as if it's intended for you to work with them.
Is there any way to get the ID of the newly created document within a Transaction?
Update
What I'm trying to achieve is to have multiple data operations in 1 go, and have everything reverted back if it fails.
As per Doug answer, if newCustomerRef already contains the ID, it seems what I am missing is to delete it during the catch block in case the transaction fails:
try {
const newCustomerRef = db.collection('customers').doc();
await db.runTransaction(t => {
const res = t.set(newCustomerRef, formData)
console.log(res)
});
} catch (e) {
newCustomerRef.delete()
//...error handling...
}
This is sort of a manual thing to do, feels a little hacky. Is there a way to delete it automatically if the transaction fails?
newCustomerRef already contains the ID. It was generated randomly on the client as soon as doc() was called, before the transaction ever started.
const id = newCustomerRef.id
If a transaction fails for any reason, the database is unchanged.
The operation to add the document is performed in the set(..) call. This means by using set() on the transaction, everything is rolled back should the transaction fail.
This means in the following example
...
await db.runTransaction(t => {
t.set(newCustomerRef, formData)
... do something ...
if (someThingWentWrong) {
throw 'some error'
}
});
Should someThingWentWrong be true, no document will have been added.

Firebase Cloud Functions: Transactions function not returning promise?

here is what I am trying to do with firebase cloud function:
-Listen to any change in one of the documents under 'user' collection.
-Update carbon copies of the userinfo in the relevant documents in both 'comment' and 'post' collections.
Because I will need to query in relevant documents and update them at once, I am writing codes for transaction operations.
Here is the code that I wrote. It returns the error message, 'Function returned undefined, expected Promise or value'.
exports.useInfoUpdate = functions.firestore.document('user/{userid}').onUpdate((change,context) => {
const olduserinfo=change.before.data();
const newuserinfo=change.after.data();
db.runTransaction(t=>{
return t.get(db.collection('comment').where('userinfo','==',olduserinfo))
.then((querysnapshot)=>{
querysnapshot.forEach((doc)=>{
doc.ref.update({userinfo:newuserinfo})
})
})
})
.then(()=>{
db.runTransaction(t=>{
return t.get(db.collection('post').where('userinfo','==',olduserinfo))
.then((querysnapshot)=>{
querysnapshot.forEach((doc)=>{
doc.ref.update({userinfo:newuserinfo})
})
})
})
})
});
I am a bit confused because as far as I know, 'update' method returns a promise? I might be missing something big but I picked up programming only last November, so don't be too harsh. :)
Any advice on how to fix this issue? Thanks!
EDIT:
Building on Renaud's excellent answer, I created the below code in case someone may need it.
The complication with transaction is that the same data may be stored under different indices or in different formats. e.g. The same 'map' variable can be stored under an index in one collection, and as part of an array in another. In this case, each document returned by querying needs different update methods.
I resolved this issue using doc.ref.path, split, and switch methods. This enables application of different update methods based on the collection name. In a nutshell, something like this:
return db.runTransaction(t => {
return t.getAll(...refs)
.then(docs => {
docs.forEach(doc => {
switch (doc.ref.path.split('/')[0]) { //This returns the collection name and switch method assigns a relevant operation to be done.
case 'A':
t = t.update(doc.ref, **do whatever is needed for this collection**)
break;
case 'B':
t = t.update(doc.ref, **do whatever is needed for this collection**)
break;
default:
t = t.update(doc.ref, **do whatever is needed for this collection**)
}
})
})
})
Hope this helps!
Preamble: This is a very interesting use case!!
The problem identified by the error message comes from the fact that you don't return the Promise returned by the runTransaction() method. However there are several other problems in your code.
With the Node.js Server SDK you can indeed pass a query to the transaction's get() method (you cannot with the JavaScript SDK). However, in your case you want to update the documents returned by two queries. You cannot call twice db.runTransaction() because, then, it is not a unique transaction anymore.
So you need to use the getAll() method by passing an unpacked array of DocumentReferences. (Again, note that this getAll() method is only available in the Node.js Server SDK and not in the JavaScript SDK).
The following code will do the trick.
We run the two queries and transform the result in one array of DocumentReferences. Then we call the runTransaction() method and use the spread operator to unpack the array of DocumentReferences and pass it to the getAll() method.
Then we loop over the docs and we chain the calls to the transaction's update() method, since it returns the transaction.
However note that, with this approach, if the results of one of the two original queries change during the transaction, any new or removed documents will not be seen by the transaction.
exports.useInfoUpdate = functions.firestore.document('user/{userid}').onUpdate((change, context) => {
const olduserinfo = change.before.data();
const newuserinfo = change.after.data();
const db = admin.firestore();
const q1 = db.collection('comment').where('userinfo', '==', olduserinfo); // See the remark below: you probably need to use a document field here (e.g. olduserinfo.userinfo)
const q2 = db.collection('post').where('userinfo', '==', olduserinfo);
return Promise.all([q1.get(), q2.get()])
.then(results => {
refs = [];
results.forEach(querySnapshot => {
querySnapshot.forEach(documentSnapshot => {
refs.push(documentSnapshot.ref);
})
});
return db.runTransaction(t => {
return t.getAll(...refs)
.then(docs => {
docs.forEach(doc => {
t = t.update(doc.ref, { userinfo: newuserinfo })
})
})
})
})
});
Two last remarks:
I am not sure that db.collection('comment').where('userinfo', '==', olduserinfo); will be valid as olduserinfo is obtained through change.before.data(). You probably need to specify one field. This is probably the same for newuserinfo.
Note that you cannot do doc.ref.update() in a transaction, you need to call the transaction's update() method, not the one of a DocumentReference.

Not able to iterate over array of users pulled from firebase [duplicate]

This question already has an answer here:
How to get data from firestore DB in outside of onSnapshot
(1 answer)
Closed 3 years ago.
I am using Firebase's Cloud Firestore for a web page I'm working on. I have it currently setup to create a new document in the "Users" collection when a new user is added/joined. The issue is when I try to pull the list of users down to iterate over them, I'm not able to.
I have tried iterating over it wither different kinds of loops. The loops don't seem to run as the length of the object when console logging it is 0.
let temp = [];
db.collection("Users").onSnapshot(res => {
const changes = res.docChanges();
changes.forEach(change => {
if (change.type === "added") {
temp.push({
id: change.doc.id,
email: change.doc.data().email
});
}
});
});
console.log(temp);
console.log(temp.length);
I expected the 2nd console log to be 2 but it outputs 0. The weird thing is when I look at the object from the console log above, it shows it has a length of 2 and shows the current data in it:
Data is loaded from Firestore asynchronously. Since this may take some time, your main code will continue to run while the data is loading. Then when the data is loaded, your callback functions is called.
You can easily see this in practice by adding some logging:
console.log("Before starting onSnapshot");
db.collection("Users").onSnapshot(res => {
console.log("Got data");
});
console.log("After starting onSnapshot");
When you run this code, it logs:
Before starting onSnapshot
After starting onSnapshot
Got data
This is probably not the order you expected, but it completely explains why the log statements don't work as you expected. By the time you log the array length, the data hasn't been loaded from the database yet.
The reason logging the array does seem to work, is that Chrome updates the log output after the data has loaded. If you change it to console.log(JSON.stringify(temp));, you'll see that it logs an empty array.
This means that all code that needs access to the data, must be inside the callback, or be called from there:
let temp = [];
db.collection("Users").onSnapshot(res => {
const changes = res.docChanges();
changes.forEach(change => {
if (change.type === "added") {
temp.push({
id: change.doc.id,
email: change.doc.data().email
});
}
});
console.log(temp);
console.log(temp.length);
});
As you can probably imagine, many developers who are new to asynchronous APIs run into this problem. I recommend studying some of the previous questions on this topic, such as:
How do I return the response from an asynchronous call?
THEN ARRAY EMPTY PROMISSE
How to get data from firestore DB in outside of onSnapshot

Pg-promise inserts/transactions not working within async queue

I have found a lot of things related to the use of pg-promise and await/async but nothing that quite answers my issue with async (the node/npm package) and in particular the interaction between async.queue and pg-promise queries.
My issue: I need to make a few millions computations (matching score) asynchronously and commit their results in the same async process in a postgres db. My main process is a promise that first computes all of the possible distinct combinations of two records from a table and segments them in chunks of a thousand pairs at a time.
These chunks of a thousand pairs (i.e. [[0,1], [0,2], ... , [0, 1000]] is my array of chunks' first index' content) are fed to an instance of async.queue that performs first the computation of the matching score then the db recording.
The part that has had me scratching my head for hours is that the db committing doesn't work whether it is using insert statements or transactions. I know for sure the functions I use for the db part work since I've written manual tests using them.
My main code is as follows:
'use strict';
const promise = require('bluebird');
const initOptions = {
promiseLib: promise
};
const pgp = require('pg-promise')(initOptions);
const cn = {connexion parameters...};
const db = pgp(cn);
const async = require('async');
var mainPromise = (db, php, param) => {
return new Promise(resolve => {
//some code computing the chunksArray using param
...
var q = async.queue((chunk, done) => {
var scores = performScoresCalculations(chunk);
//scores is an array containing the 1000 scores for any chunk of a 1000 pairs
performDbCommitting(db, scores);
//commit those scores to the db using pg-promise
done();
}, 10);
q.drain = () => {
resolve(arr);
//admittedly not quite sure about that part, haven't used async.queue much so far
}
q.push(chunksArray);
)}.catch(err => {
console.error(err);
});
};
Now my scores array looks like this:
[{column1: 'value1_0', column2: 'value2_0', ..., columnN: 'valueN_0'}, ... , {column1: 'value1_999', column2: 'value2_999', column3: 'value3_999'}] with a thousand records in it.
My performDbCommitting function is as follows:
var performDbCommitting = (db, pgp, scores) => {
console.log('test1');
//displays 'test1', as expected
var query = pgp.helpers.insert(scores, ['column1', 'column2', 'column3'], 'myScoreTable');
console.log(query);
//display the full content of the query, as expected
db.any(query).then(data => {
console.log('test2');
//nothing is displayed
console.log(data);
//nothing is displayed
return;
}).catch(err => {
console.error(err);
});
}
So here is my problem:
when testing "manually" performDbCommitting works perfectly, I've even tried a version with transactions, same works flawlessly,
when used within async.queue everything in performDbCommitting seems to work until the db.any(query) call, as evidenced by the console.log displaying the info correctly until that point,
no error is thrown up, the computations over chunksArray keep on going by groups of 1000 as expected,
if I inspect any of the arrays (chunk, chunksArray, scores, etc) everything is as should be, the lengths are correct, their contents too.
pg-promise just doesn't seem to want to push my 1000 records at a time in the database when used with async.queue and that's where I'm stuck.
I have no trouble imagining the fault lies with me, it's about the first time I'm using async.queue, especially mixed with bluebird promising and pg-promise.
Thank you very much in advance for taking the time to read this and shed any light on this issue if you can.
I was experiencing this same issue on one of my machines in particular but none of the others.
What worked for me was updating pg-promise from version 10.5.0 to version 10.5.6 (via npm update pg-promise).
Your mainPromise doesn't wait for performDBCommitting to finish:
should be like:
//commit those scores to the db using pg-promise
performDbCommitting(db, scores).then(()=>{done();});
and performDBCommitting needs to return the promise too:
return db.any(query).then(data => {
console.log('test2');
//nothing is displayed
console.log(data);
//nothing is displayed
return null;
}).catch(err => {
console.error(err);
return null;
});

Categories