Mongoose error: "Topology was destroyed", what's the problem? - javascript

I was coding a leveling system for my discord bot, but encountered a mongoerror on the way, does anyone know why this is occuring? (The addXp function runs whenever someone chatted, the error is logged in the removeTimeout function.) I have used mongoose in many more ways in my code but this hasn't yet occured before, I also did find others' issues with this error but found no fitting solution there.
async function removeTimeout(leveldataid) {
console.log("removetimeout")
await mongo().then(async (mongoose) => {
try {
await levelSchema.findByIdAndUpdate(
leveldataid,
{
_id: leveldataid,
oncooldown: false
},
{upsert: true}
)
} catch(err) {
console.log(err)
} finally {
mongoose.connection.close()
}
})
}
async function addXP(member, guild) {
let data = undefined
let changedleveldata = false
await mongo().then(async (mongoose) => {
try {
data = await levelenabledSchema.findById(guild.id)
if (data) {
if (data.enabled == true) {
let extraxp = Math.floor(Math.random() *20) + 15
let leveldata = await levelSchema.findById(`guild${guild.id}member${member.id}`)
if (!leveldata) {
await levelSchema.findByIdAndUpdate(
`guild${guild.id}member${member.id}`,
{
_id: `guild${guild.id}member${member.id}`,
guild: guild.id,
level: 1,
maxXp: 113,
xp: extraxp,
oncooldown: true
},
{upsert: true}
)
} else {
if (leveldata.oncooldown == false) {
let newlevel = leveldata.level
let newMaxXp = leveldata.maxXp
newMaxXp = Math.floor(newMaxXp)
let xp = leveldata.xp
xp +=extraxp
if (xp > leveldata.maxXp) {
xp -=Math.floor(100*1.135**newlevel)
newlevel++
newMaxXp = 100*1.13**newlevel
}
await levelSchema.findByIdAndUpdate(
`guild${guild.id}member${member.id}`,
{
level: newlevel,
maxXp: newMaxXp,
xp: xp,
oncooldown: true
},
{upsert: true}
)
}
}
}
}
changedleveldata = true
} catch(err) {
console.log(err)
} finally {
mongoose.connection.close()
}
if (changedleveldata == true) {
setTimeout(remoteTimeout(`guild${guild.id}member${member.id}`), 60000)
}
})
}

This basically means the connection between node server and mongoDB was interrupted while writing data.
The solution to your problem would be to remove the finally block from your code because it interrupts the connection when the bot is being used by a lot of users and the bot tries to write data while a previous connection is being closed.
await mongo().then(async (mongoose) => {
try { // code here
} catch(e) { console.log(e); }
});

Related

typescript node insert on mongodb database

im a total newbie in js (typescript, mongoDB, node.)
i just found that my code is not behaving as i expected, im getting 6 registers on the mongoDB instead of just one, it should check if the register exists and then update it, i dont know if it is something related to the await / async or i am doing something wrong, thanks in advace, here is my code.
fields.forEach((value) => {
try {
const mongoConnection = new DocumentDbRepository();
let checksIfExists = await mongoConnection.getValue(key, information[uniqueValue]);
if(checksIfExists==null){
let insert = await mongoConnection.insertValue(information);
console.log(insert);
}
if(checksIfExists?.passValue===information.passValue){
console.log('---------update---------');
let sons = Object.values(information.ticketToRide);
information.ticketToRide = sons;
let update = await mongoConnection.updateRegister(information, checksIfExists._id);
console.log(update);
} else {
console.log('---------insert---------');
let sons = Object.values(information.ticketToRide);
information = sons;
let insert = await mongoConnection.insertValue(information);
console.log(insert);
}
} catch (error) {
console.log(error)
}
}
async getValue(uniqueValue: any, keyValue:any) {
if (this._connection == null) {
await this.connect();
}
const db = this._connection.db(DocumentDbRepository.DbName);
const ticketToRide = db.collection("ticketToRide");
const query = {};
query[uniqueValue] = ''+keyValue+'';
const passInfo = await ticketToRide.findOne(query);
return passInfo;
}
async insertValue(information: any) {
if (this._connection == null) {
await this.connect();
}
const db = this._connection.db(DocumentDbRepository.DbName);
const ticketToRide = db.collection("ticketToRide");
let check = await ticketToRide.insertOne(
information
)
return check;
}
First, you don't need to create a connection inside the loop.
Second, mongodb has an update() or updateMany() method that has a special option { upsert: true }. If it is passed, insert will happen automatically.
Usage example:
Person.update( { name: 'Ted' }, { name: 'Ted', age : 50 }, { upsert: true })

Trying to make a matchmaking function

So i am trying to find an opponent for user based on his trophies, it works fine when if condition isnt run but when if condition runs, it runs infinte loop
const UserProfile = require("../schemas/userProfile")
async function matchmake(user, message) {
let UserProfileDetails = await UserProfile.findOne({ userID: user.id });
let userTrophies = UserProfileDetails.trophies;
let userMatched = await UserProfile.aggregate([
{ $match: { trophies: { $gte: userTrophies - 10, $lte: userTrophies + 10 } } },
{ $sample: { size: 1 } }
]);
let otherUserID = userMatched[0].userID;
console.log("userID -"+otherUserID);
if (otherUserID === user.id) {
otherUserID = await matchmake(user, message);
}
return otherUserID;
}
module.exports = { matchmake }```
If it's infinite looping, it seems that your aggregation keeps pulling the same user as 'user.id', which recursively calls the matchmake function again over and over.
I would try to add a $not condition to your match that checks for the user.id. That way, the aggregation doesn't return the original user.

onRequest vs onCall returning null

Please help me figure out the difference in return behaviour between the onCall and onRequest google functions below.
onCall, the problem: returns null on all returns, except at the first return (as commented below). The db entries and rest of the code works fine. Just no returns problem.
onRequest, returns perfectly fine on every return. The db entries and rest of the code also works fine.
Both as you will see compare the same, but I just can't seem to get it to work at all. Any advice on how to get my returns to work for the onCall (and structure it better) would be much appreciated.
I am keen on sticking with async await (as opposed to a promise). Using Node.js 12. I am calling the onCall in Flutter, don't know if that is relevant for the question.
The onCall:
exports.applyUserDiscount = functions.https.onCall(async (data, context) => {
if (!context.auth) return {message: "Authentication Required!", code: 401};
const uid = context.auth.uid;
const discountCode = data["discountCode"];
const cartTotal = data["cartTotal"];
try {
return await db.collection("discountCodes").where("itemID", "==", discountCode).limit(1).get()
.then(async (snapshot) => {
if (snapshot.empty) {
return "doesNotExist"; // The only return that works.
} else { // Everything else from here onwards returns null.
snapshot.forEach(async (doc) => {
if (doc.data().redeemed == true) {
return "codeUsed";
} else {
const newCartTotal = cartTotal - doc.data().discountAmount;
if (newCartTotal < 0) {
return "lessThanTotal";
} else {
doc.ref.update({
redeemed: true,
uid: uid,
redeemDate: fireDateTimeNow,
});
await db.collection("userdata").doc(uid).set({
cartDiscount: admin.firestore.FieldValue.increment(-doc.data().discountAmount),
}, {merge: true});
return doc.data().discountAmount.toString();
}
}
});
}
});
} catch (error) {
console.log("Error:" + error);
return "error";
}
});
The onRequest:
exports.applyUserDiscount = functions.https.onRequest(async (req, res) => {
const uid = req.body.uid;
const discountCode = req.body.discountCode;
const cartTotal = req.body.cartTotal;
try {
return await db.collection("discountCodes").where("itemID", "==", discountCode).limit(1).get()
.then(async (snapshot) => {
if (snapshot.isempty) {
res.send("doesNotExist");
} else {
snapshot.forEach(async (doc) => {
if (doc.data().redeemed == true) {
res.send("codeUsed");
} else {
const newCartTotal = cartTotal - doc.data().discountAmount;
if (newCartTotal < 0) {
res.send("lessThanTotal");
} else {
doc.ref.update({
redeemed: true,
uid: uid,
redeemDate: fireDateTimeNow,
});
await db.collection("userdata").doc(uid).set({
cartDiscount: admin.firestore.FieldValue.increment(-doc.data().discountAmount),
}, {merge: true});
res.send(doc.data().discountAmount.toString());
}
}
});
}
});
} catch (error) {
console.log(error);
res.send("error");
}
});
There are several points to be noted when looking at your code(s):
You should not use async/await within a forEach loop. The problem is that the callback passed to forEach() is not being awaited, see more explanations here or here. HOWEVER, in your case you don't need to loop over the QuerySnapshot since it contains only one doc. You can use the docs property which return an array of all the documents in the QuerySnapshot and take the first (and unique) element.
You mix-up then() with async/await, which is not recommended.
I would advise to throw exceptions for the "error" cases, like doesNotExist, codeUsed or lessThanTotal but it's up to you to choose. The fact that, for example, the lessThanTotal case is an error or a standard business case is debatable... So if you prefer to send a "text" response, I would advise to encapsulate this response in a Object with one property: in your front-end the response will always have the same format.
So, the following should do the trick. Note that I send back on object with a response element, including for the cases that could be considered as errors. As said above you could throw an exception in these cases.
exports.applyUserDiscount = functions.https.onCall(async (data, context) => {
if (!context.auth) ... //See https://firebase.google.com/docs/functions/callable#handle_errors
const uid = context.auth.uid;
const discountCode = data["discountCode"];
const cartTotal = data["cartTotal"];
try {
const snapshot = await db.collection("discountCodes").where("itemID", "==", discountCode).limit(1).get();
if (snapshot.empty) {
//See https://firebase.google.com/docs/functions/callable#handle_errors
} else {
const uniqueDoc = snapshot.docs[0];
if (uniqueDoc.data().redeemed == true) {
return { response: "codeUsed" };
} else {
const newCartTotal = cartTotal - uniqueDoc.data().discountAmount;
if (newCartTotal < 0) {
return { response: "lessThanTotal" };
} else {
await uniqueDoc.ref.update({ // See await here!!
redeemed: true,
uid: uid,
redeemDate: fireDateTimeNow,
});
await db.collection("userdata").doc(uid).set({
cartDiscount: admin.firestore.FieldValue.increment(-uniqueDoc.data().discountAmount),
}, { merge: true });
return {
response: uniqueDoc.data().discountAmount.toString()
}
}
}
}
} catch (error) {
console.log("Error:" + error);
return "error";
}
});

UnhandledPromiseRejectionWarning: MongoError: Cannot use a session that has ended

I am creating a node.js bot to track how long users have been on a game for. When I run this code sometimes the bot throws an error UnhandledPromiseRejectionWarning: MongoError: Cannot use a session that has ended
I can't see unreachable code or a session that is disconnected before everything has a chance to run. Here is my entire code for the function on a discord presence update:
const mongo = require('./mongo');
const exposeSchema = require('./schemas/expose-schema');
const { general } = require('./config.json');
module.exports = async client => {
client.on('presenceUpdate', async (oldPresence, newPresence) => {
const username = newPresence.user.username;
const guild = newPresence.guild.id;
console.log('Presence update detected:');
console.log(`Guild ID: ${guild} \nUser ID: ${newPresence.user.id} \nUsername: ${username}`);
await mongo().then(async mongoose => {
try {
await exposeSchema.where({ guildid: guild, username: username }).findOne(async function(err, sent) {
if (err) {
console.log(err);
}
if (sent == null) {
console.log('Writing to database');
await exposeSchema({
guildid: guild,
username: username,
sent: false,
}).save();
}
else if (sent.sent == false) {
const act = await newPresence.activities.find(activity => activity.timestamps != null);
if (act) {
const n = new Date();
const g = act.timestamps.start;
const hours = Math.abs(n - g) / 36e5;
if (hours >= 4) {
await exposeSchema.findOneAndUpdate({
guildid: guild,
username: username,
}, {
guildid: guild,
username: username,
sent: true,
}, {
upsert: true,
});
console.log(`${newPresence.user.username} has been playing ${act.name} for ${Math.round(hours)} hours, time to wrap it the fuck up and go outside or get some sleep.`);
if (newPresence.member.nickname == null) {
await client.channels.cache.get(general).send(`${newPresence.user.username} has been playing ${act.name} for ${Math.round((hours) * 100) / 100} hours, time to wrap it the fuck up and go outside or get some sleep.`);
}
else {
await client.channels.cache.get(general).send(`${newPresence.user.nickname} has been playing ${act.name} for ${Math.round((hours) * 100) / 100} hours, time to wrap it the fuck up and go outside or get some sleep.`);
}
}
}
}
});
}
finally {
mongoose.connection.close();
}
});
});
};

How can I update more than 500 docs in Firestore using Batch?

I'm trying to update a field timestamp with the Firestore admin timestamp in a collection with more than 500 docs.
const batch = db.batch();
const serverTimestamp = admin.firestore.FieldValue.serverTimestamp();
db
.collection('My Collection')
.get()
.then((docs) => {
serverTimestamp,
}, {
merge: true,
})
.then(() => res.send('All docs updated'))
.catch(console.error);
This throws an error
{ Error: 3 INVALID_ARGUMENT: cannot write more than 500 entities in a single call
at Object.exports.createStatusError (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\common.js:87:15)
at Object.onReceiveStatus (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:1188:28)
at InterceptingListener._callNext (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:564:42)
at InterceptingListener.onReceiveStatus (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:614:8)
at callback (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:841:24)
code: 3,
metadata: Metadata { _internal_repr: {} },
details: 'cannot write more than 500 entities in a single call' }
Is there a way that I can write a recursive method which creates a batch object updating a batch of 500 docs one by one until all the docs are updated.
From the docs I know that delete operation is possible with the recursive approach as mentioned here:
https://firebase.google.com/docs/firestore/manage-data/delete-data#collections
But, for updating, I'm not sure how to end the execution since the docs are not being deleted.
I also ran into the problem to update more than 500 documents inside a Firestore collection. And i would like to share how i solved this problem.
I use cloud functions to update my collection inside Firestore but this should also work on client side code.
The solution counts every operation which is made to the batch and after the limit is reached a new batch is created and pushed to the batchArray.
After all updates are completed the code loops through the batchArray and commits every batch which is inside the array.
It is important to count every operation set(), update(), delete() which is made to the batch because they all count to the 500 operation limit.
const documentSnapshotArray = await firestore.collection('my-collection').get();
const batchArray = [];
batchArray.push(firestore.batch());
let operationCounter = 0;
let batchIndex = 0;
documentSnapshotArray.forEach(documentSnapshot => {
const documentData = documentSnapshot.data();
// update document data here...
batchArray[batchIndex].update(documentSnapshot.ref, documentData);
operationCounter++;
if (operationCounter === 499) {
batchArray.push(firestore.batch());
batchIndex++;
operationCounter = 0;
}
});
batchArray.forEach(async batch => await batch.commit());
return;
I liked this simple solution:
const users = await db.collection('users').get()
const batches = _.chunk(users.docs, 500).map(userDocs => {
const batch = db.batch()
userDocs.forEach(doc => {
batch.set(doc.ref, { field: 'myNewValue' }, { merge: true })
})
return batch.commit()
})
await Promise.all(batches)
Just remember to add import * as _ from "lodash" at the top. Based on this answer.
You can use default BulkWriter. This method used 500/50/5 rule.
Example:
let bulkWriter = firestore.bulkWriter();
bulkWriter.create(documentRef, {foo: 'bar'});
bulkWriter.update(documentRef2, {foo: 'bar'});
bulkWriter.delete(documentRef3);
await close().then(() => {
console.log('Executed all writes');
});
As mentioned above, #Sebastian's answer is good and I upvoted that too. Although faced an issue while updating 25000+ documents in one go.
The tweak to logic is as below.
console.log(`Updating documents...`);
let collectionRef = db.collection('cities');
try {
let batch = db.batch();
const documentSnapshotArray = await collectionRef.get();
const records = documentSnapshotArray.docs;
const index = documentSnapshotArray.size;
console.log(`TOTAL SIZE=====${index}`);
for (let i=0; i < index; i++) {
const docRef = records[i].ref;
// YOUR UPDATES
batch.update(docRef, {isDeleted: false});
if ((i + 1) % 499 === 0) {
await batch.commit();
batch = db.batch();
}
}
// For committing final batch
if (!(index % 499) == 0) {
await batch.commit();
}
console.log('write completed');
} catch (error) {
console.error(`updateWorkers() errored out : ${error.stack}`);
reject(error);
}
Explanations given on previous comments already explain the issue.
I'm sharing the final code that I built and worked for me, since I needed something that worked in a more decoupled manner, instead of the way that most of the solutions presented above do.
import { FireDb } from "#services/firebase"; // = firebase.firestore();
type TDocRef = FirebaseFirestore.DocumentReference;
type TDocData = FirebaseFirestore.DocumentData;
let fireBatches = [FireDb.batch()];
let batchSizes = [0];
let batchIdxToUse = 0;
export default class FirebaseUtil {
static addBatchOperation(
operation: "create",
ref: TDocRef,
data: TDocData
): void;
static addBatchOperation(
operation: "update",
ref: TDocRef,
data: TDocData,
precondition?: FirebaseFirestore.Precondition
): void;
static addBatchOperation(
operation: "set",
ref: TDocRef,
data: TDocData,
setOpts?: FirebaseFirestore.SetOptions
): void;
static addBatchOperation(
operation: "create" | "update" | "set",
ref: TDocRef,
data: TDocData,
opts?: FirebaseFirestore.Precondition | FirebaseFirestore.SetOptions
): void {
// Lines below make sure we stay below the limit of 500 writes per
// batch
if (batchSizes[batchIdxToUse] === 500) {
fireBatches.push(FireDb.batch());
batchSizes.push(0);
batchIdxToUse++;
}
batchSizes[batchIdxToUse]++;
const batchArgs: [TDocRef, TDocData] = [ref, data];
if (opts) batchArgs.push(opts);
switch (operation) {
// Specific case for "set" is required because of some weird TS
// glitch that doesn't allow me to use the arg "operation" to
// call the function
case "set":
fireBatches[batchIdxToUse].set(...batchArgs);
break;
default:
fireBatches[batchIdxToUse][operation](...batchArgs);
break;
}
}
public static async runBatchOperations() {
// The lines below clear the globally available batches so we
// don't run them twice if we call this function more than once
const currentBatches = [...fireBatches];
fireBatches = [FireDb.batch()];
batchSizes = [0];
batchIdxToUse = 0;
await Promise.all(currentBatches.map((batch) => batch.commit()));
}
}
Based on all the above answers, I put together the following pieces of code that one can put into a module in JavaScript back-end and front-end to easily use Firestore batch writes, without worrying about the 500 writes limit.
Back-end (Node.js)
// The Firebase Admin SDK to access Firestore.
const admin = require("firebase-admin");
admin.initializeApp();
// Firestore does not accept more than 500 writes in a transaction or batch write.
const MAX_TRANSACTION_WRITES = 499;
const isFirestoreDeadlineError = (err) => {
console.log({ err });
const errString = err.toString();
return (
errString.includes("Error: 13 INTERNAL: Received RST_STREAM") ||
errString.includes("Error: 4 DEADLINE_EXCEEDED: Deadline exceeded")
);
};
const db = admin.firestore();
// How many transactions/batchWrites out of 500 so far.
// I wrote the following functions to easily use batchWrites wthout worrying about the 500 limit.
let writeCounts = 0;
let batchIndex = 0;
let batchArray = [db.batch()];
// Commit and reset batchWrites and the counter.
const makeCommitBatch = async () => {
console.log("makeCommitBatch");
await Promise.all(batchArray.map((bch) => bch.commit()));
};
// Commit the batchWrite; if you got a Firestore Deadline Error try again every 4 seconds until it gets resolved.
const commitBatch = async () => {
try {
await makeCommitBatch();
} catch (err) {
console.log({ err });
if (isFirestoreDeadlineError(err)) {
const theInterval = setInterval(async () => {
try {
await makeCommitBatch();
clearInterval(theInterval);
} catch (err) {
console.log({ err });
if (!isFirestoreDeadlineError(err)) {
clearInterval(theInterval);
throw err;
}
}
}, 4000);
}
}
};
// If the batchWrite exeeds 499 possible writes, commit and rest the batch object and the counter.
const checkRestartBatchWriteCounts = () => {
writeCounts += 1;
if (writeCounts >= MAX_TRANSACTION_WRITES) {
batchIndex++;
batchArray.push(db.batch());
writeCounts = 0;
}
};
const batchSet = (docRef, docData) => {
batchArray[batchIndex].set(docRef, docData);
checkRestartBatchWriteCounts();
};
const batchUpdate = (docRef, docData) => {
batchArray[batchIndex].update(docRef, docData);
checkRestartBatchWriteCounts();
};
const batchDelete = (docRef) => {
batchArray[batchIndex].delete(docRef);
checkRestartBatchWriteCounts();
};
module.exports = {
admin,
db,
MAX_TRANSACTION_WRITES,
checkRestartBatchWriteCounts,
commitBatch,
isFirestoreDeadlineError,
batchSet,
batchUpdate,
batchDelete,
};
Front-end
// Firestore does not accept more than 500 writes in a transaction or batch write.
const MAX_TRANSACTION_WRITES = 499;
const isFirestoreDeadlineError = (err) => {
return (
err.message.includes("DEADLINE_EXCEEDED") ||
err.message.includes("Received RST_STREAM")
);
};
class Firebase {
constructor(fireConfig, instanceName) {
let app = fbApp;
if (instanceName) {
app = app.initializeApp(fireConfig, instanceName);
} else {
app.initializeApp(fireConfig);
}
this.name = app.name;
this.db = app.firestore();
this.firestore = app.firestore;
// How many transactions/batchWrites out of 500 so far.
// I wrote the following functions to easily use batchWrites wthout worrying about the 500 limit.
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
}
async makeCommitBatch() {
console.log("makeCommitBatch");
if (!this.isCommitting) {
this.isCommitting = true;
await this.batch.commit();
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.isCommitting = true;
await this.batch.commit();
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async commitBatch() {
try {
await this.makeCommitBatch();
} catch (err) {
console.log({ err });
if (isFirestoreDeadlineError(err)) {
const theInterval = setInterval(async () => {
try {
await this.makeCommitBatch();
clearInterval(theInterval);
} catch (err) {
console.log({ err });
if (!isFirestoreDeadlineError(err)) {
clearInterval(theInterval);
throw err;
}
}
}, 4000);
}
}
}
async checkRestartBatchWriteCounts() {
this.writeCounts += 1;
if (this.writeCounts >= MAX_TRANSACTION_WRITES) {
await this.commitBatch();
}
}
async batchSet(docRef, docData) {
if (!this.isCommitting) {
this.batch.set(docRef, docData);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.set(docRef, docData);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async batchUpdate(docRef, docData) {
if (!this.isCommitting) {
this.batch.update(docRef, docData);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.update(docRef, docData);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async batchDelete(docRef) {
if (!this.isCommitting) {
this.batch.delete(docRef);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.delete(docRef);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
}
No citations or documentation, this code i invented by myself and for me it worked and looks clean, and simple for read and usage. If some one like it, then can use it too.
Better make autotest becose code use private var _ops wich can be changed after packages upgrade. Forexample in old versions its can be _mutations
async function commitBatch(batch) {
const MAX_OPERATIONS_PER_COMMIT = 500;
while (batch._ops.length > MAX_OPERATIONS_PER_COMMIT) {
const batchPart = admin.firestore().batch();
batchPart._ops = batch._ops.splice(0, MAX_OPERATIONS_PER_COMMIT - 1);
await batchPart.commit();
}
await batch.commit();
}
Usage:
const batch = admin.firestore().batch();
batch.delete(someRef);
batch.update(someRef);
...
await commitBatch(batch);
Simple solution
Just fire twice ?
my array is "resultsFinal"
I fire batch once with a limit of 490 , and second with a limit of the lenght of the array ( results.lenght)
Works fine for me :)
How you check it ?
You go to firebase and delete your collection , firebase say you have delete XXX docs , same as the lenght of your array ? Ok so you are good to go
async function quickstart(results) {
// we get results in parameter for get the data inside quickstart function
const resultsFinal = results;
// console.log(resultsFinal.length);
let batch = firestore.batch();
// limit of firebase is 500 requests per transaction/batch/send
for (i = 0; i < 490; i++) {
const doc = firestore.collection('testMore490').doc();
const object = resultsFinal[i];
batch.set(doc, object);
}
await batch.commit();
// const batchTwo = firestore.batch();
batch = firestore.batch();
for (i = 491; i < 776; i++) {
const objectPartTwo = resultsFinal[i];
const doc = firestore.collection('testMore490').doc();
batch.set(doc, objectPartTwo);
}
await batch.commit();
}

Categories