I have developed a game using Firestore, but I have noticed some problems in my scheduled cloud function which deletes rooms that were created 5 minutes ago and are not full OR are finished.
So for that, I am running the following code.
async function deleteExpiredRooms() {
// Delete all rooms that are expired and not full
deleteExpiredSingleRooms();
// Also, delete all rooms that are finished
deleteFinishedRooms();
}
Deleting finished rooms seems to work correctly with this:
async function deleteFinishedRooms() {
const query = firestore
.collection("gameRooms")
.where("finished", "==", true);
const querySnapshot = await query.get();
console.log(`Deleting ${querySnapshot.size} expired rooms`);
// Delete the matched documents
querySnapshot.forEach((doc) => {
doc.ref.delete();
});
}
But I am experiencing concurrency problems when deleting rooms created 5 minutes ago that are not full (one room is full when 2 users are in the room, so that the game can start).
async function deleteExpiredSingleRooms() {
const currentDate = new Date();
// Calculate the target date
const targetDate = // ... 5 minutes ago
const query = firestore
.collection("gameRooms")
.where("full", "==", false)
.where("createdAt", "<=", targetDate);
const querySnapshot = await query.get();
console.log(`Deleting ${querySnapshot.size} expired rooms`);
// Delete the matched documents
querySnapshot.forEach((doc) => {
doc.ref.delete();
});
}
Because during the deletion of a room, a user can enter it before it is completely deleted.
Any ideas?
Note: For searching rooms I am using a transaction
firestore.runTransaction(async (transaction) => {
...
const query = firestore
.collection("gameRooms")
.where("full", "==", false);
return transaction.get(query.limit(1));
});
You can use BatchWrites:
const query = firestore
.collection("gameRooms")
.where("full", "==", false)
.where("createdAt", "<=", targetDate);
const querySnapshot = await query.get();
console.log(`Deleting ${querySnapshot.size} expired rooms`);
const batch = db.batch();
querySnapshot.forEach((doc) => {
batch.delete(doc.ref);
});
// Commit the batch
batch.commit().then(() => {
// ...
});
A batched write can contain up to 500 operations. Each operation in
the batch counts separately towards your Cloud Firestore usage.
This should delete all the rooms matching that criteria at once. Using a loop to delete them might take a while as it'll happen one by one.
If you are concerned about the 500 docs limit in a batch write, consider using Promise.all as shown:
const deleteOps = []
querySnapshot.forEach((doc) => {
deleteOps.push(doc.ref.delete());
});
await Promise.all(deleteOps)
Now to prevent users from joining the rooms that are being delete, it's kind of harder in a Cloud Function to do so as all the instances run independently and there may be a race condition.
To avoid that, you many have to manually check if the room that user is trying to join is older than 5 minutes and has less number of players. This is just a check to make sure the room is being deleted or will be deleted in no time.
function joinRoom() {
// isOlderThanMin()
// hasLessNumOfPlayers()
// return 'Room suspended'
}
Because the logic to filter which rooms should be deleted is same, this should not be an issue.
Maybe you are looking for transactions check out the documentation out here: https://firebase.google.com/docs/firestore/manage-data/transactions
Or watch the YouTube video that explains the concurrency problem and the differences between batched writes and transactions: https://youtu.be/dOVSr0OsAoU
Related
So I have this website, that acts like a stat sheet and when I press the -1 button, the score of the game goes down by 1. I'm using javascript to access the firebase database and then subtract the score by 1.
const docRef = doc(db, "Games", game_id);
const docSnap = await getDoc(docRef);
var score = docSnap.data().team2_score
await updateDoc(docRef, {
team2_score: score -= number
});
The issue is that if a user clicks it really fast multiple times, then Firebase doesn't do all those operations. So if a user clicks the button 5 times really fast, the Firebase database would only keep track of 4 of them. This is my issue.
Is there a way to make sure that every single one of those clicks updated in the database, even when clicked really fast?
You have two options:
Option 1
Use Firebase Increment: https://cloud.google.com/firestore/docs/samples/firestore-data-set-numeric-increment
const docRef = doc(db, "Games", game_id);
await updateDoc(docRef, {
team2_score: firebase.firestore.FieldValue.increment(-1)
});
Option 2
Use a Transaction. https://firebase.google.com/docs/firestore/manage-data/transactions#transactions
const docRef = doc(db, "Games", game_id);
try {
const result = await db.runTransaction((transaction) => {
// This code may get re-run multiple times if there are conflicts.
return transaction.get(docRef).then((sfDoc) => {
if (!sfDoc.exists) {
throw "Document does not exist!";
}
// Note: this could be done without a transaction
// by updating the score using FieldValue.increment()
var newScore= sfDoc.data().team2_score - 1;
transaction.update(docRef, { team2_score: newScore});
});
})
console.log("Transaction successfully committed!");
} catch(error) {
console.log("Transaction failed: ", error);
}
I am writing a cloud function that will move expired events from a collection to another. It is not working as expected and I am very novice at Javascript. Please save me.
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
const db = admin.firestore();
exports.expireEvents = functions.region("us-west2").pubsub.schedule('* * * * *').onRun(async (context) => {
await db.collection("Events").where('endDate', '<=', admin.firestore.Timestamp.now().millisecondsSinceEpoch).get().then((snapshot) => {
snapshot.forEach(async (doc) => {
// For all expired events, we will perform operations on some of their fields
const event = doc.data();
// Write document to collection of "expired events"
await db.collection("ArchivedEvents").doc(event.eid).set(event);
// For all the guests in that expired event, do stuff; guests should be a list of strings.
event.guests.forEach(async (uid) => {
// Create a new write batch
const batch = db.batch();
// Get user and update some attributes
const guest = await db.collection("Users").doc(uid);
// Add all operations to be performed for given user to this batch
batch.update(guest, {eventsAttended: admin.firestore.FieldValue.arrayUnion(event.eid)});
batch.update(guest, {eventsAttending: admin.firestore.FieldValue.arrayRemove(event.eid)});
// Execute batch of operations
await batch.commit();
});
// Delete doc from "not expired" collection
await db.collection("Events").doc(event.eid).delete();
});
console.log(`Successfully expired events ending on ${admin.firestore.Timestamp.now()}.`);
return true;
})
.catch((err) => {
console.error(`Could not get or update documents. Error ${err}.`);
return false;
});
});
This is the next part of the error. I tried with a collection with no documents and a few documents, but I am starting to think that because none of them have expired yet, that's why I am getting this error?
Rest of error log
If you want to ignore undefined values, enable `ignoreUndefinedProperties`.
at Object.validateUserInput (/workspace/node_modules/#google-cloud/firestore/build/src/serializer.js:277:19)
at validateQueryValue (/workspace/node_modules/#google-cloud/firestore/build/src/reference.js:2230:18)
at CollectionReference.where (/workspace/node_modules/#google-cloud/firestore/build/src/reference.js:1061:9)
at /workspace/index.js:139:33
at cloudFunction (/workspace/node_modules/firebase-functions/lib/cloud-functions.js:131:23)
at /layers/google.nodejs.functions-framework/functions-framework/node_modules/#google-cloud/functions-framework/build/src/function_wrappers.js:144:25
at processTicksAndRejections (node:internal/process/task_queues:96:5)
You've at least two problems:
I think you want a JavaScript Date object (not Firebase Timestamp object) in your query, i.e. where('endDate', '<=', new Date())
Firebase Timestamp doesn't have a millisecondsSinceEpoch property which is -- I think -- causing the "undefined" error that you're encountering.
I see a similar question here . However, I am very new to coding and I am trying to delete all the documents that are older than 1 month and not premium from the “users” collection. When deleting a document, the “user_online” filed need to be 30 days or more old and “user_premium” need to be “no”.
I am using the node js with adminsdk.
I would be very grateful if anyone can help me with a node js code to achieve above.
using some other post I came up with following.
var userdelete_query = db.collection('users').where('user_online', '<=', new Date(Date.now() - 2592000000) && 'user_premium', '==', 'no');
userdelete_query.get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
doc.ref.delete();
console.log(`deleted: ${doc.id}`);
});
});
above code shows no errors, but nothing happens. I think the following part is the problem
&& 'user_premium', '==', no); -
To combine two where clauses, you need to chain them, as explained in the doc.
var userdelete_query = db.collection('users')
.where('user_online', '<=', new Date(Date.now() - 2592000000))
.where('user_premium', '==', 'no');
userdelete_query.get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
doc.ref.delete();
});
});
Note that you will need to create a composite index for this query to work.
Note that with the above code you don't know when all the docs are deleted. If you want to monitor the execution of all the parallel calls to the delete() method, you can use Promise.all() as follows:
const userdelete_query = db.collection('users')
.where('user_online', '<=', new Date(Date.now() - 2592000000))
.where('user_premium', '==', 'no');
userdelete_query.get()
.then(function(querySnapshot) {
const promises = [];
querySnapshot.forEach(function(doc) {
promises.push(doc.ref.delete());
});
return Promise.all(promises);
})
.then(function() {
console.log("ALL DOCS ARE DELETED");
})
A third possible approach would be to use a batched write (or better, a batched deletion).
I am using Firebase in my React Native app. I have a collection of users and in that, I have documents and the ids of the documents are custom (using users' phone numbers as ids) and in that, I have blocked users array field in which I will add the users who are blocked by the user. I want to show the list where user can only see the users who are not blocked.
I am getting all the users list and I want to filter them and fetch only the people not blocked by the user.
var getUsersList = async() => {
const findUser = await firestore().collection('users').get();
if (findUser.docs[0] != undefined && findUser.docs[0]._exists){
setUserList(findUser.docs)
}
}
I understand that your firestore collection is similar to this one:
If that is the case, then I have structured your requirement in the three functions below:
1.readBlockedNumbers to return the array of blocked numbers that the user has.
2.show_nonBlockedUsersthat receives the array of the blocked numbers from the previous method and displays the users who are not in this array.
3. test to coordinate the execution of the above two methods.
const admin = require('firebase-admin');
const serviceAccount = require('/home/keys.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
const db = admin.firestore();
async function readBlockedNumbers(docId){
const userRef = db.collection('users').doc(docId);
const doc = await userRef.get();
if (!doc.exists) {
console.log('No such document!');
return [];
}
else
{
return doc.data()['blocked_numbers'];
}
async function show_nonBlockedUsers(blocked_users){
console.log('Array length:', blocked_users.length);
if(blocked_users.length == 0)
return;
const userRef = db.collection('users');
const snapshot = await userRef
.where(admin.firestore.FieldPath.documentId(), 'not-in', blocked_users)
.get();
if (snapshot.empty) {
console.log('No matching documents.');
return;
}
snapshot.forEach(doc => {
console.log(doc.id, '=>', doc.data());
});
}
async function test(){
const docId = '202-555-0146';
//'202-555-0102'
const blocked_users = await readBlockedNumbers(docId);
await show_nonBlockedUsers(blocked_users);
}
Important here is how to use the not-in operator and the method admin.firestore.FieldPath.documentId().
I have found the not-in operator here and the method firebase.firestore.FieldPath.documentId() referenced in this other stackoverflow question since the id cannot be passed like the other document fields in the where clause.
Please refer to the firebase documentation as well for the limitations of the not-in operator.
I hope you find this useful.
I'd like to make a copy of a collection in Firestore upon an event using Cloud Functions
I already have this code that iterates over the collection and copies each document
const firestore = admin.firestore()
firestore.collection("products").get().then(query => {
query.forEach(function(doc) {
var promise = firestore.collection(uid).doc(doc.data().barcode).set(doc.data());
});
});
is there a shorter version? to just copy the whole collection at once?
I wrote a small nodejs snippet for this.
const firebaseAdmin = require('firebase-admin');
const serviceAccount = '../../firebase-service-account-key.json';
const firebaseUrl = 'https://my-app.firebaseio.com';
firebaseAdmin.initializeApp({
credential: firebaseAdmin.credential.cert(require(serviceAccount)),
databaseURL: firebaseUrl
});
const firestore = firebaseAdmin.firestore();
async function copyCollection(srcCollectionName, destCollectionName) {
const documents = await firestore.collection(srcCollectionName).get();
let writeBatch = firebaseAdmin.firestore().batch();
const destCollection = firestore.collection(destCollectionName);
let i = 0;
for (const doc of documents.docs) {
writeBatch.set(destCollection.doc(doc.id), doc.data());
i++;
if (i > 400) { // write batch only allows maximum 500 writes per batch
i = 0;
console.log('Intermediate committing of batch operation');
await writeBatch.commit();
writeBatch = firebaseAdmin.firestore().batch();
}
}
if (i > 0) {
console.log('Firebase batch operation completed. Doing final committing of batch operation.');
await writeBatch.commit();
} else {
console.log('Firebase batch operation completed.');
}
}
copyCollection('customers', 'customers_backup').then(() => console.log('copy complete')).catch(error => console.log('copy failed. ' + error));
Currently, no. Looping through each document using Cloud Functions and then setting a new document to a different collection with the specified data is the only way to do this. Perhaps this would make a good feature request.
How many documents are we talking about? For something like 10,000 it should only take a few minutes, tops.
This is the method i use to copy data to another collection, I used it to shift data (like sells or something) from an active collection to a 'sells feed' or 'sells history' collection.
At the top i reference the documents, at the bottom is the quite compact code.
You can simply add a for loop on top for more than 1 operation.
Hope it helps somebody :)
DocumentReference copyFrom = FirebaseFirestore.instance.collection('curSells').doc('0001');
DocumentReference copyTo = FirebaseFirestore.instance.collection('sellFeed').doc('0001');
copyFrom.get().then((value) => {
copyTo.set(value.data())
});
There is no fast way at the moment. I recommend you rewrite your code like this though:
import { firestore } from "firebase-admin";
async function copyCollection() {
const products = await firestore().collection("products").get();
products.forEach(async (doc)=> {
await firestore().collection(uid).doc(doc.get('barcode')).set(doc.data());
})
}