i have a realtime-database with the following structure:
-- test-b7a6b
-- locations
-- 0
-- logs
-- alarm_2a330b56-c1b8-4720-902b-df89b82ae13a
...
-- devices
-- deviceTokens
-- 1
-- 2
i am using firebase-functions that gets executed when a new log gets written
let functions = require('firebase-functions');
let admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.sendPush = functions.database.ref('/locations/0/logs/{devicelogs}/{logs}').onWrite((change, context) => {
let logsData = change.after.val();
loadUsers();
getBody(deviceTypeId);
//managing the results
});
i have other functions that i want to referenciate to the same location as the one with a new log
function loadUsers() {
let dbRef = admin.database().ref('/locations/0/deviceTokens');
//managing users
}
function getBody(deviceTypeId) {
let devicesDB = admin.database().ref('/locations/0/devices');
//managing devices
}
putting the location manually on all 3 functions makes it work just fine but i don't know how to make it listen for the same event on all the locations ( 0, 1 and 2 ) and possibly more locations in the future
So my question: is there a way i can get the location key when a log gets written to any location so i can send it to the other functions
To listen to all locations, use a parameter in the path that triggers the function:
exports.sendPush = functions.database.ref('/locations/{location}/logs/{devicelogs}/{logs}').onWrite((change, context) => {
You can then get the parameters from context.params and pass it on:
exports.sendPush = functions.database.ref('/locations/{location}/logs/{devicelogs}/{logs}').onWrite((change, context) => {
let logsData = change.after.val();
loadUsers(context.params.location);
getBody(deviceTypeId);
//managing the results
});
Also see the Cloud Functions for Firebase documentation on handling event data.
Related
Transactions and batched writes can be used to write multiple documents by means of an atomic operation.
Documentation says that Using the Cloud Firestore client libraries, you can group multiple operations into a single transaction.
I cannot understand what is the meaning of client libraries here and if it's correct to use Transactions and batched writes within a Cloud Function.
Example given: suppose in the database I have 3 elements (which doc IDs are A, B, C). Now I need to insert 3 more elements (which doc IDs are C, D, E). The Cloud Function should add just the latest ones and send a Push Notification to the user telling him that 2 new documents are available.
The doc ID could be the same but since I need to calculate how many documents are new (the ones that will be inserted) I need a way to read the doc ID first and check for its existence. Hence, I'm wondering if Transactions fit Cloud Functions or not.
Also, each transaction or batch of writes can write to a maximum of 500 documents. Is there any other way to overcome this limit within a Cloud Function?
Firestore Transaction behaviour is different between the Clients SDKs (JS SDK, iOS SDK, Android SDK , ...) and the Admin SDK (a set of server libraries), which is the SDK we use in a Cloud Function. More explanations on the differences here in the documentation.
Because of the type of data contention used in the Admin SDK you can, with the getAll() method, retrieve multiple documents from Firestore and hold a pessimistic lock on all returned documents.
So this is exactly the method you need to call in your transaction: you use getAll() for fetching documents C, D & E and you detect that only C is existing so you know that you need to only add D and E.
Concretely, it could be something along the following lines:
const db = admin.firestore();
exports.lorenzoFunction = functions
.region('europe-west1')
.firestore
.document('tempo/{docId}') //Just a way to trigger the test Cloud Function!!
.onCreate(async (snap, context) => {
const c = db.doc('coltest/C');
const d = db.doc('coltest/D');
const e = db.doc('coltest/E');
const docRefsArray = [c, d, e]
return db.runTransaction(transaction => {
return transaction.getAll(...docRefsArray).then(snapsArray => {
let counter = 0;
snapsArray.forEach(snap => {
if (!snap.exists) {
counter++;
transaction.set(snap.ref, { foo: "bar" });
} else {
console.log(snap.id + " exists")
}
});
console.log(counter);
return;
});
});
});
To test it: Create one of the C, D or E doc in the coltest collection, then create a doc in the tempo collection (Just a simple way to trigger this test Cloud Function): the CF is triggered. Then look at the coltest collection: the two missing docs were created; and look a the CF log: counter = 2.
Also, each transaction or batch of writes can write to a maximum of
500 documents. Is there any other way to overcome this limit within a
Cloud Function?
AFAIK the answer is no.
There used to also be a one second delay required as well between 500 record chunks. I wrote this a couple of years ago. The script below reads the CSV file line by line, creating and setting a new batch object for each line. A counter creates a new batch write per 500 objects and finally asynch/await is used to rate limit the writes to 1 per second. Last, we notify the user of the write progress with console logging. I had published an article on this here >> https://hightekk.com/articles/firebase-admin-sdk-bulk-import
NOTE: In my case I am reading a huge flat text file (a manufacturers part number catalog) for import. You can use this as a working template though and modify to suit your data source. Also, you may need to increase the memory allocated to node for this to run:
node --max_old_space_size=8000 app.js
The script looks like:
var admin = require("firebase-admin");
var serviceAccount = require("./your-firebase-project-service-account-key.json");
var fs = require('fs');
var csvFile = "./my-huge-file.csv"
var parse = require('csv-parse');
require('should');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your-project.firebaseio.com"
});
var firestore = admin.firestore();
var thisRef;
var obj = {};
var counter = 0;
var commitCounter = 0;
var batches = [];
batches[commitCounter] = firestore.batch();
fs.createReadStream(csvFile).pipe(
parse({delimiter: '|',relax_column_count:true,quote: ''})
).on('data', function(csvrow) {
if(counter <= 498){
if(csvrow[1]){
obj.family = csvrow[1];
}
if(csvrow[2]){
obj.series = csvrow[2];
}
if(csvrow[3]){
obj.sku = csvrow[3];
}
if(csvrow[4]){
obj.description = csvrow[4];
}
if(csvrow[6]){
obj.price = csvrow[6];
}
thisRef = firestore.collection("your-collection-name").doc();
batches[commitCounter].set(thisRef, obj);
counter = counter + 1;
} else {
counter = 0;
commitCounter = commitCounter + 1;
batches[commitCounter] = firestore.batch();
}
}).on('end',function() {
writeToDb(batches);
});
function oneSecond() {
return new Promise(resolve => {
setTimeout(() => {
resolve('resolved');
}, 1010);
});
}
async function writeToDb(arr) {
console.log("beginning write");
for (var i = 0; i < arr.length; i++) {
await oneSecond();
arr[i].commit().then(function () {
console.log("wrote batch " + i);
});
}
console.log("done.");
}
I'm using Firebase as backend to my iOS app and can't figure out how to construct a batch write through their Cloud Functions.
I have two collections in my Firestore, drinks and customers. Each new drink and each new customer is assigned a userId property that corresponds to the uid of the currently logged in user. This userId is used with a query to the Firestore to fetch only the drinks and customers connected to the logged in user, like so: Firestore.firestore().collection("customers").whereField("userId", isEqualTo: Auth.auth().currentUser.uid)
Users are able to log in anonymously and also subscribe while anonymous. The problem is if they log out there's no way to log back in to the same anonymous uid. The uid is also stored as an appUserID with the RevenueCat SDK so I can still access it, but since I can't log the user back in to their anonymous account using the uid the only way to help a user access their data in case of a restoring of purchases is to update the userId field of their data from the old uid to the new uid. This is where the need for a batch write comes in.
I'm relatively new to programming in general but I'm super fresh when it comes to Cloud Functions, JavaScript and Node.js. I dove around the web though and thought I found a solution where I make a callable Cloud Function and send both old and new userID with the data object, query the collections for documents with the old userID and update their userId fields to the new. Unfortunately it's not working and I can't figure out why.
Here's what my code looks like:
// Cloud Function
exports.transferData = functions.https.onCall((data, context) => {
const firestore = admin.firestore();
const customerQuery = firestore.collection('customers').where('userId', '==', `${data.oldUser}`);
const drinkQuery = firestore.collection('drinks').where('userId', '==', `${data.oldUser}`);
const customerSnapshot = customerQuery.get();
const drinkSnapshot = drinkQuery.get();
const batch = firestore.batch();
for (const documentSnapshot of customerSnapshot.docs) {
batch.update(documentSnapshot.ref, { 'userId': `${data.newUser}` });
};
for (const documentSnapshot of drinkSnapshot.docs) {
batch.update(documentSnapshot.ref, { 'userId': `${data.newUser}` });
};
return batch.commit();
});
// Call from app
func transferData(from oldUser: String, to newUser: String) {
let functions = Functions.functions()
functions.httpsCallable("transferData").call(["oldUser": oldUser, "newUser": newUser]) { _, error in
if let error = error as NSError? {
if error.domain == FunctionsErrorDomain {
let code = FunctionsErrorCode(rawValue: error.code)
let message = error.localizedDescription
let details = error.userInfo[FunctionsErrorDetailsKey]
print(code)
print(message)
print(details)
}
}
}
}
This is the error message from the Cloud Functions log:
Unhandled error TypeError: customerSnapshot.docs is not iterable
at /workspace/index.js:22:51
at fixedLen (/workspace/node_modules/firebase-functions/lib/providers/https.js:66:41)
at /workspace/node_modules/firebase-functions/lib/common/providers/https.js:385:32
at processTicksAndRejections (internal/process/task_queues.js:95:5)
From what I understand customerSnapshot is something called a Promise which I'm guessing is why I can't iterate over it. By now I'm in way too deep for my sparse knowledge and don't know how to handle these Promises returned by the queries.
I guess I could just force users to create a login before they subscribe but that feels like a cowards way out now that I've come this far. I'd rather have both options available and make a decision instead of going down a forced path. Plus, I'll learn some more JavaScript if I figure this out!
Any and all help is greatly appreciated!
EDIT:
Solution:
// Cloud Function
exports.transferData = functions.https.onCall(async(data, context) => {
const firestore = admin.firestore();
const customerQuery = firestore.collection('customers').where('userId', '==', `${data.oldUser}`);
const drinkQuery = firestore.collection('drinks').where('userId', '==', `${data.oldUser}`);
const customerSnapshot = await customerQuery.get();
const drinkSnapshot = await drinkQuery.get();
const batch = firestore.batch();
for (const documentSnapshot of customerSnapshot.docs.concat(drinkSnapshot.docs)) {
batch.update(documentSnapshot.ref, { 'userId': `${data.newUser}` });
};
return batch.commit();
});
As you already guessed, the call customerQuery.get() returns a promise.
In order to understand what you need, you should first get familiar with the concept of promises here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
For your use case, you will probably end up with either using the then callback:
customerQuery.get().then((result) => {
// now you can access the result
}
or by making the method call synchronous, by using the await statement:
const result = await customerQuery.get()
// now you can access the result
I have three Firebase database trigger function to push notifications to users. However, I have noticed that .onCreate() gets triggered on database update and delete. Is this expected? Is there a way to prevent this?
const functions = require('firebase-functions')
exports.onNoteCreate = functions
.region('europe-west1')
.database
.ref('/notes/{noteId}')
.onCreate((snapshot, context) => {
...
//Push notification to affected users
//Compose notification object
const notificationObject = { "test": true }
membersToAlert.forEach((memberId, index) => {
let isAlreadyPresent = false
//Do not write if already present! - This code should not be needed?
const ref = snapshot.ref.root.child(`/notes/${personId}/noteAdditions`)
ref.orderByChild('originId')
.equalTo(noteId)
.on("value", (removeSnapshot) => {
isAlreadyPresent = true
})
//Write notification to all affected users
if(!isAlreadyPresent) {
snapshot.ref.root.child(`/notifications/${personId}/noteAdditions`).push(notificationObject)
}
})
return true
})
My .onUpdate() and .onDelete() triggers are also listening to .ref('/notes/{noteId}'). Is that a problem?
How can I make sure .onCreate() only gets triggered when a new object is inserted?
EDIT:
My testing workflow is as follows:
Create a new node in /notes using .push() -> works as expected
Update the same node using .update() -> works as expected
Delete the node in /notes/{noteId} directly from the Firebase Console
Step 3 triggers both .onCreate() and .onUpdate(). See log below:
I 2019-08-12T17:17:25.867Z onNoteCreate 670055884755913 onNoteCreate ... onNoteCreate 670055884755913
I 2019-08-12T17:17:26.053Z onNoteUpdate 670048941917608 onNoteUpdate ... onNoteUpdate 670048941917608
D 2019-08-12T17:17:26.843878505Z onNoteDelete 670054292162841 Function execution started onNoteDelete 670054292162841
D 2019-08-12T17:17:26.849773576Z onNoteDelete 670054292162841 Function execution took 7 ms, finished with status: 'ok' onNoteDelete 670054292162841
Database before delete
-notifications
-userId
-noteAdditions
-guid01
-notificationData
-noteUpdates
-guid03
-notificationData
Database after delete
//guid01 gets deleted by .onDelete() as expected
//guid03 gets deleted by .onDelete() as expected
-notifications
-userId
-noteAdditions
-guid02
-notificationData //Inserted by .onCreate() upon delete
-noteUpdates
-guid04
-notificationData //Inserted by .onUpdate() upon delete
The listeners are attached to /notes/{noteId} and updates are being made at /notifications/{userId}/...
onNoteCreate
exports.onNoteCreate = functions
.region('europe-west1')
.database
.ref('/notes/{noteId}')
.onCreate((snapshot, context) => {
...
snapshot.ref.root.child(`/notifications/${personId}/noteAdditions`).push(notificationObject)
...
console.log('onNoteCreate', '...')
...
})
onNoteUpdate
exports.onNoteUpdate = functions
.region('europe-west1')
.database
.ref('/notes/{noteId}')
.onUpdate((change, context) => {
...
change.after.ref.root.child(`/notifications/${personId}/noteUpdates`).push(notificationObject)
...
console.log('onNoteUpdate', '...')
...
})
Does it matter that I import the functions like so?
const create = require('./src/db-functions/notes').onNoteCreate
const update = require('./src/db-functions/notes').onNoteUpdate
const delete = require('./src/db-functions/notes').onNoteDelete
exports.onNoteCreate = create
exports.onNoteUpdate = update
exports.onNoteDelete = delete
I failed to include the code in my example where I call
//Get user data - also updated by onNoteCreate, onNoteUpdate , onNoteDelete
dbRoot.child(`users/${userId}`)
.on('value', (snapshot) => {
.on() attached a listener that was being triggered each time the value was updated, thus triggering "onNoteCreate", "onNoteUpdate" and "onNoteDelete" when not expected. I should have used .once() if I did not wish to attach a listener which I did not.
All credits to #doug for pointing this out to me in this post:
Firebase database trigger - how to wait for calls
I am trying to implement push notifications trigger using cloud functions in firebase but each time I try the val function returns null. It is not recognizing the pushID, I implemented database from android using push() method.
This is my database structure
And this is my code for push notifications whenever a Post is created.
//import firebase functions modules
const functions = require('firebase-functions');
//import admin module
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
// Listens for new messages added to messages/:pushId
exports.pushNotification = functions.database.ref('/Posts/Posts/{pushId}').onWrite( event => {
console.log('Push notification event triggered');
// Grab the current value of what was written to the Realtime Database.
var valueObject = event.data.val();
// if(valueObject.photoUrl != null) {
// valueObject.photoUrl= "Sent you a photo!";
// }
// Create a notification
const payload = {
notification: {
title:valueObject.tittle,
body: valueObject.details,
sound: "default"
},
};
//Create an options object that contains the time to live for the notification and the priority
const options = {
priority: "high",
timeToLive: 60 * 60 * 24
};
return admin.messaging().sendToTopic("pushNotifications", payload, options);
});
This is error in console of cloud functions
Edited after using OnCreate:-
exports.pushNotification = functions.database.ref('/Posts/Posts/{pushid}').onCreate((snap, context) => {
const original = snapshot.val();
console.log('Push notification event triggered');
// Grab the current value of what was written to the Realtime Database.
// var valueObject = event.data.val();
var valueObject = snap.val();
// if(valueObject.photoUrl != null) {
// valueObject.photoUrl= "Sent you a photo!";
// }
// Create a notification
const payload = {
notification: {
title:valueObject.tittle,
body: valueObject.details,
sound: "default"
},
};
It looks like you haven't adapted your code to the new Functions 1.0 SDK. The differences are detailed here: https://firebase.google.com/docs/functions/beta-v1-diff
As you can see from that doc in the Realtime Database section, onWrite triggers now give you a Change object with before and after properties that you use to get the value of the database location before or after the update.
Also consider if you actually want an onCreate trigger instead, which is more straightforward to deal with, and only triggers once when data at the matching location is newly created.
How can I remove ref after my function is finished running? Is it necessary? I want my function to run as quickly as possible, and don't want "things" piling up.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.myFunction = functions.database.ref('/path/{uid}').onWrite(event => {
const ref = event.data.adminRef.root.child('something').child(event.params.uid);
return ref.transaction(current => {
if (event.data.exists() && !event.data.previous.exists()) {
return _.toInteger(current) + _.toInteger(_.get(data, 'value', 0));
}
}).then(() => {
return null; // Avoid "Error serializing return value: TypeError: Converting circular structure to JSON"
});
});
A DatabaseReference is nothing you can "remove". It is just a pointer to a location in your database. The documentation has a page for it:
https://firebase.google.com/docs/reference/admin/node/admin.database.Reference
The only thing you can remove/detach is a callback you set with ref.on(...), with ref.off(...), but there is no callback in your code and I think that ref.once() should get the job done most of the time in Functions.
To be clear: ref.transactions()'s do not have to be detached, they just run once, i.e. there is no callback. Same for ref.set() and ref.once().