I am new to Firestore and learning things out. On my learning path, I have reached the section on Events for meta data changes in the Firebase documentation.
This is looking very useful but I am unable to understand how to test it. This is the code in documentation
db.collection("cities").doc("SF")
.onSnapshot({
// Listen for document metadata changes
includeMetadataChanges: true
}, function(doc) {
// ...
});
I added my simple update command to see what happens, it is updating every second. I want to understand what it is trying to give me back ? and in what case I can use it ? Why is it updating every second ?
firebase.firestore().collection("cities").doc("DC")
.onSnapshot({
// Listen for document metadata changes
includeMetadataChanges: true
}, function(doc) {
// ...
var docRef = firebase.firestore().collection('cities').doc('DC');
var updateTimestamp = docRef.update({
timestamp: firebase.firestore.FieldValue.serverTimestamp()
});
});
So actually you created modern infinite loop.
You subscribed to snapshotChanges of DC document (for data and metadata), so it means any change in the document you will receive it. And as soon as you run it for the first time, the current data comes as first time run subscription.
Then in subscribe, you update same document, it means again your subscription will be run, and again and again.
firebase.firestore().collection("cities").doc("DC")
.onSnapshot({includeMetadataChanges: true}, (docSnapshot) => {
console.log(docSnapshot);
});
Related
How can I listen to a specific field change with firestore js sdk ?
In the documentation, they only seem to show how to listen for the whole document, if any of the "SF" field changes, it will trigger the callback.
db.collection("cities").doc("SF")
.onSnapshot(function(doc) {
console.log("Current data: ", doc && doc.data());
});
You can't. All operations in Firestore are on an entire document.
This is also true for Cloud Functions Firestore triggers (you can only receive an entire document that's changed in some way).
If you need to narrow the scope of some data to retrieve from a document, place that in a document within a subcollection, and query for that document individually.
As Doug mentioned above, the entire document will be received in your function. However, I have created a filter function, which I named field, just to ignore document changes when those happened in fields that I am not interested in.
You can copy and use the function field linked above in your code. Example:
export const yourCloudFunction = functions.firestore
.document('/your-path')
.onUpdate(
field('foo', 'REMOVED', (change, context) => {
console.log('Will get here only if foo was removed');
}),
);
Important: The field function is not avoiding your function to be executed if changes happened in other fields, it will just ignore when the change is not what you want. If your document is too big, you should probably consider Doug's suggestion.
Listen for the document, then set a conditional on the field you're interesting in:
firebase.firestore().collection('Dictionaries').doc('Spanish').collection('Words').doc(word).collection('Pronunciations').doc('Castilian-female-IBM').onSnapshot(function(snapshot) {
if (snapshot.data().audioFiles) { // eliminates an error message
if (snapshot.data().audioFiles.length === 2) {
audioFilesReady++;
if (audioFilesReady === 3) {
$scope.showNextWord();
}
}
}
}, function(error) {
console.error(error);
});
I'm listening for a document for a voice (Castilian-female-IBM), which contains an array of audio files, in webm and mp3 formats. When both of those audio files have come back asynchronously then snapshot.data().audioFiles.length === 2. This increments a conditional. When two more voices come back (Castilian-male-IBM and Latin_American-female-IBM) then audioFilesReady === 3 and the next function $scope.showNextWord() fires.
Just out of the box what I do is watching before and after with the before and after method
const clientDataBefore = change.before.data();
console.log("Info database before ", clientDataBefore);
const clientDataAfter = change.after.data();
console.log("Info database after ", clientDataAfter );
For example now you should compare the changes for a specific field and do some actions or just return it.
Some more about before.data() and after.data() here
So I wanna set up a basic chat with Firestore and while implementing it I Noticed that the Listener actually registers 2 Events for 1 Document being added.
Here is the Code to add a Document:
const chatRef = this.$fireStore.collection('chats/global/messages');
chatRef.add({
displayName: 'Prof Dr Barcode',
content: this.newMsg,
timestamp: this.$fireStoreObj.FieldValue.serverTimestamp()
});
And my listener looks like this:
const chatRef = this.$fireStore.collection('chats/global/messages');
const initialRef = chatRef.limit(10).orderBy('timestamp');
initialRef.onSnapshot(querySnapshot => {
querySnapshot.docChanges().forEach(change => {
console.log(change.type);
if (change.type === 'added') {
this.messages.push(change.doc.data());
}
});
});
Now the change.type console log triggers twice for each document added. Once with added and once with modified, while investigating I saw that when I log myself the Document in the Listener it gets logged twice, at first with the added event the timestamp field is null, then instantly it triggers again with modified and the timestamp field is populated.
I dont think this is the correct way Firestore should behave?
What you're observing is the expected behavior. This is because the Firestore client SDK writes the new document to the local cache first, immediately, and all listeners are notified of the change as an "added" change. Because you wrote a server timestamp, that final value gets interpreted after the document in synchronized to the server (not on the client), which results in a change to the timestamp field. That change gets synchronized to the client, resulting in a "modified" change on the client.
If you remove the server timestamp, you should not see the additional modification, as the server now has nothing to add to the document that the client didn't already know about.
I need to fetch sub-set of documents in Firestore collection modified after some moment. I tried going theses ways:
It seems that native filtering can work only with some real fields in stored document - i.e. nevertheless Firestore API internally has DocumentSnapshot.getUpdateTime() I cannot use this information in my query.
I tried adding my _lastModifiedAt 'service field' via server-side firestore cloud function, but ... that updating of _lastModifiedAt causes recursive invocation of the onWrite() function. I.e. is does also not work as needed (recursion finally stops with Error: quota exceeded (Function invocations : per 100 seconds)).
Are there other ideas how to filter collection by 'lastModifiedTime'?
Here is my 'cloud function' for reference
It would work if I could identify who is modifying the document, i.e. ignore own updates of _lastModified field, but I see no way to check for this
_lastModifiedBy is set to null because of current inability of Firestore to provide auth information (see here)
exports.updateLastModifiedBy = functions.firestore.document('/{collId}/{documentId}').onWrite(event => {
console.log(event.data.data());
var lastModified = {
_lastModifiedBy: null,
_lastModifiedAt: now
}
return event.data.ref.set(lastModified, {merge: true});
});
I've found the way to prevent recursion while updating '_lastModifiedAt'.
Note: this will not work reliably if client can also update '_lastModifiedAt'. It does not matter much in my environment, but in general case I think writing to '_lastModifiedAt' should be allowed only to service accounts.
exports.updateLastModifiedBy = functions.firestore.document('/{collId}/{documentId}').onWrite(event => {
var doc = event.data.data();
var prevDoc = event.data.previous.data();
if( doc && prevDoc && (doc._lastModifiedAt != prevDoc._lastModifiedAt) )
// this is my own change
return 0;
var lastModified = getLastModified(event);
return event.data.ref.set(lastModified, {merge: true});
});
Update: Warning - updating lastModified in onWrite() event causes infinite recursion when trying to delete all documents in Firebase console. This happens because onWrite() is also triggered for delete and writing lastModified into deleted document actually resurrects it. That document propagates back into console and is tried to be deleted once again, indefinitely (until WEB page is closed).
To fix that issue above mentioned code has to be specified individually for onCreate() and onUpdate().
How about letting the client write the timestamp with FieldValue.serverTimestamp() and then validate that the value written is equal to time in security rules?
Also see Mike's answer here for an example: Firestore Security Rules: If timestamp (FieldValue.serverTimestamp) equals now
You could try the following function, which will not update the _lastModifiedAt if it has been marked as modified within the last 5 seconds. This should ensure that this function only runs once, per update (as long as you don't update more than once in 5 seconds).
exports.updateLastModifiedBy = functions.firestore.document('/{collId}/{documentId}').onWrite(event => {
console.log(event.data.data());
if ((Date.now() - 5000) < event.data.data()._lastModifiedAt) {return null};
var lastModified = {
_lastModifiedBy: null,
_lastModifiedAt: now
}
return event.data.ref.set(lastModified, {merge: true});
});
I am trying to remove an item from $firebaseArray (boxes).
The remove funcion:
function remove(boxJson) {
return boxes.$remove(boxJson);
}
It works, however it is immediately added back:
This is the method that brings the array:
function getBoxes(screenIndex) {
var boxesRef = screens
.child("s-" + screenIndex)
.child("boxes");
return $firebaseArray(boxesRef);
}
I thought perhaps I'm holding multiple references to the firebaseArray and when one deletes, the other adds, but then I thought firebase should handle it, no?
Anyway I'm lost on this, any idea?
UPDATE
When I hack it and delete twice (with a timeout) it seems to work:
function removeForce(screenIndex, boxId) {
setTimeout(function () {
API.removeBox(screenIndex, boxId);
}, 1000);
return API.removeBox(screenIndex, boxId);
}
and the API.removeBox:
function removeBox(screenIndex, boxId) {
var boxRef = screens
.child("s-" + screenIndex)
.child("boxes")
.child(boxId);
return boxRef.remove();
}
When you remove something from firebase it is asynchronous. Per the docs the proper way to remove an item is from firebase, using AngularFire is:
var obj = $firebaseObject(ref);
obj.$remove().then(function(ref) {
// data has been deleted locally and in the database
}, function(error) {
console.log("Error:", error);
});
$remove() ... Removes the entire object locally and from the database. This method returns a promise that will be fulfilled when the data has been removed from the server. The promise will be resolved with a Firebase reference for the exterminated record.
Link to docs: https://www.firebase.com/docs/web/libraries/angular/api.html#angularfire-firebaseobject-remove
The most likely cause is that you have a security rules that disallows the deletion.
When you call boxes.$remove Firebase immediately fires the child_removed event locally, to ensure the UI is updated quickly. It then sends the command to the Firebase servers to check it and update the database.
On the server there is a security rule that disallows this deletion. The servers send a "it failed" response back to the client, which then raises a child_added event to fix the UI.
Appearantly I was saving the items again after deleting them. Clearly my mistake:
function removeSelected(boxes) {
var selectedBoxes = Selector.getSelectedBoxes(boxes);
angular.forEach(selectedBoxes, function (box) {
BoxManager.remove(box);
});
Selector.clearSelection(boxes, true);
}
In the clearSelection method I was updating a field on the boxes and saved them again.
Besides the obvious mistake this is a lesson for me on how to work with Firebase. If some part of the system keeps a copy of your deleted item, saving it won't produce a bug but revive the deleted item.
For those, who have the similar issue, but didn't solve it yet.
There are two methods for listening events: .on() and .once(). In my case that was the cause of a problem.
I was working on a migration procedure, that should run once
writeRef
.orderByChild('text_hash')
.equalTo(addItem.text_hash)
.on('value', val => { // <--
if (!val.exists()) {
writeRef.push(addItem)
}
});
So the problem was exactly because of .on method. It fires each time after a data manipulation from FB's console.
Changing to .once solved that.
I need to keep track of a counter of a collection with a huge number of documents that's constantly being updated. (Think a giant list of logs). What I don't want to do is to have the server send me a list of 250k documents. I just want to see a counter rising.
I found a very similar question here, and I've also looked into the .observeChanges() in the docs but once again, it seems that .observe() as well as .observeChanges() actually return the whole set before tracking what's been added, changed or deleted.
In the above example, the "added" function will fire once per every document returned to increment a counter.
This is unacceptable with a large set - I only want to keep track of a change in the count as I understand .count() bypasses the fetching of the entire set of documents. The former example involves counting only documents related to a room, which isn't something I want (or was able to reproduce and get working, for that matter)
I've gotta be missing something simple, I've been stumped for hours.
Would really appreciate any feedback.
You could accomplish this with the meteor-streams smart package by Arunoda. It lets you do pub/sub without needing the database, so one thing you could send over is a reactive number, for instance.
Alternatively, and this is slightly more hacky but useful if you've got a number of things you need to count or something similar, you could have a separate "Statistics" collection (name it whatever) with a document containing that count.
There is an example in the documentation about this use case. I've modified it to your particular question:
// server: publish the current size of a collection
Meteor.publish("nbLogs", function () {
var self = this;
var count = 0;
var initializing = true;
var handle = Messages.find({}).observeChanges({
added: function (id) {
count++;
if (!initializing)
self.changed("counts", roomId, {nbLogs: count});
},
removed: function (id) {
count--;
self.changed("counts", roomId, {nbLogs: count});
}
// don't care about moved or changed
});
// Observe only returns after the initial added callbacks have
// run. Now return an initial value and mark the subscription
// as ready.
initializing = false;
self.added("counts", roomId, {nbLogs: count});
self.ready();
// Stop observing the cursor when client unsubs.
// Stopping a subscription automatically takes
// care of sending the client any removed messages.
self.onStop(function () {
handle.stop();
});
});
// client: declare collection to hold count object
Counts = new Meteor.Collection("counts");
// client: subscribe to the count for the current room
Meteor.subscribe("nbLogs");
// client: use the new collection
Deps.autorun(function() {
console.log("nbLogs: " + Counts.findOne().nbLogs);
});
There might be some higher level ways to do this in the future.