As I understand, when this following line of code is interpreted/executed by Javascript
ref.on('value',callback)
(similar to document.addEventListener('click', callback)). The callback gets attached to the el/object for that event such that when that event executes then the attached callback (event handler) gets fired.
But I observe that firebase 'value' event will automatically fire when there is some data at this ref, when the above line of code is interpreted/executed by Javascript even though there is NO trigger such as add/delete/modify operation that happens to that ref.
Is this interpretation/assumption correct or the value event works just like any other event that trigger from add/delete/modify operations. In that case what would be that trigger?
Also, if the value event fires automatically does it actually do an async/network call to the firebase database on that ref and then fetches that data (snapshot) or is the ref data cached at the client side i.e. no async/network request.
Can anybuddy, clarify both this confusion? Your help is appreciated.
According to the documentation:
You can use the value event to read a static snapshot of the contents
at a given path, as they existed at the time of the event. This method
is triggered once when the listener is attached and again every time
the data, including children, changes. The event callback is passed a
snapshot containing all data at that location, including child data.
If there is no data, the snapshot will return false when you call
exists() and null when you call val() on it.
When you attach a listener, the SDK will use its persistent connection to the database to check if there is new data. If there is not any new data, then the locally cached data is provided.
Related
I'm trying to use a document snapshot listener for Firebase Firestore. I want to perform some action based on the current document value from the server, but also listen for changes to the document and enable offline cache when possible.
The listener works to update a state when the document changes, but for some reason always operates form a previous cache of what changed when it was listening:
let unsub = firebase.firestore().collection('myCol').doc('myDoc').onSnapshot((doc) => {
if(doc.data().myVal) myFunction(); //myVal is always what the last listener thought it was, not updated from current server value
}
So if I then call unsub() and make a change to the document in the console, next time the listener is started up, it returns the last cache value from when it was previously listening instead of the first load being from the server.
How can I force the listener to get the first value from the server instead of it's old local cache?
The only way I can currently force the listener logic to load from the server first is by manually triggering a get() on the document first. This simply updates the local cache of any changes that happened when the listener wasn't listening last.
If you're having similar issues with your code, add this before setting the listener logic:
await firebase.firestore().collection('myCol').doc('myDoc').get({source: 'server'}).catch(e => {});
I'm using fullcalendar (the v4 alpha) to arrange events.
I have an eventDragStop callback that I'm trying to use to set an extendedProp for an event, marking that the event has been altered.
eventDragStop: function (info) {
calendar.getEventById(info.event.id).setExtendedProp("extra2", true)
}
Using the code above, it doesn't work. If I alert(info.event.id), I can see that the correct ID is being called for the event that has been dragged, and get no errors.
If I have three events on the calendar, with IDs: 1, 2, 3, and use the following code:
eventDragStop: function (info) {
calendar.getEventById(1).setExtendedProp("extra2", true)
}
So, explicitly stating to change ID number 1, rather than the event in the callback.
If I drag event number 1, this doesn't work either. However, if I drag event 2 or 3, it will work and change event 1.
Vice versa, any event I explicitly state, it will be able to change that event, providing that was not also the event that triggered the eventDragStop callback.
Can anyone tell me why this is?
https://fullcalendar.io/docs/v4/eventDragStop says (of itself as a callback)
"It is triggered before the event’s information has been modified"
So I think what is happening here is that fullCalendar effectively overwrites any change you make to the event data during this callback.
I think this is because the event object maybe gets replaced with a new version (constructed based on its final resting place) some time after this callback runs.
I haven't verified this by looking at the source code but it's a logical explanation for the issue you're seeing, and it also makes some sense that the event object would get updated (with new dates/times etc) after dragging is complete, and that this might in fact involve a full refresh of the object data at that time.
Anyway, that's why when dragging event 1 you then fail to persist any updates to event 1's other data, but when dragging event 2 or 3 you are able to persist the changes to event 1 - because in that instance event 1's data is not being replaced at a later time as a result of the dragging being completed.
Instead of using eventDragStop, you should modify the event during eventDrop (https://fullcalendar.io/docs/v4/eventDrop) instead. This callback occurs after fullCalendar has completely finished processing the dragging/dropping and updated the event times etc. Therefore any further changes you make to the event data I would expect should be preserved.
I have a large data set (~100k entries), that is being subscribed to using the 'child_added' event. Using node 7 and firebase 3.6.1, doing this seems to download the entire 100k entries before a single child_added event is fired.
Memory consumption grows significantly for a few dozen seconds, and then all child_added events are fired swiftly after each other.
This is slow:
require('firebase').
initializeApp({databaseURL: 'https://someproject.firebaseio.com'}).
database().ref('data').
on('child_added', (snap) => console.log(snap.key));
Limiting is still fast (few seconds delay):
require('firebase').
initializeApp({databaseURL: 'https://someproject.firebaseio.com'}).
database().ref('data').limitToFirst(10).
on('child_added', (snap) => console.log(snap.key));
Given the streaming nature of Firebase, I assume it is not intended behaviour for child_added subscriptions to download the entire data set to the client before anything is done.
Am I doing something wrong, or is this a bug?
Although that in the child_added section extracted from firebase documentation it says:
The child_added event is typically used when retrieving a list of items from the database. Unlike value which returns the entire contents of the location, child_added is triggered once for each existing child and then again every time a new child is added to the specified path. The event callback is passed a snapshot containing the new child's data. For ordering purposes, it is also passed a second argument containing the key of the previous child.
At the first lines in that page, we can found this:
Data stored in a Firebase Realtime Database is retrieved by attaching an asynchronous listener to a database reference. The listener will be triggered once for the initial state of the data and again anytime the data changes.
Seems to be its normal behaviour. It first retrieves all the data.
I am in the same situation, waiting nearly 40 seconds for the first child to fire. The only solution I could come up with was to get the keys using Firebase rest API and shallow query parameter, then loop over each key and call Firebase. Here is basically what I did.
`
console.log('start', Date.now());
fetch('https://[firebase_app].firebaseio.com/[your_path].json?shallow=true')
.then((response) => {
return response.json();
}).then(function(j) {
Object.keys(j).forEach(function (key) {
console.log(key, 'start', Date.now());
firebase_reference.child(key).on("child_added", function (snapshot) {
console.log(key, Date.now());
//now you have the first response without waiting for everything.
});
});
});`
I know this doesn't answer your question about child_added functionality, but it does what you would expect to happen with child_added. I am going to submit a feature request to Firebase and link this SO question.
According to the Firebase documentation:
Value events are always triggered last and are guaranteed to contain updates from any other events which occurred before that snapshot was taken.
Here is a simple example (jsbin) where child_added fires before value. This behavior was confirmed using the currently latest Firebase version (2.3.1):
var ref = new Firebase("https://reform.firebaseio.com");
ref.child('pets').once('value', function(snapshot) {
snapshot.forEach(function(pet) {
console.log("Pet: " + pet.key());
pet.ref().child('food').once('value', function (foods) {
console.log('value event called for ' + pet.key());
});
pet.ref().child('food').once('child_added', function (foods) {
console.log('child_added event called for ' + pet.key());
});
});
});
In this example, the console log will be:
Pet: cat
Pet: dog
value event called for cat
child_added event called for cat
value event called for dog
child_added event called for dog
Why does the child_added event fire last in this case? Does this not violate the guarantee per the documentation?
To summarize the excellent feedback in the comments and by Firebase Support:
It makes most sense here to use on() instead of once() to register the event listener. In the example of the original post, and to quote Firebase Support:
The on() callback is registered, and when the once() callback is registered it is ordered in reference to the on() callback. Once the once() callback is fired, it's automatically deregistered. Even though the execution of the events are done in a certain order (due to javascript being single threaded), they are being calculated separately from each other.
Frank's correction to that example shows this in action.
The modified example again breaks the "guarantee" because (Firebase Support):
The data is already locally on the client. So once you have run ref.child('pets').on() and the callback happens, all the data under /pets has been retrieved to the client. Now in the callback processing, you are adding additional callbacks to the existing data. When the callback is being added, the client library is immediately firing the callback without waiting for the second one to be registered since all the data is available.
Since I would like to enforce the guarantee in this case where the data is local, I simply register the child_added listener before the value listener, as demonstrated in the correction to the modified example.
I'm trying to get data with server-sent event, what the different using
source.onmessage vs source.addEventListener?
source.onmessage is the built in function wrapper for EventSource that is triggered when new data is sent to the client. It fires when no event attribute is returned (default) and doesn't fire when it is set.
addEventListener is similar, but differs in that it listens for a specific event name, and triggers on its presence, allowing you to separate your functionality for multiple events. You can then parse the JSON data returned. It can be used on any event type. Have a look at this example:
source.addEventListener("login", function(e) {
// do your login specific logic
var returnedData = JSON.parse(e);
console.log(returnedData);
}, false);
This snippet will listen for a server message with event specified as login, then it triggers the callback function.
More info:
https://developer.mozilla.org/en-US/docs/Server-sent_events/Using_server-sent_events
http://html5doctor.com/server-sent-events/
I assume you're talking about addEventListener('message') vs onmessage. They do the same thing, but I'd recommend using onmessage because with addEventListener, there's always a possibility of unexpectedly adding the same listener twice, e.g. due to a laggy page reload, or some hot-reload during development. In those cases the handler function could fire twice on every event, which leads to weird behaviors.