I have the list of items on page, I want to add listener for detect insert in collection.
The code below is on client side.
Messages.find().observeChanges({
added: function(){
console.log('Message added');
}
});
But if in collection already exist items, after load page will fire callback for all items. Why callback 'added' fired for items that already in collection and how to detect only really new inserts?
There are two solutions. For the first one, look here: https://github.com/oortcloud/unofficial-meteor-faq#why-does-observe-fire-a-bunch-of-added-events-for-existing-documents
The second one is this:
var isInitial = true
Messages.find().observeChanges({
added: function(){
if(!isInitial){
console.log('Message added');
}
}
});
isInitial = false
This works because the call to observeChanges will fire the added callback for the documents already in Minimongo before returning (and isInitial will be set to false).
You can try using collections hooks which will do exactly what you want:
https://atmospherejs.com/mrt/collection-hooks
Unfortunately this package hasn't yet been updated to work with Meteor 0.9 and later.
Related
I am trying to remove an item from $firebaseArray (boxes).
The remove funcion:
function remove(boxJson) {
return boxes.$remove(boxJson);
}
It works, however it is immediately added back:
This is the method that brings the array:
function getBoxes(screenIndex) {
var boxesRef = screens
.child("s-" + screenIndex)
.child("boxes");
return $firebaseArray(boxesRef);
}
I thought perhaps I'm holding multiple references to the firebaseArray and when one deletes, the other adds, but then I thought firebase should handle it, no?
Anyway I'm lost on this, any idea?
UPDATE
When I hack it and delete twice (with a timeout) it seems to work:
function removeForce(screenIndex, boxId) {
setTimeout(function () {
API.removeBox(screenIndex, boxId);
}, 1000);
return API.removeBox(screenIndex, boxId);
}
and the API.removeBox:
function removeBox(screenIndex, boxId) {
var boxRef = screens
.child("s-" + screenIndex)
.child("boxes")
.child(boxId);
return boxRef.remove();
}
When you remove something from firebase it is asynchronous. Per the docs the proper way to remove an item is from firebase, using AngularFire is:
var obj = $firebaseObject(ref);
obj.$remove().then(function(ref) {
// data has been deleted locally and in the database
}, function(error) {
console.log("Error:", error);
});
$remove() ... Removes the entire object locally and from the database. This method returns a promise that will be fulfilled when the data has been removed from the server. The promise will be resolved with a Firebase reference for the exterminated record.
Link to docs: https://www.firebase.com/docs/web/libraries/angular/api.html#angularfire-firebaseobject-remove
The most likely cause is that you have a security rules that disallows the deletion.
When you call boxes.$remove Firebase immediately fires the child_removed event locally, to ensure the UI is updated quickly. It then sends the command to the Firebase servers to check it and update the database.
On the server there is a security rule that disallows this deletion. The servers send a "it failed" response back to the client, which then raises a child_added event to fix the UI.
Appearantly I was saving the items again after deleting them. Clearly my mistake:
function removeSelected(boxes) {
var selectedBoxes = Selector.getSelectedBoxes(boxes);
angular.forEach(selectedBoxes, function (box) {
BoxManager.remove(box);
});
Selector.clearSelection(boxes, true);
}
In the clearSelection method I was updating a field on the boxes and saved them again.
Besides the obvious mistake this is a lesson for me on how to work with Firebase. If some part of the system keeps a copy of your deleted item, saving it won't produce a bug but revive the deleted item.
For those, who have the similar issue, but didn't solve it yet.
There are two methods for listening events: .on() and .once(). In my case that was the cause of a problem.
I was working on a migration procedure, that should run once
writeRef
.orderByChild('text_hash')
.equalTo(addItem.text_hash)
.on('value', val => { // <--
if (!val.exists()) {
writeRef.push(addItem)
}
});
So the problem was exactly because of .on method. It fires each time after a data manipulation from FB's console.
Changing to .once solved that.
I've made a working chat with meteor and mongodb, but I want to play a sound or something when there is a new message. However, I don't know how to check if data is updated. I could check if there is a new message by counting the messages before and after the update, but I just don't know how to check for an update.
So my question here is: How do I check for an update in the data?
I have a website that needs to pop up a toastr alert whenever a new message arrives. My collection is called "Alerts". This is what I do:
Alerts.find({notified: false}).observeChanges({
added: function(id, doc) {
Alerts.update(id, {
$set: {
notified: true
}
});
toastr.info(foo, bar);
}
});
Whenever a new alert is created whose field "notified" is false, a toastr alert will be created and that alert will be marked as "notified: true".
Alternatively you could do the same thing but create a separate collection of "notifications" that when observed, are removed from the collection as well that are a distinct collection from your chat messages collection.
You could create a tailing cursor on the oplog collection, so you get a new document whenever something (anything!) in the database changes. But that's not really an elegant solution, because that handler would need to process a lot of junk.
It might be better to have the routine which writes the message to the database also inform any currently online users. There is really no good reason to go the detour over the database.
I have two jQuery mobile pages (#list and #show). There are several items on the #list page with different IDs. If I click on item no.5, the ID no5 will be stored in localStorage and I will be redirected to page #show
Now the problem:
Storing the ID in localStorage works, but the next page shows me not the item no.5, but it shows me an old item, that was in the localStorage before.
script from page #list
localStorage.setItem("garageID", $(this).attr('id'));
window.location.replace("#show");
I encountered this problem too (and not on a mobile : on Chromium/linux).
As there doesn't seem to be a callback based API, I "fixed" it with a timeout which "prevents" the page to be closed before the setItem action is done :
localStorage.setItem(name, value);
setTimeout(function(){
// change location
}, 50);
A timeout of 0 might be enough but as I didn't find any specification (it's probably in the realm of bugs) and the problem isn't consistently reproduced I didn't take any chance. If you want you might test in a loop :
function setLocalStorageAndLeave(name, value, newLocation){
value = value.toString(); // to prevent infinite loops
localStorage.setItem(name, value);
(function one(){
if (localStorage.getItem(name) === value) {
window.location = newLocation;
} else {
setTimeout(one, 30);
}
})();
}
But I don't see how the fact that localStorage.getItem returns the right value would guarantee it's really written in a permanent way as there's no specification of the interruptable behavior, I don't know if the following part of the spec can be legitimately interpreted as meaning the browser is allowed to forget about dumping on disk when it leaves the page :
This specification does not require that the above methods wait until
the data has been physically written to disk. Only consistency in what
different scripts accessing the same underlying list of key/value
pairs see is required.
In your precise case, a solution might be to simply scroll to the element with that given name to avoid changing page.
Note on the presumed bug :
I didn't find nor fill any bug report as I find it hard to reproduce. In the cases I observed on Chromium/linux it happened with the delete operation.
Disclaimer: This solution isn't official and only tested for demo, not for production.
You can pass data between pages using $.mobile.changePage("target", { data: "anything" });. However, it only works when target is a URL (aka single page model).
Nevertheless, you still can pass data between pages - even if you're using Multi-page model - but you need to retrieve it manually.
When page is changed, it goes through several stages, one of them is pagebeforechange. That event carries two objects event and data. The latter object holds all details related to the page you're moving from and the page you're going to.
Since $.mobile.changePage() would ignore passed parameters on Multi-page model, you need to push your own property into data.options object through $.mobile.changePage("#", { options }) and then retrieve it when pagebeforechange is triggered. This way you won't need localstorage nor will you need callbacks or setTimeout.
Step one:
Pass data upon changing page. Use a unique property in order not to conflict with jQM ones. I have used stuff.
/* jQM <= v1.3.2 */
$.mobile.changePage("#page", { stuff: "id-123" });
/* jQM >= v1.4.0 */
$.mobile.pageContainer.pagecontainer("change", "#page", { stuff: "id-123" });
Step two:
Retrieve data when pagebeforechange is triggered on the page you're moving to, in your case #show.
$(document).on("pagebeforechange", function (event, data) {
/* check if page to be shown is #show */
if (data.toPage[0].id == "show") {
/* retrieve .stuff from data.options object */
var stuff = data.options.stuff;
/* returns id-123 */
console.log(stuff);
}
});
Demo
I may be completely missing something here, but I have the following:
a Model which encapsulates 'all' the data (all JSON loaded from one URL)
the model has one (or more) Collections which it is instantiating with the data it got on construction
some code which I want to run on the Collection when the data is initialized and loaded
My question is about the composed Collection. I could do this outside the scope of the Collection, but I'd rather encapsulate it (otherwise what's the point of making it a 'class' with an initializer etc).
I thought I could put that code in the initialize() function, but that runs before the model has been populated, so I don't have access to the models that comprise the collection (this.models is empty).
Then I thought I could bind to an event, but no events are triggered after initialization. They would be if I loaded the Collection with a fetch from its own endpoint, but I'm not doing that, I'm initializing the collection from pre-existing data.
My question: How to get initialize code to run on the Collection immediately after it is initialized with data (i.e. this.models isn't empty).
Is it possible to do this without having to get 'external' code involved?
Okay here is the demo code, perhaps this will explain things better.
var Everything = Backbone.Model.extend({
url: "/static/data/mydata.json",
parse: function(data)
{
this.set("things", new Things(data.things, {controller: this}));
}
});
var Thing = Backbone.Model.extend({
});
var Things = Backbone.Collection.extend({
model: Thing,
initialize: function(data, options)
{
// HERE I want access to this.models.
// Unfortunately it has not yet been populated.
console.log("initialize");
console.log(this.models);
// result: []
// And this event never gets triggered either!
this.on("all", function(eventType)
{
console.log("Some kind of event happend!", eventType);
});
}
});
var everything = new Everything();
everything.fetch();
// Some manual poking to prove that the demo code above works:
// Run after everything has happened, to prove collection does get created with data
setTimeout(function(){console.log("outside data", everything.get("things").models);}, 1000);
// This has the expected result, prints a load of models.
// Prove that the event hander works.
setTimeout(function(){console.log("outside trigger", everything.get("things").trigger("change"));}, 1000);
// This triggers the event callback.
Unfortunately for you the collection gets set with data only after it was properly initialized first and models are reset using silent: true flag which means the event won't trigger.
If you really wanted to use it you can cheat it a bit by delaying execution of whatever you want to do to next browser event loop using setTimeout(..., 0) or the underscore defer method.
initialize: function(data, options) {
_.defer(_.bind(this.doSomething, this));
},
doSomething: function() {
// now the models are going to be available
}
Digging this an old question. I had a similar problem, and got some help to create this solution:
By extending the set function we can know when the collection's data has been converted to real models. (Set gets called from .add and .reset, which means it is called during the core function instantiating the Collection class AND from fetch, regardless of reset or set in the fetch options. A dive into the backbone annotated source and following the function flow helped here)
This way we can have control over when / how we get notified without hacking the execution flow.
var MyCollection = Backbone.Collection.extend({
url: "http://private-a2993-test958.apiary-mock.com/notes",
initialize: function () {
this.listenToOnce(this, 'set', this.onInitialized)
},
onInitialized:function(){
console.log("collection models have been initialized:",this.models )
},
set: function(models,options){
Backbone.Collection.prototype.set.call(this, models, options);
this.trigger("set");
}
})
//Works with Fetch!
var fetchCollection= new MyCollection()
fetchCollection.fetch();
//Works with initializing data
var colData = new MyCollection([
{id:5, name:'five'},
{id:6, name:'six'},
{id:7, name:'seven'},
{id:8, name:'eight'}
])
//doesn't trigger the initialized function
colData.add(new Backbone.Model({id:9,name:'nine'};
Note: If we dont use .listenToOnce, then we will also get onInitialized called every time a model is added to or changed in the collection as well.
I have a fetch in my Backbone collection as follows.
var Items = Backbone.Collection.extend({
get_items:function(data) {
this.fetch({
data:data,
success:function() {
console.log(items);
}
});
});
}
var items = new Items();
items.get_items({id:1});
items.get_items({id:2});
In each of the console.log statement, I expect the contents of the collection to be different because I gave different parameters. But in Chrome, no matter what "id" value I give, the content of the collection doesn't change.
However, if I do
var Items = Backbone.Collection.extend({
get_items:function(data) {
this.fetch({
data:data,
success:function() {
console.log(items.models);
}
});
});
}
var items = new Items();
items.get_items({id:1});
items.get_items({id:2});
where I specifically printout "items.models", I can see that the list of models in the collection has indeed been changed.
What's going on here?
You're describing two different scenarios here: the collection is not just an array of models.. it has a lot of stuff attached to it. The collection's 'models' property is where all the data is. Only that is the absolute source of truth - anything else is either lying or doing something it's not supposed to.
Edit:
To add to this, why not reference 'this' instead of 'items'? i.e. this.models? After all, if you're wondering what is inside of the collection that you are working with, having to reference a global variable is a bit silly :) (and rather bad practice).
Edit #2:
Also, the 'fetch' is asynchronous unless specified otherwise.. so if you're trying to get a consistent, reproducible result, doing two fetches in a row isn't going to give it to you. If one of the requests is even a millisecond slower than the other for whatever reason then your results will be out of order anyway.