Disclaimer: I tried to make a jsfiddle of this, but without a public source for the RESTAdapter, I couldn't really make it work.
I have a model with a hasMany array of child models. I need to add a new model to this child array and save to the server:
App.FooModel = DS.Model.extend({
'name': DS.attr('string'),
'bars': DS.hasMany('App.BarModel')
});
App.BarModel = DS.Model.extend({
'name': DS.attr('string'),
});
App.ApplicationController = Ember.Controller.extend({
init: function() {
var foo = App.FooModel.find(101); // -- currently has bars[201, 202, 203]
var newBar = loadFixture( App.BarModel, 204 );
var self = this;
setTimeout( function() { // -- just to be sure our models are loaded before we try this
// foo.currentState: saved
foo.get('bars').addObject(newBar);
// foo.currentState: saved
foo.store.commit(); // -- nothing happens
}, 1000);
}
});
App = Ember.Application.create({
store: DS.Store.create({
revision: 11
})
});
But nothing happens. My parent model doesn't get marked as dirty, so the store never attempts a commit. Is there a different way I should be adding this relationship to the parent? Is this a bug?
Current Workaround:
foo.get('bars').addObject(newBar);
var save = foo.get('name');
foo.set('name', (save + '!'));
foo.set('name', save); // -- this marks our record as dirty, so a save will actually happen
foo.store.commit();
Edit 1: I'm aware that ember-data will only serialize this data if it was embedded to begin with (https://stackoverflow.com/a/15145803/84762), but I have overridden my serializer to handle this. The issue I'm having is that the store never even attempts to save this change so we never even get to the serializer.
Edit 2: I suspect this might have something to do with this bug, but at the same that would mean this wouldn't work for anyone and I have a hard time believing no one else has run into this, yet.
It looks like you're Modeling a one to many relationship yet you didn't include the belongsTo option on App.BarModel. Check this link out:
http://emberjs.com/guides/models/defining-models/#toc_one-to-many
App.Post = DS.Model.extend({
comments: DS.hasMany('App.Comment')
});
App.Comment = DS.Model.extend({
post: DS.belongsTo('App.Post')
});
For what I understand, you did not use the embedded feature of relationship but overrided your serializer to handle the serialization of bars objects into foo object.
I think your bug probably came from here : if your relation is not embedded there is no reason to mark the foo object dirty as when you add an object to his bars association what should change is usually a key foo_id of the bar object you added, then there is no changes of the foo object to send to the API.
Related
Newish to Backbone, and having some trouble. Going to try to ask this in a generic, no-code way since the application I'm tasked with maintaining is several thousand lines long... hope I can be clear.
I have a method myMethod(), that belongs to a model App.Person.
I have a collection App.PersonList that holds several instances of App.Person.
I have an instance (myPersonList) App.PersonList that I'm creating within an object (myDonationForm) that is an instance of an object App.DonationForm (and here we roam even further outside my comfort zone: App.DonationForm extends an object named Controller which extends an object called Base which seems to be a base.js thing and I have very little idea what's happening here but I hope it doesn't matter for my immediate need).
Also in App.DonationForm, I have an instance (myErrorMsg) of a model App.Errors. I would like to be able to set an attribute of myErrors from myMethod() but can't work out the syntax to refer to myErrors, traversing up the tree of nested objects and then back down a parallel step.
I hope that made sense. To visualize it:
myDonationForm, inst of App.DonationForm, ext Controller
|--myPersonList, inst of App.PersonList, ext Collection
| |--myPerson[1], inst of App.Person, ext Model // I want to change from here
| | +---myMethod()
| |--myPerson[2], inst of App.Person, ext Model // or from here
| +---myMethod()
+myErrorMsg, inst of App.Errors, ext Model // an attribute of this.
Thank you in advance for any pointers you can offer.
Edited to add a code snippet (and I accidentally tried to edit Hoyen's answer, not my own question! didn't realize it til I got the peer review screen, ugh)
App.SpecialDonationForm = App.DonationForm.extend({
[...]
initialize: function(options){
App.DonationForm.prototype.initialize.call(this, options);
[...]
},
start: function(){
App.DonationForm.prototype.start.call(this);
[...]
this.myPersonList = new App.PersonList(this.initialData);
this.myErrorMsg = new App.Errors();
[...]
Basically since myErrorMsg is an instance of a Backbone model you need to use Backbone's model methods to set the attributes. So, it should look something like this:
App.Person = Backbone.Model.extend(
{
[...]
defaults: {
[...],
errorMsg: new App.Errors()
},
[...]
myMethod: function(){
var value = ""; // set it to what ever you like
this.get("errorMsg").set("message",value); // "message", is the attribute you want to update with the value
this.trigger('change:errorMsg'); // not sure if this is needed. but this will insure that this will trigger any event listeners on errorMsg
}
});
App.SpecialDonationForm = App.DonationForm.extend({
[...]
initialize: function(options){
App.DonationForm.prototype.initialize.call(this, options);
[...]
},
start: function(){
App.DonationForm.prototype.start.call(this);
[...]
this.myPersonList = new App.PersonList(this.initialData);
this.myErrorMsg = new App.Errors();
this.myPersonList.on('change:errorMsg',showError.bind(this));
function showError(model,val,options){
this.myErrorMsg.set(_.extend(this.myErrorMsg.defaults,model.toJSON());
}
[...]
An event aggregator may work in this scenario.
Setup to extend Backbone events
App.eventAgg = _.extend({object}, Backbone.Events);
During App.Errors initialize, listenTo an event
this.listenTo(App.eventAgg, 'someEvent', this.doSomeUpdate);
In App.Person's myMethod, trigger the event
App.eventAgg.trigger('someEvent');
To answer your generic question even more generically,
Based on your hierarchy, a Person object doesn't know anything at all about the Errors object. Child objects don't (and shouldn't) know anything about their parents.
A Person object does (presumably) know, however, that an "error" has occurred and someone might want to know about it.
Since the Person object doesn't know any more than that, all it can do is trigger a custom event.
Some other object needs to listen for this custom event. Based on your hierarchy, the only other object that has access to the Person object is the PersonList collection.
The PersonList collection is now in the same situation. It has learned (from the custom event) that an error has occurred, but it can't do anything about it and it doesn't know about its parents. So all it can do is trigger another custom event.
The DonationForm object has to listen for the custom events from the PersonList collection. When it receives the event, it can then pass it to the Errors object.
Voila.
Given the following models:
(note: these are simplified for illustration purposes)
App.CustomerOrder = DS.Model.extend({
deliveries: DS.hasMany('delivery'),
total: DS.attr('number')
});
App.Delivery = DS.Model.extend({
orderlines: DS.hasMany('orderline')
});
App.OrderLine = DS.Model.extend({
productid: DS.attr('string'),
qtyordered: DS.attr('number')
});
When the app first loads I'm querying an API that sends me information about which dependencies should trigger an update. So for example it'll send me something like:
CustomerOrder: ["deliveries", "deliveries.orderlines", "deliveries.orderlines.qtyordered"...]
..means, if deliveries are added/deleted from a customerorder or if lines are added/deleted from a delivery attached to a customer order or if the qtyordered on an orderline on a delivery attached to a customer order, then what the API expects is for me to serialize CustomerOrder (along with the entire chain of relationships) and sent to an 'update' service (i.e. server/customerorder/updates type thing) that will run various routines and fill-in pieces of data and send the entire chain of objects back.
For illustration purposes I've put a simple example on here of an ordertotal (I realize this is easily calculated client-side but there's a bunch of other stuff that would be duplication of code from the server). So, if the qtyordered on an orderline changes, I need to send the customerorder instance to the server, where it will update my total field.
One of the challenges is that I can't hard code that dependency list by setting up observer functions with .observes() type stuff, it has to be done dynamically after that dependency data is loaded (presumably using addObserver). The other is that observers wont dig multiple layers deep like that.
I've tried using a mix-in to the models that overrides the init function and does exactly that.
clientchangeset: DS.attr('raw'),
init: function() {
this._super.apply(this, arguments);
var className = this.auth.camelizedModelString(this.constructor.toString());
var watchlist = this.auth.dependencies[className] || null;
var self = this;
watchlist.forEach(function(watch) {
if(watch.hasOwnProperty('attributeName') && watch.hasOwnProperty('collectionFlag')) {
// {attributeName: attributeName, collectionFlag: collectionFlag}
if(watch['collectionFlag']) {
console.log(className+'.addObserver('+watch['attributeName']+'.#each.clientchangeset)');
self.addObserver(watch['attributeName']+'.#each.clientchangeset', null, 'updateChangelist');
} else {
console.log(className+'.addObserver('+watch['attributeName']+')');
self.addObserver(watch['attributeName'], null, 'updateChangelist');
}
}
});
},
This appears to work, but only one layer deep. For completeness, heres the updateChangelist function:
updateChangelist: function(src, field, value) { //jshint ignore:line
if(this.get('pauseUpdates')) {
return;
}
var className = this.auth.camelizedModelString(this.constructor.toString());
var oldclientchangeset = this.get('clientchangeset') || [];
console.log('Before: '+className+'.[clientchangeset]= '+oldclientchangeset);
oldclientchangeset.pushObject(field);
this.set('clientchangeset', oldclientchangeset);
console.log('After: '+className+'.[clientchangeset]= '+oldclientchangeset);
}
So in general the way I got this to work was to create the observers as indicated, but the handlers simply update a property called '_needsUpdate' on each level of the relationships whenever they are triggered. '_needsUpdate' is just a date so when triggered I do:
this.set('_needsUpdate', +new Date());
Then when setting up observers at each level for that level's children, I just set up a single observer to look at child.#each._needsUpdate.
When saving a model, Backbone determines whether to send an HTTP POST or PUT request by whether or not the model's ID attribute is set. If there is an ID, the model is considered to already exist.
For my application, this logic is incorrect because I must allow the user to specify an ID (as I interact with a poorly designed legacy system).
How should I handle this problem? I still would like to use PUT if the model is changed.
I am considering the following options:
Override isNew, which is the Backbone method that simply checks if an ID is present.
Override sync.
Determine if the concept of cid would somehow solve the problem.
One solution is to address the symptoms rather than the cause. Consider adding to your model a new create method:
var FooModel = Backbone.Model.extend({
urlRoot: '/api/foo',
create: function () {
return this.save(null, {
type: 'post', // make it a POST rather than PUT
url: this.urlRoot // send the request to /api/foo rather than /api/foo/:id
});
}
});
This is the solution I use, but I don't consider it ideal because the view logic/caller now needs to call create rather than save when creating (which is rather easy to do). This extended API bothers me for my use-case (despite working and being rather small), but perhaps it'll work for yours.
I'd love to see some additional answers to this question.
So I went down the path of trying to change up isNew.
I came up with new criteria that would answer whether a model is new:
Was the model created via a fetch from a collection? Then it's definitely not new.
Was the model created with an ID attribute? This is a choice I made for my case, see disadvantages below for the effect of doing this, but I wanted to make new Model({ id: 1, name: 'bob' }) not be considered new, while setting the ID later on (new Model({ name:
bob'}).set('id', 1)) would be.
Was the model ever synced? If the model was successfully synced at any point, it's definitely not new because the server knows about it.
Here's what this looks like:
var UserDefinedIDModel = Backbone.Model.extend({
// Properties
_wasCreatedWithID: false,
_wasConstructedByFetch: false,
_wasSynced: false,
// Backbone Overrides
idAttribute: 'some_id',
urlRoot: '/api/foo',
constructor: function (obj, options) {
this._wasCreatedWithID = !!obj[this.idAttribute];
this._wasConstructedByFetch = options && options.xhr && options.xhr.status === 200;
// Preserve default constructor
return Backbone.Model.prototype.constructor.apply(this, arguments);
},
initialize: function () {
this.on('sync', this.onSync.bind(this));
},
isNew: function () {
// We definitely know it's not new
if (this._wasSynced || this._wasConstructedByFetch) return false;
// It might be new based on this. Take your pick as to whether its new or not.
return !this._wasCreatedWithID;
},
// Backbone Events
onSync: function () {
this._wasSynced = true;
}
});
Advantages over the other answers
No logic outside of the backbone model for handling this odd usecase.
No server-side changes to support this
No new pseudo properties
Disadvantages
This is a lot of code when you could just create a new create method as per my other answer.
Currently myCollection.create({ some_id: 'something' }); issues a PUT. I think if you need support for this you'll have to do myCollection.create({ some_id: 'something' }, { url: '/api/foo', type: 'post' }); You can remove the _wasCreatedWithoutID check to fix this, but then any construction of a new model that derives its data from an existing one will be treated as new (in my case, this is undesirable).
Here's another solution :
In your model define an idAttribute that don't exists in your server model/table/... and that wouldn't be displayed to the DOM.
So let's suppose that the JSON that you send to the server is as follows :
{
'id': 1,
'name': 'My name',
'description': 'a description'
}
Your model should look like :
var MyModel = Backbone.Model.extend({
idAttribute: 'fakeId'
});
Now, when you create a new model and try to save it to the server, no one would initialize the fakeId and it would be considered a new object (POST).
When you fetch your model from the server you have to set the fakeId in your model, and your server must duplicate the id in the fakeId like this your model will be considered as an existing (PUT)
I got nested JSON data from the server like this:
{
name: "Alice",
profile: {
something: "abc"
}
}
and I have the following model:
App.User = Ember.Object.extend({
name: null,
profile: Ember.Object.extend({
something: null
})
})
If I simply do App.User.create(attrs) or user.setProperties(attrs), the profile object gets overwritten by plain JS object, so currently I'm doing this:
var profileAttr = attrs.profile;
delete attrs.profile
user.setProperties(attrs); // or user = App.User.create(attrs);
user.get('profile').setProperties(profileAttrs);
It works, but I've got it in a few places and in the real code I've got more than one nested object, so I was wondering if it's ok to override User#create and User#setProperties methods to do it automatically. Maybe there's some better way?
Based on your comment, you want the automatic merging behaviour you get with models (the sort of thing you get with .extend()). In that case, you could try registering a custom transformer, something like:
App.ObjectTransform = DS.Transform.extend({
deserialize: function(json){
return Ember.Object.create(json);
}
});
App.User = DS.Model.extend({
profile: DS.attr('object')
});
See: https://github.com/emberjs/data/blob/master/TRANSITION.md#json-transforms
If you are doing your server requests without an adapter you can use the model class method load() with either an array of json objects or a single object. This will refresh any known records already cached and stash away the JSON for future primary key based lookups. You can also call load() on a model instance with a JSON hash as well but it will only update that single model instance.
Its unclear why you are not using an adapter, you can extend one of the Ember Model adapters and override the the record loading there, eg. extend from the RESTAdapter and do any required transform on the JSON if required by overriding _loadRecordFromData
You can also override your models load function to transform data received if required as well. The Ember Model source is fairly easy to read so its not hard to extend to your requirements.
in backbone we have an app that uses an event Aggregator, located on the window.App.Events
now, in many views, we bind to that aggregator, and i manually wrote a destroy function on a view, which handles unbinding from that event aggregator and then removing the view. (instead of directly removing the view).
now, there were certain models where we needed this functionality as well, but i can't figure out how to tackle it.
certain models need to bind to certain events, but maybe i'm mistaken but if we delete a model from a collection it stays in memory due to these bindings to the event aggregator which are still in place.
there isn't really a remove function on a model, like a view has.
so how would i tacke this?
EDIT
on request, some code example.
App = {
Events: _.extend({}, Backbone.Events)
};
var User = Backbone.Model.extend({
initialize: function(){
_.bindAll(this, 'hide');
App.Events.bind('burglar-enters-the-building', this.hide);
},
hide: function(burglarName){
this.set({'isHidden': true});
console.warn("%s is hiding... because %s entered the house", this.get('name'), burglarName);
}
});
var Users = Backbone.Collection.extend({
model: User
});
var House = Backbone.Model.extend({
initialize: function(){
this.set({'inhabitants': new Users()});
},
evacuate: function(){
this.get('inhabitants').reset();
}
});
$(function(){
var myHouse = new House({});
myHouse.get('inhabitants').reset([{id: 1, name: 'John'}, {id: 1, name: 'Jane'}]);
console.log('currently living in the house: ', myHouse.get('inhabitants').toJSON());
App.Events.trigger('burglar-enters-the-building', 'burglar1');
myHouse.evacuate();
console.log('currently living in the house: ', myHouse.get('inhabitants').toJSON());
App.Events.trigger('burglar-enters-the-building', 'burglar2');
});
view this code in action on jsFiddle (output in the console): http://jsfiddle.net/saelfaer/szvFY/1/
as you can see, i don't bind to the events on the model, but to an event aggregator.
unbinding events from the model itself, is not necessary because if it's removed nobody will ever trigger an event on it again. but the eventAggregator is always in place, for the ease of passing events through the entire app.
the code example shows, that even when they are removed from the collection, they don't live in the house anymore, but still execute the hide command when a burglar enters the house.
I see that even when the binding event direction is this way Object1 -> listening -> Object2 it has to be removed in order to Object1 lost any alive reference.
And seeing that listening to the Model remove event is not a solution due it is not called in a Collection.reset() call then we have two solutions:
1. Overwrite normal Collection cleanUp
As #dira sais here you can overwrite Collection._removeReference to make a more proper cleaning of the method.
I don't like this solutions for two reasons:
I don't like to overwrite a method that has to call super after it.
I don't like to overwrite private methods
2. Over-wrapping your Collection.reset() calls
Wich is the opposite: instead of adding deeper functionality, add upper functionality.
Then instead of calling Collection.reset() directly you can call an implementation that cleanUp the models before been silently removed:
cleanUp: function( data ){
this.each( function( model ) { model.unlink(); } );
this.reset( data );
}
A sorter version of your code can looks like this:
AppEvents = {};
_.extend(AppEvents, Backbone.Events)
var User = Backbone.Model.extend({
initialize: function(){
AppEvents.on('my_event', this.listen, this);
},
listen: function(){
console.log("%s still listening...", this.get('name'));
},
unlink: function(){
AppEvents.off( null, null, this );
}
});
var Users = Backbone.Collection.extend({
model: User,
cleanUp: function( data ){
this.each( function( model ) { model.unlink(); } );
this.reset( data );
}
});
// testing
var users = new Users([{name: 'John'}]);
console.log('users.size: ', users.size()); // 1
AppEvents.trigger('my_event'); // John still listening...
users.cleanUp();
console.log('users.size: ', users.size()); // 0
AppEvents.trigger('my_event'); // (nothing)
Check the jsFiddle.
Update: Verification that the Model is removed after remove the binding-event link
First thing we verify that Object1 listening to an event in Object2 creates a link in the direction Obect2 -> Object1:
In the above image we see as the Model (#314019) is not only retained by the users collection but also for the AppEvents object which is observing. Looks like the event linking for a programmer perspective is Object that listen -> to -> Object that is listened but in fact is completely the opposite: Object that is listened -> to -> Object that is listening.
Now if we use the Collection.reset() to empty the Collection we see as the users link has been removed but the AppEvents link remains:
The users link has disappear and also the link OurModel.collection what I think is part of the Collection._removeReference() job.
When we use our Collection.cleanUp() method the object disappear from the memory, I can't make the Chrome.profile tool to explicitly telling me the object #314019 has been removed but I can see that it is not anymore among the memory objects.
I think the clean references process is a tricky part of Backbone.
When you remove a Model from a Collection the Collection takes care to unbind any event on the Model that the Collection its self is binding. Check this private Collection method.
Maybe you can use such a same technique in your Aggregator:
// ... Aggregator code
the_model.on( "remove", this.unlinkModel, this );
// ... more Aggregator code
unlinkModel: function( model ){
model.off( null, null, this );
}
This is in the case the direction of the binding is Aggregator -> Model. If the direction is the opposite I don't think you have to make any cleaning after Model removed.
Instead of wrapping Collection's reset with cleanUp as fguillen suggested, I prefer extending Collection and overriding reset directly. The reason is that
cleanUp takes effect only in client's code, but not in library(i.e. Backbone)'s.
For example, Collection.fetch may internally call Collection.reset. Unless modifying the Backbone's source code, we cannot unbind models from events(as in cleanUp) after calling Collection.fetch.
Basically, my suggested snippet is as follows:
var MyCollection = Backbone.Collection.extend({
reset: function(models, options) {
this.each(function(model) {
model.unlink(); // same as fguillen's code
});
Backbone.Collection.prototype.reset.apply(this, arguments);
}
});
Later, we can create new collections based on MyCollection.