Persisting & loading metadata in a backbone.js collection - javascript

I have a situation using backbone.js where I have a collection of models, and some additional information about the models. For example, imagine that I'm returning a list of amounts: they have a quantity associated with each model. Assume now that the unit for each of the amounts is always the same: say quarts. Then the json object I get back from my service might be something like:
{
dataPoints: [
{quantity: 5 },
{quantity: 10 },
...
],
unit : quarts
}
Now backbone collections have no real mechanism for natively associating this meta-data with the collection, but it was suggested to me in this question: Setting attributes on a collection - backbone js that I can extend the collection with a .meta(property, [value]) style function - which is a great solution. However, naturally it follows that we'd like to be able to cleanly retrieve this data from a json response like the one we have above.
Backbone.js gives us the parse(response) function, which allows us to specify where to extract the collection's list of models from in combination with the url attribute. There is no way that I'm aware of, however, to make a more intelligent function without overloading fetch() which would remove the partial functionality that is already available.
My question is this: is there a better option than overloading fetch() (and trying it to call it's superclass implementation) to achieve what I want to achieve?
Thanks

Personally, I would wrap the Collection inside another Model, and then override parse, like so:
var DataPointsCollection = Backbone.Collection.extend({ /* etc etc */ });
var CollectionContainer = Backbone.Model.extend({
defaults: {
dataPoints: new DataPointsCollection(),
unit: "quarts"
},
parse: function(obj) {
// update the inner collection
this.get("dataPoints").refresh(obj.dataPoints);
// this mightn't be necessary
delete obj.dataPoints;
return obj;
}
});
The Collection.refresh() call updates the model with new values. Passing in a custom meta value to the Collection as previously suggested might stop you from being able to bind to those meta values.

This meta data does not belong on the collection. It belongs in the name or some other descriptor of the code. Your code should declaratively know that the collection it has is only full of quartz elements. It already does since the url points to quartz elements.
var quartzCollection = new FooCollection();
quartzCollection.url = quartzurl;
quartzCollection.fetch();
If you really need to get this data why don't you just call
_.uniq(quartzCollecion.pluck("unit"))[0];

Related

Creating and updating nested objects in Ember

I got nested JSON data from the server like this:
{
name: "Alice",
profile: {
something: "abc"
}
}
and I have the following model:
App.User = Ember.Object.extend({
name: null,
profile: Ember.Object.extend({
something: null
})
})
If I simply do App.User.create(attrs) or user.setProperties(attrs), the profile object gets overwritten by plain JS object, so currently I'm doing this:
var profileAttr = attrs.profile;
delete attrs.profile
user.setProperties(attrs); // or user = App.User.create(attrs);
user.get('profile').setProperties(profileAttrs);
It works, but I've got it in a few places and in the real code I've got more than one nested object, so I was wondering if it's ok to override User#create and User#setProperties methods to do it automatically. Maybe there's some better way?
Based on your comment, you want the automatic merging behaviour you get with models (the sort of thing you get with .extend()). In that case, you could try registering a custom transformer, something like:
App.ObjectTransform = DS.Transform.extend({
deserialize: function(json){
return Ember.Object.create(json);
}
});
App.User = DS.Model.extend({
profile: DS.attr('object')
});
See: https://github.com/emberjs/data/blob/master/TRANSITION.md#json-transforms
If you are doing your server requests without an adapter you can use the model class method load() with either an array of json objects or a single object. This will refresh any known records already cached and stash away the JSON for future primary key based lookups. You can also call load() on a model instance with a JSON hash as well but it will only update that single model instance.
Its unclear why you are not using an adapter, you can extend one of the Ember Model adapters and override the the record loading there, eg. extend from the RESTAdapter and do any required transform on the JSON if required by overriding _loadRecordFromData
You can also override your models load function to transform data received if required as well. The Ember Model source is fairly easy to read so its not hard to extend to your requirements.

Breeze JS: Is there a way to query entities from data.results?

I have a Breeze web api controller, with methods that accept parameters and do some work, filtering, sorting, etc, on the server.
On the querySucceeded, I'd like to do further querying to data.results. Is there a way to accomplish this? I got this working by exporting/importing data.results to a local manager, and do the projection from there. The projection is needed in order to use the observable collection in a vendor grid control.
var query = datacontext.EntityQuery.from("GetActiveCustomers")
.withParameters({ organizationID: "1" })
.toType("Customer")
.expand("Organization")
.orderBy('name');
var queryProjection = query
.select("customerID, organizationID, name, isActive, organization.name");
return manager.executeQuery(query)
.then(querySucceeded)
.fail(queryFailed);
function querySucceeded(data) {
var exportData = manager.exportEntities(data.results);
var localManager = breeze.EntityManager.importEntities(exportData);
var resultProjection = localManager.executeQueryLocally(queryProjection);
//This is the way I came up with to query data.results (exporting/importing the result entities to a local manager)
//Is there a better way to do this? Querying directly data.results. Example: data.results.where(...).select("customerID, organizationID...)
if (collectionObservable) {
collectionObservable(resultProjection);
}
log('Retrieved Data from remote data source',
data, true);
}
You've taken an interesting approach. Normally a projection returns uncacheable objects, not entities. But you casted this to Customer (with the toType clause) which means you've created PARTIAL Customer entities with missing data.
I must hope you know what you are doing and have no intention of saving changes to these customer entities while they remain partial else calamity may ensue.
Note that when you imported the selected Customers to the "localManager" you did not bring along their related Organization entities. That means an expression such as resultProjection[0].organization will return null. That doesn't seem correct.
I understand that you want to hold on to a subset of the Customer partial entities and that there is no local query that could select that subset from cache because the selection criteria are only fully known on the server.
I think I would handle this need differently.
First, I would bury all of this logic inside the DataContext itself; the purpose of a DataContext is to encapsulate the details of data access so that callers (such as ViewModels) don't have to know internals. The DataContext is an example of the UnitOfWork (UoW) pattern, an abstraction that helps isolate the data access/manipulation concerns from ViewModel concerns.
Then I would store it either in a named array of the DataContext (DC) or of the ViewModel (VM), depending upon whether this subset was of narrow or broad interest in the application.
If only the VM instance cares about this subset, then the DC should return the data.results and let the VM hold them.
I do not understand why you are re-querying a local EntityManager for this set nor for why your local query is ALSO appling a projection ... which would return non-entity data objects to the caller. What is wrong with returning the (partial) Customer entities.
It seems you intend to further filter the subset on the client. Hey ... it's a JavaScript array. You can call stuffArray.filter(filterFunction).
Sure that doesn't give you the Breeze LINQ-like query syntax ... but do you really need that? Why do you need ".select" over that set?
If that REALLY is your need, then I guess I understand why you're dumping the results into a separate EntityManager for local use. In that case, I believe you'll need more code in your query callback method to import the related Organization entities into that local EM so that someCustomer.organization returns a value. The ever-increasing trickiness of this approach makes me uncomfortable but it is your application.
If you continue down this road, I strongly encourage you to encapsulate it either in the DC or in some kind of service class. I wouldn't want my VMs to know about any of these shenanigans.
Best of luck.
Update 3 Oct 2013: Local cache query filtering on unmapped property
After sleeping on it, I have another idea for you that eliminates your need for a second EM in this use case.
You can add an unmapped property to the client-side Customer entity and set that property with a subset marker after querying the "GetActiveCustomers" endpoint on the server; you'd set the marker in the query callback.
Then you can compose a local query that filters on the marker value to ensure you only consider Customer objects from that subset.
Reference the marker value only in local queries. I don't know if a remote query filtering on the marker value will fail or simply ignore that criterion.
You won't need a separate local EntityManager; the Customer entities in your main manager carry the evidence of the server-side filtering. Of course the server will never have to deal with your unmapped property value.
Yes, a breeze local query can target unmapped properties as well as mapped properties.
Here's a small demonstration. Register a custom constructor like this:
function Customer() { /* Custom constructor ... which you register with the metadataStore*/
// Add unmapped 'subset' property to be queried locally.
this.subset = Math.ceil(Math.random() * 3); // simulate values {1..3}
}
Later you query it locally. Here are examples of queries that do and do not reference that property:
// All customers in cache
var x = breeze.EntityQuery.from("Customers").using(manager).executeLocally();
// All customers in cache whose unmapped 'subset' property === 1.
var y = breeze.EntityQuery.from("Customers")
.where("subset", 'eq', 1) // more criteria; knock yourself out
.using(manager).executeLocally();
I trust you'll know how to set the subset property appropriately in your callback to our "GetActiveCustomers" query.
HTH
Once you queried for some data breeze stores those entities on the local memory.
All you have to do is query locally when you need to filter the data some more.
You do this by specifying for the manager to query locally :
manager.executeQueryLocally(query);
Because querying from the database is done asynchronously you have to make sure that you retrieve from the local memory only if there is something there. Follow the promises.

Optimal URL structure for Backbone.js and Backbone implementation

I'm developing a RESTful API for a Quiz app, which is going to be built with Backbone.js and Marionette. I'm quite new to backbone and was wondering what de best URL structure would be. I have the following resources:
Answer,
Question which contains Answers,
Question Group which contains Questions,
Quiz which contains Question Groups.
Two possible URL structures come to mind:
GET /quizzes/:id
GET /quizzes/:id/questiongroups
GET /quizzes/:id/questiongroups/:id
GET /quizzes/:id/questiongroups/:id/questions
GET /quizzes/:id/questiongroups/:id/questions/:id
GET /quizzes/:id/questiongroups/:id/questions/:id/answers
or:
GET /quizzes/:id
GET /quizzes/:id/questiongroups
GET /questiongroups/:id
GET /questiongroups/:id/questions
...
Now, I have been trying to use both of these options. With the first one, I can't figure out how to define the collections as a property of the parent models in Backbone so that I can use fetch() on them. The problem with the second option is a bit different: as I understand it, Backbone derives the url for a model from its collection, but the collection is a child of another resource, whereas the url for getting a single resource uses another collection, namely the global set of resources.
I'm pretty sure I'd have to override url() in both cases. I tried some things but didn't come up with anything useable at all. Also, I'd rather not override every single url()-model in the app, changing the API structure to suit the preferences of Backbone seems like a better option to me.
Any pointers as to what seems the right way to do it with Backbone would be great!
Thanks
If questiongroups can only appear in a single quiz, then the first option (the hierarchical one) is an obvious choice. To comply with RESTful conventions, you might want to consider using singular nouns instead: /quiz/:id/questiongroups/:id/question/:id/answer/:id
To solve your fetching problem, I would recommend using nested backbone models as per this answer: https://stackoverflow.com/a/9904874/1941552. I've also added a cheeky little parentModel attribute.
For example, your QuizModel could look something like this:
var Quiz = Backbone.Model.extend({
urlRoot: '/quiz/', // backbone appends the id automatically :)
defaults: {
title: 'My Quiz'
description: 'A quiz containing some question groups.'
},
model: {
questionGroups: QuestionGroups,
},
parse: function(response){
for(var key in this.model){
var embeddedClass = this.model[key];
var embeddedData = response[key];
response[key] = new embeddedClass(embeddedData, {
parse:true,
parentModel:this
});
}
return response;
}
});
Then, your QuestionGroups model could have the following url() function:
var QuestionGroups = Backbone.Model.extend({
// store metadata and each individual question group
url: function() {
return this.parentModel.url()+'/questiongroup/'+this.id;
}
});
Alternatively, if you don't need to store any metadata, you could use a Backbone.Collection:
var QuestionGroups = Backbone.Collection.extend({
model: QuestionGroup,
url: function() {
return this.parentModel.url()+'/questiongroup/'+this.id;
}
});
I'm afraid I haven't tested any of this, but I hope it can be useful anyway!

How to update the whole Backbone.js collection that in database?

I have a collection (an object list) in database. I can fetch it like: collectionModel.fetch()
But then user changes something on that collection. When user clickes on save button, the whole collection list must be update in database. I thought maybe i can delete() the old one first and then create() it with new one but i could'n achive it. I can't use the update() method because in this case i should find which collection elements has changed but i want to update whole list. How can i do that? Thanks for help.
Do you have a REST api in front of that database? That's how Backbone is made to work with. When your JavaScript code runs model.save(); a PUT request is made to your api for that model.
You question is about saving the whole collection, for that if you want to remain within the default implementation of Backbone you will have to go over all the models in the collection and call save for each of them.
If you want to make one single request to your server you will have to implement a custom method inside your collection. Something like:
MyCollection = Backbone.Collection.extend({
saveAll: function() {
var data = this.toJSON();
return Backbone.$.ajax({
data: { objects: data },
url: '/url/in/your/server/to/update/db'
});
}
});
That's going to send the array of all models in your collection converted to JSON to your server.
Again, you want to have a RESTful API on the server side if you want to make your life with Backbone easy.
If you want to reset collection you have to specify "reset" attribute.
collectionList.fetch({
reset: true,
...
});
But I think it's better to just update it:
collectionList.fetch({
remove: false,
update: true,
merge: true,
...
});
This is a very old question, but I had another approach so I thought I'd post it.
Sometimes my collections have a lot of data and the server doesn't get it all. I solved this by using one of the underscore methods that backbone collections have, invoke (also relies on jquery):
MyCollection = Backbone.Collection.extend({
update: function(callback) {
// Invoke the update method on all models
$.when.apply($, this.invoke('update')).then(() => {
// After complete call the callback method (if passsed)
if(callback) {
callback();
}
});
}
});
You can use it by calling collection.update() when the collection has models in it. A similar method can be used for creating or deleting collections, and this should be modifiable to catch errors but I didn't account for that.

Any recommendations for deep data structures with Backbone?

I've run into a headache with Backbone. I have a collection of specified records, which have subrecords, for example: surgeons have scheduled procedures, procedures have equipment, some equipment has consumable needs (gasses, liquids, etc). If I have a Backbone collection surgeons, then each surgeon has a model-- but his procedures and equipment and consumables will all be plain ol' Javascript arrays and objects after being unpacked from JSON.
I suppose I could, in the SurgeonsCollection, use the parse() to make new ProcedureCollections, and in turn make new EquipmentCollections, but after a while this is turning into a hairball. To make it sensible server-side there's a single point of contact that takes one surgeon and his stuff as a POST-- so propagating the 'set' on a ConsumableModel automagically to trigger a 'save' down the hierarchy also makes the whole hierarchical approach fuzzy.
Has anyone else encountered a problem like this? How did you solve it?
This can be helpful in you case: https://github.com/PaulUithol/Backbone-relational
You specify the relations 1:1, 1:n, n:n and it will parse the JSON accordingly. It also create a global store to keep track of all records.
So, one way I solved this problem is by doing the following:
Have all models inherit from a custom BaseModel and put the following function in BaseModel:
convertToModel: function(dataType, modelType) {
if (this.get(dataType)) {
var map = { };
map[dataType] = new modelType(this.get(dataType));
this.set(map);
}
}
Override Backbone.sync and at first let the Model serialize as it normally would:
model.set(response, { silent: true });
Then check to see if the model has an onUpdate function:
if (model.onUpdate) {
model.onUpdate();
}
Then, whenever you have a model that you want to generate submodels and subcollections, implement onUpdate in the model with something like this:
onUpdate: function() {
this.convertToModel('nameOfAttribute1', SomeCustomModel1);
this.convertToModel('nameOfAttribute2', SomeCustomModel2);
}
I would separate out the different surgeons, procedures, equipment, etc. as different resources in your web service. If you only need to update the equipment for a particular procedure, you can update that one procedure.
Also, if you didn't always need all the information, I would also lazy-load data as needed, but send down fully-populated objects where needed to increase performance.

Categories