Breeze Create Entity with PascalCased initial values - javascript

I have an Angular + Breeze + WebApi(EF) setup which works very well when querying for data. The breezeProvider is set for camelCase on the client by default, and PascalCase on the server side.
bp.NamingConvention.camelCase.setAsDefault();
The trouble I'm having is when pushing new data over signalR to my app. The data arrives PascalCased from the .NET Stack (Breeze explicity says to not mess with the casing on the server). I then use the standard factory to create a new entity and try to initialize it with the passed values. Since the initialier hash is all PascalCased, most of the properties fail to initialize properly. Is there a way to tell Breeze that it should convert this data the same way that it does when querying? I have not yet been successful. Basically, I just want Breeze to treat this the same as it treats all of the data it receives.
function onSignalRData(call){
var callType = manager.metadataStore.getEntityType("Call");
// call json from server is like { Id: 1, Name: "Paul", IsActive: true }
// I think Breeze expects { id: 1, name: "Paul", isActive: true }
var newCall = callType.createEntity(call);
}
Any suggestions? I see Breeze mentions manually defined naming conventions, but again I'm not doing anything exotic here, just doing the same thing that Breeze does on queries but in a creation initializer.

This may not be the best solution, but it seems to work for me. I used the same method as formatting json data to be camelCased and reparsed the data with a reviver function to camelCase it. I'm surprised that there's not a pipeline block in breeze entityManager.createEntity(...) that can handle this.
function reviver(key, val) {
if (key)
this[key.charAt(0).toLowerCase() + key.slice(1)] = val;
else
return val;
}
var parsed = JSON.parse(myjson, reviver);

Related

AWS DynamoDB data with and/or without types?

I'm using the aws-sdk for NodeJS to interact with a DynamoDB table. This is my first look at DynamoDB.
When using a call like getItem() or updateItem(), the data structure includes types, like this:
{
'a': { S: 'My string' }
}
Is there any way to pass and receive these objects without the types..? So...
{
'a': 'My string'
}
Or, any helper functions already written that will convert objects to and from this format..?
const data = dbToObj({ 'a': { S: 'My string' } })
// { 'a': 'My string' }
So I could convert to it when populating call params, and convert from it when receiving data.
Trying to avoid accessing my data like:
const myData = data.Item.something.M.somethinElse.M.qwerty.S
I know I could write something myself, just wondering if anyone knows of functions/options already available that would do this. I couldn't find any.
Found this:
https://github.com/kayomarz/dynamodb-data-types
https://www.npmjs.com/package/dynamodb-data-types
Exactly what I was looking for.
Install: npm i dynamodb-data-types
Provides wrap({ 'a': 'My string' }) and unwrap({ 'a': { S: 'My string' } }) methods for doing the conversion to and from plain objects.
UPDATE
I've now also found this: AWS.DynamoDB.DocumentClient, which is part of the aws-sdk.
The document client simplifies working with items in Amazon DynamoDB by abstracting away the notion of attribute values. This abstraction annotates native JavaScript types supplied as input parameters, as well as converts annotated response data to native JavaScript types.
UPDATE 2
This is being worked on by Amazon under their awslabs github page:
Amazon DynamoDB Automarshaller
This library provides a Marshaller class that converts native JavaScript values to DynamoDB AttributeValues and back again. It's designed to work with ES6 features like sets, maps, and iterables, and can be configured to support data types only supported by JavaScript (such as empty binary buffers) or by Amazon DynamoDB (such as numbers of arbitrary size) with minimal tradeoffs.
It's part of their DynamoDB DataMapper For JavaScript package.
We use dynamo-easy with typescript for our production applications. (directly from browser or inside Lambda functions)
It provides the mapping from JS to DynamoDB types but also some nice abstraction for the request API.
import { Model, PartitionKey, DynamoStore } from '#shiftcoders/dynamo-easy'
#Model()
export class Person {
#PartitionKey()
id: string
name: string
yearOfBirth: number
}
const personStore = new DynamoStore(Person)
personStore
.scan()
.whereAttribute('yearOfBirth').equals(1958)
.exec()
.then(res => console.log('ALL items with yearOfBirth == 1958', res))
full disclosure: I am one of the authors of the library
Dynogels offer a cleaner way to handle without worrying about types.
We use it in production and works without any issues.
https://github.com/clarkie/dynogels
Account.create({email: 'foo#example.com', name: 'Foo Bar', age: 21}, function (err, acc) {
console.log('created account in DynamoDB', acc.get('email'));
});
It is a ODM (Object Data Mapper) for dynamodb.
Hope it helps.
Dynamoose is another modelling tool that can abstract even more extraneous code away. The github is here. I think its built on top of the AWS documentclient, although I haven't done much digging.

Sending a list of custom objects from server to client

I made a Character class shared by the client and server in the same js file.
The server instanciates these characters and stocks them in a characterList object. I send it to the client with socket.emit( 'updated_characters', characterList ), but the client gets back an object of Object, instead of an object of Character, so I can't use my Character's methods on them.
How can I get around that ?
You can't send custom object types directly using socket.io. socket.io uses JSON as the transport format for objects and JSON does not have the ability to record or rebuild custom object types.
So, what you need to do is to create a serialized format for your objects (which will also likely be JSON) and then when you read the data on the other end, you will construct a custom object in your own code and then pass it this data in the constructor from which it will initialize itself.
The canonical ways to do this is to create a serialize method for your custom object that creates a new plain object that contains all the relevant info that needs to be sent over the wire and does not contain references to other objects (since that can't be serialized). Keep in mind that objects often have housekeeping information that does not necessarily need to be sent to the other end. Then, create an initialize method for your custom object that can take this serialized data and properly initialize your object that you can use on the other end.
Here's a simple example:
function MyObject(data) {
if (data) {
this.init(data);
} else {
this.count = 0;
this.greeting = "hello"
}
}
MyObject.prototype = {
init: function(data) {
this.greeting = data.greeting;
this.count = data.count;
},
getData: function() {
return {greeting: this.greeting, count: this.count};
},
talk: function() {
say(this.greeting);
}
};
Then, on the sending end of things, if you have an instance of your object in item, you would send the data with:
socket.emit('updated_characters', item.getData());
And, on the receiving side of things, you would have this:
socket.on('updated_characters', function(data) {
var item = new MyObject(data);
// do something with the item here
});
socket.emit( 'updated_characters', JSON.stringify(characterList) );
we have to do JSON.stringify in order for socket to identify as JSON and display it in front-end

Dojo JsonRestStore with array not at root-level of JSON response

Is there a way to configure a JsonRestStore to work with an existing web service that returns an array of objects which is not at the root-level of the JSON response?
My JSON response is currently similar to this:
{
message: "",
success: true,
data: [
{ name: "Bugs Bunny", id: 1 },
{ name: "Daffy Duck", id: 2 }
],
total: 2
}
I need to tell the JsonRestStore that it will find the rows under "data", but I can't see a way to do this from looking at the documentation. Schema seems like a possibility but I can't make sense of it through the docs (or what I find in Google).
My web services return data in a format expected by stores in Ext JS, but I can't refactor years worth of web services now (dealing with pagination via HTTP headers instead of query string values will probably be fun, too, but that's a problem for another day).
Thanks.
While it's only barely called out in the API docs, there is an internal method in dojox/data/JsonRestStore named _processResults that happens to be easily overridable for this purpose. It receives the data returned by the service and the original Deferred from the request, and is expected to return an object containing items and totalCount.
Based on your data above, something like this ought to work:
var CustomRestStore = declare(JsonRestStore, {
_processResults: function (results) {
return {
items: results.data,
totalCount: results.total
};
}
});
The idea with dojo/store is that reference stores are provided, but they are intended to be customized to match whatever data format you want. For example, https://github.com/sitepen/dojo-smore has a few additional stores (e.g. one that handles Csv data). These stores provide good examples for how to handle data that is offered under a different structure.
There's also the new dstore project, http://dstorejs.io/ , which is going to eventually replace dojo/store in Dojo 2, but works today against Dojo 1.x. This might be easier for creating a custom store to match your data structure.

AngularJS/Mongoose: how to assign objects if you have the ID

I have a way that works, it just seems like a stupid way of doing things.
I have an Invoice object ($resource) that has a client value, which, because of the way Mongoose works, is assigned to an ID. But, I want to use the client values (e.g. client.name, client.address, etc.) in the invoice views in Angular. Here's the code I have, which takes invoice.client (an ID) and reassigns the whole client object:
Invoices.query(function(invoices) {
angular.forEach(invoices, function(invoice){
Clients.get({
clientId: invoice.client
}, function(client) {
invoice.client = client;
})
});
$scope.invoices = invoices;
});
This seems very redundant, because I already have all the clients loaded in $scope.clients, but I couldn't think of a way to use those instead, which would require far fewer database calls.
$scope.clients is a promise object, since it is populated like this:
Clients.query(function(clients) {
$scope.clients = clients;
});

Creating and updating nested objects in Ember

I got nested JSON data from the server like this:
{
name: "Alice",
profile: {
something: "abc"
}
}
and I have the following model:
App.User = Ember.Object.extend({
name: null,
profile: Ember.Object.extend({
something: null
})
})
If I simply do App.User.create(attrs) or user.setProperties(attrs), the profile object gets overwritten by plain JS object, so currently I'm doing this:
var profileAttr = attrs.profile;
delete attrs.profile
user.setProperties(attrs); // or user = App.User.create(attrs);
user.get('profile').setProperties(profileAttrs);
It works, but I've got it in a few places and in the real code I've got more than one nested object, so I was wondering if it's ok to override User#create and User#setProperties methods to do it automatically. Maybe there's some better way?
Based on your comment, you want the automatic merging behaviour you get with models (the sort of thing you get with .extend()). In that case, you could try registering a custom transformer, something like:
App.ObjectTransform = DS.Transform.extend({
deserialize: function(json){
return Ember.Object.create(json);
}
});
App.User = DS.Model.extend({
profile: DS.attr('object')
});
See: https://github.com/emberjs/data/blob/master/TRANSITION.md#json-transforms
If you are doing your server requests without an adapter you can use the model class method load() with either an array of json objects or a single object. This will refresh any known records already cached and stash away the JSON for future primary key based lookups. You can also call load() on a model instance with a JSON hash as well but it will only update that single model instance.
Its unclear why you are not using an adapter, you can extend one of the Ember Model adapters and override the the record loading there, eg. extend from the RESTAdapter and do any required transform on the JSON if required by overriding _loadRecordFromData
You can also override your models load function to transform data received if required as well. The Ember Model source is fairly easy to read so its not hard to extend to your requirements.

Categories