I'm creating a record that should have a reference to another record.
I already created a record that has for RecordName France and for record type Countries. The record I now want to create looks like this:
var operations = container.publicCloudDatabase.newRecordsBatch(); // I'm normally creating many cities at once, newRecordsBatch() also works with only one record.
operations.create({
recordName: 'Paris'
recordType: 'Cities',
fields: {
Country: {
value: 'France'
}
}
});
operations.commit().then(function(response) {
if(response.hasErrors) {
console.log(response.errors[0]);
}
});
In the CloudKit Dashboard I have set that Cities to have one reference to Countries using the field Country. However when running the code it returns the server responded with a status of 400 (Bad Request).
I watched the WWDC video and the only thing Apple says about references in CloudKit JS is use a Reference object. I don't know what it is, I guess it's a JSON object but does someone know what are the key/values of this object?
A Reference object is a dictionary with the following keys:
recordName: The unique name used to identify the record within a zone. Required.
zoneID: Dictionary that identifies a record zone in the database.
action: The delete action for the reference object. NONE or DELETE_SELF or VALIDATE. Required.
Example of a good syntax for the Country field:
Country: {
value: {
recordName: 'France',
action: 'DELETE_SELF'
}
}
More info available in the documentation, pages 68-69.
Related
How to join column, that when i launch my function, he return me only columns name instead of entity_columnName.
I'm using TypeORM, and i try this;
const data = this.conn.getRepository(User).createQueryBuilder('user');
data.leftJoinAndSelect('user.orders', 'orders');
data.getRawMany();
but return me:
firstName: ...
lastName: ...
age: ...
order_name: ...
order_price: ...
instead of:
firstName: ...
lastName: ...
age: ...
name: ...
price: ...
can someone tell me how do this? thanks for any help
How are your User and Order entities defined?
If you define a relation (e.g. OneToMany) with the eager: true option, then TypeORM will automatically include the related entities when you query using the repository *find methods. It won't do this when you use the QueryBuilder, where you have to add them such as with leftJoinAndSelect() (as you have done).
An example from an Invoice entity that has OneToMany line items:
#OneToMany(
() => InvoiceLineItem,
(item: InvoiceLineItem) => item.invoice,
{ eager: true }
)
items: InvoiceLineItem[]
Per the example, if I were then to find() or findMany() invoices, then the related objects, items will be included as well because eager: true.
This behaviour might translate well to your situation with users and orders.
Also be aware of the differences between getMany() and getRawMany() when using the query builder.
If you use getMany() then TypeORM will automagically give you entities back (e.g. you'd get an instance of User with property orders that is an array of Order instances). The property names will be correct.
Since you added the NestJS tag to your question, also understand serialization:
https://docs.nestjs.com/techniques/serialization
There is a built-in ClassSerializerInterceptor that comes with NestJS that you might find useful.
In your controller you can decorate the class or any of its methods e.g.
#UseInterceptors(ClassSerializerInterceptor)
This will transform the response to JSON, and will apply the rules specified with class-transformer decorators on the entity/DTO class.
If you were to use this interceptor, the response sent to the client will have your desired property names.
If you really want to modify the response that your client gets back, you can also look into writing your own Interceptor.
I am developing a node js project in which the end-user can create and update outlets in mongoDb database. Now, let's say there are hundreds of fields in the outlet schema but the user may not update each and every field, i.e one may update 2 fields, other may update 3 fields. So, I want to create a function that can handle each type of request. I don't know this can be done or there is some other way, but i am new to this, can you suggest me something suitable for my project. Thanks in advance!
Sorry for the confusion earlier.
I am developing a project in nodejs for retail outlets. Each outlet has over 50 fields in the database while registering. Registration is fine. The POST request via API specifies all the data required.
But when I am planning to update any of those field. I am not sure what approach to handle. Sometimes only 1 field will be changed or next time a bunch of them with no order/sequence.
Example :
{"id":"ABCD", "name":"MNOPQR"}
{"id":"ABCD", "time":123, "name":"ZYX"}
So here in first query I need to only update the name while in next I need to update both name and time.
Is there any way I can manage the dynamic json parsing at server end and updating only those fields (int database) that are mentioned in the request.
You have several approaches you can use.
One is to accept an object with the changes:
function doTheChanges(changes) {
Object.keys(changes).forEach(name => {
const value = changes[name];
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges({foo: "bar", biz: "baz"});
...to change the foo field to "bar" and the biz field to "baz". (Names that have invalid identifier chars, etc., can be put in double quotse: {"I'm an interesting field name": "bar"}.)
Alternately you could accept an array of changes with name and value prperties:
function doTheChanges(changes) {
changes.forEach(({name, value}) => {
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges([
{
name: "foo",
value: "bar"
},
{
name: "biz",
value: "baz"
}
]);
You could also just accept a varying number of parameters, where each parameter is an object with name and value properties:
function doTheChanges(...changes) {
changes.forEach(({name, value}) => {
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges(
{
name: "foo",
value: "bar"
},
{
name: "biz",
value: "baz"
}
);
Note that that's very similar to the array option.
Or use the builder pattern, but it's probably overkill here.
Instead of put request you can use patch request so that only the values you change get affected in database .
First, we setup a scenario like so:
setupProject(server, []);
visit('/items');
This all works fine. The issue occurs when trying to update attributes of the current user prior to running the test.
Then update the current user with:
let user = server.create('user', 'organization', { enableManage: true });
This is intended to go to the specific user, go to an attribute object on that user called 'organization', and update an attribute of 'organization' called 'enableManage' to true.
Any help is appreciated.
You can always access Mirage's ORM via server.schema to mutate data in the database, prior to running a test.
let user = server.schema.users.find(1);
user.update({ organization: { enableManage: true });
That would update the organization property of this user record in the db.
If organization is an object you might want to do a clone, something like:
user.update({ organization: Object.assign(user.organization, { enableManage: true }));
By the way, depending on your API it looks like you might want to consider making organization a separate model, instead of a POJO that lives in each User's record.
Is there a way to configure a JsonRestStore to work with an existing web service that returns an array of objects which is not at the root-level of the JSON response?
My JSON response is currently similar to this:
{
message: "",
success: true,
data: [
{ name: "Bugs Bunny", id: 1 },
{ name: "Daffy Duck", id: 2 }
],
total: 2
}
I need to tell the JsonRestStore that it will find the rows under "data", but I can't see a way to do this from looking at the documentation. Schema seems like a possibility but I can't make sense of it through the docs (or what I find in Google).
My web services return data in a format expected by stores in Ext JS, but I can't refactor years worth of web services now (dealing with pagination via HTTP headers instead of query string values will probably be fun, too, but that's a problem for another day).
Thanks.
While it's only barely called out in the API docs, there is an internal method in dojox/data/JsonRestStore named _processResults that happens to be easily overridable for this purpose. It receives the data returned by the service and the original Deferred from the request, and is expected to return an object containing items and totalCount.
Based on your data above, something like this ought to work:
var CustomRestStore = declare(JsonRestStore, {
_processResults: function (results) {
return {
items: results.data,
totalCount: results.total
};
}
});
The idea with dojo/store is that reference stores are provided, but they are intended to be customized to match whatever data format you want. For example, https://github.com/sitepen/dojo-smore has a few additional stores (e.g. one that handles Csv data). These stores provide good examples for how to handle data that is offered under a different structure.
There's also the new dstore project, http://dstorejs.io/ , which is going to eventually replace dojo/store in Dojo 2, but works today against Dojo 1.x. This might be easier for creating a custom store to match your data structure.
I successfully implemented loading and showing relations with 'Backbone Relational' from an API I created. I get how things work by trial and error. I do think the docs are lacking some clarity though since it took a lot of time to figure out how things work. Especially on how to map things to the API I think the docs are lacking a bit.
Problem
Adding a bookmark works, it's the editing and deletion that don't work. The PUT becomes a POST and the DELETE simply doesn't fire at all. When I set an id to the model hardcoded it does work. So the id is missing which makes sense for the PUT becoming a POST.
The problem seems to be that the id doesn't hold an actual id, but a collection. The view where the problem occurs does not requires the BookmarkBinding, it's used somewhere else. Simply the fact that it has Bookmark as a relation makes the DELETE and PUT break.
BookmarkBinding model:
App.Model.BookmarkBinding = Backbone.RelationalModel.extend({
defaults: {
set_id: null,
bookmark_id: null
},
relations: [{
type: Backbone.HasOne,
key: 'bookmark',
relatedModel: 'App.Model.Bookmark',
reverseRelation: {
type: Backbone.HasOne,
key: 'id'
}
}],
urlRoot: 'http://api.testapi.com/api/v1/bookmark-bindings'
});
Bookmark model:
App.Model.Bookmark = Backbone.RelationalModel.extend({
defaults: {
url: 'undefined',
description: 'undefined',
visits: 0,
},
relations: [{
type: Backbone.HasMany,
key: 'termbindings',
relatedModel: 'App.Model.TermBinding',
reverseRelation: {
key: 'bookmark_id',
}
}],
urlRoot: 'http://api.testapi.com/api/v1/bookmarks'
});
From Backbonejs.org
The default sync handler maps CRUD to RESTful HTTP methods like so:
create → POST /collection
read → GET /collection[/id]
update → PUT /collection/id
delete → DELETE /collection/id
Your question suggests that you're making an HTTP PUT request, and therefore a Backbone update. If you want to make an HTTP POST, use Backbone create. The PUT request maps onto update, and requires that an id be sent in the URL, which isn't happening according to your server log. If your're creating a new object, then most server-side frameworks such as Rails / Sinatra / Zend will create an id for the object
Another possible source of error is the keys that you chose for the relations, like you suspected.
A Bookmark has many BookmarkBindings, and it seems that Backbone-relational will store them in the field that you specify in BookmarkBindings.relations.reverseRelation.key, which is currently defined as 'id'.
So the collection of related BookmarkBindings ids will to be stored on the same attribute as the Bookmark.id, creating a collision. Backbone.sync will send an undefined value to the server (which you see in your logs), because it finds a collection there instead of an integer.
First suggestion - You may not need a bidirectional relation, in which case drop it from the BookmarkBinding model.
Second suggestion - define the reverse relation on another key, so that it doesn't collide with Bookmark.id, such as BookmarkBindings.relations.reversRelation.key : 'binding_ids'
due disclosure - I've never used Backbone-relational.js, only Backbone.js.
The problem was that on editing or deleting the bookmark model, the bookmark binding model wanted to do it's work too since it is related too the bookmark from it's side. I already tried to remove the reverse relation which didn't prove to be a solution since in the other part of my application where I used the bookmark bindings things wouldn't work anymore.
Solution
I did end up removing the reverse relation (#jarede +1 for that!), but the crux was how to implement the foreign key to fetch relations from the API without a reverse relation. I ended up adding the keySource and keyDestination which made everything work out.
Sidenote
Backbone Ralational cannot handle identical foreign keys either, this gave me some problems too. This will make the lastly declared foreign key overwrite all the previous ones. This can be quite impractical since within an API it's not uncommon that model's are related to a column named id. So the idAttribute can be set with idAttribute: '_id' for example, but the foreign key has to be unique across your application.
BookmarkBinding model:
App.Model.BookmarkBinding = Backbone.RelationalModel.extend({
defaults: {
set_id: null,
bookmark_id: null
},
relations: [{
type: Backbone.HasOne,
key: 'bookmark',
keySource: 'id',
keyDestination: 'bookmark',
relatedModel: 'App.Model.Bookmark'
}],
urlRoot: 'http://api.testapi.com/api/v1/bookmark-bindings'
});