Hai once again ive deleting action to take care of.
Ive array and parameter with given item to be removed
[
Class
,
Class
,
Class
,
Object>
__ember1393425759417_meta: Meta
formId: 4
proto: Object
Where classes are existing records in the database, i can delete them and it works (else block). PS! Parameter is the exact same object / class that needs to be deleted
deleteFieldset: function(formID){
if(this.get('controller').get('isEditController')){
//Checks if edit controller
var allPersonArray = this.get('controller').content._data.persons;
if(allPersonArray[formID.formId -1] !== undefined){
//TODO: Delete generated objects
}else{
formID.destroyRecord(); // Deleting records works.
}
}else{}
I tried removeAt, removeObject, but no luck.
Given output with removeAt on the newly generated object in the array.
TypeError: Object function Object() { [native code] } has no method
'inverseFor'
On the other hand i cant splice em either, cus that does not affect hbs...
It looks like you're having trouble creating records, so I'll try to explain. First, in the latest versions of Ember-Data, the store is supposed to create new records for you. Example:
var record = this.get('store').createRecord({
prop1: 'value1',
prop2: 'value2'
});
record.set('belongsToRelationship', otherRecord);
record.set('hasManyRelationship', new Em.Set());
records.save();
This will create a proper record for you, and when you call .save() on that record, the adapter will persist it to the server for you.
As far as primary keys go, you can't create those on the client side. Whatever you think might be working right now by just incrementing the last primary key, will not work when you have more than one client. (Hell, it probably won't even work now.) Your server is supposed to create the primary keys. You create a record, like I did above. When you save it to the server, your server should respond with a payload that includes the saved record, primary keys included. Ember-Data will then load that payload into the record you just created, populating the primary key fields.
Related
I am using mockapi.io for my first big learning project(ecomerce) and i can do get, post but seems like not delete. It deletes the row in the admin page table but not the product from the online api. Seems impossible to find info about this issue online, or i just don't know how to ask the right question. Maybe it's not even posible to really delete from a mock api?!
I am geting in the console: http.js:21 DELETE https://6060b8b904b05d0017ba2dfb.mockapi.io/products?id=50 400 (Bad Request)
So it does compose the right link to that specific product, but just doesn't delete it.
function deleteProduct(e) {
if (e.target.classList.contains("delete")) {
const id = e.target.id;
e.target.parentElement.parentElement.remove(id);
const productToDelete = `https://6060b8b904b05d0017ba2dfb.mockapi.io/products?id=${id}`;
http
.delete(productToDelete)
.then((data) => getProductsAdmin())
.catch("Error on delete!");
}
}
Let me know if you need printscreens or more code. Thanks
I think your parameter's link was wrong, it should be :
https://6060b8b904b05d0017ba2dfb.mockapi.io/products/${id}
the answer above is correct, and finding info about mockapi.io on the internet was hard for me so, I just wanted to answer to the related problems with mockapi.io when deleting a specific object from DB.
When adding new objects to the resource mockapi.io assigns an id to them. And when you want to delete that new object you should make a request(/products/${id}) to that assigned id, but the problem is if you have your own id in the object and mockapi.io assigns another id to this object when adding it to the resource, your request will address different object.
I solved it this way:
First, you change the name of the id that mockapi.io assigns automatically
For example, I changed it to "index"
https://i.stack.imgur.com/N6zY2.png
const { data } = axios.get(`https://UNIQUE_PROJECT.mockapi.io/cart?id=${thisCard.id}`);
await axios.delete(`https://UNIQUE_PROJECT.mockapi.io/cart/${data[0].index}`);
It finds an object with my assigned id, then I make a delete request with index(the id that mockapi.io assigns automatically to new objects).
Important to remember, a delete request(/cart/${id}) works only with identifiers (id, index, etc.) that mockapi.io assigns itself.
What is the correct approach when working with an "new object" that is to be saved in a collection. Say I have a collection Cars. I have a /cars/new-car
url and then a form with:
name: __
parts: list of parts here
If I want to make this form "reactive" in the sense that if I add a new part in the parts array it shows a rerender is the best approach to make the whole "Car" a reactive object. Or should one just add a new row in the dom?
I dont want to automatically insert the whole thing into the "Cars" collection until It has a name and a list of parts.
Most examples shows very simple of adding to collection -> rerender of DOM which is very straightforward.
Edit: The same concept may apply to when editing a car. Fetching the car from a collection, setting up so the returned object is reactive(so I can add/remove parts) when done get all values and store the edited car information.
Start out by initializing an "empty" car as a reactive variable.
Template.cars.onCreated(function () {
this.car = new ReactiveVar({}); // empty car
});
Say your dom has some sort of attribute on each field describing which key it is:
<input data-key="name" placeholder="Car name"/>
Then you can bind an event that will use the data from this to update the reactive variable.
Template.cars.events({
'change input': function (e, template) {
template.car.set(_.extend(template.car.get(), {
[$(e.target).data('key')]: $(e.target).val()
}));
}
});
This will construct the object as you fill in your inputs.
Consider using Session for your /cars/new-car page
When the page first loads
Session.set('parts', []});
Session.set('name', '');
When the user saves a part
var addedPart = getPart();
var update = Session.get('parts').push(addedPart);
Session.set('parts', update);
Then your template helper functions can get everything it needs to render the view by calling Session.get().
Template.view.helpers({
currentParts: function() {
return Session.get('parts');
}
});
What do you think? I'm fairly new to Meteor myself, so there maybe even more clever ways to do batch updates on the session. But this is general gist.
UPDATE 1: 5 votes have been received, so I have submitted a feature request: https://github.com/LearnBoost/mongoose/issues/2637
Please cast your +1 votes there to let the core team know you want this feature.
UPDATE 2: See answer below...
ORIGINAL POST:
Lets say I do a "lean" query on a collection OR receive some data from a REST service and I get an array of objects (not mongoose documents).
These objects already exist in the database, but I need to convert some/all of those objects to mongoose documents for individual editing/saving.
I have read through the source and there is a lot going on once mongoose has data from the database (populating, casting, initializing, etc), but there doesn't seem to be a method for 'exposing' this to the outside world.
I am using the following, but it just seems hacky ($data is a plain object):
// What other properties am I not setting? Is this enough?
var doc = new MyModel( $data );
doc.isNew = false;
// mimicking mongoose internals
// "init" is called internally after a document is loaded from the database
// This method is not documented, but seems like the most "proper" way to do this.
var doc = new MyModel( undefined );
doc.init( $data );
UPDATE: After more searching I don't think there is a way to do this yet, and the first method above is your best bet (mongoose v3.8.8). If anybody else is interested in this, I will make a feature request for something like this (leave a comment or upvote please):
var doc = MyModel.hydrate( $data );
Posting my own answer so this doesn't stay open:
Version 4 models (stable released on 2015-03-25) now exposes a hydrate() method. None of the fields will be marked as dirty initially, meaning a call to save() will do nothing until a field is mutated.
https://github.com/LearnBoost/mongoose/blob/41ea6010c4a84716aec7a5798c7c35ef21aa294f/lib/model.js#L1639-1657
It is very important to note that this is intended to be used to convert a plain JS object loaded from the database into a mongoose document. If you are receiving a document from a REST service or something like that, you should use findById() and update().
For those who live dangerously:
If you really want to update an existing document without touching the database, I suppose you could call hydrate(), mark fields as dirty, and then call save(). This is not too different than the method of setting doc.isNew = false; as I suggested in my original question. However, Valeri (from the mongoose team) suggested not doing this. It could cause validation errors and other edge case issues and generally isn't good practice. findById is really fast and will not be your bottleneck.
If you are getting a response from REST service and say you have a User mongoose model
var User = mongoose.model('User');
var fields = res.body; //Response JSON
var newUser = new User(fields);
newUser.save(function(err,resource){
console.log(resource);
});
In other case say you have an array of user JSON objects from User.find() that you want to query or populate
var query = User.find({});
query.exec(function(users){
//mongoose deep-populate ref docs
User.deeppopulate users 'email_id phone_number'.exec({
//query through populated users objects
});
});
MongoDB doesn't support Joins and Transfers. So for now you can't cast values to an object directly. Although you can work around it with forEach.
Using Breeze with Entity Framework code first to return data from calls to a web service.
I have a data model that's several levels deep. In this instance I'm returning a "schedule" object, which has a number of child "DefaultItems", each one of which has a cost and a single "type" child with its own properties.
If you call the web service for one of these directly, you get something like this, which is as expected:
{
$id:"1",
$type:"Schedule_06B188AC55B213FE4B13EA5B77D9C039007E80E9DB6F6841C055777A028C5F95, EntityFrameworkDynamicProxies-Core",
DefaultItems:[
{
$id:"2",
$type:"DefaultItem, Core",
RowId:"d422af5d-d6ca-46a3-a142-1feb93348e1d",
Cost:1,
Type:{
$id:"3",
$type:"Type, Core",
RowId:"38ed6d1b-d0b7-43cb-b958-2b2424b97759",
Type:"Type1"
},
Schedule:{
$ref:"1"
}
},
//more DefaultItem objects
{},
{}
],
RowId:"627eb2f2-ec74-4646-b3d1-d6423f84a2cd",
Start:"2010-01-18T00:00:00.000",
End:"2019-01-18T00:00:00.000"
}
This then comes down to the browser, where knockout is used to bind it to data objects. The trouble is that at this point, the data only seems to be one level deep.
So I can get at Schedule.Start and Schedule.End without issue. I can also iterate through the DefaultItem objects inside my Schedule and get their Costs out. But the Type objects inside DefaultItem just aren't there.
It's not about using an incorrect name to bind them: if you pause in the browser debugger and drill down into the JSON that the browser has, there's no Type objects at all, not even empty objects where they should be.
How come they come out of the web service, but don't seem to be in the data that Breeze passes back to the browser?
Apparently in Breeze, relationships have to be defined both ways in order to propagate. So I had to ensure that my the primary key in my Type class was marked as a foreign key to the DefaultItem class.
I believe this is currently registered as a bug. It's certainly a bit annoying.
I have read and read the docs on these two methods, but for the life of me cannot work out why you might use one over the other?
Could someone just give me a basic code situation where one would be application and the other wouldn't.
reset sets the collection with an array of models that you specify:
collection.reset( [ { name: "model1" }, { name: "model2" } ] );
fetch retrieves the collection data from the server, using the URL you've specified for the collection.
collection.fetch( { url: someUrl, success: function(collection) {
// collection has values from someUrl
} } );
Here's a Fiddle illustrating the difference.
We're assuming here that you've read the documentation, else it'l be a little confusing here.
If you look at documentation of fetch and reset, what it says is, suppose you have specified the url property of the collection - which might be pointing to some server code, and should return a json array of models, and you want the collection to be filled with the models being returned, you will use fetch.
For example you have the following json being returned from the server on the collection url:
[{
id : 1,
name : "a"
}, {
id : 2,
name : "b"
}, {
id : 3,
name : "c"
}]
Which will create 3 models in your collection after successful fetch. If you hunt for the code of collection fetch here you will see that fetch will get the response and internally will call either reset or add based on options specified.
So, coming back to discussion, reset assumes that we already have json of models, which we want to be stored in collection, we will pass it as a parameter to it. In your life, ever if you want to update the collection and you already have the models on client side, then you don't need to use fetch, reset will do your job.
Hence, if you want to the same json to be filled in the collection with the help of reset you can do something like this:
var _self = this;
$.getJSON("url", function(response) {
_self.reset(response); // assuming response returns the same json as above
});
Well, this is not a practice to be followed, for this scenario fetch is better, its just used for example.
Another example of reset is on the documentation page.
Hope it gives a little bit of idea and makes your life better :)
reset() is used for replacing collection with new array. For example:
#collection.reset(#full_collection.models)
would load #full_collections models, however
#collection.reset()
would return empty collection.
And fetch() function returns default collection of model