I'm adding ObjectId to an array from another array that I receive as the body.
exports.updateBasket = function (req, res) {
Basket.findOne({ _id: req.params.id }, function (err, basket) {
for(var i=0, len=req.body.length; i < len; i++) {
basket.update({$addToSet: { "items": req.body[i] } }, { upsert: true, safe: true });
}
if (err) {
res.send(err);
}
else {
res.json({ message: 'Successfully added' });
}
});
};
I have 2 questions concerning this :
Is there any upside to do the loop in angular and have multiple PUT?
What is the way to update this same array but when removing ObjectId?
One way that I thought of was to loop ObjectId that have to be removed and look if they are in the array of the object, if yes, delete them.
Another way would be to clear the array when PUT is called and update with the new ObjectId list (which would be the ones that were there minus the one user removed).
Both doesn't feel right ...
thanks
You code looks a bit odd. You are fetching asynchronously on the req.params._id but you are queuing up req.body.length potential worth of updates, but you send 'success' before you even get a response back from the updated results.
If you wanted to filter on arrays, look at lodash, if you want to process multiple updates asynchronously and get those response use async modules.
Related
I'm running a Node.js server, connecting to a MongoDB database with mongoose.
Inside my controller, I have several methods that make operations to the database. One of them is this one:
async findMultiple(req, res) {
const [baseSkillsArray] = Array(req.body);
try {
// if there is not baseSkillsArray, skip
if (!baseSkillsArray) {
return res.status(200).send([]);
}
// find all baseSkills using the ids in the baseSkillsArray
const allBaseSkills = await BaseSkill.find({
_id: { $in: [baseSkillsArray.baseSkillArray] } //
});
console.log('test ' + allBaseSkills);
res.status(200).send(allBaseSkills);
} catch (error) {
console.error(error.message);
res.status(500).send('Server error find BaseSkills');
}
}
However, this returns me nothing. I did some debugging and I found the reason is the find id $in the array. So I tried hard coding a value, like '2', for instance.
// find all baseSkills using the ids in the baseSkillsArray
const allBaseSkills = await BaseSkill.find({ _id: { $in: ['2'] } });
No success. So I went to MongoDB Atlas, where my DB is stored. I tried filtering using the same line of code in my collections.
{ _id: { $in: ['2'] } }
Surprisingly, it returns my document as I wanted!
The issue is that I need to make it work with mongoose. Any ideas? Is this a known bug?
There is nothing wrong with the query, nor a bug regarding $in.
In fact, what's wrong is the actual collection name. I manually created a collection in MongoDB Atlas, called "baseSkills". However, mongoose by default transforms your collection name into lowercase and adds an "s" if your collection's name is not in the plural.
So every time I started my server, I noticed that there was a new collection called "baseskills". I assumed it was a bug and deleted it. Only after making this post that I realized the collection was there again.
So I exported the documents to this collection and my query was working fine.
FYI, there is a way to enforce the collection's name in mongoose. When you declare you model, add a second parameter to the Schema function called "collection". Here is an example:
const BaseSkillSchema = new mongoose.Schema({
_id: {
type: String,
required: true
}, ...
}, { collection: 'baseSkills' })
That's it! Sorry for the mess and thank you for your help!
you want to query over mongo db object ids. So you should create a new ObjectId to do that.
import {Types} from 'mongoose';
{ _id: { $in: [new Types.Object("2")] } }
Or if you have 2 ids one generated and one custom created as id then you can query without creating a new object.
{ id: { $in: ['2'] } }
lets suppose I have a model with a field called draftFields. It is an object(but it can be an array).
I will create a PUT request to add data into draftFields. My question is: can I add data to draftFields and preserve the previous vale?
Lets say that I have added the first data to draftFields. E.g:
draftFields = {
someRandomValue: 'hi'
}
and after that I'm going to make another PUT request and it should look like this:
draftFields = {
someRandomValue: "hi",
anotherRandomValue: "hey"
}
How can I do that? Everytime I updated my draftFields obj it will remove the previous value. I had to save it in my frontend state to be able to save the previous value. Is there any workaround or method to preserve the values from the backend?
This is my code atm:
app.put('/api/save-draft/:id', function (req, res) {
User.findOneAndUpdate(
{ _id: req.params.id },
{ $set: { draftFields: req.body.draftFields } },
{ new: true },
(err, doc) => {
if (err) {
console.log('Something wrong when updating data!');
res.status(400).send('Error');
}
res.status(200).send('All good!');
console.log(doc);
},
);
});
I'm using Javascript(ReactJS) and NodeJS if this is relevant.
I can use the $push method and change from object to array.
https://docs.mongodb.com/manual/reference/operator/update/push/
I have failed searching, sorry.
I have a Documents in a Collection that have a field that is an Array (foo). This is an Array of other subdocuments. I want to set the same field (bar) for each subdocument in each document to the same value. This value comes from a checkbox.
So..my client-side code is something like
'click #checkAll'(e, template) {
const target = e.target;
const checked = $(target).prop('checked');
//Call Server Method to update list of Docs
const docIds = getIds();
Meteor.call('updateAllSubDocs', docIds, checked);
}
I tried using https://docs.mongodb.com/manual/reference/operator/update/positional-all/#positional-update-all
And came up with the following for my Server helper method.
'updateAllSubDocs'(ids, checked) {
Items.update({ _id: { $in: ids } }, { $set: { "foo.$[].bar": bar } },
{ multi: true }, function (err, result) {
if (err) {
throw new Meteor.Error('error updating');
}
});
}
But that throws an error 'foo.$[].bar is not allowed by the Schema'. Any ideas?
I'm using SimpleSchema for both the parent and subdocument
Thanks!
Try passing an option to bypass Simple Schema. It might be lacking support for this (somewhat) newer Mongo feature.
bypassCollection2
Example:
Items.update({ _id: { $in: ids } }, { $set: { "foo.$[].bar": bar } },
{ multi: true, bypassCollection2: true }, function (err, result) {
if (err) {
throw new Meteor.Error('error updating');
}
});
Old answer:
Since you say you need to make a unique update for each document it sounds like bulk updating is the way to go in this case. Here's an example of how to do this in Meteor.
if (docsToUpdate.length < 1) return
const bulk = MyCollection.rawCollection().initializeUnorderedBulkOp()
for (const myDoc of docsToUpdate) {
bulk.find({ _id: myDoc._id }).updateOne({ $set: update })
}
Promise.await(bulk.execute()) // or use regular await if you want...
Note we exit the function early if there's no docs because bulk.execute() throws an exception if there's no operations to process.
If your data have different data in the $set for each entry on array, I think you need a loop in server side.
Mongo has Bulk operations, but I don't know if you can call them using Collection.rawCollection().XXXXX
I've used rawCollection() to access aggregate and it works fine to me. Maybe work with bulk operations.
So I got two mongoose-models:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var eventSchema = new mongoose.Schema({
name: String,
date: String,
dogs: [{ type: Schema.Types.ObjectId, ref: 'Dog' }]
});
module.exports = mongoose.model('Event', eventSchema);
and
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var dogSchema = new mongoose.Schema({
name: String,
age: String,
gender: String,
});
module.exports = mongoose.model('Dog', dogSchema);
Event contains an array of dogs and im trying to figure out how to add/delete dogs to this array.
On the client I got this method:
$.ajax({
url: "http://localhost:3000/api/events/",
dataType: 'json',
type: 'POST', // Not sure if I should Post or Put...
data: {event_Id : this.props.choosenEvent._id, //Here I got the Id of the Event that i want to update by
dog_Id : this.props.events[dog]._id }, //adding this dog, which Id is here
success: function(data) {
}.bind(this),
});
},
On the server, NodeJs, I got my routes to the API. To me, it makes sense to use an PUT-method and start by getting the right Event with the event_Id passed as a param. Something like:
router.route('/events/:event_id')
.put(function(req, res) {
Event
.findById({ _id: req.param.event_id })
.populate('dogs')
});
But Im stuck at this point. Any help appreciated. Thanks!
Update!
Thank you! Your code helped a lot, you used lodash .remove to delete a dog from the array, is there a similar way to add an item with lodash?
I gave the add method a go like this:
router.route('/events')
.post(function(req, res) {
// Your data is inside req.body
Event
.findById({ _id: req.body.event_Id })
// execute the query
.exec(function(err, eventData) {
// Do some error handing
// Your dogs are inside eventData.dogs
eventData.dogs.push(req.body.dog_Id);
console.log(eventData)
});
// Update your eventDate here
Event.update({_id: req.body.event_id}, eventData)
.exec(function(err, update) {
// Do some error handing
// And send your response
});
});
When I hit the console.log(eventData) I can see that dog_id gets added to the array as it should. However it does not get saved to the db and the error says that eventData is not defined in Event.Update. I suspect this is a Js-scope-issue.
Onte thing that boggles me is this:
Obviously I would like to be able to add and remove dogs from the array and the
route is this: router.route('/events') .
But if both the add-method and the remove-method is on the same route, how can the code know which one I am going for?
There are a few mistakes you are making. First of all, you are making a POST request but your route accepts a PUT request. I have updated your code so it accepts a POST.
When posting objects, your data is inside req.body. req.params is used for url parameters. This is also the case when using a PUT request.
Populating dogs is not really necessary. You are sending your dog_id to your function so you can delete your item from your array which removes your dog from your event. This should do the trick. Please note that this does not remove your dog from your DB but only from your event.
Last but not least. I am using lodash. _.remove is a lodash function. You should definitely check it out, it will help you a lot.
Take a look at my code. It should get you going:
router.route('/events/:event_id')
// Since you are posting, you should use POST from JavaScript instead of PUT
.post(function(req, res) {
// Your data is inside req.body
Event
.findById({ _id: req.body.event_id })
// execute the query
.exec(function(err, eventData) {
// Do some error handing
// Your dogs are inside eventData.dogs
_.remove(eventData.dogs, function(d) {
return d._id === req.body.dog_Id;
});
// Update your eventDate here
Event.update({_id: req.body.event_id}, eventData)
.exec(function(err, update) {
// Do some error handing
// And send your response
});
});
});
UPDATE:
I do not think there is a way to add items to an array with lodash but you can simply use push like you did in your code example. That works just fine.
Your update is not working because your are executing the findById and update at the same time. You will have to find the item first, add the id and THEN update the item :) Move your update function inside the callback of your findById function and that should be fixed. So it looks like this:
router.route('/events')
.post(function(req, res) {
// Your data is inside req.body
Event
.findById({ _id: req.body.event_Id })
// execute the query
.exec(function(err, eventData) {
// Do some error handing
// Your dogs are inside eventData.dogs
eventData.dogs.push(req.body.dog_Id);
console.log(eventData)
// Update your eventDate here
Event.update({_id: req.body.event_id}, eventData)
.exec(function(err, update) {
// Do some error handing
// And send your response
});
});
});
You can add different functions on the same route as long as the method is different from the others. Take a look at REST at this answer. You can have a GET, POST, PUT & DELETE on /events. This is defined by this rule:
router.route('/events').post();
On the site I am creating, users can enter different tags and separate them with commas. ExpressJS should then search through whether they exist or not. If they do not exist, then it should create an object for each of them. I have an array and am iterating through it with a for function, however, only one object is created thanks to the callback... Is there any possible way to create multiple objects at once depending on the array's length?
for (i=0;i<postTopics.length;i++) {
var postTopic = postTopics[i],
postTopicUrl = postTopic.toString().toLowerCase().replace(' ', '-');
Topic.findOne({ "title": postTopics[i] }, function (err, topic) {
if (err) throw err;
if (!topic) {
Topic.create({
title: postTopic,
url: postTopicUrl
}, function (err, topic) {
if (err) throw err;
res.redirect('/');
});
}
});
}
Try out async.parallel.
$ npm install async
// Get the async module so we can do our parallel asynchronous queries much easier.
var async = require('async');
// Create a hash to store your query functions on.
var topicQueries = {};
// Loop through your postTopics once to create a query function for each one.
postTopics.forEach(function (postTopic) {
// Use postTopic as the key for the query function so we can grab it later.
topicQueries[postTopic] = function (cb) {
// cb is the callback function passed in by async.parallel. It accepts err as the first argument and the result as the second.
Topic.findOne({ title: postTopic }, cb);
};
});
// Call async.parallel and pass in our topicQueries object.
// If any of the queries passed an error to cb then the rest of the queries will be aborted and this result function will be called with an err argument.
async.parallel(topicQueries, function (err, results) {
if (err) throw err;
// Create an array to store our Topic.create query functions. We don't need a hash because we don't need to tie the results back to anything else like we had to do with postTopics in order to check if a topic existed or not.
var createQueries = [];
// All our parallel queries have completed.
// Loop through postTopics again, using postTopic to retrieve the resulting document from the results object, which has postTopic as the key.
postTopics.forEach(function (postTopic) {
// If there is no document at results[postTopic] then none was returned from the DB.
if (results[postTopic]) return;
// I changed .replace to use a regular expression. Passing a string only replaces the first space in the string whereas my regex searches the whole string.
var postTopicUrl = postTopic.toString().toLowerCase().replace(\ \g, '-');
// Since this code is executing, we know there is no topic in the DB with the title you searched for, so create a new query to create a new topic and add it to the createQueries array.
createQueries.push(function (cb) {
Topic.create({
title: postTopic,
url: postTopicUrl
}, cb);
});
});
// Pass our createQueries array to async.parallel so it can run them all simultaneously (so to speak).
async.parallel(createQueries, function (err, results) {
// If any one of the parallel create queries passes an error to the callback, this function will be immediately invoked with that err argument.
if (err) throw err;
// If we made it this far, no errors were made during topic creation, so redirect.
res.redirect('/');
});
});
First we create an object called topicQueries and we attach a query function to it for each postTopic title in your postTopics array. Then we pass the completed topicQueries object to async.parallel which will run each query and gather the results in a results object.
The results object ends up being a simple object hash with each of your postTopic titles as the key, and the value being the result from the DB. The if (results[postTopic]) return; line returns if results has no document under that postTopic key. Meaning, the code below it only runs if there was no topic returned from the DB with that title. If there was no matching topic then we add a query function to our createQueries array.
We don't want your page to redirect after just one of those new topics finishes saving. We want to wait until all your create queries have finished, so we use async.parallel yet again, but this time we use an array instead of an object hash because we don't need to tie the results to anything. When you pass an array to async.parallel the results argument will also be an array containing the results of each query, though we don't really care about the results in this example, only that no errors were thrown. If the parallel function finishes and there is no err argument then all the topics finished creating successfully and we can finally redirect the user to the new page.
PS - If you ever run into a similar situation, except each subsequent query requires data from the query before it, then checkout async.waterfall :)
If you really want to see if things exist already and avoid getting errors on duplicates then the .create() method already accepts a list. You don't seem to care about getting the document created in response so just check for the documents that are there and send in the new ones.
So with "finding first", run the tasks in succession. async.waterfall just to pretty the indent creep:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
async.waterfall(
[
function(callback) {
Topic.find(
{ "title": { "$in": topics } },
function(err,found) {
// assume ["Topic B", "Topic D"] are found
found = found.map(function(x) {
return x.title;
});
var newList = topics.filter(function(x) {
return found.indexOf(x) == -1;
});
callback(err,newList);
}
);
},
function(newList,callback) {
Topic.create(
newList.map(function(x) {
return {
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
};
}),
function(err) {
if (err) throw err;
console.log("done");
callback();
}
);
}
]
);
You could move the "url" generation to a "pre" save schema hook. But again if you really don't need the validation rules, go for "bulk API" operations provided your target MongoDB and mongoose version is new enough to support this, which really means getting a handle to the underlying driver:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
async.waterfall(
[
function(callback) {
Topic.find(
{ "title": { "$in": topics } },
function(err,found) {
// assume ["Topic B", "Topic D"] are found
found = found.map(function(x) {
return x.title;
});
var newList = topics.filter(function(x) {
return found.indexOf(x) == -1;
});
callback(err,newList);
}
);
},
function(newList,callback) {
var bulk = Topic.collection.initializeOrderedBulkOp();
newList.forEach(function(x) {
bullk.insert({
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
});
});
bulk.execute(function(err,results) {
console.log("done");
callback();
});
}
]
);
That is a single write operation to the server, though of course all inserts are actually done in order and checked for errors.
Otherwise just hang the errors from duplicates and insert as an "unordered Op", check for "non duplicate" errors after if you want:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
var bulk = Topic.collection.initializeUnorderedBulkOp();
topics.forEach(function(x) {
bullk.insert({
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
});
});
bulk.execute(function(err,results) {
if (err) throw err;
console.log(JSON.stringify(results,undefined,4));
});
Output in results looks something like the following indicating the "duplicate" errors, but does not "throw" the error as this is not set in this case:
{
"ok": 1,
"writeErrors": [
{
"code": 11000,
"index": 1,
"errmsg": "insertDocument :: caused by :: 11000 E11000 duplicate key error index: test.topic.$title_1 dup key: { : \"B Topic\" }",
"op": {
"title": "B Topic",
"url": "b-topic",
"_id": "53b396d70fd421057200e610"
}
},
{
"code": 11000,
"index": 3,
"errmsg": "insertDocument :: caused by :: 11000 E11000 duplicate key error index: test.topic.$title_1 dup key: { : \"D Topic\" }",
"op": {
"title": "D Topic",
"url": "d-topic",
"_id": "53b396d70fd421057200e612"
}
}
],
"writeConcernErrors": [],
"nInserted": 2,
"nUpserted": 0,
"nMatched": 0,
"nModified": 0,
"nRemoved": 0,
"upserted": []
}
Note that when using the native collection methods, you need to take care that a connection is already established. The mongoose methods will "queue" up until the connection is made, but these will not. More of a testing issue unless there is a chance this would be the first code executed.
Hopefully versions of those bulk operations will be exposed in the mongoose API soon, but the general back end functionality does depend on having MongoDB 2.6 or greater on the server. Generally it is going to be the best way to process.
Of course, in all but the last sample which does not need this, you can go absolutely "async nuts" by calling versions of "filter", "map" and "forEach" that exist under that library. Likely not to be a real issue unless you are providing really long lists for input though.
The .initializeOrderedBulkOP() and .initializeUnorderedBulkOP() methods are covered in the node native driver manual. Also see the main manual for general descriptions of Bulk operations.