Related
This is my Operator Models:
const operatorSchema = new Schema({
operatorName: {
type: String
},
users:[{
email:String,
payment:Number,
paymentsData: Date,
product: String,
}],
});
I need to filter by operatorName and email in users block. But when I try with this I get all users in related OperatorName how can I query correctly ?
Operators.find( { $and: [{operatorName: operatorName}, {'users.email': 'super#m.com'}]}, function (err, docs) {
) {
if (err) console.log(err)
else {
docs.forEach(function(data){
console.log(data)
})
// res.render('total_earns_operator_tables', { operators: docs });
}
});
EDIT
I also try with aggregate method like this but again, I get same result and I gel all bunch of user data, but I want only demouser#mail.com
Operators.aggregate([
{ $match: {$and: [{ operatorName: operatorName},{'users.email':
'demouser#mail.com' }]}},
]
,function (err, docs) {
// console.log(Operators)
// Operators.find( { $and: [{operatorName: operatorName}, {users: {$elemMatch: {email:['super#m.com']}}}]}, function (err, docs) {
// Operators.find( {operatorName: operatorName, "users.email": "demouser#mail.com"}, function (err, docs) {
if (err) console.log(err)
else {
docs.forEach(function(data){
console.log(data)
})
// res.render('total_earns_operator_tables', { operators: docs });
}
});
It is very basic but I couldnt find solution.
Your query is doing exactly what it is supposed to do. It is returning all documents that satisfy you two criteria: 1. having a specified operatorName and 2. users array having at least one user matching the specified email.
If you want to reashape your documents by filtering the user array to only include the user matching your condition, you'll have to use an aggregation.
EDIT
As per your edit: Your aggregation only have a $match stage, which is identical to your query above. To change the shape of a document, the aggregation framework provides you with the $project stage, see the example below:
Operators.aggregate([
{
$match: {
operatorName: operatorName,
"users.email": "demouser#mail.com"
}
},
{
$project: {
operatorName: '$operatorName',
users: {
$filter: {
input: "$users",
as: "user",
cond: {
$eq: [
"$$user.email",
"demouser#mail.com"
]
}
}
}
}
}
]
Here, we first filter the collection to get only the documents you want, using the $match stage, then we use the $filteroperator in $project stage, to return only the matching users within the array.
See the working playground
I'm working on a system where I use a schema with the type Map much like this:
const Product = mongoose.model('Product', {
name: String,
data:{
type:Map,
of:String
}
});
I'm trying to query certain parts of the docs in this collection with a projection:
Product.findOne({
_id: req.params.id
}, {
name:true,
data.xy*
})
I would like to only get the fields in data starting with xy. Do you have any hint how to do that?
Since Mongoose's map type will be represented as a nested object in mongodb, you could convert using $objectToArray which will result in array of key-value pairs that allows you to query by the keys starting with xy. Something like:
Product.aggregate([
{
$project: {
data: {
$objectToArray: "$data"
}
}
},
{
"$unwind": "$data"
},
{
$match: {
"data.k": {
$regex: "^xy"
}
}
}
])
Here's a working example on mongoplayground:
https://mongoplayground.net/p/yCJLhzalOXI
In MongoDB, is it possible to update the value of a field using the value from another field? The equivalent SQL would be something like:
UPDATE Person SET Name = FirstName + ' ' + LastName
And the MongoDB pseudo-code would be:
db.person.update( {}, { $set : { name : firstName + ' ' + lastName } );
The best way to do this is in version 4.2+ which allows using the aggregation pipeline in the update document and the updateOne, updateMany, or update(deprecated in most if not all languages drivers) collection methods.
MongoDB 4.2+
Version 4.2 also introduced the $set pipeline stage operator, which is an alias for $addFields. I will use $set here as it maps with what we are trying to achieve.
db.collection.<update method>(
{},
[
{"$set": {"name": { "$concat": ["$firstName", " ", "$lastName"]}}}
]
)
Note that square brackets in the second argument to the method specify an aggregation pipeline instead of a plain update document because using a simple document will not work correctly.
MongoDB 3.4+
In 3.4+, you can use $addFields and the $out aggregation pipeline operators.
db.collection.aggregate(
[
{ "$addFields": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}},
{ "$out": <output collection name> }
]
)
Note that this does not update your collection but instead replaces the existing collection or creates a new one. Also, for update operations that require "typecasting", you will need client-side processing, and depending on the operation, you may need to use the find() method instead of the .aggreate() method.
MongoDB 3.2 and 3.0
The way we do this is by $projecting our documents and using the $concat string aggregation operator to return the concatenated string.
You then iterate the cursor and use the $set update operator to add the new field to your documents using bulk operations for maximum efficiency.
Aggregation query:
var cursor = db.collection.aggregate([
{ "$project": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}}
])
MongoDB 3.2 or newer
You need to use the bulkWrite method.
var requests = [];
cursor.forEach(document => {
requests.push( {
'updateOne': {
'filter': { '_id': document._id },
'update': { '$set': { 'name': document.name } }
}
});
if (requests.length === 500) {
//Execute per 500 operations and re-init
db.collection.bulkWrite(requests);
requests = [];
}
});
if(requests.length > 0) {
db.collection.bulkWrite(requests);
}
MongoDB 2.6 and 3.0
From this version, you need to use the now deprecated Bulk API and its associated methods.
var bulk = db.collection.initializeUnorderedBulkOp();
var count = 0;
cursor.snapshot().forEach(function(document) {
bulk.find({ '_id': document._id }).updateOne( {
'$set': { 'name': document.name }
});
count++;
if(count%500 === 0) {
// Excecute per 500 operations and re-init
bulk.execute();
bulk = db.collection.initializeUnorderedBulkOp();
}
})
// clean up queues
if(count > 0) {
bulk.execute();
}
MongoDB 2.4
cursor["result"].forEach(function(document) {
db.collection.update(
{ "_id": document._id },
{ "$set": { "name": document.name } }
);
})
You should iterate through. For your specific case:
db.person.find().snapshot().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
Apparently there is a way to do this efficiently since MongoDB 3.4, see styvane's answer.
Obsolete answer below
You cannot refer to the document itself in an update (yet). You'll need to iterate through the documents and update each document using a function. See this answer for an example, or this one for server-side eval().
For a database with high activity, you may run into issues where your updates affect actively changing records and for this reason I recommend using snapshot()
db.person.find().snapshot().forEach( function (hombre) {
hombre.name = hombre.firstName + ' ' + hombre.lastName;
db.person.save(hombre);
});
http://docs.mongodb.org/manual/reference/method/cursor.snapshot/
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update/creation of a field based on another field:
// { firstName: "Hello", lastName: "World" }
db.collection.updateMany(
{},
[{ $set: { name: { $concat: [ "$firstName", " ", "$lastName" ] } } }]
)
// { "firstName" : "Hello", "lastName" : "World", "name" : "Hello World" }
The first part {} is the match query, filtering which documents to update (in our case all documents).
The second part [{ $set: { name: { ... } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set is a new aggregation operator and an alias of $addFields.
Regarding this answer, the snapshot function is deprecated in version 3.6, according to this update. So, on version 3.6 and above, it is possible to perform the operation this way:
db.person.find().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
I tried the above solution but I found it unsuitable for large amounts of data. I then discovered the stream feature:
MongoClient.connect("...", function(err, db){
var c = db.collection('yourCollection');
var s = c.find({/* your query */}).stream();
s.on('data', function(doc){
c.update({_id: doc._id}, {$set: {name : doc.firstName + ' ' + doc.lastName}}, function(err, result) { /* result == true? */} }
});
s.on('end', function(){
// stream can end before all your updates do if you have a lot
})
})
update() method takes aggregation pipeline as parameter like
db.collection_name.update(
{
// Query
},
[
// Aggregation pipeline
{ "$set": { "id": "$_id" } }
],
{
// Options
"multi": true // false when a single doc has to be updated
}
)
The field can be set or unset with existing values using the aggregation pipeline.
Note: use $ with field name to specify the field which has to be read.
Here's what we came up with for copying one field to another for ~150_000 records. It took about 6 minutes, but is still significantly less resource intensive than it would have been to instantiate and iterate over the same number of ruby objects.
js_query = %({
$or : [
{
'settings.mobile_notifications' : { $exists : false },
'settings.mobile_admin_notifications' : { $exists : false }
}
]
})
js_for_each = %(function(user) {
if (!user.settings.hasOwnProperty('mobile_notifications')) {
user.settings.mobile_notifications = user.settings.email_notifications;
}
if (!user.settings.hasOwnProperty('mobile_admin_notifications')) {
user.settings.mobile_admin_notifications = user.settings.email_admin_notifications;
}
db.users.save(user);
})
js = "db.users.find(#{js_query}).forEach(#{js_for_each});"
Mongoid::Sessions.default.command('$eval' => js)
With MongoDB version 4.2+, updates are more flexible as it allows the use of aggregation pipeline in its update, updateOne and updateMany. You can now transform your documents using the aggregation operators then update without the need to explicity state the $set command (instead we use $replaceRoot: {newRoot: "$$ROOT"})
Here we use the aggregate query to extract the timestamp from MongoDB's ObjectID "_id" field and update the documents (I am not an expert in SQL but I think SQL does not provide any auto generated ObjectID that has timestamp to it, you would have to automatically create that date)
var collection = "person"
agg_query = [
{
"$addFields" : {
"_last_updated" : {
"$toDate" : "$_id"
}
}
},
{
$replaceRoot: {
newRoot: "$$ROOT"
}
}
]
db.getCollection(collection).updateMany({}, agg_query, {upsert: true})
(I would have posted this as a comment, but couldn't)
For anyone who lands here trying to update one field using another in the document with the c# driver...
I could not figure out how to use any of the UpdateXXX methods and their associated overloads since they take an UpdateDefinition as an argument.
// we want to set Prop1 to Prop2
class Foo { public string Prop1 { get; set; } public string Prop2 { get; set;} }
void Test()
{
var update = new UpdateDefinitionBuilder<Foo>();
update.Set(x => x.Prop1, <new value; no way to get a hold of the object that I can find>)
}
As a workaround, I found that you can use the RunCommand method on an IMongoDatabase (https://docs.mongodb.com/manual/reference/command/update/#dbcmd.update).
var command = new BsonDocument
{
{ "update", "CollectionToUpdate" },
{ "updates", new BsonArray
{
new BsonDocument
{
// Any filter; here the check is if Prop1 does not exist
{ "q", new BsonDocument{ ["Prop1"] = new BsonDocument("$exists", false) }},
// set it to the value of Prop2
{ "u", new BsonArray { new BsonDocument { ["$set"] = new BsonDocument("Prop1", "$Prop2") }}},
{ "multi", true }
}
}
}
};
database.RunCommand<BsonDocument>(command);
MongoDB 4.2+ Golang
result, err := collection.UpdateMany(ctx, bson.M{},
mongo.Pipeline{
bson.D{{"$set",
bson.M{"name": bson.M{"$concat": []string{"$lastName", " ", "$firstName"}}}
}},
)
How can I access an array of object with their own _id and update it with Mongo/Mongoose?
Take a look to my update query and check if there's something wrong, because this code doesn't return any error, but it doesn't really update the field
modelUser.findOneAndUpdate(
{ userName: body.author, "portfolio._id": body.id },
{ new: true },
{
$set: { //I thing the problem it's over here
"portfolio.$.profitLoss": profitLoss,
"portfolio.$.percentage": percentage
}
},
(err, user) => {
if (err) {
console.log(err);
}
console.log(`Done`);
}
);
This is my User Schema:
const userSchema = new Schema({
...stuff,
portfolio: [
{
coin: String,
amount: String,
price: String,
bought: Date,
profitLoss: String,
percentage: String
}
],
});
Basically i think mongo just don't know which of these sub documents should update, I don't know if there's something like another findOneAndUpdate for sub object/document by id.
Just changed findOneAndUpdate to updateOne and everything works.
Basic problem
I have a bunch of records and I need to get latest (most recent) and the oldest (least recent).
When googling I found this topic where I saw a couple of queries:
// option 1
Tweet.findOne({}, [], { $orderby : { 'created_at' : -1 } }, function(err, post) {
console.log( post );
});
// option 2
Tweet.find({}, [], {sort:[['arrival',-1]]}, function(err, post) {
console.log( post );
});
Unfortunatly they both error:
TypeError: Invalid select() argument. Must be a string or object.
The link also has this one:
Tweet.find().sort('_id','descending').limit(15).find(function(err, post) {
console.log( post );
});
and that one errors:
TypeError: Invalid sort() argument. Must be a string or object.
So how can I get those records?
Timespan
Even more ideally I just want the difference in time (seconds?) between the oldest and the newest record, but I have no clue on how to start making a query like that.
This is the schema:
var Tweet = new Schema({
body: String
, fid: { type: String, index: { unique: true } }
, username: { type: String, index: true }
, userid: Number
, created_at: Date
, source: String
});
I'm pretty sure I have the most recent version of mongoDB and mongoose.
EDIT
This is how I calc the timespan based on the answer provided by JohnnyHK:
var calcDays = function( cb ) {
var getOldest = function( cb ) {
Tweet.findOne({}, {}, { sort: { 'created_at' : 1 } }, function(err, post) {
cb( null, post.created_at.getTime() );
});
}
, getNewest = function( cb ) {
Tweet.findOne({}, {}, { sort: { 'created_at' : -1 } }, function(err, post) {
cb( null, post.created_at.getTime() );
});
}
async.parallel({
oldest: getOldest
, newest: getNewest
}
, function( err, results ) {
var days = ( results.newest - results.oldest ) / 1000 / 60 / 60 / 24;
// days = Math.round( days );
cb( null, days );
}
);
}
Mongoose 3.x is complaining about the [] parameter in your findOne calls as the array format is no longer supported for the parameter that selects the fields to include.
Try this instead to find the newest:
Tweet.findOne({}, {}, { sort: { 'created_at' : -1 } }, function(err, post) {
console.log( post );
});
Change the -1 to a 1 to find the oldest.
But because you're not using any field selection, it's somewhat cleaner to chain a couple calls together:
Tweet.findOne().sort({created_at: -1}).exec(function(err, post) { ... });
Or even pass a string to sort:
Tweet.findOne().sort('-created_at').exec(function(err, post) { ... });
Fast and Simple - One Line Solution
Get 10 latest documents
MySchema.find().sort({ _id: -1 }).limit(10)
Get 10 oldest documents
MySchema.find().sort({ _id: 1 }).limit(10)
In case you want sorting based on some other property i.e. createdAt and get the oldest or latest. It is similar to the above query.
MySchema.find().sort({ createdAt: -1 }).limit(10) // 10 latest docs
MySchema.find().sort({ createdAt: 1 }).limit(10) // 10 oldest docs
for version ~3.8 mongoose
to find the last entry
model.findOne().sort({ field: 'asc', _id: -1 }).limit(1)
or using
model.findOne().sort({ field: -_id }).limit(1)
collectionName.findOne().sort({$natural: -1}).limit(1).exec(function(err, res){
if(err){
console.log(err);
}
else{
console.log(res);
}
}
This will give you the last document recorded on the database. Just follow the same concept.
await Model.find().sort({$natural:-1}).limit(1); //for the latest record
await Model.find().sort({$natural:1}).limit(1); //for the oldest record
This one works for me. using mongodb natural order https://docs.mongodb.com/manual/reference/operator/meta/natural/
We have method called sort using that we can able to get first element(old document) which means 1 for sort field or last element(new document) which means -1 for sort field of collection.
The best way is to have an async function like that:
async function findLastElement () {
return await Mymodel.findOne().sort('-_id');
}
this way you get the last element and you ensure reusability.
Here is the answer with async - await
const olderDoc: any = await Model.findOne().sort({ createdAt: 1 }).lean().exec()
console.log('olderDoc', olderDoc)
const newerDoc: any = await Model.findOne().sort({ createdAt: -1 }).lean().exec()
console.log('newerDoc', newerDoc)