I have the following query that works fine without $max and $min, however, when $max and $min are included nothing happens?
Summary.update({
productId: _product._id,
},{
$addToSet: {
attrs: _variant.attrs,
vars: {
variantId: _variant._id,
title: _variant.title,
imgs: _variant.imgs,
}
},
$max: { 'price.highest': _price.highest },
$min: { 'price.lowest': _price.lowest },
$setOnInsert: self.summary
},{
upsert: true
},function(err,update){
...
});
Any ideas would be greatly appreciated.
Thanks.
Your query does come with come caveats, so it really only remains to explain what would be going wrong.
First and foremost is that your MongoDB server version must be at least version 2.6.x or greater in order to have the $min and $max update operators available.
Now consider the basic test conditions:
> db.test.update(
{ "a": 1 },
{
"$setOnInsert": { "b": 2 },
"$min": { "c": 1 },
"$max": { "d": 1 }
},
{ "upsert": true }
)
WriteResult({
"nMatched" : 0,
"nUpserted" : 1,
"nModified" : 0,
"_id" : ObjectId("559f114dbe78f212535e2f5f")
})
On a first execution since there is no data for a matching value of "a" an upsert is performed creating the new object in the collection:
{
"_id" : ObjectId("559f114dbe78f212535e2f5f"),
"a" : 1,
"d" : 1,
"c" : 1,
"b" : 2,
}
Now if you change the values of $min or $max the expected behaviour is to modify those fields where the value falls within the constraints, like so:
> db.test.update(
{ "a": 1 },
{
"$setOnInsert": { "b": 3 },
"$min": { "c": 2 },
"$max": { "d": 2 }
},
{ "upsert": true }
)
WriteResult({ "nMatched" : 1, "nUpserted" : 0, "nModified" : 1 })
Since the value to $min is larger than the stored value this field is not changed. However the value to $max is larger than the stored value and that is modified. There was a different value in $setOnInsert, but this does not affect the data since the operation is not an upsert this time:
{
"_id" : ObjectId("559f114dbe78f212535e2f5f"),
"a" : 1,
"d" : 2,
"c" : 1,
"b" : 2
}
If you then issue a statement with either the same values for $min and $max or that othewise fall out of the contraints of being respectively "lower" or "higher" values then nothing will be updated:
db.test.update(
{ "a": 1 },
{
"$setOnInsert": { "b": 3 },
"$min": { "c": 2 },
"$max": { "d": 2 }
},
{ "upsert": true }
)
WriteResult({ "nMatched" : 1, "nUpserted" : 0, "nModified" : 0 })
That is the expected behavior of the operators in this context. Of course if you submit either $min or $max on a field that does not exist yet, then the field is added to the document with the specified value, just like with $set or $push or similar.
So either you are subitting values that do not meet the requirements for updating or your supported server and/or driver versions are not capable of handling the operators. These are all the things you need to check to see why you don't think you are getting the expected results.
As a side note, beware that you know what you are expecting with $addToSet as well. In a similar way, if the complete object exists already then nothing will be modified. If however you have an object with various keys as you do, then changing any one of those values makes the whole object "unique" and a new member will be added. If you mean to do something else like have only "one" of the keys to contain a unique vallue thene there is other logic you need to apply and cannot simply use $addToSet to handle it.
Also, here is a full listing you can run with node and mongoose in addition to the above test conditions. This is tested against MongoDB 3.x and mongoose 4.0.6:
var async = require('async'),
mongoose = require('mongoose'),
Schema = mongoose.Schema;
mongoose.connect('mongodb://localhost/test');
var testSchema = new Schema({
"a": Number,
"b": Number,
"c": Number,
"d": Number
});
var Test = mongoose.model('Test',testSchema,"test");
async.series(
[
function(callback) {
Test.remove({},callback);
},
function(callback) {
Test.findOneAndUpdate(
{ "a": 1 },
{
"$setOnInsert": { "b": 2 },
"$min": { "c": 1 },
"$max": { "d": 1 }
},
{ "upsert": true, "new": true },
callback
);
},
function(callback) {
Test.findOneAndUpdate(
{ "a": 1 },
{
"$setOnInsert": { "b": 3 },
"$min": { "c": 2 },
"$max": { "d": 2 }
},
{ "upsert": true, "new": true },
callback
);
},
function(callback) {
Test.findOneAndUpdate(
{ "a": 1 },
{
"$setOnInsert": { "b": 3 },
"$min": { "c": 2 },
"$max": { "d": 2 }
},
{ "upsert": true, "new": true },
callback
);
}
],
function(err,results) {
if (err) throw err;
console.log( JSON.stringify( results, undefined, 2 ) );
process.exit();
}
);
Related
I have a collection with following structure:
{
"_id" : "Pd2fl7xcT3iWEmpAafv4DA",
"slot" : 1,
"stat" : [
{
"unitStat" : "5"
"value" : 13
},
{
"unitStat" : "18",
"value" : 1.96
},
{
"unitStat" : "28",
"value" : 1373
},
{
"unitStat" : "41",
"roll" : 2,
"value" : 69
}
]
}
I want to get 5 sorted objects (by any unitStat type) for every slot.
In that moment, I can perform 6 calls to db, but it isn't a good idea.
I tried to use aggregation, but I can perform it only for one slot:
db.collection.aggregate(
{
`$match`: {
slot: 1,
secondaryStat: {
`$elemMatch`: {
unitStat:'5'
}
}
}
},
{
`$unwind`: `'$secondaryStat'`
},
{
`$match`: {
'secondaryStat.unitStat' : '5'
}
},
{
`$sort`: {
'secondaryStat.value': -1
}
},
{
`$limit`: 5
}
)
Can I find, for example top 5 sorted objects from 6 different slots?
The following query can get us the expected output:
db.collection.aggregate([
{
$unwind:"$stat"
},
{
$match:{
"stat.unitStat":"5"
}
},
{
$sort:{
"slot":1,
"stat.value":1
}
},
{
$group:{
"_id":"$slot",
"slot":{
$first:"$slot"
},
"stat":{
$push:"$stat"
}
}
},
{
$project:{
"_id":0,
"slot":1,
"stat":{
$slice:["$stat",0,5]
}
}
}
]).pretty()
Aggregation stages details:
Stage I: Unwind the stat array
Stage II: Filter unitStat for any specified value. "5" in this case.
Stage III: Sort the data in ascending order on the basis of slot and
stat.value
Stage IV: Group back the data on the basis of slot and push all filtered stat into an array with name 'stat'
Stage V: Slice the stat array with the specified length. 5 in this
case.
I created a spreadsheet for work using Google Sheets. It works well, but I'm trying to enhance it by learning JavaScript, so I wrote a simple copy-paste function that works really well.
To make it easier to change values in the future, I'm trying to add everything to an array and create a for loop. I've never done this before, so forgive me if it looks terrible, but here is the array I created:
var rangesAndValues = {
"0" : {
range : "C3",
value : ""
},
"1" : {
range : "C5:C7",
value : 0
},
"2" : {
range : "C9",
value : 0
},
"3" : {
range : "G3:G10",
value : 0
},
"4" : {
range : "J3",
value : "No"
},
"5" : {
range : "J5",
value : "No"
},
"6" : {
range : "J7",
value : 0
},
"7" : {
range : "J9",
value : 0
},
"8" : {
range : "M3:M10",
value : 0
},
"9" : {
range : "O3:O10",
value : 0
},
"10" : {
range : "Q3:Q10",
value : ""
},
"11" : {
range : "S3:S10",
value : 0
}
};
I then tried to loop it and console.log it to make sure it worked.
rangesAndValues.forEach(function(element){
console.log(element);
});
I got this error in the console:
Uncaught TypeError: rangesAndValues.forEach is not a function at window.onload
Eventually I'm going to want it to loop a function for 0-11, and another loop for 1-11 (skipping 0).
I'm not sure how to skip the first one when it does work either, so brownie points if you answer that too. Thanks!!
forEach is a method on the prototype of an Array, but you have an Object there. You can get the keys of your object, loop through them, and get the value by key, like so:
var keys = Object.keys(rangesAndValues)
keys.forEach(function(key){
var element = rangeAndValues[key];
console.log(element);
});
You need for..in to loop through an object
var rangesAndValues = {
"0": {
range: "C3",
value: ""
},
"1": {
range: "C5:C7",
value: 0
},
"2": {
range: "C9",
value: 0
},
"3": {
range: "G3:G10",
value: 0
},
"4": {
range: "J3",
value: "No"
},
"5": {
range: "J5",
value: "No"
},
"6": {
range: "J7",
value: 0
},
"7": {
range: "J9",
value: 0
},
"8": {
range: "M3:M10",
value: 0
},
"9": {
range: "O3:O10",
value: 0
},
"10": {
range: "Q3:Q10",
value: ""
},
"11": {
range: "S3:S10",
value: 0
}
};
for (let keys in rangesAndValues) {
console.log(rangesAndValues[keys])
}
you are trying to use array method forEach on an object
to iterate over object values you can do
Object.values(rangesAndValues).forEach(function(element){
console.log(element);
});
Object.values give you an array and then you can use forEach
I want to find all key names from a collection that partially match a certain string.
The closest I got was to check if a certain key exists, but that's an exact match:
db.collection.find({ "fkClientID": { $exists:1 }})
I'd like to get all keys that start with fk instead.
You can do that using mapReduce:
To get just the field names at root level:
db.collection.mapReduce(function () {
Object.keys(this).map(function(key) {
if (key.match(/^fk/)) emit(key, null);
// OR: key.indexOf("fk") === 0
});
}, function(/* key, values */) {
// No need for params or to return anything in the
// reduce, just pass an empty function.
}, { out: { inline: 1 }});
This will output something like this:
{
"results": [{
"_id": "fkKey1",
"value": null
}, {
"_id": "fkKey2",
"value": null
}, {
"_id": "fkKey3",
"value": null
}],
"timeMillis": W,
"counts": {
"input": X,
"emit": Y,
"reduce": Z,
"output": 3
},
"ok" : 1
}
To get field names and any or all (whole doc) its values:
db.test.mapReduce(function () {
var obj = this;
Object.keys(this).map(function(key) {
// With `obj[key]` you will get the value of the field as well.
// You can change `obj[key]` for:
// - `obj` to return the whole document.
// - `obj._id` (or any other field) to return its value.
if (key.match(/^fk/)) emit(key, obj[key]);
});
}, function(key, values) {
// We can't return values or an array directly yet:
return { values: values };
}, { out: { inline: 1 }});
This will output something like this:
{
"results": [{
"_id": "fkKey1",
"value": {
"values": [1, 4, 6]
}
}, {
"_id": "fkKey2",
"value": {
"values": ["foo", "bar"]
}
}],
"timeMillis": W,
"counts": {
"input": X,
"emit": Y,
"reduce": Z,
"output": 2
},
"ok" : 1
}
To get field names in subdocuments (without path):
To do that you will have to use store JavaScript functions on the Server:
db.system.js.save({ _id: "hasChildren", value: function(obj) {
return typeof obj === "object";
}});
db.system.js.save({ _id: "getFields", value: function(doc) {
Object.keys(doc).map(function(key) {
if (key.match(/^fk/)) emit(key, null);
if (hasChildren(doc[key])) getFields(doc[key])
});
}});
And change your map to:
function () {
getFields(this);
}
Now run db.loadServerScripts() to load them.
To get field names in subdocuments (with path):
The previous version will just return field names, not the whole path to get them, which you will need if what you want to do is rename those keys. To get the path:
db.system.js.save({ _id: "getFields", value: function(doc, prefix) {
Object.keys(doc).map(function(key) {
if (key.match(/^fk/)) emit(prefix + key, null);
if (hasChildren(doc[key]))
getFields(doc[key], prefix + key + '.')
});
}});
And change your map to:
function () {
getFields(this, '');
}
To exclude overlapping path matches:
Note that if you have a field fkfoo.fkbar, it will return fkfoo and fkfoo.fkbar. If you don't want overlapping path matches, then:
db.system.js.save({ _id: "getFields", value: function(doc, prefix) {
Object.keys(doc).map(function(key) {
if (hasChildren(doc[key]))
getFields(doc[key], prefix + key + '.')
else if (key.match(/^fk/)) emit(prefix + key, null);
});
}});
Going back to your question, renaming those fields:
With this last option, you get all the paths that include keys that start with fk, so you can use $rename for that.
However, $rename doesn't work for those that contain arrays, so for those you could use forEach to do the update. See MongoDB rename database field within array
Performance note:
MapReduce is not particularly fast thought, so you may want to specify { out: "fk_fields"} to output the results into a new collection called fk_fields and query those results later, but that will depend on your use case.
Possible optimisations for specific cases (consistent schema):
Also, note that if you know that the schema of your documents is always the same, then you just need to check one of them to get its fields, so you can do that adding limit: 1 to the options object or just retrieving one document with findOne and reading its fields in the application level.
If you have the latest MongoDB 3.4.4 then you can use $objectToArray in an aggregate statement with $redact as the the most blazing fast way this can possibly be done with native operators. Not that scanning the collection is "fast". but as fast as you get for this:
db[collname].aggregate([
{ "$redact": {
"$cond": {
"if": {
"$gt": [
{ "$size": { "$filter": {
"input": { "$objectToArray": "$$ROOT" },
"as": "doc",
"cond": {
"$eq": [ { "$substr": [ "$$doc.k", 0, 2 ] }, "fk" ]
}
}}},
0
]
},
"then": "$$KEEP",
"else": "$$PRUNE"
}
}}
])
The presently undocumented $objectToArray translates an "object" into "key" and "value" form in an array. So this:
{ "a": 1, "b": 2 }
Becomes this:
[{ "k": "a", "v": 1 }, { "k": "b", "v": 2 }]
Used with $$ROOT which is a special variable referring to the current document "object", we translate to an array so the values of "k" can be inspected.
Then it's just a matter of applying $filter and using $substr to get the preceding characters of the "key" string.
For the record, this would be the MongoDB 3.4.4 optimal way of obtaining an unique list of the matching keys:
db[collname].aggregate([
{ "$redact": {
"$cond": {
"if": {
"$gt": [
{ "$size": { "$filter": {
"input": { "$objectToArray": "$$ROOT" },
"as": "doc",
"cond": {
"$eq": [ { "$substr": [ "$$doc.k", 0, 2 ] }, "fk" ]
}
}}},
0
]
},
"then": "$$KEEP",
"else": "$$PRUNE"
}
}},
{ "$project": {
"j": {
"$filter": {
"input": { "$objectToArray": "$$ROOT" },
"as": "doc",
"cond": {
"$eq": [ { "$substr": [ "$$doc.k", 0, 2 ] }, "fk" ]
}
}
}
}},
{ "$unwind": "$j" },
{ "$group": { "_id": "$j.k" }}
])
That's the safe provision, which is considering that the key may not be present in all documents and that there could possibly be multiple keys in the document.
If you are absolutely certain that you "always" have the key present in the document and that there will only be one, then you can shorten to just $group:
db[colname].aggregate([
{ "$group": {
"_id": {
"$arrayElemAt": [
{ "$map": {
"input": { "$filter": {
"input": { "$objectToArray": "$$ROOT" },
"as": "doc",
"cond": {
"$eq": [ { "$substr": [ "$$doc.k", 0, 2 ] }, "fk" ]
}
}},
"as": "el",
"in": "$$el.k"
}},
0
]
}
}}
])
The most efficient way in earlier versions would be using the $where syntax that allows a JavaScript expression to evaluate. Not that anything that evaluates JavaScript is the "most" efficient thing you can do, but analyzing "keys" as opposed to "data" is not optimal for any data store:
db[collname].find(function() { return Object.keys(this).some( k => /^fk/.test(k) ) })
The inline function there is just shell shorthand and this could also be written as:
db[collname].find({ "$where": "return Object.keys(this).some( k => /^fk/.test(k) )" })
The only requirement for $where is that the expression returns a true value for any document you want to return, so the documents return unaltered.
I have following data. I want to get objects from od array based on some condition. Along with that I want to get em and name field as well.
I am not very much familiar with aggregate of mongodb. So I need help to solve my problem.
{
_id : 1,
em : 'abc#12s.net',
name : 'NewName',
od :
[
{
"oid" : ObjectId("1234"),
"ca" : ISODate("2016-05-05T13:20:10.718Z")
},
{
"oid" : ObjectId("2345"),
"ca" : ISODate("2016-05-11T13:20:10.718Z")
},
{
"oid" : ObjectId("57766"),
"ca" : ISODate("2016-05-13T13:20:10.718Z")
}
]
},
{
_id : 2,
em : 'ab6c#xyz.net',
name : 'NewName2',
od :
[
{
"oid" : ObjectId("1234"),
"ca" : ISODate("2016-05-11T13:20:10.718Z")
},
{
"oid" : ObjectId("2345"),
"ca" : ISODate("2016-05-12T13:20:10.718Z")
},
{
"oid" : ObjectId("57766"),
"ca" : ISODate("2016-05-05T13:20:10.718Z")
}
]
}
I have tried using $match, $project and $unwind of aggregate to get the desired result. My query is as given below : -
db.collection.aggregate([
{
$match : {
"od.ca" : {
'$gte': '10/05/2016',
'$lte': '15/05/2016'
}
}
},
{
$project:{
_id: '$_id',
em: 1,
name : 1,
od : 1
}
},
{
$unwind : "$od"
},
{
$match : {
"od.ca" : {
'$gte': '10/05/2016',
'$lte': '15/05/2016'
}
}
}])
The result I got is with em and name and od array with one of the object from od, i.e. there are multiple records for same email id.
{
_id : 1,
em : 'abc#12s.net',
name : 'NewName',
od :
[
{
"oid" : ObjectId("57766"),
"ca" : ISODate("2016-05-13T13:20:10.718Z")
}
]
}
{
_id : 1,
em : 'abc#12s.net',
name : 'NewName',
od :
[
{
"oid" : ObjectId("2345"),
"ca" : ISODate("2016-05-11T13:20:10.718Z")
}
]
}
But What I want is return result will be for each email id, inside od array all the objects matching the condition. One sample out put that I want is :-
{
_id : 1,
em : 'abc#12s.net',
name : 'NewName',
od :
[
{
"oid" : ObjectId("2345"),
"ca" : ISODate("2016-05-11T13:20:10.718Z")
},
{
"oid" : ObjectId("57766"),
"ca" : ISODate("2016-05-13T13:20:10.718Z")
}
]
}
Any thing wrong I am doing in the query? If the query suppose to return like this, how I can get the result I want? Can someone tell me what should I try or what changes in the query can help me getting the result I want?
You don't necessarily need a cohort of those aggregation operators except when your MongoDB version is older than the 2.6.X releases. The $filter operator will do the job just fine.
Consider the following example where the $filter operator when applied in the $project pipeline stage will filter the od array to only include documents that have a ca date greater than or equal to '2016-05-10' and less than or equal to '2016-05-15':
var start = new Date('2016-05-10'),
end = new Date('2016-05-15');
db.collection.aggregate([
{
"$match": {
"od.ca": { "$gte": start, "$lte": end }
}
},
{
"$project": {
"em": 1,
"name": 1,
"od": {
"$filter": {
"input": "$od",
"as": "o",
"cond": {
"$and": [
{ "$gte": [ "$$o.ca", start ] },
{ "$lte": [ "$$o.ca", end ] }
]
}
}
}
}
}
])
Bear in mind this operator is only available for MongoDB versions 3.2.X and newer.
Otherwise, for versions 2.6.X up to 3.0.X, you can combine the use of the $map and $setDifference operators to "filter" the documents in the ca array.
The $map operator basically maps some values evaluated by the $cond operator to a set of either false values or the documents which pass the given condition. The $setDifference operator then returnns the difference of the sets from the previous computation. Check how this pans out with the preceding example:
var start = new Date('2016-05-10'),
end = new Date('2016-05-15');
db.collection.aggregate([
{
"$match": {
"od.ca": { "$gte": start, "$lte": end }
}
},
{
"$project": {
"em": 1,
"name": 1,
"od": {
"$setDifference": [
{
"$map": {
"input": "$od",
"as": "o",
"in": {
"$cond": [
{
"$and": [
{ "$gte": [ "$$o.ca", start ] },
{ "$lte": [ "$$o.ca", end ] }
]
},
"$$o",
false
]
}
}
},
[false]
]
}
}
}
])
Fo versions 2.4.X and older, you may have to use the concotion of $match, $unwind and $group operators to achieve the same where the above operators do not exist.
The preceding example demonstrates this, which is what you were attempting but just left short of a $group pipeline step to group all the flattened documents into the original document schema, albeit minus the filtered array elements:
db.collection.aggregate([
{
"$match": {
"od.ca": { "$gte": start, "$lte": end }
}
},
{ "$unwind": "$od" },
{
"$match": {
"od.ca": { "$gte": start, "$lte": end }
}
},
{
"$group": {
"$_id": "$_id",
"em": { "$first": "$em" },
"name": { "$first": "$name" },
"od": { "$push": "$od" }
}
}
])
I have the following User object:
{
"_id" : ObjectId("someId"),
"name" : "Bob",
"password" : "fakePassword",
"follower" : [...],
"following" : [..]
}
I need to paginate over the follower list, so I use the slice projection operator, but I just need the paginated followers list to be returned. And I don't know if I am doing it the wrong way, or this can't be done, but limit fields doesn't work with slice projection.
Following are a couple of queries I tried:
collection.findOne(
{
_id: new ObjectId(userId)
},
{
follower: {$slice:[skip, parseInt(pageSize)]},
follower: 1
},..
and
collection.findOne(
{
_id: new ObjectId(userId)
},
{
follower: 1,
follower: {$slice:[skip, parseInt(pageSize)]}
},
But these return all the values in the object, and does not limit the fields, although, the slice works fine in both the cases.
Also when I do something like _id:0,following:0 , this part works, but I don't want to mention each and every field in the query like this, it may create problems once I decide to change the schema.
How do I get this to work, what could be the syntax for the query to get this working..??
Not sure I'm getting your usage pattern here. Perhaps we can simplify the example a little. So considering the document:
{
"_id" : ObjectId("537dd763f95ddda3208798c5"),
"name" : "Bob",
"password" : "fakePassword",
"follower" : [
"A",
"B",
"C",
"D",
"E",
"F",
"G",
"H",
"I",
"J",
"K"
]
}
So the simple query like this:
db.paging.find(
{ "name": "Bob" },
{
"_id": 0,
"name": 0,
"password": 0,
"follower": { "$slice": [0,3] }
}).pretty()
Gives results:
{
"follower" : [
"A",
"B",
"C"
]
}
And similarly from the following page:
db.paging.find(
{ "name": "Bob" },
{
"_id": 0,
"name": 0,
"password": 0,
"follower": { "$slice": [3,3] }
}).pretty()
Gives the results:
{
"follower" : [
"D",
"E",
"F"
]
}
So for me personally I am not sure whether you were asking about the field exclusion or whether you were asking about "paging" the array results, but either way, both of those examples are shown here.
One way is to actually use _id here by saying {_id:1}:
{ "_id" : ObjectId("537de1bc08eb9d89a7d3a1b2"), "f" : [ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16, 17, 18, 19, 20 ], "d" : 1 }
> db.test.findOne({ "_id" : ObjectId("537de1bc08eb9d89a7d3a1b2")},{f:{$slice:[0,2]}})
{
"_id" : ObjectId("537de1bc08eb9d89a7d3a1b2"),
"f" : [
1,
2
],
"d" : 1
}
> db.test.findOne({ "_id" : ObjectId("537de1bc08eb9d89a7d3a1b2")},{_id:0, f:{$slice:[0,2]}})
{ "f" : [ 1, 2 ], "d" : 1 }
> db.test.findOne({ "_id" : ObjectId("537de1bc08eb9d89a7d3a1b2")},{_id:1, f:{$slice:[0,2]}})
{ "_id" : ObjectId("537de1bc08eb9d89a7d3a1b2"), "f" : [ 1, 2 ] }