Summing and Outputting Totals in Mongo View - javascript

In my mongDB backend, I have a view that, after multiple aggregation stages, outputs info that looks like this:
{
"_id" : 25k3ejfjyi32132f9z3,
"customer_id" : 15cgrd582950jj493g5,
"openBalance": 24,
// other data...
},
{
"_id" : 35g6ejfjfj32132f8s4,
"customer_id" : 23gtrd684563jj494f4
"openBalance": 20,
// other data...
}
What I need to do, as a last step, is total up all of the "openBalance" amounts for all records, and output that number in a new field along with the other data. So, in other words, based on the above data, I want to return 44 in the a field titled totalOpenBalance.
Is there a way I can handle this aggregation logic in a mongo view? I'm not sure how to do this, because I'm not wanting to add a field to each record returned, but instead return a value based on the total of the records? It would look something like this:
{
"_id" : 25k3ejfjyi32132f9z3,
"customer_id" : 15cgrd582950jj493g5,
"openBalance": 24,
// other data...
},
{
"_id" : 35g6ejfjfj32132f8s4,
"customer_id" : 23gtrd684563jj494f4
"openBalance": 20,
// other data...
},
"totalOpenBalance": 44

If you add the following code to the end of your pipeline
$group: {
_id: null, // do not really group but throw all documents into the same bucket
documents: { $push: "$$ROOT" }, // push each encountered document into the group
totalOpenBalance: { $sum: "$openBalance" } // sum up all "openBalance" values
}
you will get something that you might be able to use:
{
"_id" : null,
"documents" : [
{
"_id" : 25k3ejfjyi32132f9z3,
"customer_id" : 15cgrd582950jj493g5,
"openBalance" : 24
},
{
"_id" : 35g6ejfjfj32132f8s4,
"customer_id" : 23gtrd684563jj494f4,
"openBalance" : 20
}
],
"totalOpenBalance" : 44
}
If you want to go completely crazy which I would not really recommend then read on. By adding the following stages
{
$group: {
_id: null, // do not really group but throw all documents into the same bucket
documents: { $push: "$$ROOT" }, // push each encountered document into the group
totalOpenBalance: { $sum: "$openBalance" } // sum up all "openBalance" values
}
}, {
$project: {
"_id": 0, // remove the "_id" field
"documents": { $concatArrays: [ "$documents", [ { "totalOpenBalance": "$totalOpenBalance" } ] ] } // append a magic subdocument to the the existing documents
}
}, {
$unwind: "$documents" // just so we can flatten the resulting array into separate documents
}, {
$replaceRoot: {
newRoot: "$documents" // and move the content of our documents field to the root
}
}
you get exactly what you asked for:
{
"_id" : 25k3ejfjyi32132f9z3,
"customer_id" : 15cgrd582950jj493g5,
"openBalance" : 24
},
{
"_id" : 35g6ejfjfj32132f8s4,
"customer_id" : 23gtrd684563jj494f4,
"openBalance" : 20
},
{
"totalOpenBalance" : 44
}
This, however, is probably just an overkill...

Related

Mongodb Sum query returns nothing

I have this example of activities row collection
{
"_id" : ObjectId("5ec90b5258a37c002509b27d"),
"user_hash" : "asdsc4be9fe7xxx",
"type" : "Expense",
"name" : "Lorem",
"amount" : 10000,
"date_created" : 1590233938
}
I'd like to collect the sum amount of the activity with this aggregate code
db.activities.aggregate(
[
{
$group:
{
_id: "$id",
total: { $sum: "$amount" }
}
},
{
$match: { type: "Expense", "user_hash": "asdsc4be9fe7xxx" }
}
]
)
Expected result : {_id: null, total: xxxxx }
Actual result:
Any solution for this? Thank you in Advance
There're 2 problems with your query:
You making the sum aggregation on each individual document instead doing it on the whole collection because you specify _id: "$id", while you need to specify _id: null.
You're performing the match stage in the aggregating after the group stage. But you need to perform it before because after you group the result will be something like:
{
"_id": null,
"total": 15
}
As you can see this object doesn't have any of the fields that the original objects have therefore 0 results will be matched. The order of stages is important because essentially each stage performs some operation based on the result of the previous stage (there're some exceptions when mongodb automatically optimizes stages but different order in these stages doesn't produce different results).
So the query should be:
db.activities.aggregate(
[
{
$match: { type: "Expense", "user_hash": "asdsc4be9fe7xxx" }
},
{
$group:
{
_id: null,
total: { $sum: "$amount" }
}
},
]
)

Aggregate a collection of timestamps in MongoDB using the Aggregation Pipeline

I have a collection of timestamps which record what actions are performed by users at which time. For now, the collection consists of only two actions start and end. There can only be a single end action, while there can be multiple start actions per user.
Now I want a generate a list of users where the time difference between the last start action and the end action is - for example - less than a minute.
The simplified documents in my collection timestamps look like this:
document #1
{
id: 123,
user: "user1",
type: "start",
date: 2019-09-10
}
document #2
{
id: 234,
user: "user1",
type: "end",
date: 2019-09-11
}
Now the result I want should look like this:
{
id: null,
list: ["user1, user2"]
}
The field list should contain every user, where the time difference between the start and end action is less than a minute.
I am having trouble combining the documents which contain the start and end attribute. I was trying to combine them into documents that looks like this:
{
id: 345
user: "user1"
date_start: 2019-09-10
date_end: 2019-09-11
}
I don't know where to start with the aggregation pipeline and how to split and combine the different types of timestamps. Furthermore, I still need to add a field that contains the difference between both dates.
The following query can get us the expected output:
db.collection.aggregate([
{
$sort:{
"date":-1
}
},
{
$group:{
"_id":{
"id":"$id",
"type":"$type"
},
"id":{
$first:"$id"
},
"user":{
$first:"$user"
},
"type":{
$first:"$type"
},
"date":{
$first:"$date"
}
}
},
{
$group:{
"_id":"$id",
"user":{
$first:"$user"
},
"info":{
$push:{
"k":"$type",
"v":"$date"
}
}
}
},
{
$addFields:{
"info":{
$arrayToObject:"$info"
}
}
},
{
$match:{
$expr:{
$lt:[
{
$subtract:[
{
$toDate:"$info.end"
},
{
$toDate:"$info.start"
}
]
},
60000
]
}
}
},
{
$group:{
"_id":null,
"users":{
$push:"$user"
}
}
},
{
$project:{
"_id":0
}
}
]).pretty()
Data set:
{
"_id" : ObjectId("5d77a117bd4e75c58d598214"),
"id" : 123,
"user" : "user1",
"type" : "start",
"date" : "2019-09-10T13:01:14.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d598215"),
"id" : 123,
"user" : "user1",
"type" : "start",
"date" : "2019-09-10T13:04:14.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d598216"),
"id" : 123,
"user" : "user1",
"type" : "start",
"date" : "2019-09-10T13:09:02.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d598217"),
"id" : 123,
"user" : "user1",
"type" : "end",
"date" : "2019-09-10T13:09:14.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d598218"),
"id" : 234,
"user" : "user2",
"type" : "start",
"date" : "2019-09-10T13:02:02.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d598219"),
"id" : 234,
"user" : "user2",
"type" : "end",
"date" : "2019-09-10T13:09:14.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d59821a"),
"id" : 345,
"user" : "user3",
"type" : "start",
"date" : "2019-09-10T13:08:55.242Z"
}
{
"_id" : ObjectId("5d77a117bd4e75c58d59821b"),
"id" : 345,
"user" : "user3",
"type" : "end",
"date" : "2019-09-10T13:09:14.242Z"
}
Output:
{ "users" : [ "user3", "user1" ] }
Query analysis:
Stage I: Sorting the documents in descending order of the date
Stage II: Grouping on [id, type] and picking the first date for
each type i.e. the latest date for each type
Stage III: Grouping only on id and pushing the type and associated date into an array as key-value pairs
Stage IV: Converting the array of key-value pairs into an object
Stage V: Filtering documents which has the difference between end and start date less than 60000 ms. (milliseconds equivalent of 1 minute)
Stage VI: Pushing all filtered names into an array

How to add within an array information using findOneAndUpdate without deleting information that was previously contained [duplicate]

I am working on an express js application where I need to update a nested array.
1) Schema :
//Creating a mongoose schema
var userSchema = mongoose.Schema({
_id: {type: String, required:true},
name: String,
sensors: [{
sensor_name: {type: String, required:true},
measurements: [{time: String}]
}] });
2)
Here is the code snippet and explanation is below:
router.route('/sensors_update/:_id/:sensor_name/')
.post(function (req, res) {
User.findOneAndUpdate({_id:req.body._id}, {$push: {"sensors" :
{"sensor_name" : req.body.sensor_name , "measurements.0.time": req.body.time } } },
{new:true},function(err, newSensor) {
if (err)
res.send(err);
res.send(newSensor)
}); });
I am able to successfully update a value to the measurements array using the findOneAndUpdate with push technique but I'm failing when I try to add multiple measurements to the sensors array.
Here is current json I get if I get when I post a second measurement to the sensors array :
{
"_id": "Manasa",
"name": "Manasa Sub",
"__v": 0,
"sensors": [
{
"sensor_name": "ras",
"_id": "57da0a4bf3884d1fb2234c74",
"measurements": [
{
"time": "8:00"
}
]
},
{
"sensor_name": "ras",
"_id": "57da0a68f3884d1fb2234c75",
"measurements": [
{
"time": "9:00"
}
]
}]}
But the right format I want is posting multiple measurements with the sensors array like this :
Right JSON format would be :
{
"_id" : "Manasa",
"name" : "Manasa Sub",
"sensors" : [
{
"sensor_name" : "ras",
"_id" : ObjectId("57da0a4bf3884d1fb2234c74"),
"measurements" : [
{
"time" : "8:00"
}
],
"measurements" : [
{
"time" : "9:00"
}
]
}],
"__v" : 0 }
Please suggest some ideas regarding this. Thanks in advance.
You might want to rethink your data model. As it is currently, you cannot accomplish what you want. The sensors field refers to an array. In the ideal document format that you have provided, you have a single object inside that array. Then inside that object, you have two fields with the exact same key. In a JSON object, or mongo document in this context, you can't have duplicate keys within the same object.
It's not clear exactly what you're looking for here, but perhaps it would be best to go for something like this:
{
"_id" : "Manasa",
"name" : "Manasa Sub",
"sensors" : [
{
"sensor_name" : "ras",
"_id" : ObjectId("57da0a4bf3884d1fb2234c74"),
"measurements" : [
{
"time" : "8:00"
},
{
"time" : "9:00"
}
]
},
{
// next sensor in the sensors array with similar format
"_id": "",
"name": "",
"measurements": []
}],
}
If this is what you want, then you can try this:
User.findOneAndUpdate(
{ _id:req.body._id "sensors.sensor_name": req.body.sensor_name },
{ $push: { "sensors.0.measurements": { "time": req.body.time } } }
);
And as a side note, if you're only ever going to store a single string in each object in the measurements array, you might want to just store the actual values instead of the whole object { time: "value" }. You might find the data easier to handle this way.
Instead of hardcoding the index of the array it is possible to use identifier and positional operator $.
Example:
User.findOneAndUpdate(
{ _id: "Manasa" },
{ $push: { "sensors.$[outer].measurements": { "time": req.body.time } } }
{ "arrayFilters:" [{"outer._id": ObjectId("57da0a4bf3884d1fb2234c74")}]
);
You may notice than instead of getting a first element of the array I specified which element of the sensors array I would like to update by providing its ObjectId.
Note that arrayFilters are passed as the third argument to the update query as an option.
You could now make "outer._id" dynamic by passing the ObjectId of the sensor like so: {"outer._id": req.body.sensorId}
In general, with the use of identifier, you can get to even deeper nested array elements by following the same procedure and adding more filters.
If there was a third level nesting you could then do something like:
User.findOneAndUpdate(
{ _id: "Manasa" },
{ $push: { "sensors.$[outer].measurements.$[inner].example": { "time": req.body.time } } }
{ "arrayFilters:" [{"outer._id": ObjectId("57da0a4bf3884d1fb2234c74"), {"inner._id": ObjectId("57da0a4bf3884d1fb2234c74"}}]
);
You can find more details here in the answer written by Neil Lunn.
refer ::: positional-all
--- conditions :: { other_conditions, 'array1.array2.field_to_be_checked': 'value' }
--- updateData ::: { $push : { 'array1.$[].array2.$[].array3' : 'value_to_be_pushed' } }

Mongo Aggregate: how to compare with a field from another collection?

I am trying to implement a function that collects unread messages from an articles collection. Each article in the collection has a "discussions" entry with discussion comment subdocuments. An example of such a subdocument is:
{
"id": NumberLong(7534),
"user": DBRef("users", ObjectId("...")),
"dt_create": ISODate("2015-01-26T00:10:44Z"),
"content": "The discussion comment content"
}
The parent document has the following (partial) structure:
{
model: {
id: 17676,
title: "Article title",
author: DBRef("users", ObjectId(...)),
// a bunch of other fields here
},
statistics: {
// Statistics will be stored here (pageviews, etc)
},
discussions: [
// Array of discussion subdocuments, like the one above
]
}
Each user also has a last_viewed entry which is a document, an example is as follows:
{
"17676" : "2015-01-10T00:00:00.000Z",
"18038" : "2015-01-10T00:00:00.000Z",
"18242" : "2015-01-20T00:00:00.000Z",
"18325" : "2015-01-20T00:00:00.000Z"
}
This means that the user has looked at discussion comments for the last time on January 10th 2015 for articles with IDs 17676 and 18038, and on January 20th 2015 for articles with IDs 18242 and 18325.
So I want to collect discussion entries from the article documents, and for article with ID 17676, I want to collect the discussion entries that were created after 2015-01-10, and for article with ID 18242, I want to show the discussion entries created after 2015-01-20.
UPDATED
Based on Neil Lunn's reply, the function I have created so far is:
function getUnreadDiscussions(userid) {
user = db.users.findOne({ 'model.id': userid });
last_viewed = [];
for(var i in user.last_viewed) {
last_viewed.push({
'id': parseInt(i),
'dt': user.last_viewed[i]
});
}
result = db.articles.aggregate([
// For now, collect just articles the user has written
{ $match: { 'model.author': DBRef('users', user._id) } },
{ $unwind: '$discussions' },
{ $project: {
'model': '$model',
'discussions': '$discussions',
'last_viewed': {
'$let': {
'vars': { 'last_viewed': last_viewed },
'in': {
'$setDifference': [
{ '$map': {
'input': '$$last_viewed',
'as': 'last_viewed',
'in': {
'$cond': [
{ '$eq': [ '$$last_viewed.id', '$model.id' ] },
'$$last_viewed.dt',
false
]
}
} },
[ false ]
]
}
}
}
}
},
// To get a scalar instead of a 1-element array:
{ $unwind: '$last_viewed' },
// Match only those that were created after last_viewed
{ $match: { 'discussions.dt_create': { $gt: '$last_viewed' } } },
{ $project: {
'model.id': 1,
'model.title': 1,
'discussions': 1,
'last_viewed': 1
} }
]);
return result.toArray();
}
The whole $let thing, and the $unwind after that, transforms the data into the following partial projection (with the last $match commented out):
{
"_id" : ObjectId("54d9af1dca71d8054c8d0ee3"),
"model" : {
"id" : NumberLong(18325),
"title" : "Article title"
},
"discussions" : {
"id" : NumberLong(7543),
"user" : DBRef("users", ObjectId("54d9ae24ca71d8054c8b4567")),
"dt_create" : ISODate("2015-01-26T00:10:44Z"),
"content" : "Some comment here"
},
"last_viewed" : ISODate("2015-01-20T00:00:00Z")
},
{
"_id" : ObjectId("54d9af1dca71d8054c8d0ee3"),
"model" : {
"id" : NumberLong(18325),
"title" : "Article title"
},
"discussions" : {
"id" : NumberLong(7554),
"user" : DBRef("users", ObjectId("54d9ae24ca71d8054c8b4567")),
"dt_create" : ISODate("2015-01-26T02:03:22Z"),
"content" : "Another comment here"
},
"last_viewed" : ISODate("2015-01-20T00:00:00Z")
}
So far so good here. But the problem now is that the $match to select only the discussions created after the last_viewed date is not working. I am getting an empty array response. However, if I hard-code the date and put in $match: { 'discussions.dt_create': { $gt: ISODate("2015-01-20 00:00:00") } }, it works. But I want it to take it from last_viewed.
I found another SO thread where this issue has been resolved by using the $cmp operator.
The final part of the aggregation would be:
[
{ /* $match, $unwind, $project, $unwind as before */ },
{ $project: {
'model': 1,
'discussions': 1,
'last_viewed': 1,
'compare': {
$cmp: [ '$discussions.dt_create', '$last_viewed' ]
}
} },
{ $match: { 'compare': { $gt: 0 } } }
]
The aggregation framework is great, but it takes quite a different approach in problem-solving. Hope this helps anyone!
I'll keep the question unanswered in case anyone else has a better answer/method. If this answer has been upvoted enough times, I'll accept this one.

taking the difference between adjacent documents in mongoDB

How do I take the difference between adjacent records in mongoDB using javascript? For example, if I have the following three documents in a collection:
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z")
}
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z")
}
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z")
}
I want to take the difference in the "time" field between adjacent values to get:
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z"),
"time_difference" : null
}
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z"),
"time_difference" : 1
}
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z"),
"time_difference" : 3
}
Any ideas on how to do this efficiently in javascript/mongoDB? Thanks.
I don't know whether this was true when the question was asked seven years ago, but this can be solved completely within the aggregation framework. Assuming the collection name is AdjacentDocument, the following aggregation will get the results you're looking for:
db.AdjacentDocument.aggregate(
{$sort: {time: 1}},
{$group: {_id: 0, document: {$push: '$$ROOT'}}},
{$project: {documentAndPrevTime: {$zip: {inputs: ['$document', {$concatArrays: [[null], '$document.time']}]}}}},
{$unwind: {path: '$documentAndPrevTime'}},
{$replaceWith: {$mergeObjects: [{$arrayElemAt: ['$documentAndPrevTime', 0]}, {prevTime: {$arrayElemAt: ['$documentAndPrevTime', 1]}}]}},
{$set: {time_difference: {$trunc: [{$divide: [{$subtract: ['$time', '$prevTime']}, 1000]}]}}},
{$unset: 'prevTime'}
);
Aggregation pipeline walkthrough
First, the documents are sorted from oldest to newest. They are grouped into a single document with the documents stored in an ordered array field:
{$sort: {time: 1}},
{$group: {_id: 0, document: {$push: '$$ROOT'}}}
/*
{
"_id" : 0,
"document" : [
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z")
},
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z")
},
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z")
}
]
}
*/
Next, the previous times are zipped into the document array, creating an array of [document, previousTime]:
{$project: {documentAndPrevTime: {$zip: {inputs: ['$document', {$concatArrays: [[null], '$document.time']}]}}}}
/*
{
"_id" : 0,
"documentAndPrevTime" : [
[
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z")
},
null
],
[
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z")
},
ISODate("2013-02-13T15:45:41.148Z")
],
[
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z")
},
ISODate("2013-02-13T15:45:42.148Z")
]
]
}
*/
Next, the document & time array is unwound, creating a document for each of the initial documents:
{$unwind: {path: '$documentAndPrevTime'}}
/*
{
"_id" : 0,
"documentAndPrevTime" : [
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z")
},
null
]
}
{
"_id" : 0,
"documentAndPrevTime" : [
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z")
},
ISODate("2013-02-13T15:45:41.148Z")
]
}
{
"_id" : 0,
"documentAndPrevTime" : [
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z")
},
ISODate("2013-02-13T15:45:42.148Z")
]
}
*/
Next, we replace the document with the value of the document array element, merged with previous time element (using null if it's the initial index):
{$replaceWith: {$mergeObjects: [{$arrayElemAt: ['$documentAndPrevTime', 0]}, {prevTime: {$arrayElemAt: ['$documentAndPrevTime', 1]}}]}}
/*
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z"),
"prevTime" : null
}
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z"),
"prevTime" : ISODate("2013-02-13T15:45:41.148Z")
}
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z"),
"prevTime" : ISODate("2013-02-13T15:45:42.148Z")
}
*/
Finally, we update the document by setting the time_difference to the difference of the two time fields, and removing the temporary prevTime field. Since the difference between two dates is in milliseconds and your example uses seconds, we calculate the seconds by dividing by 1000 and truncating.
{$set: {time_difference: {$trunc: [{$divide: [{$subtract: ['$time', '$prevTime']}, 1000]}]}}},
{$unset: 'prevTime'}
/*
{
"_id" : ObjectId("50ed90a55502684f440001ac"),
"time" : ISODate("2013-02-13T15:45:41.148Z"),
"time_difference" : null
}
{
"_id" : ObjectId("50ed90a55502684f440001ad"),
"time" : ISODate("2013-02-13T15:45:42.148Z"),
"time_difference" : 1
}
{
"_id" : ObjectId("50ed90a55502684f440001ae"),
"time" : ISODate("2013-02-13T15:45:45.148Z"),
"time_difference" : 3
}
*/
The one thing you will want to make sure of here is that you have a sort on the query you wish to use to garnish your records. If no sort is used it will actually use find order, which is not $natural order.
Find order can differ between queries so if you run the query twice within the period of 2 minutes you might find that they don't return the same order. It does seem however that your query would be logically sorted on tiem_difference.
It should also by noted that this is not possible through normal querying. I also do not see an easy way doing this through the aggregation framework.
So already it seems the next plausible method is either using multiple queries or client side processing. Client side processing is probably the better here using a function like the one defined by #Marlon above.
One thing, I want to clear you. Unlike MYSQL, MongoDB is not give gurantee to the position. I mean, MongoDB will give you different sort at different time. So compare adjacent document may give different result, on every reading.
If you are fine with that and you want to compare then try with MongoDB's MapReduce http://docs.mongodb.org/manual/applications/map-reduce/
Assuming those 3 objects are coming through in an array, you could do something like the below:
var prevTime;
var currentTime;
for(var i = 0; i < records.length; i++)
{
currentTime = new Date(records[i].time).getTime();
records[i].time_difference = currentTime - prevTime;
prevTime = currentTime;
}
Of course you'll need to swap bits out to make it use the records from mongo.
If you need to do any more complex date calculations, I highly suggest checking out datejs (which you can get a node wrapper for if you want).

Categories