MongoDB TTL index does not remove an expired record - javascript

I created a TTL index on my collection's created_at field to autoremove document after 30s but after sometime passed (5min later) the document didn't get removed. I read some related questions and I'm sure there was no typo in my collection.
Here is the JSON after running db[collectionName].getIndexes()
[
{
"v" : 2,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "record_db.collectionName"
},
{
"v" : 2,
"key" : {
"created_at" : 1.0
},
"name" : "created_at_1",
"ns" : "record_db.collectionName",
"expireAfterSeconds" : 30.0
}
]
And here is an example document:
{
"_id" : ObjectId("5fc3823af29f9d9eb1518857"),
"record_path" : path/to/file,
"timestamp" : 1606648392,
"humantime" : "29/11/2020 18:12:58",
"created_at" : ISODate("2020-11-29T18:12:58.859Z")
}

ISODate("2020-11-29T18:12:58.859Z") not happened yet.
const iso = "2020-11-29T18:12:58.859Z"
const date = new Date(iso);
console.log(date.toString());

Related

change only one field of entire array of embedded document in mongoose

i have a list schema and a question set schema. the quetsionSet schema is embedded inside the list schema. its working fine but how can i update anything inside the array of embedded document i.e. here i want to change the listname of all the documents inside questionSet (array of questionSet documents).
here is an example of my list document model
{ "_id" : ObjectId("60f2cc07275bbb30d8cb268e"),
"listName" : "dsa",
"aboutList" : "dsa queestions",
questionSet" : [ { "solved" : false,
"_id" : ObjectId("60f2cc12275bbb30d8cb2695"),
"topic" : "array",
"name" : "array is best",
"url" : "www.arr.com",
"listname" : "dsa",
"__v" : 0 },
{ "solved" : false,
"_id" : ObjectId("60f2cc1b275bbb30d8cb269d"),
"topic" : "linked list",
"name" : "reverse list",
"url" : "www.list.com",
"listname" : "dsa",
"__v" : 0 }
],
"__v" : 2
}
you can use the following in your case
db.<collection_name>.updateOne(
{ "_id" : ObjectId("60f2cc07275bbb30d8cb268e")},
{
$set: {
'questionSet.$[].listname': "javascript"
}
}
)

MongoDb - Do i need an index on a second field when using deleteOne

I have a simple collection events
{
"_id" : ObjectId("5e2a9bb9dcb646448f9409b3"),
"year" : 2020,
"employee_id" : "5e1afe5ab7bad92b20365476",
"event" : ["Holidays"],
"total" : 21,
"used" : 1
}
and i don't want to be able to delete the documents that have used field grater than 0.
I use this
db.collection('events').deleteOne({_id: ObjectId("5e2a9bb9dcb646448f9409b3"), used: 0});
Do i need to set an index on used field if i already use _id?
Thanks
Not necessary. MongoDB already uses _id (unique index created by MongoDB) to delete documents.
db.collection('events').find({_id: ObjectId("5e2a9bb9dcb646448f9409b3"), used: 0}).explain();
"winningPlan" : {
"stage" : "FETCH",
"filter" : {
"used" : {
"$eq" : 0.0
}
},
"inputStage" : {
"stage" : "IXSCAN",
"keyPattern" : {
"_id" : 1
},
"indexName" : "_id_",
"isMultiKey" : false,
"multiKeyPaths" : {
"_id" : []
},
"isUnique" : true,
"isSparse" : false,
"isPartial" : false,
"indexVersion" : 2,
"direction" : "forward",
"indexBounds" : {
"_id" : [
"[ObjectId('5e2a9bb9dcb646448f9409b3'), ObjectId('5e2a9bb9dcb646448f9409b3')]"
]
}
}
}

How to lazy load with angular using mongodb data

I want to implement lazy loading in angular.js, i am sending the list of data from backend to the UI using nodejs, i need to implement, on scroll 10 items, are there any examples to achieve this please share any links to do this. Please can anybody help me on this.
Lazy loading is nothing to do with DB, since it depends on the DAO layer, whereas DB is concerned about returning the data for the query submitted to it.
My approach to achieve lazy loading from UI
Using pagination we can do lazy loading
1) Find the total number of documents in your collection
2) Each time when you are loading the page with next set of data, pass on the required information such as from which document the DB needs to send the data
3) Repeat step 2 until you reach the total number of documents in your collection
An example Let us have a collection with few records
db.mycollection.find();
{ "_id" : ObjectId("58947e7e93cbb73057657d60"), "name" : "Clement" }
{ "_id" : ObjectId("58947e7e93cbb73057657d61"), "name" : "Rockin" }
{ "_id" : ObjectId("58947e7e93cbb73057657d62"), "name" : "Gowri" }
{ "_id" : ObjectId("58947e7e93cbb73057657d63"), "name" : "Inbaraj" }
{ "_id" : ObjectId("58947e7e93cbb73057657d64"), "name" : "Siva" }
{ "_id" : ObjectId("58947e7e93cbb73057657d65"), "name" : "Rani" }
{ "_id" : ObjectId("58947e7e93cbb73057657d66"), "name" : "Rose" }
{ "_id" : ObjectId("58947e7e93cbb73057657d67"), "name" : "Rekha" }
{ "_id" : ObjectId("58947e7e93cbb73057657d68"), "name" : "Dev" }
{ "_id" : ObjectId("58947f6f93cbb73057657d69"), "name" : "Joe" }
{ "_id" : ObjectId("58947f8393cbb73057657d6a"), "name" : "Beniton" }
Prerequisite for doing pagination
db.mycollection.find().count()
11
Let me have the initial load size as 5
My first query to DB would be
db.mycollection.find().sort({"_id":1}).limit(5);
{ "_id" : ObjectId("58947e7e93cbb73057657d60"), "name" : "Clement" }
{ "_id" : ObjectId("58947e7e93cbb73057657d61"), "name" : "Rockin" }
{ "_id" : ObjectId("58947e7e93cbb73057657d62"), "name" : "Gowri" }
{ "_id" : ObjectId("58947e7e93cbb73057657d63"), "name" : "Inbaraj" }
{ "_id" : ObjectId("58947e7e93cbb73057657d64"), "name" : "Siva" }
My Next query to DB
db.mycollection.find().sort({"_id":1}).skip(5).limit(5);
{ "_id" : ObjectId("58947e7e93cbb73057657d65"), "name" : "Rani" }
{ "_id" : ObjectId("58947e7e93cbb73057657d66"), "name" : "Rose" }
{ "_id" : ObjectId("58947e7e93cbb73057657d67"), "name" : "Rekha" }
{ "_id" : ObjectId("58947e7e93cbb73057657d68"), "name" : "Dev" }
{ "_id" : ObjectId("58947f6f93cbb73057657d69"), "name" : "Joe" }
final query would be
db.mycollection.find().sort({"_id":1}).skip(10).limit(5);
{ "_id" : ObjectId("58947f8393cbb73057657d6a"), "name" : "Beniton" }
In this example,
Sort on _id is used, which is based on insertion time, which helps us in ordering the documents and render it for the subsequent queries.
Hope it Helps!
References:
https://www.mongodb.com/blog/post/the-mean-stack-mongodb-expressjs-angularjs-and
Lazy Loading/More Data Scroll in Mongoose/Nodejs
http://adrichman.github.io/implementing-a-lazy-load-and-infinite-scroll-in-angularjs/

Meteor find is returning null with variable [duplicate]

This question already has an answer here:
Updating template with session variable and subscription/publication only updates with previous query and appends information
(1 answer)
Closed 7 years ago.
Description of the problem:
I have two collections videos and specs. videos collection has a key called spec which corresponds to a specs id. Both collections are not empty.
My Template Helper:
Template.List.helpers({
videos: function(){
var vids = Videos.find({ online: false}).fetch();
return vids.map(function(value){
console.log("id: " + value.spec);
console.log(Specs.find({ id: value.spec }).fetch());
//Issue here
value.specName = Specs.find({ id: value.spec }).fetch()[0].name;
return value;
});
}
});
As you can see in my template helper I loop through the video array and add specName to the array.
The main problem comes from this specific line:
value.specName = Specs.find({ id: value.spec }).fetch()[0].name;
and more specifically this line: Specs.find({ id: value.spec }).fetch()
which will return null every time.
What I've tried:
Naturally, my first thought would be to check what value.spec returns. And it returns an int between 1 and 15 (included) which is right. If value.spec is right, then how come the find() doesn't return anything?
I then decided to hard set it and tried this:
Specs.find({ id: 2}).fetch()
And this worked. Which is weird because on multiple occasions value.spec returns 2...
I've also tried intParse(value.spec) just in case, however that didn't work either.
Question
Why does Specs.find({ id: value.spec }).fetch() return null knowing that value.spec is set correctly and that a hard coded number works?
Requested json data: (from meteor mongo)
specs:
{ "_id" : "XKXHtQuiFsAew3dDy", "id" : 1, "name" : "Endocrine surgery" }
{ "_id" : "68jFidAMXTXpQtQye", "id" : 2, "name" : "General and digestive" }
{ "_id" : "GZSXToRXMfJgnH3CY", "id" : 3, "name" : "Pediatric surgery" }
{ "_id" : "T2mBz2gsXEqQaybmq", "id" : 4, "name" : "Thoracic surgery" }
{ "_id" : "hnuQzZiPKvYYDZhc8", "id" : 5, "name" : "Equipment" }
{ "_id" : "byE3A6HchvfhKdmR8", "id" : 6, "name" : "Gynecology" }
{ "_id" : "u5rrPB7asGW3NC6B2", "id" : 7, "name" : "Urology" }
{ "_id" : "umxKvR66oEx5dRppf", "id" : 8, "name" : "Cardiovascular surgery" }
{ "_id" : "bPcBTZn3t5ubRRcrQ", "id" : 9, "name" : "Endoscopic surgery" }
{ "_id" : "yNyAqQPoreNtdRZ34", "id" : 10, "name" : "NOTES" }
{ "_id" : "KG794eakRaztEqehG", "id" : 11, "name" : "Robotic surgery" }
{ "_id" : "QBrtvTg4GT7Tf7cAJ", "id" : 12, "name" : "Skull base surgery" }
{ "_id" : "HEhq6oBjuuMnrxE5a", "id" : 13, "name" : "Arthroscopy and upper limb surgery" }
{ "_id" : "xwpgHqZpBQP7WAnd5", "id" : 14, "name" : "Single port surgery" }
{ "_id" : "K4BgFupwNdDGD3449", "id" : 15, "name" : "Telemicrosurgery" }
videos:
{ "_id" : "L5Qi7YRRhn6Sfcjk8", "id" : "vd01en1065e", "title" : "Right inguinal hernia: open plug technique", "authors" : [ "E Pelissier" ], "date_published" : "2004-09-27", "abstract" : "", "tags" : [ "" ], "spec" : 2, "private" : true, "online" : false }
{ "_id" : "M8cuLW6KNCqKeP9vF", "id" : "vd01en1074e", "title" : "Laparoscopic splenectomy, posterior approach", "authors" : [ "D Mutter", " F Rubino" ], "date_published" : "2004-09-27", "abstract" : "", "tags" : [ "" ], "spec" : 2, "private" : true, "online" : false }
{ "_id" : "Ptzrxw8GifeMvQk9k", "id" : "vd01en1090e", "title" : "Intussusception of the intestine in the newborn", "authors" : [ "F Becmeur", " D Christmann", " I Kauffmann" ], "date_published" : "2004-09-27", "abstract" : "", "tags" : [ "" ], "spec" : 3, "private" : true, "online" : false }
{ "_id" : "oHWcX3vCBHuZQM9hR", "id" : "vd01en1103e_2", "title" : "Appendicular peritonitis: laparoscopic conversion", "authors" : [ "B Navez" ], "date_published" : "2001-11-05", "abstract" : "", "tags" : [ "" ], "spec" : 2, "private" : true, "online" : false }
{ "_id" : "6uzmxYxhd5DDuS2gG", "id" : "vd01en1108e", "title" : "Diaphragmatic hernias", "authors" : [ "F Becmeur" ], "date_published" : "2001-11-28", "abstract" : "", "tags" : [ "" ], "spec" : 3, "private" : true, "online" : false }
{ "_id" : "yHqruiQYeeQ9SDHpH", "id" : "vd01en1112e", "title" : "Laparoscopic excision of the cystic stump", "authors" : [ "J Leroy" ], "date_published" : "2004-09-27", "abstract" : "", "tags" : [ "" ], "spec" : 2, "private" : true, "online" : false}
{ "_id" : "fmjtk5WAEKitMxyGj", "id" : "vd01en1114e", "title" : "Laparoscopic gastric banding in a patient with a BMI of 40", "authors" : [ "JM Zimmermann", " D Fölscher" ], "date_published" : "2004-09-27", "abstract" : "", "tags" : [ "" ], "spec" : 2, "private" : true, "online" : false}
I've been stuck on this problem for a couple hours, I didn't want to post this on SO since I do believe it's a simple problem. However, it is mind boggling.
The problem here is the publication/subscription issue. You did not mention it, how do you handle publications and subscriptions but that is most likely the issue. What is happening is that when you browse your videos collection, which subscription is ready (since you hae any data in it) here: Videos.find({ online: false}) that not necessarly means that (in that precise moment) subscription handling Spec collection is ready as well. So even if on the server the query is working, on the client it's null, because the data are not synced between client and server YET. so you have to wait until both subscriptions are ready somehow. You can use template subscriptions, or waitOn function in your router.
First of all check on server side that your publication and subscription is ready and returning you back data check this like as
Meteor.publish('Videso', function () {
var result= Videso.find({});
console.log(result);
return result
});
if console.log returning you records on server side i have a solution then your
problem
the solution of your problem is that
Template.List.helpers({
videos: function(){
var vids = Videos.find({ online: false}).map(function (value) {
console.log("id: " + value.spec);
var idValue = Specs.findOne({ "id": value.spec })
console.log("idValue===",idValue)
return _.extend(value, idValue);
});
console.log(vids); //here merged data of both table having
// relvent record will returned
return vids;
}
});
Just implement it and check data on your template it will be ready for you and vote me then :)

Unable to print BSON object from javascript

My mongoDB collection looks like this :
{
"_id" : ObjectId("5070310e0f3350482b00011d"),
"emails" : [
{
"_id" : ObjectId("5070310e0f3350482b000120"),
"_type" : "Email",
"name" : "work",
"email" : "peter.loescher#siemens.com",
"current" : true
}
]
}
and this is the .js code i use to print the contents :
c = db.contacts.findOne( { "emails.email" : { $ne : null } }, { "emails" : 1 } )
print(c._id.toString() + " " + c.emails[0]);
when I try to run this javascript file, it is just displaying the id but not the email array.
output:
5070310e0f3350482b00011d [object bson_object]
but when I try c.emails[0].email is is giving proper result. i.e. peter.loescher#siemens.com
All I need is I want to display the whole emails embedded object.
i.e.
"emails" : [
{
"_id" : ObjectId("5070310e0f3350482b000120"),
"_type" : "Email",
"name" : "work",
"email" : "peter.loescher#siemens.com",
"current" : true
}
]
Where I am going wrong?. Any help would be appreciated.
You need printjson to output a nicely formatted JSON:
printjson(c.emails[0]);
Here it is the documentation.

Categories