Duplicate key error while using _id from another collection - javascript

I have 2 collections so far.
The first is a 'Users' collection (That works well), and the other is a 'Rooms' collection (both created for a chat application).
I want every room to have a "users" array that will contain the user._id of every user that is in that room,
meaning I should be able to put the same user._id (from the user collection) in every one of the rooms right?
After creating a room successfully with 2 user._ids in the "users" array,
I tried making another one using one of the user._ids I used in the first room.
Then I got this error:
MongoError: E11000 duplicate key error collection: ratchat.rooms index: users_1 dup key:
{ users: "5fe08d452f34530e641d8f8c" }
After checking with a debugger I've found that the error occurs only when I use a user._id that is already used in another room's "users" array.
The only thing I could think of that could cause this problem is the Room schema,
maybe there's something I missed while reading the docs...
At first my Room schema looked like this:
const roomSchema = new mongoose.Schema({
users: [String],
hasLeft: [String],
isGroup: { type: Boolean, default: false },
},
});
const Room = mongoose.model("Room", roomSchema);
Then I thought maybe mongoDB needs to know that the ObjectIds that are in the users array are just references to another collection:
const roomSchema = new mongoose.Schema({
users: [{ type: mongoose.Schema.Types.ObjectId, ref: "User" }],
hasLeft: [String],
isGroup: { type: Boolean, default: false },
},
});
const Room = mongoose.model("Room", roomSchema);
No luck so far...

{autoIndex: false}
After research, I have found the reason for this error:
mongoose automatically creates indexes,
this is not only a duplicate error issue, but can cause a significant performance impact later in production.
According to mongoose docs, you can easily disable this behavior by setting the autoIndex option of your schema to false, or globally on the connection by setting the option autoIndex to false.
mongoose.connect('mongodb://user:pass#localhost:port/database', { autoIndex: false });
// or
mongoose.createConnection('mongodb://user:pass#localhost:port/database', { autoIndex: false });
// or
animalSchema.set('autoIndex', false);
// or
new Schema({..}, { autoIndex: false });
Don't forget to drop the entire collection before trying again
Because the collection is already indexed, emptying it completely won't work.
You have to drop the entire collection.

Related

Refresh Tokens in MongoDB

What is the best solution to remove a refresh token from MongoDB automatically.
On login the user is given a temporary auth token which lasts 30 seconds. They are also given a permanent token, stored in MongoDB, which currently lasts until they log out.
I want to remove the permanent one at the end of every day, but I'm not sure how to do that without having a cron job running (to monitor what time it is). This seems a bit complex for such a small task. Is there a way mongo would know what the time is and then remove the refresh token?
This is how the token collection looks:
Thank you
To auto-delete the MongoDB documents after some time, you should use the TTL(time to live) collection feature documented here.
Essentially, you need to create an index on the collection that stores the token documents. For your use case, you can do something like this:
// This would delete the tokens document after 3600seconds after creation
// You can tweak the time as you wish.
db.tokens.createIndex({ "createdAt": 1 }, { expireAfterSeconds: 3600 });
NodeJS, mongodb.
Just simply create a model for each token.
const mongoose = require('mongoose')
const Schema = mongoose.Schema
const tokenSchema = new Schema({
_userId: {
type: Schema.Types.ObjectId,
required: true,
ref: 'user'
},
token: {
type: String,
required: true
},
expireAt: {
type: Date,
default: Date.now,
index: { expires: 60*60*24 }
}
})
module.exports = mongoose.model('tokens', tokenSchema)

How to make strict associations in Sails.js

Here are my two models that were generated using sails generate api model_name
Guitar.js
module.exports = {
schema: true,
attributes: {
brand:{
type: 'string',
required: true
},
messages:{
collection: 'message',
via: 'guitar'
}
}
};
Message.js
module.exports = {
schema: true,
attributes: {
text:{
type: 'string',
required: true
},
author:{
type: 'string',
defaultsTo: 'Anonymous author'
},
guitar:{
model: 'guitar',
required: true
}
}
};
Basically, a guitar can have many messages.
The problem comes when I insert new messages into the DB:
POST http://localhost:1337/message/
With JSON content:
{
"text": 4,
"author": "33434",
"guitar": null,
"extra": "This attribute will be removed because schema: true"
}
If I send this, the server will throw an error because the message must have a guitar.
However, if I instead write "guitar": 34, and 34 is a guitar ID that doesn't exist, the message will be added, and guitar will be changed to null. Weird.
This seems to be a bug, or maybe it's the intended behaviour but with a NoSQL database in mind.
I need to make strict associations so that all data makes sense. I hope I don't have to create my own controller manually that handles this logic the way I want.
Basically what I want is that Sails throws an error when the ID of the association doesn't exist.
By the time I write this, I came up with a solution: Just configure this on the DB server (MySQL, etc) so that the foreign key must exist and then handle the error in Sails. But I'm not very happy with it since it depends on the DB server. I'd like this to work even with localDiskDb.
My other solution would be to actually write manually a controller and see what happens by using something like Guitar.find(id).add(new_message) (maybe it's wrong, I haven't tested this)
I am not sure about this but have you tried like this :
http://localhost:1337/message/create?text=4&author=3343&guitar=somerandomString

Compund index which overwrites previous document

I am trying to create a model for MongoDb using mongoose where I want to ensure that only one document exists for a particular user and file.
var FileStatusSchema = new mongoose.Schema ({
file: mongoose.Schema.Types.ObjectId,
user: mongoose.Schema.Types.ObjectId,
hasSeen: { type: Boolean, default: false }
})
FileStatusSchema.index = ({file: 1, user: 1}, {unique: true})
Now, if I try to save a document with a combination of file and user which already exists, it raises a duplicate key error.
Is there some way with which I can configure MongoDB to overwrite the document rather than raising an exception?
if document doesn't exists this command will create new one
collection.update({file:2112,user:21421}, {hasSeen:true}, {upsert:true});

Mongoose/Node server restart and duplicates

Ok so after a ton of trial and error, I've determined that when I drop a collection and then recreate it through my app, unique doesn't work until I restart my local node server. Here's my Schema
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var Services = new Schema ({
type : {type : String},
subscriptionInfo : Schema.Types.Mixed,
data : Schema.Types.Mixed
},{_id:false});
var Hashtags = new Schema ({
name: {type : String},
services : [Services]
},{_id:false});
var SubscriptionSchema = new Schema ({
eventId : {type: String, index: { unique: true, dropDups: true }},
hashtags : [Hashtags]
});
module.exports = mongoose.model('Subscription', SubscriptionSchema);
And Here's my route...
router.route('/')
.post(function(req, res) {
var subscription = new subscribeModel();
subscription.eventId = eventId;
subscription.save(function(err, subscription) {
if (err)
res.send(err);
else
res.json({
message: subscription
});
});
})
If I drop the collection, then hit the /subscribe endpoint seen above, it will create the entry but will not honor the duplicate. It's not until I then restart the server that it starts to honor it. Any ideas why this is? Thanks!
What mongoose does when your application starts and it itself initializes is scan your schema definitions for the registered models and calls the .ensureIndexes() method for the supplied arguments. This is the "by design" behavior and is also covered with this statement:
When your application starts up, Mongoose automatically calls ensureIndex for each defined index in your schema. While nice for development, it is recommended this behavior be disabled in production since index creation can cause a significant performance impact. Disable the behavior by setting the autoIndex option of your schema to false.
So your general options here are:
Don't "drop" the collection and call .remove() which leaves the indexes intact.
Manually call .ensureIndexes() when you issue a drop on a collection in order to rebuild them.
The warning in the document is generally that creating indexes for large collections can take some time and take up server resources. If the index exists this is more or less a "no-op" to MongoDB, but beware of small changes to the index definition which would result in creating "additional" indexes.
As such, it is generally best to have a deployment plan for production systems where you determine what needs to be done.
This post seems to argue that indexes are not re-built when you restart: Are MongoDB indexes persistent across restarts?

Ratings not updating as expected in Mongo

I'm trying to implement a rating system along the lines of upvotes/downvotes.
Users can vote on a lesson only once. They can switch their votes between up and down. Voting the same as their previous vote removes their vote.
I'm trying to accomplish this with pull() but it empties out the entire ratings array including other users' votes.
Rating Schema
var RatingSchema = new Schema({
rating: Boolean,
user: {
type: Schema.ObjectId,
ref: 'User'
}
});
Lesson Schema
var LessonSchema = new Schema({
...,
ratings: [RatingSchema]
});
Problem code
//assuming lesson.ratings looks like this
[{user: 123..., rating: true},
{user: 321..., rating: true}];
//assuming lesson was loaded from a query
lesson.ratings.pull({user: 123...});
//resulting ratings
[]
I don't if this is expected behavior but I just want to remove the matching rating and not all of the sub docs.
Found a working sol'n.
The problem was the way the rating schema was originally defined and how save() works:
OG Rating Schema
var Ratings = new Schema({
_id: false,
rating: Boolean,
user: {
type: Schema.ObjectId,
ref: 'User'
}
});
I thought the user ref would be unique enough to work with wrt modifying ratings.
I still don't know why the pull() would delete all subdocs regardless of user id.
When its time to save, because of the lack of _id, it updated the entire ratings array overwriting altogether.
To fix this instead of using the user to update, I switched it to use _id.
Also, careful when switching the _id flag on and off again. The driver will throw undefined objectid errs.

Categories