I am trying to migrate some data from an old MongoDB schema to a new one. All the schema stuff is working fine. What I cannot understand is why the old documents are not being converted before being saved as new documents. I am reading all the documents, using a map function to convert them to the new schema then saving them. But they are all failing validation because, it turns out, they haven't been modified to the new schema at all. Is this an async issue? Any clues would be great.
let User = require('./api/models/user.model');
let newUser;
let mapUsers = () => {
let makeUser = (u) => {
return {
firstName: u.first_name,
lastName: u.last_name,
avatarUrl: u.avatar_url,
email: u.email,
loginCount: u.login_count,
loginTime: u.login_time,
logoutTime: u.logout_time
}
};
h2User.find({}).limit(1).exec((err, users) => {
if (err) {
console.error(err);
} else {
users.map(user => {
newUser = new User(makeUser(user)); // like this doesn't work
newUser.save((err, nu) => {
if (err) {
console.log(err);
} else {
console.log(nu._id)
}
});
});
}
});
};
mapUsers();
You would have to convert the Mongo document into an object with new User(makeUser(user.toObject())).
As Mongoose returns a document, it will contain other attributes that may not be apparent. When you do console.log(user) it usually prints the output of toObject so it can get confusing.
Related
const Location = require("../models/locations");
getLocation = async(req, res) => {
await Location.findOne(
{ name: req.query.locationName }, // req.query.locationName is "Gurgaon"
{ restaurantIds: 1 },
(err, location) => {
if (err) throw err;
else {
console.log(location);
/*
{
_id: 6004f9cff6ae9921f89f0f81,
restaurantIds: [ 6004fb53f6ae9921f89f0f83, 600792321b229bae25a66497 ]
}
*/
console.log(location._id); // 6004f9cff6ae9921f89f0f81
console.log(location.restaurantIds); // undefined
return location;
}
}
);
}
module.exports = { getLocation };
Screenshot of the output
This is how the locations collection looks like.
{
"_id" : ObjectId("6004f9cff6ae9921f89f0f81"),
"name" : "Gurgaon",
"restaurantIds" : [
ObjectId("6004fb53f6ae9921f89f0f83"),
ObjectId("600792321b229bae25a66497")
]
}
Here is the locations schema.
const mongoose = require("mongoose");
const Schema = mongoose.Schema;
const Locations = new Schema({}, { strict: false });
module.exports = mongoose.model("locations", Locations);
I don't know the reason why location.restaurantIds is returning me undefined. Please help me with this. I am new to mongodb.
There will be some reasons might be you have not specified this field in your mongoose schema, try adding field in your schema and you are able to access this field in your query.
Second option if you don't want to specify that field in schema then try lean(),
By default, Mongoose queries return an instance of the Mongoose Document class. Enabling the lean option tells Mongoose to skip instantiating a full Mongoose document and just give you the POJO.
await Location.findOne(.. your query).lean();
restaurantIds is a nested object , you must populate it :
const Location = require("../models/locations");
getLocation = async(req, res) => {
await Location.findOne(
{ name: req.query.locationName },
{ restaurantIds: 1 })
.populate('restaurantIds').then(location => {
console.log(location);
console.log(location._id);
console.log(location.restaurantIds);
return location;
})
.catch(err => {
throw err;
})
);
}
module.exports = { getLocation };
It's look like your restaurantIds is an array, so when you print it, you must use as array. Start by change:
console.log(location.restaurantIds);
to:
console.log(location.restaurantIds[0]);
In this example, you will be printing the first object in the array, and use it in your code as array, it will be OK.
Edit:
After added the restaurantIds, and now we know it's array of ObjectID, and this kind of array cannot be printed.
I would like to know how to assign a date object in this scenario, I need to update lastUpdate, whenever user changes his details.
I also tried Object.assign(user.lastUpdated, new Date());
exports.edit = (req, res, next) => {
const userid = req.params.id;
const errorHandler = (error) => {
next(error);
};
const updateUser = (user) => {
Object.assign(user, req.body);
Object.assign(user.lastUpdated, new Date());// not working
user.lastUpdated= new Date(); //not able to save this in database
user.save().then(() => {
res.json({
uid: user.id,
username: user.username,
displayName: user.displayName,
password:user.password,
lastUpdated: user.lastUpdated// result should be last updated Date.
});
}).catch(errorHandler);
};
};
The Object.assign() method is used to copy the values of all enumerable own properties from one or more source objects to a target object. It will return the target object. ( https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/assign ).
But in your code Object.assign(user.lastUpdated, new Date()); what you are trying to do is join two values together. So it won't work.
Try like this : Object.assign( user, { lastUpdated : new Date() } );
I am currently using Mongoose, however all these hidden keys are driving me crazy and is disrupting my workflow when these keys are popping up out of nowhere. Here is my code - it's simply logging the docs from the find function:
const mongoose = require('mongoose')
const Kitten = mongoose.model('Kitten', mongoose.Schema({ name: String }));
mongoose.connect('mongodb://localhost/test')
mongoose.connection.on('error', console.log)
mongoose.connection.once('open', function() {
var fluffy = new Kitten({ name: 'fluffy' })
fluffy.save((err, fluffy) => {
if (err) return console.error(err);
Kitten.find({}, (err, docs) => {
for (var i = 0; i < docs.length; ++i) {
const doc = docs[i]
console.log('Object.getOwnPropertyNames ', Object.getOwnPropertyNames(doc))
console.log('Object.keys ', Object.keys(doc))
console.log(doc)
console.log('--')
}
})
})
})
And one of the docs that's logged is
Why are the keys shown by console log in neither .keys nor .getOwnPropertyNames? The console.log output is the one that reflects what's actually in the MongoDB document.
Edit: Edited to use more reasonable code
docs is a list of Mongoose document objects. They don't have fields available for enumeration, there are accessors defined that make them available as doc.fieldName.
There are document toObject and toJSON methods to convert document object to plain object when needed.
The actual problem here is that since document object aren't needed, they shouldn't be queried. Plain objects can be retrieved with lean.
Kitten.find({}).lean().exec((err, docs) => {
for (var i = 0; i < docs.length; ++i) {
const doc = docs[i]
...
}
});
The results you get from "find" are cursers. You need to use "toArray" to load the document to RAM.
const mongoose = require('mongoose')
const Kitten = mongoose.model('Kitten', mongoose.Schema({ name: String }));
mongoose.connect('mongodb://localhost/test')
mongoose.connection.on('error', console.log)
mongoose.connection.once('open', function() {
var fluffy = new Kitten({ name: 'fluffy' })
fluffy.save((err, fluffy) => {
if (err) return console.error(err);
Kitten.find({}).toArray((err, doc) => {
console.log('Object.getOwnPropertyNames ', Object.getOwnPropertyNames(doc))
console.log('Object.keys ', Object.keys(doc))
console.log(doc)
console.log('--')
})
})
})
I'm building a GraphQL Server where I need to do some sort of validation before committing data to database (MongoDB and Mongoose).
One of these checks is related to unique fields. So, a model may have one or more unique fields and I need to be able to check for that before saving into database.
So, I have build some helper functions to do it and the code is below:
Helper code:
import mongoose from "mongoose";
const isFieldUnique = (modelName, fieldName, fieldValue) => {
let model = mongoose.model(modelName);
let query = {};
query[fieldName] = fieldValue;
return model.findOne(query).exec();
};
const executeUniquePromises = (uniques, modelName, data) => {
let promises = [];
uniques.map(name => {
let value = data[name];
if (!value)
throw new Error("Cannot test uniqueness for a null field.");
promises.push(
isFieldUnique(modelName, name, value)
.then(value => {
if (value) {
let error = name + ' is not unique';
console.log(error);
return error;
}
console.log(name + ' is unique');
return null;
})
.catch(error => {
throw new Error(error);
})
)
});
return Promise.all(promises);
};
export const checkUniqueness = (uniques, modelName, data) => {
return new Promise((resolve, reject) => {
executeUniquePromises(uniques, modelName, data).then(result => {
let errors = [];
// Check for errors
result.map((error) => {
if (error)
errors.push(error);
});
if (errors.length > 0)
return reject(errors);
else
resolve();
});
});
}
Mongoose static create function:
import * as helper from './helper';
schema.statics.create = function (data) {
let uniques = ['name', 'email'];
helper.checkUniqueness(uniques,'Company', data)
.then(result => {
let user = new this(data);
return company.save();
})
.catch(error => {
throw new Error(error);
});
}
GraphQL code:
const createUser = {
type: UserType,
description: "Create a user",
args: {
data: {
name: "user",
type: new GraphQLNonNull(UserInputType)
}
},
resolve(root, args) {
return UserModel.create(args.data);
}
};
The helper code seens to be confused and I´m not using my usage of promises with other promises are the correct way of doing it.
Remember that I may need to check several fields for uniqueness, so that is why I´ve created the promise array.
One problem is that when I´m inserting data where there are not uniques matching I get no return in my GraphQL Server.
I want to find out a better way of doing it and discover why I´m not getting back the saved object.
MongoDB already handles unique out of the box. Set the field to unique: true in the Mongoose schema. You can use mongoose-beautiful-unique to make the error messages similar to the validation error messages. And finally, read this when you can't get unique: true to work.
var jobskill_ref = db.collection('job_skills').where('job_id','==',post.job_id);
jobskill_ref.delete();
Error thrown
jobskill_ref.delete is not a function
You can only delete a document once you have a DocumentReference to it. To get that you must first execute the query, then loop over the QuerySnapshot and finally delete each DocumentSnapshot based on its ref.
var jobskill_query = db.collection('job_skills').where('job_id','==',post.job_id);
jobskill_query.get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
doc.ref.delete();
});
});
I use batched writes for this. For example:
var jobskill_ref = db.collection('job_skills').where('job_id','==',post.job_id);
let batch = firestore.batch();
jobskill_ref
.get()
.then(snapshot => {
snapshot.docs.forEach(doc => {
batch.delete(doc.ref);
});
return batch.commit();
})
ES6 async/await:
const jobskills = await store
.collection('job_skills')
.where('job_id', '==', post.job_id)
.get();
const batch = store.batch();
jobskills.forEach(doc => {
batch.delete(doc.ref);
});
await batch.commit();
//The following code will find and delete the document from firestore
const doc = await this.noteRef.where('userId', '==', userId).get();
doc.forEach(element => {
element.ref.delete();
console.log(`deleted: ${element.id}`);
});
the key part of Frank's answer that fixed my issues was the .ref in doc.ref.delete()
I originally only had doc.delete() which gave a "not a function" error. now my code looks like this and works perfectly:
let fs = firebase.firestore();
let collectionRef = fs.collection(<your collection here>);
collectionRef.where("name", "==", name)
.get()
.then(querySnapshot => {
querySnapshot.forEach((doc) => {
doc.ref.delete().then(() => {
console.log("Document successfully deleted!");
}).catch(function(error) {
console.error("Error removing document: ", error);
});
});
})
.catch(function(error) {
console.log("Error getting documents: ", error);
});
or try this, but you must have the id beforehand
export const deleteDocument = (id) => {
return (dispatch) => {
firebase.firestore()
.collection("contracts")
.doc(id)
.delete()
}
}
You can now do this:
db.collection("cities").doc("DC").delete().then(function() {
console.log("Document successfully deleted!");
}).catch(function(error) {
console.error("Error removing document: ", error);
});
And of course, you can use await/async:
exports.delete = functions.https.onRequest(async (req, res) => {
try {
var jobskill_ref = db.collection('job_skills').where('job_id','==',post.job_id).get();
jobskill_ref.forEach((doc) => {
doc.ref.delete();
});
} catch (error) {
return res.json({
status: 'error', msg: 'Error while deleting', data: error,
});
}
});
I have no idea why you have to get() them and loop on them, then delete() them, while you can prepare one query with where to delete in one step like any SQL statement, but Google decided to do it like that. so, for now, this is the only option.
If you're using Cloud Firestore on the Client side, you can use a Unique key generator package/module like uuid to generate an ID. Then you set the ID of the document to the ID generated from uuid and store a reference to the ID on the object you're storing in Firestore.
For example:
If you wanted to save a person object to Firestore, first, you'll use uuid to generate an ID for the person, before saving like below.
const uuid = require('uuid')
const person = { name: "Adebola Adeniran", age: 19}
const id = uuid() //generates a unique random ID of type string
const personObjWithId = {person, id}
export const sendToFireStore = async (person) => {
await db.collection("people").doc(id).set(personObjWithId);
};
// To delete, get the ID you've stored with the object and call // the following firestore query
export const deleteFromFireStore = async (id) => {
await db.collection("people").doc(id).delete();
};
Hope this helps anyone using firestore on the Client side.
The way I resolved this is by giving each document a uniqueID, querying on that field, getting the documentID of the returned document, and using that in the delete. Like so:
(Swift)
func rejectFriendRequest(request: Request) {
DispatchQueue.global().async {
self.db.collection("requests")
.whereField("uniqueID", isEqualTo: request.uniqueID)
.getDocuments { querySnapshot, error in
if let e = error {
print("There was an error fetching that document: \(e)")
} else {
self.db.collection("requests")
.document(querySnapshot!.documents.first!.documentID)
.delete() { err in
if let e = err {
print("There was an error deleting that document: \(e)")
} else {
print("Document successfully deleted!")
}
}
}
}
}
}
The code could be cleaned up a bit, but this is the solution I came up with. Hope it can help someone in the future!
const firestoreCollection = db.collection('job_skills')
var docIds = (await firestoreCollection.where("folderId", "==", folderId).get()).docs.map((doc => doc.id))
// for single result
await firestoreCollection.doc(docIds[0]).delete()
// for multiple result
await Promise.all(
docIds.map(
async(docId) => await firestoreCollection.doc(docId).delete()
)
)
delete(seccion: string, subseccion: string)
{
const deletlist = this.db.collection('seccionesclass', ref => ref.where('seccion', '==', seccion).where('subseccion', '==' , subseccion))
deletlist.get().subscribe(delitems => delitems.forEach( doc=> doc.ref.delete()));
alert('record erased');
}
The code for Kotlin, including failure listeners (both for the query and for the delete of each document):
fun deleteJobs(jobId: String) {
db.collection("jobs").whereEqualTo("job_id", jobId).get()
.addOnSuccessListener { documentSnapshots ->
for (documentSnapshot in documentSnapshots)
documentSnapshot.reference.delete().addOnFailureListener { e ->
Log.e(TAG, "deleteJobs: failed to delete document ${documentSnapshot.reference.id}", e)
}
}.addOnFailureListener { e ->
Log.e(TAG, "deleteJobs: query failed", e)
}
}