I have somewhat the following schema(without _id) -
{uid: String,
inbox:[{msgid:String, someval:String}]
}
Now, in the request I get the msgid and I use it in the following mongoose query like this-
my_model.findOne({'inbox.msgid':'msgidvaluexyz'}
, function(err, doc) {
console.log(doc);
return !0; })
Now, the problem is that I get the whole document which has the specific message along with the other messages in inbox -
Output-
{uid:'xyz',
inbox:[
{msgid:,someval},
{msgid:'our queried msgid',someval}, //required sub array
{msgid:,someval},
]
}
Now what query can i use to get the specific sub array only as the document inbox is too large to be looped through.
Use the $ positional selection operator to have the returned doc only include the matched inbox element:
my_model.findOne({'inbox.msgid':'msgidvaluexyz'}
, {'inbox.$': 1}
, function(err, doc) {
console.log(doc);
return !0; })
If I understood your question correctly:
// find each person with a last name matching 'Ghost'
var query = Person.findOne({ 'name.last': 'Ghost' });
// selecting the `name` and `occupation` fields
query.select('name occupation');
http://mongoosejs.com/docs/queries.html
You can select which fields you want to get back. You could get only the inbox array or everything except that.
Related
To summarize, I am working with 2 collections - 'usercollection' and 'groupcollection' and I would like to associate users with groups. I don't want to have 2 copies of all the user documents so I have a unique ID attribute for each user that I want to use to associate specific users with specific groups. This is all running on a localhost webserver so I'm getting the input from an html page with a form in it where you enter 'username' and 'groupname'. I tried using the .distinct() function with query as 'username' and the target field/attribute as 'uid'.
// Set our internal DB variable
var db = req.db;
// Get our form values. These rely on the "name" attributes
var userName = req.body.username;
// Set query and options for searching usercollection
var query = {"username" : userName};
const fieldName = "uid";
// Set our collections
var users = db.get('usercollection');
// Get UID corresponding to username
var uidToAdd = users.distinct(fieldName, query);
This is what I attempted (with some other lines that aren't relevant taken out) but it just returned a null object so I'm at a bit of a loss. Also, I'm still a beginner with nodejs/javascript/mongoDB so the more informative the answer the better! When I do the same code in the mongo shell I can get the actual value of the 'uid' attribute so I really don't know what's going wrong
I am not sure I am following you. But if I understood correctly, if you want to make a relationship between 'usercollection' and 'groupcolletion', you can simply create those 2 collections and each user in 'usercollection' should have a field with 'groupid' as a reference. In this way, you can access 'groupcollection' easily.
Here is an example with using mongoose.
In User model
...
groupId: {
type: mongoose.Schema.Types.ObjectID
ref: "Group"
}
...
Later you can also use 'populate' to fetch 'Group' information.
...
let data = await User.findById(id).populate('groupId');
...
I tried to create a stored procedure using the sample sp creation code from Azure docs, but i couldn't fetch the collection details. It always returns null.
Stored Procedure
// SAMPLE STORED PROCEDURE
function sample(prefix) {
var collection = getContext().getCollection();
console.log(JSON.stringify(collection));
// Query documents and take 1st item.
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
'SELECT * FROM root r',
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found',
// else take 1st element from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found');
}
else {
var response = getContext().getResponse();
var body = { prefix: prefix, feed: feed[0] };
response.setBody(JSON.stringify(body));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
The console shows only this.
the results shows no doc found because of not getting collection.I have passed the partition key at time of execution via explorer.
I had a similar issue. I think the Azure portal doesn't execute stored procedures properly when the partition key is not a string.
In my case I had a partitionKey that is a number. When I executed the stored procedure via the portal I always got an empty resultSet, even though I had documents in my database. When I changed the structure a little, and made my partitionKey a string, the stored procedure worked fine.
Did you create the ToDoList Database with the Items Collection? Yo can do this from the Quick start blade in the Azure portal.
And then create an SP to run against that collection. There is no partition key required, so no additional params are required (leave blank).
The Collection is created without any documents. You may choose to add documents via the Query Explorer blade or via the sample ToDoList App that is available via the Quick start blade.
You are debugging in a wrong way.
It is perfectly fine to see "{\"spatial\":{}}" in your console log, even if the collection has items. Why? well because that is a property of that object.
So regarding what you said:
the results shows no doc found because of not getting collection
is false. I have the same console log text, but I have items in my collection.
I have 2 scenarios for why your stored procedure return no items:
I had the same issue trying on azure portal UI(in browser) and for my surprise I had to insert an item without the KEY in order that my stored procedure to see it.
On code you specify the partition as a string ie. new PartitionKey("/UserId") instead of your object ie. new PartitionKey(stock.UserId)
Hello is this code in the comment possible with Parse Cloud Code?
Parse.Cloud.beforeFind('Note', function(req) {
var query = req.query;
var user = req.user;
// if a given 'Note' visibility is set to 'Unlisted'
// return only the Notes with 'user' field that the calling _User
});
The documentation only shows how to filter fields that are returned but not exactly remove items from the query result in the Cloud Code.
This can be done through ACL, I know, but the caveat is that if the request is a retrieve function and not query the Note should still return.
Assuming you've saved the user as an object relationship (not a string id). Just add the qualification you need, such as:
query.equalTo("your_user_pointer_col_on_Note", user)
I'm using Node.js to push values to a MySQL table like:
for (var i = 0; i < data.length; i++) {
flattenedData.push([data[i].id, data[i].adult, data[i].backdrop_path, JSON.stringify(data[i].genre_ids), data[i].original_language, data[i].original_title, data[i].overview, data[i].popularity, data[i].poster_path, data[i].release_date, data[i].title, data[i].video, data[i].vote_average, data[i].vote_count]);
//console.log(flattenedData);
}
db.query("INSERT INTO movies (id, adult, backdrop_path, genre_ids, original_language, original_title, overview, popularity, poster_path, release_date, title, video, vote_average, vote_count ) values ?", [flattenedData], function (err, result) {
if (err) {
throw err;
}
else {
console.log('data inserted' + result);
}
});
I want to add ON DUPLICATE KEY UPDATE to the query, but I keep getting syntax errors, can anyone show me the proper way?
I'm going to short this to three columns, and assume id is the only unique column.
db.query("INSERT INTO movies (id, adult, backdrop_path) VALUES (?, ?, ?)
ON DUPLICATE KEY UPDATE adult=VALUES(adult), backdrop_path=VALUES(backdrop_path)",
flattenedData, function (err, result) {
...
This means if the insert results in a duplicate on the primary/unique column (id), then copy the other columns from the values you tried to insert in the VALUES clause into their respective column, overriding what their value was previously in the existing row.
There's no shortcut for that; you have to spell out all such column assignments.
I'm pretty sure the argument to query() for parameters should be an array, but it looks like your flattenedData is already an array since you're pushing to it. So I don't think you need to put it in square-brackets.
I was confused by how to get this to work in a react/redux app and eventually came to the "correct" method.
My implementation required me to update one field value per record for an arbitrary number of records in a table with 21 fields.
If you are passing data as an array structure it like [['dataString',666.66],['dataString2',666666.66],['dataString3',666666666.66]] and then make sure you pass this whole thing as an array to the query function. See itemQtyData in my code sample below.
Another thing that tripped me up was the use of brackets around the values replacement string. I didn't need them. The examples I looked at showed implementations that needed them. I also only used a single ? for all the values. So instead of using (?,?) to represent the values in the query, which didn't work, I used ?.
I found it unnecessary to supply all the field names and the corresponding values for the table. MySQL will warn you if fields don't have a default value. I haven't found this to be an issue in this case.
You can console.log the formatted sql in SqlString.format function in the file node_modules/sqlstring/lib/SqlString.js. I found this useful to see exactly why the query wasn't working and to have something that I could plug into MySQL Workbench to mess around with.
Edit: You can also do this console.log(connection.query(yourQuery, [someData], callback)) and you get the sql and lot's more when the function executes. Might make more sense than adding console.log calls to the module code.
Hope this helps!
let itemQtyData = order.map(item => {
return [
`${item.id}`,
`${Number(item.products_quantity - Number(item.quantity_to_add))}`
];
});
const updateQtyQuery =`INSERT INTO products (products_id, products_quantity) VALUES ? ON DUPLICATE KEY UPDATE products_quantity=VALUES(products_quantity)`;
connectionOSC.query(
updateQtyQuery,
[itemQtyData],
(error, results, fields) => {
if (error) throw error;
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Origin": "*"
},
body: saleId,
response: results,
isBase64Encoded: false
};
context.succeed(response);
});
I'm having some trouble querying a document by values matching inside the document after population by mongoose.
My schemas are something like this:
var EmailSchema = new mongoose.Schema({
type: String
});
var UserSchema = new mongoose.Schema({
name: String,
email: [{type:Schema.Types.ObjectId, ref:'Email'}]
});
I would like to have all users which have a email with the type = "Gmail" for example.
The following query returns empty results:
Users.find({'email.type':'Gmail').populate('email').exec( function(err, users)
{
res.json(users);
});
I have had to resort to filtering the results in JS like this:
users = users.filter(function(user)
{
for (var index = 0; index < user.email.length; index++) {
var email = user.email[index];
if(email.type === "Gmail")
{
return true;
}
}
return false;
});
Is there any way to query something like this straight from mongoose?
#Jason Cust explained it pretty well already - in this situation often the best solution is to alter the schema to prevent querying Users by properties of documents stored in separate collection.
Here's the best solution I can think of that will not force you to do that, though (because you said in the comment that you can't).
Users.find().populate({
path: 'email',
match: {
type: 'Gmail'
}
}).exec(function(err, users) {
users = users.filter(function(user) {
return user.email; // return only users with email matching 'type: "Gmail"' query
});
});
What we're doing here is populating only emails matching additional query (match option in .populate() call) - otherwise email field in Users documents will be set to null.
All that's left is .filter on returned users array, like in your original question - only with much simpler, very generic check. As you can see - either the email is there or it isn't.
Mongoose's populate function doesn't execute directly in Mongo. Instead after the initial find query returns a set a documents, populate will create an array of individual find queries on the referenced collection to execute and then merge the results back into the original documents. So essentially your find query is attempting to use a property of the referenced document (which hasn't been fetched yet and therefore is undefined) to filter the original result set.
In this use case it seems more appropriate to store emails as a subdocument array rather than a separate collection to achieve what you want to do. Also, as a general document store design pattern this is one of the use cases that makes sense to store an array as a subdocument: limited size and very few modifications.
Updating your schema to:
var EmailSchema = new mongoose.Schema({
type: String
});
var UserSchema = new mongoose.Schema({
name: String,
email: [EmailSchema]
});
Then the following query should work:
Users.find({'email.type':'Gmail').exec(function(err, users) {
res.json(users);
});
I couldn't find any other solution other than using Aggregate. It will be more troublesome, but we will use Lookup.
{
$lookup:
{
from: <collection to join>,
localField: <field from the input documents>,
foreignField: <field from the documents of the "from" collection>,
as: <output array field>
}
}