Linkedin API - get number of connections - javascript

I can successfully download LinkedIn connections, but now I am trying to get one extra object for the user - their number of connections. The documentation seems to suggest that I need "total" or "_total" but I have tried all combinations with no success. All I get is "undefined". Here's the code with examples of what I'm trying to get the total count:
IN.API.Connections("me")
.fields(["id", "firstName", "lastName", "mainAddress", "dateOfBirth", "phoneNumbers", "positions", "pictureUrl"])
.params({"count":500})
.error(displayError)
.result(function(result) {
document.write("1 ", result.values._total, "<br>");
document.write("2 ", result.values.total, "<br>");

The returned JSON is of the format:
{
"_count": #,
"_start": #,
"_total": #,
"values": [
{
...
},
{
...
}
]
}
So you would access the total number of connections returned via result._total.

Related

How to Make a ShopifyQL query?

Shopify recently announced ShopifyQL for easier accessing of analytics data. However, I'm unclear of how to actually make a ShopifyQL call. They do include an example.
{
# "FROM sales SHOW total_sales BY month SINCE -1y UNTIL today" passes a ShopifyQL query to the GraphQL query.
shopifyqlQuery(query: "FROM sales SHOW total_sales BY month SINCE -1y UNTIL today") {
__typename
... on TableResponse {
tableData {
rowData
columns {
# Elements in the columns section describe which column properties you want to return.
name
dataType
displayName
}
}
}
# parseErrors specifies that you want errors returned, if there were any, and which error properties you want to return.
parseErrors {
code
message
range {
start {
line
character
}
end {
line
character
}
}
}
}
}
However, using the GraphiQL tool to run the query hits a number of errors:
{
"errors": [
{
"message": "Field 'shopifyqlQuery' doesn't exist on type 'QueryRoot'",
"locations": [
{
"line": 3,
"column": 3
}
],
"path": [
"query",
"shopifyqlQuery"
],
"extensions": {
"code": "undefinedField",
"typeName": "QueryRoot",
"fieldName": "shopifyqlQuery"
}
}
]
}
I also tried making an authenticated call with the example query above using my app's Node server, but ran into the same issues.
What am I missing here?
Looks like it only works with the unstable version currently:
https://"+shop_name+".myshopify.com/admin/api/unstable/graphql.json
Got a successful response with this.
Is this still working for you?
https://storename.myshopify.com/admin/api/unstable/graphql.json
This was working for past 6 months. but now it getting access denied.
"code": "ACCESS_DENIED",
"documentation": "https://shopify.dev/api/usage/access-scopes",
"requiredAccess": "read_reports access scope, read_customers access scope, read_fulfillments access scope, read_inventory access scope, read_orders access scope, read_products access scope and read_all_orders access scope"
}

How do I query an index properly with Dynamoose

I'm using Dynamoose to simplify my interactions with DynamoDB in a node.js application. I'm trying to write a query using Dynamoose's Model.query function that will search a table using an index, but it seems like Dynamoose is not including all of the info required to process the query and I'm not sure what I'm doing wrong.
Here's what the schema looks like:
const UserSchema = new dynamoose.Schema({
"user_id": {
"hashKey": true,
"type": String
},
"email": {
"type": String,
"index": {
"global": true,
"name": "email-index"
}
},
"first_name": {
"type": String,
"index": {
"global": true,
"name": "first_name-index"
}
},
"last_name": {
"type": String,
"index": {
"global": true,
"name": "last_name-index"
}
}
)
module.exports = dynamoose.model(config.usersTable, UserSchema)
I'd like to be able to search for users by their email address, so I'm writing a query that looks like this:
Users.query("email").contains(query.email)
.using("email-index")
.all()
.exec()
.then( results => {
res.status(200).json(results)
}).catch( err => {
res.status(500).send("Error searching for users: " + err)
})
I have a global secondary index defined for the email field:
When I try to execute this query, I'm getting the following error:
Error searching for users: ValidationException: Either the KeyConditions or KeyConditionExpression parameter must be specified in the request.
Using the Dynamoose debugging output, I can see that the query winds up looking like this:
aws:dynamodb:query:request - {
"FilterExpression": "contains (#a0, :v0)",
"ExpressionAttributeNames": {
"#a0": "email"
},
"ExpressionAttributeValues": {
":v0": {
"S": "mel"
}
},
"TableName": "user_qa",
"IndexName": "email-index"
}
I note that the actual query sent to DynamoDB does not contain KeyConditions or KeyConditionExpression, as the error message indicates. What am I doing wrong that prevents this query from being written correctly such that it executes the query against the global secondary index I've added for this table?
As it turns out, calls like .contains(text) are used as filters, not query parameters. DynamoDB can't figure out if the text in the index contains the text I'm searching for without looking at every single record, which is a scan, not a query. So it doesn't make sense to try to use .contains(text) in this context, even though it's possible to call it in a chain like the one I constructed. What I ultimately needed to do to make this work is turn my call into a table scan with the .contains(text) filter:
Users.scan({ email: { contains: query.email }}).all().exec().then( ... )
I am not familiar with Dynamoose too much but the following code below will do an update on a record using node.JS and DynamoDB. See the key parameter I have below; by the error message you got it seems you are missing this.
To my knowledge, you must specify a key for an UPDATE request. You can checks the AWS DynamoDB docs to confirm.
var params = {
TableName: table,
Key: {
"id": customerID,
},
UpdateExpression: "set customer_name= :s, customer_address= :p, customer_phone= :u, end_date = :u",
ExpressionAttributeValues: {
":s": customer_name,
":p": customer_address,
":u": customer_phone
},
ReturnValues: "UPDATED_NEW"
};
await docClient.update(params).promise();

Can't access property from object in Javascript

EDIT: Re-structured question, cleaer, and cleaner:
I have a data object from Sequelize that is sent by node-express:
{
"page": 0,
"limit": 10,
"total": 4,
"data": [
{
"id": 1,
"title": "movies",
"isActive": true,
"createdAt": "2020-05-30T19:26:04.000Z",
"updatedAt": "2020-05-30T19:26:04.000Z",
"questions": [
{
"questionsCount": 4
}
]
}
]
}
The BIG question is, how do I get the value of questionsCount?
The PROBLEM is, I just can't extract it, these two methods give me undefined result:
category.questions[0].questionsCount
category.questions[0]['questionsCount']
I WAS ABLE to get it using toJSON() (From Sequelize lib I think), like so:
category.questions[0].toJSON().questionsCount
But I'd like to know the answer to the question, or at least a clear explanation of why do I have to use toJSON() just to get the questionsCount?
More context:
I have this GET in my controller:
exports.getCategories = (req, res) => {
const page = myUtil.parser.tryParseInt(req.query.page, 0)
const limit = myUtil.parser.tryParseInt(req.query.limit, 10)
db.Category.findAndCountAll({
where: {},
include: [
{
model: db.Question,
as: "questions",
attributes: [[db.Sequelize.fn('COUNT', 'id'), 'questionsCount']]
}
],
offset: limit * page,
limit: limit,
order: [["id", "ASC"]],
})
.then(data => {
data.rows.forEach(function(category) {
console.log("------ May 31 ----> " + JSON.stringify(category.questions[0]) + " -->" + category.questions[0].hasOwnProperty('questionsCount'))
console.log(JSON.stringify(category))
console.log(category.questions[0].toJSON().questionsCount)
})
res.json(myUtil.response.paging(data, page, limit))
})
.catch(err => {
console.log("Error get categories: " + err.message)
res.status(500).send({
message: "An error has occured while retrieving data."
})
})
}
I loop through the data.rows to get each category object.
The console.log outputs are:
------ May 31 ----> {"questionsCount":4} -->false
{"id":1,"title":"movies","isActive":true,"createdAt":"2020-05-30T19:26:04.000Z","updatedAt":"2020-05-30T19:26:04.000Z","questions":[{"questionsCount":4}]}
4
https://github.com/sequelize/sequelize/blob/master/docs/manual/core-concepts/model-querying-finders.md
By default, the results of all finder methods are instances of the model class (as opposed to being just plain JavaScript objects). This means that after the database returns the results, Sequelize automatically wraps everything in proper instance objects. In a few cases, when there are too many results, this wrapping can be inefficient. To disable this wrapping and receive a plain response instead, pass { raw: true } as an option to the finder method.
(emphasis by me)
Or directly in the source code, https://github.com/sequelize/sequelize/blob/59b8a7bfa018b94ccfa6e30e1040de91d1e3d3dd/lib/model.js#L2028
#returns {Promise<{count: number, rows: Model[]}>}
So the thing is that you get an array of Model objects which you could navigate with their get() method. It's an unfortunate coincidence that you expected an array, and got an array so you thought it is "that" array. Try the {raw:true} thing, I guess it looks something like this:
db.Category.findAndCountAll({
where: {},
include: [
{
model: db.Question,
as: "questions",
attributes: [[db.Sequelize.fn('COUNT', 'id'), 'questionsCount']]
}
],
offset: limit * page,
limit: limit,
order: [["id", "ASC"]],
raw: true // <--- hopefully it is this simple
}) [...]
toJSON() is nearby too, https://github.com/sequelize/sequelize/blob/59b8a7bfa018b94ccfa6e30e1040de91d1e3d3dd/lib/model.js#L4341
/**
* Convert the instance to a JSON representation.
* Proxies to calling `get` with no keys.
* This means get all values gotten from the DB, and apply all custom getters.
*
* #see
* {#link Model#get}
*
* #returns {object}
*/
toJSON() {
return _.cloneDeep(
this.get({
plain: true
})
);
}
So it worked exactly because it did what you needed, removed the get() stuff and provided an actual JavaScript object matching your structure (POJSO? - sorry, I could not resist). I rarely use it and thus always forget, but the key background "trick" is that a bit contrary to its name, toJSON() is not expected to create the actual JSON string, but to provide a replacement object which still gets stringified by JSON.stringify(). (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#toJSON_behavior)
try to do so category.data[0].questions.questionCount
As mentioned by others already, you need category.data[0].questions[0].questionCount.
Let me add to that by showing you why. Look at your object, I annotated it with how each part would be accessed:
category = { // category
"page": 0,
"limit": 10,
"total": 2,
"data": [ // category.data
{ // category.data[0]
"id": 1,
"title": "movies",
"createdAt": "2020-05-30T19:26:04.000Z",
"updatedAt": "2020-05-30T19:26:04.000Z",
"questions": [ // category.data[0].questions
{ // category.data[0].questions[0]
"questionCount": 2 // category.data[0].questions[0].questionCount
}
],
"questionsCount": "newValue here!"
}
]
}
try this
category.data[0].questions[0].questionCount
the reason why you have to use toJSON is because it's sometimes it is used to customise the stringification behavior. like doing some calculation before assinging the value to the object that will be returned , so it is most likley been used here to calculate the "numb of questions and then return an object with the property questionscount and the number calculated
so the object you retreived more or less looks like this
var cathegory = {
data: 'data',
questions:[{
// some calulation here to get the questionsCount
result=4,
toJSON () {
return {"questionsCount":this.result}
}
}
]
};
console.log(cathegory.questions[0].toJSON().questionsCount) //4
console.log(JSON.stringify(cathegory)) // {"data":"data","questions":[{"questionsCount":4}]}
console.log("------ May 31 ----> " + JSON.stringify(cathegory.questions[0]) + " -->" + cathegory.questions[0].hasOwnProperty('questionsCount')) //false

Searching for multiple facetsValues in Algolia

I have an array of Facet Values that I need to gather from an Algolia Indices.
For example, these are: "Beds", "Occupancy". and "Floor".
At the moment, i've got the below code which will go to my Algolia Table, grab me all of the possible values for each of the above but I have to do a query into Algolia for each one. This results in 3 network calls to Algolia.
index.searchForFacetValues(
{
facetName: val,
facetQuery: "",
maxFacetHits: 100,
query: "2019"
},
function(err, content) {
return content
})
Is there a way that I can get all the facet values for "Beds", "Occupancy". and "Floor" in a single query resulting in only one network call?
Also, i'm using https://www.algolia.com/doc/api-client/getting-started/install/javascript/
You can use an empty search and specify which facets you want to receive values for:
client.search({
query: '',
facets: ['Beds', 'Occupancy', 'Floor'],
attributesToRetrieve: [], // a little optimisation for response transfer speed
});
The response will contain something like this:
{
"facets": {
"Beds": {
"2": 1245,
"4": 893,
...
},
"Floor": {
...
},
...
}
}
So the first level of keys in facets are your facet names, and within each nested facet object you have one key/value per facet value/hits count.
If you don't know in advance the list of facet names you want to get, use facets: '*' in your query parameters.

mongodb pull an array from several object in an array of one document [duplicate]

I have a Mongo document which holds an array of elements.
I'd like to reset the .handled attribute of all objects in the array where .profile = XX.
The document is in the following form:
{
"_id": ObjectId("4d2d8deff4e6c1d71fc29a07"),
"user_id": "714638ba-2e08-2168-2b99-00002f3d43c0",
"events": [{
"handled": 1,
"profile": 10,
"data": "....."
} {
"handled": 1,
"profile": 10,
"data": "....."
} {
"handled": 1,
"profile": 20,
"data": "....."
}
...
]
}
so, I tried the following:
.update({"events.profile":10},{$set:{"events.$.handled":0}},false,true)
However it updates only the first matched array element in each document. (That's the defined behaviour for $ - the positional operator.)
How can I update all matched array elements?
With the release of MongoDB 3.6 ( and available in the development branch from MongoDB 3.5.12 ) you can now update multiple array elements in a single request.
This uses the filtered positional $[<identifier>] update operator syntax introduced in this version:
db.collection.update(
{ "events.profile":10 },
{ "$set": { "events.$[elem].handled": 0 } },
{ "arrayFilters": [{ "elem.profile": 10 }], "multi": true }
)
The "arrayFilters" as passed to the options for .update() or even
.updateOne(), .updateMany(), .findOneAndUpdate() or .bulkWrite() method specifies the conditions to match on the identifier given in the update statement. Any elements that match the condition given will be updated.
Noting that the "multi" as given in the context of the question was used in the expectation that this would "update multiple elements" but this was not and still is not the case. It's usage here applies to "multiple documents" as has always been the case or now otherwise specified as the mandatory setting of .updateMany() in modern API versions.
NOTE Somewhat ironically, since this is specified in the "options" argument for .update() and like methods, the syntax is generally compatible with all recent release driver versions.
However this is not true of the mongo shell, since the way the method is implemented there ( "ironically for backward compatibility" ) the arrayFilters argument is not recognized and removed by an internal method that parses the options in order to deliver "backward compatibility" with prior MongoDB server versions and a "legacy" .update() API call syntax.
So if you want to use the command in the mongo shell or other "shell based" products ( notably Robo 3T ) you need a latest version from either the development branch or production release as of 3.6 or greater.
See also positional all $[] which also updates "multiple array elements" but without applying to specified conditions and applies to all elements in the array where that is the desired action.
Also see Updating a Nested Array with MongoDB for how these new positional operators apply to "nested" array structures, where "arrays are within other arrays".
IMPORTANT - Upgraded installations from previous versions "may" have not enabled MongoDB features, which can also cause statements to fail. You should ensure your upgrade procedure is complete with details such as index upgrades and then run
db.adminCommand( { setFeatureCompatibilityVersion: "3.6" } )
Or higher version as is applicable to your installed version. i.e "4.0" for version 4 and onwards at present. This enabled such features as the new positional update operators and others. You can also check with:
db.adminCommand( { getParameter: 1, featureCompatibilityVersion: 1 } )
To return the current setting
UPDATE:
As of Mongo version 3.6, this answer is no longer valid as the mentioned issue was fixed and there are ways to achieve this. Please check other answers.
At this moment it is not possible to use the positional operator to update all items in an array. See JIRA http://jira.mongodb.org/browse/SERVER-1243
As a work around you can:
Update each item individually
(events.0.handled events.1.handled
...) or...
Read the document, do the edits
manually and save it replacing the
older one (check "Update if
Current" if you want to ensure
atomic updates)
What worked for me was this:
db.collection.find({ _id: ObjectId('4d2d8deff4e6c1d71fc29a07') })
.forEach(function (doc) {
doc.events.forEach(function (event) {
if (event.profile === 10) {
event.handled=0;
}
});
db.collection.save(doc);
});
I think it's clearer for mongo newbies and anyone familiar with JQuery & friends.
This can also be accomplished with a while loop which checks to see if any documents remain that still have subdocuments that have not been updated. This method preserves the atomicity of your updates (which many of the other solutions here do not).
var query = {
events: {
$elemMatch: {
profile: 10,
handled: { $ne: 0 }
}
}
};
while (db.yourCollection.find(query).count() > 0) {
db.yourCollection.update(
query,
{ $set: { "events.$.handled": 0 } },
{ multi: true }
);
}
The number of times the loop is executed will equal the maximum number of times subdocuments with profile equal to 10 and handled not equal to 0 occur in any of the documents in your collection. So if you have 100 documents in your collection and one of them has three subdocuments that match query and all the other documents have fewer matching subdocuments, the loop will execute three times.
This method avoids the danger of clobbering other data that may be updated by another process while this script executes. It also minimizes the amount of data being transferred between client and server.
This does in fact relate to the long standing issue at http://jira.mongodb.org/browse/SERVER-1243 where there are in fact a number of challenges to a clear syntax that supports "all cases" where mutiple array matches are found. There are in fact methods already in place that "aid" in solutions to this problem, such as Bulk Operations which have been implemented after this original post.
It is still not possible to update more than a single matched array element in a single update statement, so even with a "multi" update all you will ever be able to update is just one mathed element in the array for each document in that single statement.
The best possible solution at present is to find and loop all matched documents and process Bulk updates which will at least allow many operations to be sent in a single request with a singular response. You can optionally use .aggregate() to reduce the array content returned in the search result to just those that match the conditions for the update selection:
db.collection.aggregate([
{ "$match": { "events.handled": 1 } },
{ "$project": {
"events": {
"$setDifference": [
{ "$map": {
"input": "$events",
"as": "event",
"in": {
"$cond": [
{ "$eq": [ "$$event.handled", 1 ] },
"$$el",
false
]
}
}},
[false]
]
}
}}
]).forEach(function(doc) {
doc.events.forEach(function(event) {
bulk.find({ "_id": doc._id, "events.handled": 1 }).updateOne({
"$set": { "events.$.handled": 0 }
});
count++;
if ( count % 1000 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderedBulkOp();
}
});
});
if ( count % 1000 != 0 )
bulk.execute();
The .aggregate() portion there will work when there is a "unique" identifier for the array or all content for each element forms a "unique" element itself. This is due to the "set" operator in $setDifference used to filter any false values returned from the $map operation used to process the array for matches.
If your array content does not have unique elements you can try an alternate approach with $redact:
db.collection.aggregate([
{ "$match": { "events.handled": 1 } },
{ "$redact": {
"$cond": {
"if": {
"$eq": [ { "$ifNull": [ "$handled", 1 ] }, 1 ]
},
"then": "$$DESCEND",
"else": "$$PRUNE"
}
}}
])
Where it's limitation is that if "handled" was in fact a field meant to be present at other document levels then you are likely going to get unexepected results, but is fine where that field appears only in one document position and is an equality match.
Future releases ( post 3.1 MongoDB ) as of writing will have a $filter operation that is simpler:
db.collection.aggregate([
{ "$match": { "events.handled": 1 } },
{ "$project": {
"events": {
"$filter": {
"input": "$events",
"as": "event",
"cond": { "$eq": [ "$$event.handled", 1 ] }
}
}
}}
])
And all releases that support .aggregate() can use the following approach with $unwind, but the usage of that operator makes it the least efficient approach due to the array expansion in the pipeline:
db.collection.aggregate([
{ "$match": { "events.handled": 1 } },
{ "$unwind": "$events" },
{ "$match": { "events.handled": 1 } },
{ "$group": {
"_id": "$_id",
"events": { "$push": "$events" }
}}
])
In all cases where the MongoDB version supports a "cursor" from aggregate output, then this is just a matter of choosing an approach and iterating the results with the same block of code shown to process the Bulk update statements. Bulk Operations and "cursors" from aggregate output are introduced in the same version ( MongoDB 2.6 ) and therefore usually work hand in hand for processing.
In even earlier versions then it is probably best to just use .find() to return the cursor, and filter out the execution of statements to just the number of times the array element is matched for the .update() iterations:
db.collection.find({ "events.handled": 1 }).forEach(function(doc){
doc.events.filter(function(event){ return event.handled == 1 }).forEach(function(event){
db.collection.update({ "_id": doc._id },{ "$set": { "events.$.handled": 0 }});
});
});
If you are aboslutely determined to do "multi" updates or deem that to be ultimately more efficient than processing multiple updates for each matched document, then you can always determine the maximum number of possible array matches and just execute a "multi" update that many times, until basically there are no more documents to update.
A valid approach for MongoDB 2.4 and 2.2 versions could also use .aggregate() to find this value:
var result = db.collection.aggregate([
{ "$match": { "events.handled": 1 } },
{ "$unwind": "$events" },
{ "$match": { "events.handled": 1 } },
{ "$group": {
"_id": "$_id",
"count": { "$sum": 1 }
}},
{ "$group": {
"_id": null,
"count": { "$max": "$count" }
}}
]);
var max = result.result[0].count;
while ( max-- ) {
db.collection.update({ "events.handled": 1},{ "$set": { "events.$.handled": 0 }},{ "multi": true })
}
Whatever the case, there are certain things you do not want to do within the update:
Do not "one shot" update the array: Where if you think it might be more efficient to update the whole array content in code and then just $set the whole array in each document. This might seem faster to process, but there is no guarantee that the array content has not changed since it was read and the update is performed. Though $set is still an atomic operator, it will only update the array with what it "thinks" is the correct data, and thus is likely to overwrite any changes occurring between read and write.
Do not calculate index values to update: Where similar to the "one shot" approach you just work out that position 0 and position 2 ( and so on ) are the elements to update and code these in with and eventual statement like:
{ "$set": {
"events.0.handled": 0,
"events.2.handled": 0
}}
Again the problem here is the "presumption" that those index values found when the document was read are the same index values in th array at the time of update. If new items are added to the array in a way that changes the order then those positions are not longer valid and the wrong items are in fact updated.
So until there is a reasonable syntax determined for allowing multiple matched array elements to be processed in single update statement then the basic approach is to either update each matched array element in an indvidual statement ( ideally in Bulk ) or essentially work out the maximum array elements to update or keep updating until no more modified results are returned. At any rate, you should "always" be processing positional $ updates on the matched array element, even if that is only updating one element per statement.
Bulk Operations are in fact the "generalized" solution to processing any operations that work out to be "multiple operations", and since there are more applications for this than merely updating mutiple array elements with the same value, then it has of course been implemented already, and it is presently the best approach to solve this problem.
First: your code did not work because you were using the positional operator $ which only identifies an element to update in an array but does not even explicitly specify its position in the array.
What you need is the filtered positional operator $[<identifier>]. It would update all elements that match an array filter condition.
Solution:
db.collection.update({"events.profile":10}, { $set: { "events.$[elem].handled" : 0 } },
{
multi: true,
arrayFilters: [ { "elem.profile": 10 } ]
})
Visit mongodb doc here
What the code does:
{"events.profile":10} filters your collection and return the documents matching the filter
The $set update operator: modifies matching fields of documents it acts on.
{multi:true} It makes .update() modifies all documents matching the filter hence behaving like updateMany()
{ "events.$[elem].handled" : 0 } and arrayFilters: [ { "elem.profile": 10 } ]
This technique involves the use of the filtered positional array with arrayFilters. the filtered positional array here $[elem] acts as a placeholder for all elements in the array fields that match the conditions specified in the array filter.
Array filters
You can update all elements in MongoDB
db.collectioname.updateOne(
{ "key": /vikas/i },
{ $set: {
"arr.$[].status" : "completed"
} }
)
It will update all the "status" value to "completed" in the "arr" Array
If Only one document
db.collectioname.updateOne(
 { key:"someunique", "arr.key": "myuniq" },
 { $set: {
   "arr.$.status" : "completed",
   "arr.$.msgs":  {
        "result" : ""
        }
   
 } }
)
But if not one and also you don't want all the documents in the array to update then you need to loop through the element and inside the if block
db.collectioname.find({findCriteria })
.forEach(function (doc) {
doc.arr.forEach(function (singlearr) {
if (singlearr check) {
singlearr.handled =0
}
});
db.collection.save(doc);
});
I'm amazed this still hasn't been addressed in mongo. Overall mongo doesn't seem to be great when dealing with sub-arrays. You can't count sub-arrays simply for example.
I used Javier's first solution. Read the array into events then loop through and build the set exp:
var set = {}, i, l;
for(i=0,l=events.length;i<l;i++) {
if(events[i].profile == 10) {
set['events.' + i + '.handled'] = 0;
}
}
.update(objId, {$set:set});
This can be abstracted into a function using a callback for the conditional test
The thread is very old, but I came looking for answer here hence providing new solution.
With MongoDB version 3.6+, it is now possible to use the positional operator to update all items in an array. See official documentation here.
Following query would work for the question asked here. I have also verified with Java-MongoDB driver and it works successfully.
.update( // or updateMany directly, removing the flag for 'multi'
{"events.profile":10},
{$set:{"events.$[].handled":0}}, // notice the empty brackets after '$' opearor
false,
true
)
Hope this helps someone like me.
I've been looking for a solution to this using the newest driver for C# 3.6 and here's the fix I eventually settled on. The key here is using "$[]" which according to MongoDB is new as of version 3.6. See https://docs.mongodb.com/manual/reference/operator/update/positional-all/#up.S[] for more information.
Here's the code:
{
var filter = Builders<Scene>.Filter.Where(i => i.ID != null);
var update = Builders<Scene>.Update.Unset("area.$[].discoveredBy");
var result = collection.UpdateMany(filter, update, new UpdateOptions { IsUpsert = true});
}
For more context see my original post here:
Remove array element from ALL documents using MongoDB C# driver
$[] operator selects all nested array ..You can update all array items with '$[]'
.update({"events.profile":10},{$set:{"events.$[].handled":0}},false,true)
Reference
Please be aware that some answers in this thread suggesting use $[] is WRONG.
db.collection.update(
{"events.profile":10},
{$set:{"events.$[].handled":0}},
{multi:true}
)
The above code will update "handled" to 0 for all elements in "events" array, regardless of its "profile" value. The query {"events.profile":10} is only to filter the whole document, not the documents in the array. In this situation it is a must to use $[elem] with arrayFilters to specify the condition of array items so Neil Lunn's answer is correct.
Actually, The save command is only on instance of Document class.
That have a lot of methods and attribute. So you can use lean() function to reduce work load.
Refer here. https://hashnode.com/post/why-are-mongoose-mongodb-odm-lean-queries-faster-than-normal-queries-cillvawhq0062kj53asxoyn7j
Another problem with save function, that will make conflict data in with multi-save at a same time.
Model.Update will make data consistently.
So to update multi items in array of document. Use your familiar programming language and try something like this, I use mongoose in that:
User.findOne({'_id': '4d2d8deff4e6c1d71fc29a07'}).lean().exec()
.then(usr =>{
if(!usr) return
usr.events.forEach( e => {
if(e && e.profile==10 ) e.handled = 0
})
User.findOneAndUpdate(
{'_id': '4d2d8deff4e6c1d71fc29a07'},
{$set: {events: usr.events}},
{new: true}
).lean().exec().then(updatedUsr => console.log(updatedUsr))
})
Update array field in multiple documents in mongo db.
Use $pull or $push with update many query to update array elements in mongoDb.
Notification.updateMany(
{ "_id": { $in: req.body.notificationIds } },
{
$pull: { "receiversId": req.body.userId }
}, function (err) {
if (err) {
res.status(500).json({ "msg": err });
} else {
res.status(200).json({
"msg": "Notification Deleted Successfully."
});
}
});
if you want to update array inside array
await Booking.updateOne(
{
userId: req.currentUser?.id,
cart: {
$elemMatch: {
id: cartId,
date: date,
//timeSlots: {
//$elemMatch: {
//id: timeSlotId,
//},
//},
},
},
},
{
$set: {
version: booking.version + 1,
'cart.$[i].timeSlots.$[j].spots': spots,
},
},
{
arrayFilters: [
{
'i.id': cartId,
},
{
'j.id': timeSlotId,
},
],
new: true,
}
);
I tried the following and its working fine.
.update({'events.profile': 10}, { '$set': {'events.$.handled': 0 }},{ safe: true, multi:true }, callback function);
// callback function in case of nodejs

Categories