I am stumped on this one. I have an images array with in my collection, the users can rearrange the order of images on the client side and I am trying to save the new order to the database. The imagesOrder array is the new images in the new order and it only has the url so I want to match the url to the urls in the database. I am not sure how to make the index a variable or if this is possible:
this is what I have so far. my code editor shows and error on [index] so I know that is not the proper format but not sure what is:
imagesOrder.forEach((index, image) => {
const imageUrl = image.url
const index = index
Users.update({
id
}, {
$set: {
images[index]: imageUrl
}
})
});
So that is not actually the way you would do this. Basically there is no need to actually send an update request to the server for every single indexed position for the array. Also the update() method is asynchronous, so it's not something you ever put inside a forEach() which does not respect awaiting the completion of an asynchronous call.
Instead what is usually the most practical solution is to just $set the entire array content in one request. Also mocking up your imagesOrder to something practical since forEach() even actually has the signature of .forEach((<element>,><index>) => ..., which seems different to what you were expecting given the code in the question.
var imagesOrder = [
{ index: 0, url: '/one' }, { index: 1, url: '/two' }, { index: 2, url: '/three' }
];
let response = await Users.updateOne(
{ id },
{ "$set": { "images": imagesOrder.map(({ url }) => url) } }
);
// { "$set": { "images": ["/one","/two","/three"] } }
Much like the forEach() a map() does the same array iteration but with the difference that it actually returns an array generated by the processing function. This is actually what you want since all that is needed here is to simply extract the values of the url property from each object.
Note the index properties are actually already in order and really redundant here, but I'm just approximating what it sounds like you have from your question. Since an "array" actually maintains it's own order then such a property "should" be redundant and it would be advisable that your source array data actually conforms to this.
If however you managed to record such index values in a way they are actually out of order, then the best solution is to add a sort():
var imagesOrder = [
{ index: 2, url: '/three' }, { index: 0, url: '/one' }, { index: 1, url: '/two' }
];
let response = await Users.updateOne(
{ id },
{ "$set": {
"images": imagesOrder.sort((a,b) => a.index - b.index)
.map(({ url }) => url)
}}
);
// { "$set": { "images": ["/one","/two","/three"] } }
As for what you "attempted", it's not really benefiting you in any way to actually attempt updating each element at a given position. But if you really wanted to see it done, then again you actually would just instead build up a single update request:
var imagesOrder = [
{ index: 2, url: '/three' }, { index: 0, url: '/one' }, { index: 1, url: '/two' }
];
var update = { $set: {} };
for ( let { url, index } of imagesOrder.sort((a,b) => a.index - b.index) ) {
update.$set['images.'+index] = url;
}
/*
* Creates:
*
* { "$set": {
* "images.0": "/one",
* "images.1": "/two",
* "images.2": "/three"
* }}
*/
let response = await Users.updateOne({ id }, update);
Or in the case where the index property was not there or irrelevant since the array is already ordered:
var imagesOrder = [
{ index: 2, url: '/three' }, { index: 0, url: '/one' }, { index: 1, url: '/two' }
];
for ( let [index, { url }] of Object.entries(imagesOrder) ) {
update.$set['images.'+index] = url;
}
/*
* Creates:
*
* { "$set": {
* "images.0": "/one",
* "images.1": "/two",
* "images.2": "/three"
* }}
*/
let response = await Users.updateOne({ id }, update);
So it's all pretty much the same thing. Note the common form of notation is actually a "string" for the key which includes the index position numerically. This is described in Dot Notation within the core documentation for the MongoDB query language.
The one main difference here is that should your new array contain more entries than the actual array stored in the document to be modified, that second form using the "dot notation" to the indexed position is going to fail since it cannot "set" an index position which does not exist.
For this reason even though there are other pitfalls to "replacing" the array as the original examples show, it's a lot safer than attempting to update via the positional index in the stored document.
Note that this should be enough to have you at least started in the right direction. Making this work with multiple users possibly updating the data at once can become pretty complicated in terms of update statements for both checking and merging changes.
In most cases the simple "replace" will be more than adequate at least for a while. And of course the main lesson here should be to not loop "async" methods in places where it is completely unnecessary. Most of the time what you really want to "loop" is the construction of the statement, if of course any looping is required at all and most of the time it really isn't.
Addendum
Just in case you or anyone had it in mind to actually store an array of objects with the index position values stored within them, this can become a little more complex, but it can also serve as an example of how to actually issue an update statement which does not "replace" the array and actually is safe considering it does not rely on indexed positions of the array but instead using matching conditions.
This is possible with the positional filtered $[<identifier>] syntax introduced in MongoDB 3.6. This allows conditions to specify which element to update ( i.e by matching url ) instead of including the index positions within the statement directly. It's safer since if no matching element is found, then the syntax allows for not attempting to change anything at all.
Also as demonstration the method to $sort the elements based on updated index values is shown. Noting this actually uses the $push modifier even though in this statement we are not actually adding anything to the array. Just reordering the elements. But it's how you actually do that atomically:
const { Schema, Types: { ObjectId } } = mongoose = require('mongoose');
const uri = 'mongodb://localhost:27017/longorder';
const opts = { useNewUrlParser: true };
// sensible defaults
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
mongoose.set('useFindAndModify', false);
mongoose.set('useCreateIndex', true);
// schema defs
const imageSchema = new Schema({
index: Number,
url: String
})
const userSchema = new Schema({
images: [imageSchema]
});
const User = mongoose.model('User', userSchema);
// log helper
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri, opts);
// clean models
await Promise.all(
Object.entries(conn.models).map(([k,m]) => m.deleteMany())
);
// Create data
let _id = new ObjectId();
let user = await User.findOneAndUpdate(
{ _id },
{
'$push': {
'images': {
'$each': [
{ index: 2, url: '/one' },
{ index: 0, url: '/three' },
{ index: 1, url: '/two' }
],
'$sort': { 'index': 1 }
}
}
},
{ 'new': true, 'upsert': true }
);
log(user);
// Change order request
let orderImages = [
{ index: 2, url: '/three' },
{ index: 0, url: '/one' },
{ index: 1, url: '/two' }
];
let $set = { };
let arrayFilters = [];
for ( { index, url } of orderImages ) {
let key = url.replace(/^\//,'');
arrayFilters.push({ [`${key}.url`]: url });
$set[`images.$[${key}].index`] = index;
}
let ops = [
// Update the index value of each matching item
{ 'updateOne': {
'filter': { _id },
'update': { $set },
arrayFilters
}},
// Re-sort the array by index value
{ 'updateOne': {
'filter': { _id },
'update': {
'$push': {
'images': { '$each': [], '$sort': { 'index': 1 } }
}
}
}}
];
log(ops);
let response = await User.bulkWrite(ops);
log(response);
let newuser = await User.findOne({ _id });
log(newuser);
} catch(e) {
console.error(e)
} finally {
mongoose.disconnect()
}
})()
And the output, showing initial document state, the update and actual changes made:
Mongoose: users.deleteMany({}, {})
Mongoose: users.findOneAndUpdate({ _id: ObjectId("5bf6116621293f2ab3dec3d3") }, { '$setOnInsert': { __v: 0 }, '$push': { images: { '$each': [ { _id: ObjectId("5bf6116621293f2ab3dec3d6"), index: 2, url: '/one' }, { _id: ObjectId("5bf6116621293f2ab3dec3d5"), index: 0, url: '/three' }, { _id: ObjectId("5bf6116621293f2ab3dec3d4"), index: 1, url: '/two' } ], '$sort': { index: 1 } } } }, { upsert: true, remove: false, projection: {}, returnOriginal: false })
{
"_id": "5bf6116621293f2ab3dec3d3",
"__v": 0,
"images": [
{
"_id": "5bf6116621293f2ab3dec3d5",
"index": 0,
"url": "/three"
},
{
"_id": "5bf6116621293f2ab3dec3d4",
"index": 1,
"url": "/two"
},
{
"_id": "5bf6116621293f2ab3dec3d6",
"index": 2,
"url": "/one"
}
]
}
[
{
"updateOne": {
"filter": {
"_id": "5bf6116621293f2ab3dec3d3"
},
"update": {
"$set": {
"images.$[three].index": 2,
"images.$[one].index": 0,
"images.$[two].index": 1
}
},
"arrayFilters": [
{
"three.url": "/three"
},
{
"one.url": "/one"
},
{
"two.url": "/two"
}
]
}
},
{
"updateOne": {
"filter": {
"_id": "5bf6116621293f2ab3dec3d3"
},
"update": {
"$push": {
"images": {
"$each": [],
"$sort": {
"index": 1
}
}
}
}
}
}
]
Mongoose: users.bulkWrite([ { updateOne: { filter: { _id: 5bf6116621293f2ab3dec3d3 }, update: { '$set': { 'images.$[three].index': 2, 'images.$[one].index': 0, 'images.$[two].index': 1 } }, arrayFilters: [ { 'three.url': '/three' }, { 'one.url': '/one' }, { 'two.url': '/two' } ] } }, { updateOne: { filter: { _id: 5bf6116621293f2ab3dec3d3 }, update: { '$push': { images: { '$each': [], '$sort': { index: 1 } } } } } } ], {})
{
"ok": 1,
"writeErrors": [],
"writeConcernErrors": [],
"insertedIds": [],
"nInserted": 0,
"nUpserted": 0,
"nMatched": 2,
"nModified": 2,
"nRemoved": 0,
"upserted": [],
"lastOp": {
"ts": "6626503031506599940",
"t": 139
}
}
Mongoose: users.findOne({ _id: ObjectId("5bf6116621293f2ab3dec3d3") }, { projection: {} })
{
"_id": "5bf6116621293f2ab3dec3d3",
"__v": 0,
"images": [
{
"_id": "5bf6116621293f2ab3dec3d6",
"index": 0,
"url": "/one"
},
{
"_id": "5bf6116621293f2ab3dec3d4",
"index": 1,
"url": "/two"
},
{
"_id": "5bf6116621293f2ab3dec3d5",
"index": 2,
"url": "/three"
}
]
}
Related
Hello i am trying to make a caculate field in mongoDB, however i get this error:
MongoError: The dollar ($) prefixed field '$add' in '$add' is not valid for storage.
This is the code:
router.post('/char1btn', ensureAuthenticated, function(req, res) {
const {
MongoClient
} = require("mongodb");
//Replace the uri string with your MongoDB deployment's connection string.
const uri =
'mongodb+srv://test:test1234#databasetest.5f9jh.mongodb.net/Databasetest?retryWrites=true&w=majority';
const client = new MongoClient(uri);
async function run() {
try {
await client.connect();
const database = client.db("Databasetest");
const collection = database.collection("users");
//create a filter for charactername to update
const filter = {
characterimg: ""
};
// this option instructs the method to create a document if no documents match the filter
const options = {
upsert: false
};
const updateDoc = {
$set: {
health: 150,
attack: 3,
defence: 3,
endurance: 10,
characterimg: "https://i.ibb.co/MPg2SMp/Apocaliptic1.png",
},
$set: {
$add: ["$power", "$attack", "$defence", {
$devide: ["$endurance", 3]
}]
}
}
const result = await collection.updateOne(filter, updateDoc, options);
console.log(
`${result.matchedCount} document(s) matched the filter, updated ${result.modifiedCount} document(s)`,
);
} finally {
res.redirect('/main');
await client.close();
}
}
run().catch(console.dir);
})
Does anyone know how to fix this?
Try this one:
const updateDoc = [
{
$set: {
health: 150,
attack: 3,
defence: 3,
endurance: 10,
characterimg: "https://i.ibb.co/MPg2SMp/Apocaliptic1.png",
}
},
{
$set: {
result: { $sum: ["$power", "$attack", "$defence", { $divide: ["$endurance", 3] }] }
}
}
];
const result = await collection.updateOne(filter, updateDoc, options);
You need two $set stages. Otherwise $sum: [...] (or $add) will use old values or fail if fields did not exist before. Also be aware that updateDoc need to be an array, see updateOne()
Update based on comment by #WernfriedDomscheit
if you are using MongoDB version > 4.2, then you can use pipeline inside update So the update query like below:
db.users.updateOne({},
[
{
$set: {
attack: 1,
defence: 2,
endurance: 3
}
},
{
$set: {
power: {
"$add": ["$attack", "$defence", { $divide: ["$endurance", 3] }]
}
}
}
],
{ upsert: true }
);
will have the output:
{
"_id" : ObjectId("603124a22391a75d9a2ddec0"),
"attack" : 1,
"defence" : 2,
"endurance" : 3,
"power" : 4
}
So in your case:
const updateDoc = [
{
$set: {
health: 150,
attack: 3,
defence: 3,
endurance: 10,
characterimg: "https://i.ibb.co/MPg2SMp/Apocaliptic1.png",
}
},
{
$set: {
power: {
$add: ["$attack", "$defence", { $divide: ["$endurance", 3] }]
}
}
}
];
I want to check if current user is present in nested array or not.
this is few part of my sample array which is getting from API:
[
{
"owner":"abc",
"_id":"xyz77",
"comments":[
],
"likes":[
{
"_id":"9999",
"username":"user1"
},
{
"_id":"9998",
"username":"user2"
}
]
},
{
"owner":"bcd"
}
]
I want to see if user1 is present in likes array or not.
if yes then it should give output like this:
[
{
"owner":"abc",
"user1":true
},
{
"owner":"bcd",
"user1":true
},
{
"owner":"def",
"user1":false
}
]
above result is likes array of owner abc has user1 but not present in owner def.
I tried with array.some for likes array inside forEach of owner array. But not getting proper result.
help is appreciated
You can use a combination of Array.prototype.map and Array.prototype.some to create a resulting array which checks if any of the users in the likes array of each owner object matches your username:
const data = [
{
"owner":"abc",
"_id":"xyz77",
"comments":[],
"likes":[
{
"_id":"9999",
"username":"user1"
},
{
"_id":"9998",
"username":"user2"
}
]
},
{
"owner":"bcd",
"_id":"xyz88",
"comments":[],
"likes":[
{
"_id":"9998",
"username":"user2"
},
{
"_id":"9997",
"username":"user3"
}
]
},
];
const checkUsername = (data, username) => {
return data.map(e => {
const x = { owner: e.owner };
x[username] = e.likes.some(el => el.username === username);
return x;
});
};
console.log(checkUsername(data, 'user1'));
console.log(checkUsername(data, 'user2'));
Its Similar to #awarrier99 answer, Use destructuring along with map and some.
const data = [
{
owner: "abc",
_id: "xyz77",
comments: [],
likes: [
{
_id: "9999",
username: "user1",
},
{
_id: "9998",
username: "user2",
},
],
},
{
owner: "bcd",
},
];
const update = (arr, user) =>
data.map(({ likes = [], owner }) => ({
[user]: likes.some(({ username }) => username === user),
owner,
}));
console.log(update(data, "user1"));
How do I check whether a value in an JSON key's value is an array if you don't know the key's name?
I may receive 2 different formats of a JSON object:
{
"person": [{
"id": "1",
"x": "abc",
"attributes": ["something"]
},
{
"id": "1",
"x": "abc"
}
]
}
Depending on the format I will parse it differently. My goal is to create an if statement that would detect whether the value of the key is an Array or just a value, without knowing the name of the key (in above code let's assume I don't really know the name of "attributes"). How do I achieve that? Not only do I have to loop through all person objects, but also all of it's keys.
I found a solution that does that knowing the name of the attribute and there's just one object "person", but don't know how to build on that with multiple "person" objects and not knowing the name of the key?
if (Array.isArray(json.person['attributes'])) // assuming I hold the JSON in json var and I parse it with JSON.parse
{
}
You can try something like this:
Data payload:
const payload = {
person: [
{
id: 1,
x: 'abc',
attributes: ['something']
},
{
id: 1,
x: 'abc'
}
]
};
Function that will return if some entry has an Array as value:
const arrayEntries = json => {
let response = [{
isArray: false,
jsonKey: null,
jsonValue: null
}];
Object.entries(json).map(entry => {
if (Array.isArray(entry[1])) {
response.push({
isArray: true,
jsonKey: entry[0],
jsonValue: entry[1]
});
}
});
return response;
}
Usage example:
payload.person.map(person => {
console.log(arrayEntries(person));
});
Return will be something like this:
Codepen link: https://codepen.io/WIS-Graphics/pen/ExjVPEz
ES5 Version:
function arrayEntries(json) {
var response = [{
isArray: false,
jsonKey: null,
jsonValue: null
}];
Object.entries(json).map(function(entry) {
if (Array.isArray(entry[1])) {
response.push({
isArray: true,
jsonKey: entry[0],
jsonValue: entry[1]
});
}
});
return response;
}
payload.person.map(function(person) {
console.log(arrayEntries(person));
});
Is there a way to transform the output of a DynamoDB query (using doc-client in Lambda) during the query process. Specifically, I wish to extract the first item of a list and put it in a new attribute.
As a simplified example the DynamoDB has the following entries:
{
"Id": 1
"Items": [ "item-1", "item-2", "item-3" ]
},
{
"Id": 2,
"Items": [ "item-x" ]
},
{
"Id": 3,
// "Items" is potentially optional
}
And using the following Lambda function:
// ...
exports.handler = async (event, context) => {
return dynamoDoc.query({
TableName : 'some-table',
Select : 'SPECIFIC_ATTRIBUTES',
KeyConditionExpression : 'Id = :id',
ExpressionAttributeValues : {
':id' : event.Id,
},
ProjectionExpression : `
Id,
Items[0]
`,
}).promise();
};
However this returns a list with items which look like:
"Items": [
{
"Id": 1,
"Items": [
"item-1"
]
},
{
"Id": 2,
"Items": [
"item-x"
]
},
{
"Id": 3
}
]
Is there a way to remap attribute names using some kind of expression, such that I could get the data output in the form of:
Items: [
{
"Id": 1,
"FirstItem: "item-1"
},
{
"Id": 2,
"FirstItem: "item-x"
},
{
"Id": 3
}
]
I am currently using Javascript's array.forEach on the data afterward, however, I am trying to avoid this and would rather leverage DynamoDB for this computation.
No way.
Best practice is just create a helper function like a Repository, the helper function will wrap all query to Dynamodb and returns final object for handler function.
Quick fix, use .map instead of .forEach
exports.handler = async (event, context) => {
const result = await dynamoDoc.query({
TableName: 'some-table',
Select: 'SPECIFIC_ATTRIBUTES',
KeyConditionExpression: 'Id = :id',
ExpressionAttributeValues: {
':id': event.Id,
},
ProjectionExpression: `
Id,
Items[0]
`,
}).promise();
result.Items = result.Items.map((i) => {
return {
Id: i.Id,
FirstItem: i.Items && i.Items.length ? i.Items[0] : undefined,
}
});
return result; // final result
};
Based on the data below, I'm looking to do something like "find block 1 where the parent objects name is 'Panel'"
So, I tried setting up a compound index like this:
objStore.createIndex('by_name_and_block', ['Name', 'blocks.Name']);
And then calling it (sort-of) like this:
var index = objStore.index("by_name_and_block");
var request = index.get("Panel", "1");
// I've also tried:
// var request = index.get(["Panel","1"]);
...
But this doesn't work. Is there a way to set up this compound index in indexeddb?
Sample data:
[
{
Name: "Post",
blocks: [
{
Name:"1",
Arrays:[]
},
{
Name:"2",
Arrays:[]
},
]
},
{
Name: "Panel",
blocks: [
{
Name:"1",
Arrays:[]
},
{
Name:"2",
Arrays:[]
},
]
},
]
Your data is not able to index with current specification. See steps for extracting key from keyPath. Notice that object is not valid key value in array key path.
In v2, you will be able to use with index function expression.
Currently, you will have to generate extra variable before you persist into the database and remove it after retrieval. Use multiEntry index without compound index.
Is this script that what you want?
var obj = [
{
Name: "Post",
blocks: [
{
Name:"1",
Arrays:[]
},
{
Name:"2",
Arrays:[]
},
]
},
{
Name: "Panel",
blocks: [
{
Name:"1",
Arrays:[]
},
{
Name:"2",
Arrays:[]
},
]
},
];
function getBlockByName(objName, index){
for(var i = 0; i < obj.length; i++){
if(obj[i].Name == objName)
return obj[i].blocks[index];
}
return false;
}
//Index starting from 0
console.log(getBlockByName("Panel", 1));
//Will return {Name:"2", Arrays:[]} object