Javascript Array of object value into multidimensional array - javascript

I am in process of learning expressJS with NodeJS.
I am trying to insert multiple rows into mySQL table. Since the bulk insert query requires data like
[["a",1], ["b",2], ["c",3]]
How can I transform my array of objects to such form? Here is my JSON post data
[
{
"productID" : 1,
"stock": -3
},
{
"productID" : 1,
"stock": 5
}
]
How to tranform such JSON object into the multidimensional array?
[[1,-3],[1,5]]
Here is what I have tried so far.
let promises = []
req.body.map((n) => {
promises.push(new Promise(resolve => {
let { productID, stock } = n
let values = {
PRODUCT_ID: productID,
STOCK: stock
}
let sql = 'INSERT INTO product_stock_history SET ?'
db.connection.query(sql, values, (err, results) => {
if (err) {
console.log("Failed to add stocks record: " + err)
res.sendStatus(500)
return
} else {
res.send("Stock record has been added")
}
})
}))
})
The above code is working, but in the end I have error with the mySQL syntax which I believe something to do with the promise. I am not familiar with the promise :)
Error: Can't set headers after they are sent.
So what i want to achieve is just the mapping without Promise.
thanks

You could pass Object.values as parameter to map like this:
const input = [
{
"productID" : 1,
"stock": -3
},
{
"productID" : 1,
"stock": 5
}
]
const output = input.map(Object.values)
console.log(output)

Related

How to loop through objects and count unique values of key?

I have logs of json files which are objects that look like
{
"logs": [
{
"id": "12321321321321",
"email": "test#email.com",
"message": "ahahaha"
},
{
"id": "12321321312",
"email": "test#email.com",
"message": "hahahaha."
},
"id": "12321321321"
}
I need to return a new object that contains
{
"hello_id": outer id of the json file,
"array": [
{
"email": "test#me.com",
"total": 2
}
]
}
So far I am looping through the json files and have
jsonsInDirectory.forEach((file) => {
const fileData = fs.readFileSync(path.join("./logs", file), "utf8");
const jsonData = JSON.parse(fileData);
}
});
The key is "logs" and "id" and the values are the objects in the "logs" and the value of "id"
How can I count and return a new object at the same time?
You can try this approach: make a hash object that counts emails. Then just map it to an array of objects.
const data = {
logs: [{
id: "89004ef9-e825-4547-a83a-c9e9429e8f95",
email: "noah.sanchez#me.com",
message: "successfully handled skipped operation."
},
{
id: "89004ef9-e825-4547-a83a-c9e9429e8f95",
email: "noah.sanchez#me.com",
message: "successfully handled skipped operation."
},
{
id: "89004ef9-e825-4547-a83a-c9e9429e8f95",
email: "noname#me.com",
message: "successfully handled skipped operation."
}],
id: "56f83bed-3705-4115-9067-73930cbecbc0",
};
const emails = data.logs.reduce((acc, { email }) => {
acc[email] = (acc[email] ?? 0) + 1;
return acc;
}, {});
const tally = Object.entries(emails)
.map(([email, total]) => ({ email, total }));
const result = { logs_id: data.id, tally };
console.log(result)
.as-console-wrapper { max-height: 100% !important; top: 0 }
When you do const jsonData = JSON.parse(fileData);, you get the file data as a JSON and knowing the struct of that JSON you can easily get the info.
I have created a example https://codesandbox.io/s/stackoverflow-logs-count-example-jys2vg?file=/src/index.js
It may not solve exactly wath you want.
To solve this problem with the most time efficiency, you can create a tally object by firstly creating a map of the occurences of each mail with the email as the key and no of occurences as the value, since this will take constant (O(1)) time to execute, afterwhich you can create the tally array from the map as given below
output = []
jsonsInDirectory.forEach((file) => {
const fileData = fs.readFileSync(path.join("./logs", file), "utf8");
const jsonData = JSON.parse(fileData);
var map = {}
jsonData.logs.forEach((log) => {
if(log.email in map){
map[log.email] += 1
}
else {
map[log.email] = 1
}
});
var tally = []
for(var email in map){
tally.push({email: email, total: map[email]})
}
output.push({logs_id: jsonData['id'], tally: tally});
})

How to change my object from being only one object with arrays inside, to an array of objects?

So I've got this part of a code where I'm creating response for my project. Now I've managed to create data, but I've got response that I need to changes.
First here is my code:
exports.getById = (req, res) => {
const id = req.params.a_id;
articleService
.getById(id)
.then((article) => {
bankService
.getRates()
.then((list) => {
let prr = article.price;
let price = parseFloat(prr.replace(/\.| ?€$/g, '').replace(',', '.'));
let mjeseci = req.body.months;
let ratanks = list.map((rata) =>
LoanJS.Loan(price, !mjeseci ? 60 : mjeseci, rata.NKS)
);
const kreditNKS = ratanks.map((index) => index.sum);
const rataNKS = ratanks.map(
(index) => index.installments[0].installment
);
let eks = list.map((stopa) => stopa.EKS);
let name = list.map((ime) => ime.bank.name);
let nks = list.map((stopa) => stopa.NKS);
let type = list.map((ime) => ime.interest_type.name);
res.status(200).json({
kredit: {
kreditNKS: kreditNKS,
rataNKS: rataNKS,
stopaEKS: eks,
stopaNKS: nks,
tip: type,
ime: name,
}
})
.catch((err) => {
res.status(500).send('Error 1 ->' + err);
});
})
.catch((err) => {
res.status(500).send('Error ->' + err);
});
};
Explain of what it does: So I'm fetching single article from my DB which has price inside it, then I'm getting data about loan also from DB. Now I'm using that data from DB, using .map function to get values one by one and calculating for that values my final loan(that is ratanks part). Now I'm also extracting some other data that I need present to the user on the frontend.
Now my problem: It's sending my res as an object with one object, who has key:value pairs and values are array of data inside it. But I want it to be an array with multiple objects.
My response in postman right now:
{
"kredit": {
"kreditNKS": [
118406.54,
118348.2,
119400.33,
118022.46,
118262.44,
118811.84
],
"rataNKS": [
19734.42,
19724.7,
19900.05,
19670.41,
19710.41,
19801.97
],
"stopaEKS": [
"6.24",
"5.65",
"8.26",
"3.13",
"4.03",
"5.68"
],
"stopaNKS": [
"4.11",
"3.94",
"7",
"2.99",
"3.69",
"5.29"
],
"tip": [
"Fiksna",
"Promjenjiva",
"Fiksna",
"Promjenjiva",
"Fiksna",
"Fiksna"
],
"ime": [
"ZiraatBank",
"ZiraatBank",
"UniCredit",
"Raiffeisen Bank",
"Raiffeisen Bank",
"ASA Banka"
]
}
}
Where I need it to be something like this:
[
{
"kreditNKS":118406.54,
"rataNKS": 19734.42,
"stopaEKS": "6.24",
"stopaNKS": "4.11",
"tip": "Fiksna",
"ime": "ZiraatBank"
},
{
"kreditNKS":118348.2,
"rataNKS": 19724.7,
"stopaEKS": "5.65",
"stopaNKS": "3.94",
"tip": "Promjenjiva",
"ime": "ZiraatBank"
},
{
"kreditNKS":119400.33,
"rataNKS": 19900,05,
"stopaEKS": "8.26",
"stopaNKS": "7",
"tip": "Fiksna",
"ime": "UniCredit"
}
etc.....
]
Is it possible to modify something like this?
Any tips are welcome!
Thanks!
it looks like you're mapping your list into 4 arrays and then putting them inside a single object, where each array is a property of the said object.
let eks = list.map((stopa) => stopa.EKS);
let name = list.map((ime) => ime.bank.name);
let nks = list.map((stopa) => stopa.NKS);
let type = list.map((ime) => ime.interest_type.name);
res.status(200).json({
kredit: {
kreditNKS: kreditNKS,
rataNKS: rataNKS,
stopaEKS: eks,
stopaNKS: nks,
tip: type,
ime: name,
}
})
The way someArray.map(eachThing => doSomethingWithThing(thing)) works is that you iterate the entire array "someArray" and execute a function for each thing inside of it.
This means that instead of doing LoanJS.Loan(price, !mjeseci ? 60 : mjeseci, rata.NKS) for all the items of the list and write that to a new array called "ratanks", you can write your own function for all the items of the list, and during the iteration of each item do something like const loan = LoanJS.Loan(price, !mjeseci ? 60 : mjeseci, eachItem.NKS).
This being said, you should be able to get the object you want by mapping your list into an array of objects like this
const mappedKredits = list.map((eachObject) => {
const computedLoan = LoanJS.Loan(price, !mjeseci ? 60 : mjeseci, eachObject.NKS);
const kreditNKS = computedLoan.sum;
const rataNKS = computedLoan.installments[0].installment;
return {
"kreditNKS": kreditNKS,
"rataNKS": rataNKS,
"stopaEKS": eachObject.EKS,
"stopaNKS": eachObject.NKS,
"tip": eachObject.interest_type.name,
"ime": eachObject.bank.name,
}
});
res.status(200).json(mappedKredits);
Btw, try to use more descriptive names for the variables, if they're in English it'll be even better, otherwise, it makes it a bit harder for folks like me to understand what the code is doing, thus making it harder to help you.

How count pages total number of pages? [duplicate]

I am interested in optimizing a "pagination" solution I'm working on with MongoDB. My problem is straight forward. I usually limit the number of documents returned using the limit() functionality. This forces me to issue a redundant query without the limit() function in order for me to also capture the total number of documents in the query so I can pass to that to the client letting them know they'll have to issue an additional request(s) to retrieve the rest of the documents.
Is there a way to condense this into 1 query? Get the total number of documents but at the same time only retrieve a subset using limit()? Is there a different way to think about this problem than I am approaching it?
Mongodb 3.4 has introduced $facet aggregation
which processes multiple aggregation pipelines within a single stage
on the same set of input documents.
Using $facet and $group you can find documents with $limit and can get total count.
You can use below aggregation in mongodb 3.4
db.collection.aggregate([
{ "$facet": {
"totalData": [
{ "$match": { }},
{ "$skip": 10 },
{ "$limit": 10 }
],
"totalCount": [
{ "$group": {
"_id": null,
"count": { "$sum": 1 }
}}
]
}}
])
Even you can use $count aggregation which has been introduced in mongodb 3.6.
You can use below aggregation in mongodb 3.6
db.collection.aggregate([
{ "$facet": {
"totalData": [
{ "$match": { }},
{ "$skip": 10 },
{ "$limit": 10 }
],
"totalCount": [
{ "$count": "count" }
]
}}
])
No, there is no other way. Two queries - one for count - one with limit. Or you have to use a different database. Apache Solr for instance works like you want. Every query there is limited and returns totalCount.
MongoDB allows you to use cursor.count() even when you pass limit() or skip().
Lets say you have a db.collection with 10 items.
You can do:
async function getQuery() {
let query = await db.collection.find({}).skip(5).limit(5); // returns last 5 items in db
let countTotal = await query.count() // returns 10-- will not take `skip` or `limit` into consideration
let countWithConstraints = await query.count(true) // returns 5 -- will take into consideration `skip` and `limit`
return { query, countTotal }
}
Here's how to do this with MongoDB 3.4+ (with Mongoose) using $facets. This examples returns a $count based on the documents after they have been matched.
const facetedPipeline = [{
"$match": { "dateCreated": { $gte: new Date('2021-01-01') } },
"$project": { 'exclude.some.field': 0 },
},
{
"$facet": {
"data": [
{ "$skip": 10 },
{ "$limit": 10 }
],
"pagination": [
{ "$count": "total" }
]
}
}
];
const results = await Model.aggregate(facetedPipeline);
This pattern is useful for getting pagination information to return from a REST API.
Reference: MongoDB $facet
Times have changed, and I believe you can achieve what the OP is asking by using aggregation with $sort, $group and $project. For my system, I needed to also grab some user info from my users collection. Hopefully this can answer any questions around that as well. Below is an aggregation pipe. The last three objects (sort, group and project) are what handle getting the total count, then providing pagination capabilities.
db.posts.aggregate([
{ $match: { public: true },
{ $lookup: {
from: 'users',
localField: 'userId',
foreignField: 'userId',
as: 'userInfo'
} },
{ $project: {
postId: 1,
title: 1,
description: 1
updated: 1,
userInfo: {
$let: {
vars: {
firstUser: {
$arrayElemAt: ['$userInfo', 0]
}
},
in: {
username: '$$firstUser.username'
}
}
}
} },
{ $sort: { updated: -1 } },
{ $group: {
_id: null,
postCount: { $sum: 1 },
posts: {
$push: '$$ROOT'
}
} },
{ $project: {
_id: 0,
postCount: 1,
posts: {
$slice: [
'$posts',
currentPage ? (currentPage - 1) * RESULTS_PER_PAGE : 0,
RESULTS_PER_PAGE
]
}
} }
])
there is a way in Mongodb 3.4: $facet
you can do
db.collection.aggregate([
{
$facet: {
data: [{ $match: {} }],
total: { $count: 'total' }
}
}
])
then you will be able to run two aggregate at the same time
By default, the count() method ignores the effects of the
cursor.skip() and cursor.limit() (MongoDB docs)
As the count method excludes the effects of limit and skip, you can use cursor.count() to get the total count
const cursor = await database.collection(collectionName).find(query).skip(offset).limit(limit)
return {
data: await cursor.toArray(),
count: await cursor.count() // this will give count of all the documents before .skip() and limit()
};
It all depends on the pagination experience you need as to whether or not you need to do two queries.
Do you need to list every single page or even a range of pages? Does anyone even go to page 1051 - conceptually what does that actually mean?
Theres been lots of UX on patterns of pagination - Avoid the pains of pagination covers various types of pagination and their scenarios and many don't need a count query to know if theres a next page. For example if you display 10 items on a page and you limit to 13 - you'll know if theres another page..
MongoDB has introduced a new method for getting only the count of the documents matching a given query and it goes as follows:
const result = await db.collection('foo').count({name: 'bar'});
console.log('result:', result) // prints the matching doc count
Recipe for usage in pagination:
const query = {name: 'bar'};
const skip = (pageNo - 1) * pageSize; // assuming pageNo starts from 1
const limit = pageSize;
const [listResult, countResult] = await Promise.all([
db.collection('foo')
.find(query)
.skip(skip)
.limit(limit),
db.collection('foo').count(query)
])
return {
totalCount: countResult,
list: listResult
}
For more details on db.collection.count visit this page
It is possible to get the total result size without the effect of limit() using count() as answered here:
Limiting results in MongoDB but still getting the full count?
According to the documentation you can even control whether limit/pagination is taken into account when calling count():
https://docs.mongodb.com/manual/reference/method/cursor.count/#cursor.count
Edit: in contrast to what is written elsewhere - the docs clearly state that "The operation does not perform the query but instead counts the results that would be returned by the query". Which - from my understanding - means that only one query is executed.
Example:
> db.createCollection("test")
{ "ok" : 1 }
> db.test.insert([{name: "first"}, {name: "second"}, {name: "third"},
{name: "forth"}, {name: "fifth"}])
BulkWriteResult({
"writeErrors" : [ ],
"writeConcernErrors" : [ ],
"nInserted" : 5,
"nUpserted" : 0,
"nMatched" : 0,
"nModified" : 0,
"nRemoved" : 0,
"upserted" : [ ]
})
> db.test.find()
{ "_id" : ObjectId("58ff00918f5e60ff211521c5"), "name" : "first" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c6"), "name" : "second" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c7"), "name" : "third" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c8"), "name" : "forth" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c9"), "name" : "fifth" }
> db.test.count()
5
> var result = db.test.find().limit(3)
> result
{ "_id" : ObjectId("58ff00918f5e60ff211521c5"), "name" : "first" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c6"), "name" : "second" }
{ "_id" : ObjectId("58ff00918f5e60ff211521c7"), "name" : "third" }
> result.count()
5 (total result size of the query without limit)
> result.count(1)
3 (result size with limit(3) taken into account)
Try as bellow:
cursor.count(false, function(err, total){ console.log("total", total) })
core.db.users.find(query, {}, {skip:0, limit:1}, function(err, cursor){
if(err)
return callback(err);
cursor.toArray(function(err, items){
if(err)
return callback(err);
cursor.count(false, function(err, total){
if(err)
return callback(err);
console.log("cursor", total)
callback(null, {items: items, total:total})
})
})
})
Thought of providing a caution while using the aggregate for the pagenation. Its better to use two queries for this if the API is used frequently to fetch data by the users. This is atleast 50 times faster than getting the data using aggregate on a production server when more users are accessing the system online. The aggregate and $facet are more suited for Dashboard , reports and cron jobs that are called less frequently.
We can do it using 2 query.
const limit = parseInt(req.query.limit || 50, 10);
let page = parseInt(req.query.page || 0, 10);
if (page > 0) { page = page - 1}
let doc = await req.db.collection('bookings').find().sort( { _id: -1 }).skip(page).limit(limit).toArray();
let count = await req.db.collection('bookings').find().count();
res.json({data: [...doc], count: count});
I took the two queries approach, and the following code has been taken straight out of a project I'm working on, using MongoDB Atlas and a full-text search index:
return new Promise( async (resolve, reject) => {
try {
const search = {
$search: {
index: 'assets',
compound: {
should: [{
text: {
query: args.phraseToSearch,
path: [
'title', 'note'
]
}
}]
}
}
}
const project = {
$project: {
_id: 0,
id: '$_id',
userId: 1,
title: 1,
note: 1,
score: {
$meta: 'searchScore'
}
}
}
const match = {
$match: {
userId: args.userId
}
}
const skip = {
$skip: args.skip
}
const limit = {
$limit: args.first
}
const group = {
$group: {
_id: null,
count: { $sum: 1 }
}
}
const searchAllAssets = await Models.Assets.schema.aggregate([
search, project, match, skip, limit
])
const [ totalNumberOfAssets ] = await Models.Assets.schema.aggregate([
search, project, match, group
])
return await resolve({
searchAllAssets: searchAllAssets,
totalNumberOfAssets: totalNumberOfAssets.count
})
} catch (exception) {
return reject(new Error(exception))
}
})
I had the same problem and came across this question. The correct solution to this problem is posted here.
You can do this in one query. First you run a count and within that run the limit() function.
In Node.js and Express.js, you will have to use it like this to be able to use the "count" function along with the toArray's "result".
var curFind = db.collection('tasks').find({query});
Then you can run two functions after it like this (one nested in the other)
curFind.count(function (e, count) {
// Use count here
curFind.skip(0).limit(10).toArray(function(err, result) {
// Use result here and count here
});
});

How to access nested array inside a JSON object from javascript

I am working with a json object that has nested arrays as well as names with spaces such as Account ID. I need to display just the Account ID's in my Vue.js application. I am able to get my entire response.data json object but not too sure how to get just the Account ID when it's nested like the example below.
JSON
"response": {
"result": {
"Accounts": {
"row": [
{
"no": "1",
"FL": [
{
"val": "ACCOUNT ID",
"content": "123456789"
},
...
Vue.js
<script>
import axios from "axios";
export default {
name: 'HelloWorld',
data () {
return {
accounts: [],
accountIDs: []
}
},
mounted() {
var self = this;
axios.get('https://MYAPIGETREQUEST')
.then( function(res){
self.accounts = res.data;
self.accountIDs = //This is where I want to get the Account ID
console.log('Data: ', res.data);
})
.catch( function(error){
console.log('Error: ', error);
})
}
}
</script>
Try something like this
if(res.data.response.result.Accounts.row[0].FL[0].val === 'ACCOUNT ID') {
self.accountIDs = res.data.response.result.Accounts.row[0].FL[0].content;
...
}
You can also try something like this:
let rowAccounts = response.result.Accounts.row
.map(row => row.FL
.filter(FL => FL.val === 'ACCOUNT ID')
.map(acc => acc.content)
);
self.accountIDs = [].concat.apply([], rowAccounts);
In rowAccounts, you get and array of accounts array per row like:
[
0: ['acc row 1', 'another acc row1'],
1: ['acc row 2'....]
]
Now it all depends upon your implementation the way you like it.

Put Javascript Array Values into Mongodb Collection Values

I have a Javascript Array filled with mean Values and I want to insert them into a collection with a field named "mean". The Field "mean" already exists and has already values in them and now I want to update them with the values of the Array. To be more specific: I want the first Value of the Array to be in the first Document under the field "mean" and so on. I have 98 Documents and the Array has also a length of 98.
The Collection looks like this with the name "cmean":
{
"_id" : "000",
"mean" : 33.825645389680915
}
{
"_id" : "001",
"mean" : 5.046005719077798
}
and the Array:
[
33.89923155012405,
5.063347068609219
]
You can use the forEach method on the array to iterate it and update the collection. Use the index to get the _id to be used in the update query, something like the following:
meansArray.forEach(function(mean, idx) {
var id = db.cmean.find({}).skip(idx).limit(1).toArray()[0]["_id"];
db.cmean.updateOne(
{ "_id": id },
{ "$set": { "mean": mean } },
{ "upsert": true }
);
});
For large collections, you can streamline your db performance using bulkWrite as follows:
var ops = [];
meansArray.forEach(function(mean, idx) {
var id = db.cmean.find({}).skip(idx).limit(1).toArray()[0]["_id"];
ops.push({
"updateOne": {
"filter": { "_id": id },
"update": { "$set": { "mean": mean } },
"upsert": true
}
});
if (ops.length === 1000 ) {
db.cmean.bulkWrite(ops);
ops = [];
}
})
if (ops.length > 0)
db.cmean.bulkWrite(ops);
update({"_id": id}, $set: {"mean": myArray}, function(res, err) { ... });
If you are using mongoose, you also have to change data model from string to array.

Categories