I am using: https://github.com/mysqljs/mysql to be able to access my db through javascript.
This is my current code:
connection.query(`SELECT cash FROM users WHERE id = '${userID}';`, function (error, rows, fields) {
if (error) errorHandler(msg, error);
rows = JSON.stringify(rows);
moneyArray = [];
moneyArray = JSON.parse(rows);
console.log(`Your cash: ` + moneyArray[0].cash);
});
Since rows returns [ RowDataPacket { cash: 0 } ] I had to stringify it to be able to use it, but then I parsed it back into a normal array to actually be able to get the data from it. I was wondering if there was a more efficient way to handle the RowDataPacket data because stringifying and parsing it every time I need to pull data seems inefficient.
Related
I am trying to parse a csv file in node.js , i am able to parse the csv file and can print the content , the contents are coming as a from of a json object.Now my target is to iterate the json object and take out specific key and values from each block and use them in a Query which will do some DB operations.But the problem is while i am trying to iterate the json only first key and values of the first block is printed. Let me post the code what i have done
fs.createReadStream(path)
.pipe(csv.parse({headers:true ,ignoreEmpty : true}))
.on("error",(error) => {
throw error.message;
})
.on("data",function(data){
if(data && data!=={}){
Object.keys(data).forEach(function(k){
if(k==='name' || k==='Office'){
let selectQury = `select name,Office from myTable where name = ${data['name']} and Office
=${data[Office]};
db.query(selectQury,(err,res)=>{
if(err){
console.log('error',null);
This my json which i parse from the csv looks like
{
id:1,
name:"AS",
Office:"NJ"
........
ACTIVE: 1.
},
{
id:2,
name:"AKJS",
Office:"NK"
........
ACTIVE: 2.
}
so now what i want is in the select Query the parameters will be passed like
let selectQury = `select name,Office from myTable where name = "AS" and Office = "NJ";
in the first iteration
let selectQury = `select name,Office from myTable where name = "AKJS" and Office = "NK";
in the second iteration and so on as the csv grows.
I am not able to do it ,please help . Thanks in advance. I am new to node.js & tricky javascript operations.
I have arrays stacked in 1 array and I would like to insert each array per column in MySQL.
I have reached to insert all data in arrays to 1 column, but I want to insert an array per column.
Please see the screenshot and code below.
Image of array stack
con.connect(async(err)=>{
const x = await getStock()
if(err){
console.log(err);
return err;
}else{
console.log("DB ok");
}
console.log("connected");
x.filter(item=>item===undefined?false:true).map(item=>item.forEach(item=>{
const sql ="INSERT INTO test (testCol1) VALUES ?";
const values=[
[item]
];
con.query(sql,[values],(err,result)=>{
if(err)throw err;
console.log("this have been recorded"+result);
});
}));
});
I just found a solution for this case.
It can be a little bit confusing but it is working.
Maybe there exists way easier and understandable solution, but this is the solution I have at the moment.
In the code above I was using only 1 array iterator, which was returning me an array. I decided to iterate the returned array to get each integer and insert data into MySQL, also the (test${i+1}) sets array into necessary column.
x.filter(item=>item===undefined?false:true).forEach((item,i)=>{
item.forEach(item=>{
const sql =`INSERT INTO test (test${i+1}) VALUES ?`;
const values=[
[item]
];
con.query(sql,[values],(err,result)=>{
if(err)throw err;
console.log("this have been recorded"+result);
});
})
});
Forgive me but I have been trying to solve the problem for several days, but I can not locate the problem.
I am downloading from mongoDB^14..(with mongoose) two arrays with complementary data. One contains the user data, the other the user survey records.
Both arrays are related through the user's email:
/***************************************************************
This is what I download from mongoDB with Nodejs
****************************************************************
const user = require ('../models/user');
const test = require ('../models/test');
let users = await user.find ({email:userEmail});
let tests = await test.find ({email:testEmail});
Request:
let users: [
{name:'pp', email:'pp#gmail.com},
{name:'aa', email:'aa#gmail.com'}
];
let tests: [
{email:'pp#gmail.com', satisfaction:'5'},
{email:'aa#gmail.com', satisfaction:'2'}];
*****************************************************************************/
Now I try to relate both json arrays using:
for (let i = 0; i < prof1.length; i++) {
for (let z = 0; z < registro1.length; z++) {
if (users[i].email==tests[z].email){
users[i]['satisfation'] = test[z].satisfation;
}
}
}
If I do a console.log(users[0]) my wish is:
{name:'pp', email:'pp#gmail.com', satisfation:'5'}
But I receive:
{name:'pp', email:'pp#gmail.com'}
But attention¡¡¡¡ If I do a console.log(users[0].satisfation)
The result is: 5
?????? Please can someone help me. Thank you very much
Note:
If I instead of downloading the mongodb arrays, I write them by hand. So it works perfectly. Can it be a lock on the models?
WIDE INFORMATION
Although I have given a simple example, the user and test arrays are much more complex in my application, however they are modeled and correctly managed in mongoDB.
The reason for having two documents in mongoDB is because there are more than 20 variables in each of them. In user, I save fixed user data and in the test the data that I collect over time. Later, I lower both arrays to my server where I perform statistical calculations and for this I need to generate a single array with data from both arrays. I just take the test data and add it to the user to relate all the parameters. I thought that in this way it would unburden the mongoDB to carry out continuous queries and subsequent registrations, since I perform the operations on the server, to finally update the arry test in mongoDB.
When I query the database I receive the following array of documents:
let registroG = await datosProf.find({idAdmin:userAdmin });
res.json(registroG);
This is the response received in the front client:
If I open the object the document [0] would be:
**THE QUESTION **:
Why when I try to include a value key in the object it doesn't include it?
You could use Array.map with es6 spread operator to marge to objects
let users = [{ name: 'pp', email: 'pp#gmail.com' }, { name: 'aa', email: 'aa#gmail.com' }];
let tests = [{ email: 'pp#gmail.com', satisfaction: '5' }, { email: 'aa#gmail.com', satisfaction: '2' }];
let result = users.map(v => ({
...v,
...tests.find(e => e.email == v.email)
}))
console.log(result)
I am using sequlize in express JavaScript framework.
const data = await db.Tour.findAll();
res.status(200).json(data)
If I do this I can nicely retrieve the data in the front-end vue JavaScript spa like this
{
tour_name:"Bali",
activities:[
{name:"Swimming"},
{name:"Beach vollyball"},
]
}
Above one is for retrieve the data for front-end.
If I need to get the data and make some changes in the controller before send them, I will raw: true
then I can get the same output in my controller. But the problem is raw: true is not going well with the
joins, so that point getting the data from the controller and make some changes to it is very hard.
I have to access so many nested objects to find the data I want. Is there a smarter way (there should be) to get the above format
from the controller without using the raw: true.
I hope there must be a nice way to pass that data object to some thing and convert to the format.
How do I achieve this?
In below code I had to retrieve a shops products images and its ids along with some other data like ratings, count etc.
In below code I had to retrieve a shop products images and its ids.
exports.filterShopListings = async (data) => {
return Db.sequelize.query("SELECT productImages, S.shopId, S.currency, S.mainImage, S.mainImageThumb, S.coverImage, S.coverImageThumb, S.name, S.location, S.approved, S.locationLatitude, CONCAT('"+process.env.QR_URL+"',S.qrCode) as qrCode, S.locationLongitude, COALESCE(productCount,0) as productCount, COALESCE(ratingCount,0) as ratingCount, COALESCE(ROUND(UR.ratingAvg,1) ,0) as ratings, COALESCE(shopFollowing,0) as followingCount FROM shops as S JOIN users U ON (U.userId=S.userId AND U.blocked='0') LEFT JOIN ( SELECT COUNT(*) as productCount, shopId, GROUP_CONCAT(mainImageThumb,'--',shopProductId) as productImages FROM shopProducts WHERE shopProducts.deleted='0' AND shopProducts.blocked='0' GROUP BY shopId) SP ON (SP.shopId=S.shopId) LEFT JOIN ( SELECT COUNT(*) as ratingCount, AVG(ratings) as ratingAvg, shopId FROM userRatings WHERE userRatings.blocked='0' AND userRatings.deleted='0' GROUP BY shopId ) UR ON (UR.shopId=S.shopId) LEFT JOIN ( SELECT COUNT(*) as shopFollowing, shopId FROM shopFollowings GROUP BY shopId) SF ON (SF.shopId=S.shopId) WHERE "+data.whereString+" HAVING "+data.havingString+" ORDER BY "+data.orderingParam+" "+data.orderingSort+" LIMIT "+data.skip+", "+data.take+" ",{ type: Sequelize.QueryTypes.SELECT})
.then( (shops) => {
shops = JSON.parse(JSON.stringify(shops));
shops.forEach( (shop) => {
//shop.productImagesTemp = shop.productImages;
shop.productImages = shopImagesFunc(shop.productImages);
});
return shops;
});
};
And The shopImagesFunc Code -
var shopImagesFunc = (productImages) => {
if(productImages ==null)
return [];
var images = (productImages.split(",").filter(Boolean));
var newImages = [];
images.forEach(image => {
let temp = image.split("--").filter(Boolean);
newImages.push({
shopProductId: parseInt(temp[1]),
mainImage: temp[0],
});
});
return newImages;
};
SQL Query is little complicated but creating a common function to format into required output would be very useful.
I have a table in my Parse database with columns validFrom and uniqueId. There can be multiple records with the same uniqueId (like a name)
What query do I have to use to get the items with the latest validFrom for a given set of uniqueIds? I tried the following but this limits my search to 1 item for the entire set rather than 1 item per unique_id record:
var UpdateObject = Parse.Object.extend("UpdateObject");
var query = new Parse.Query(UpdateObject);
query.containedIn("unique_id", uniqueIdsArray).select('status', 'unique_id').descending("validFrom").limit(1);
The query semantics are limited, so the only approach is to query for a superset and manipulate the result to what you need. This is better done on the server to limit the transmission of extra objects.
Big caveat: did this with pencil and paper, not a running parse.app, so it may be wrong. But the big idea is to get all of the matching objects for all of the uniqueIds, group them by uniqueId, and then for each group return the one with the maximum validFrom date...
function updateObjectsInSet(uniqueIdsArray ) {
var query = new Parse.Query("UpdateObject");
// find all of the UpdateObjects with the given ids
query.containedIn("unique_id", uniqueIdsArray);
query.limit(1000);
return query.find().then(function(allUpdates) {
// group the result by id
var byId = _.groupBy(allUpdates, function(update) { return update.get("unique_id"); });
// for each group, select the newest validFrom date
return _.map(byId, function (updates) {
_.max(updates, function(update) { return -update.get("validFrom").getTime(); });
});
});
}
To place this in the cloud, just wrap it:
Parse.Cloud.define("updateObjectsInSet", function(request, response) {
updateObjectsInSet(request.params.uniqueIdsArray).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
});
Then use Parse.Cloud.run() from the client to call it.