Node JavaScript express JavaScript format data get from the sequlize query - javascript

I am using sequlize in express JavaScript framework.
const data = await db.Tour.findAll();
res.status(200).json(data)
If I do this I can nicely retrieve the data in the front-end vue JavaScript spa like this
{
tour_name:"Bali",
activities:[
{name:"Swimming"},
{name:"Beach vollyball"},
]
}
Above one is for retrieve the data for front-end.
If I need to get the data and make some changes in the controller before send them, I will raw: true
then I can get the same output in my controller. But the problem is raw: true is not going well with the
joins, so that point getting the data from the controller and make some changes to it is very hard.
I have to access so many nested objects to find the data I want. Is there a smarter way (there should be) to get the above format
from the controller without using the raw: true.
I hope there must be a nice way to pass that data object to some thing and convert to the format.
How do I achieve this?

In below code I had to retrieve a shops products images and its ids along with some other data like ratings, count etc.
In below code I had to retrieve a shop products images and its ids.
exports.filterShopListings = async (data) => {
return Db.sequelize.query("SELECT productImages, S.shopId, S.currency, S.mainImage, S.mainImageThumb, S.coverImage, S.coverImageThumb, S.name, S.location, S.approved, S.locationLatitude, CONCAT('"+process.env.QR_URL+"',S.qrCode) as qrCode, S.locationLongitude, COALESCE(productCount,0) as productCount, COALESCE(ratingCount,0) as ratingCount, COALESCE(ROUND(UR.ratingAvg,1) ,0) as ratings, COALESCE(shopFollowing,0) as followingCount FROM shops as S JOIN users U ON (U.userId=S.userId AND U.blocked='0') LEFT JOIN ( SELECT COUNT(*) as productCount, shopId, GROUP_CONCAT(mainImageThumb,'--',shopProductId) as productImages FROM shopProducts WHERE shopProducts.deleted='0' AND shopProducts.blocked='0' GROUP BY shopId) SP ON (SP.shopId=S.shopId) LEFT JOIN ( SELECT COUNT(*) as ratingCount, AVG(ratings) as ratingAvg, shopId FROM userRatings WHERE userRatings.blocked='0' AND userRatings.deleted='0' GROUP BY shopId ) UR ON (UR.shopId=S.shopId) LEFT JOIN ( SELECT COUNT(*) as shopFollowing, shopId FROM shopFollowings GROUP BY shopId) SF ON (SF.shopId=S.shopId) WHERE "+data.whereString+" HAVING "+data.havingString+" ORDER BY "+data.orderingParam+" "+data.orderingSort+" LIMIT "+data.skip+", "+data.take+" ",{ type: Sequelize.QueryTypes.SELECT})
.then( (shops) => {
shops = JSON.parse(JSON.stringify(shops));
shops.forEach( (shop) => {
//shop.productImagesTemp = shop.productImages;
shop.productImages = shopImagesFunc(shop.productImages);
});
return shops;
});
};
And The shopImagesFunc Code -
var shopImagesFunc = (productImages) => {
if(productImages ==null)
return [];
var images = (productImages.split(",").filter(Boolean));
var newImages = [];
images.forEach(image => {
let temp = image.split("--").filter(Boolean);
newImages.push({
shopProductId: parseInt(temp[1]),
mainImage: temp[0],
});
});
return newImages;
};
SQL Query is little complicated but creating a common function to format into required output would be very useful.

Related

How to get specific key and value from a long json object while iterating it in node.js

I am trying to parse a csv file in node.js , i am able to parse the csv file and can print the content , the contents are coming as a from of a json object.Now my target is to iterate the json object and take out specific key and values from each block and use them in a Query which will do some DB operations.But the problem is while i am trying to iterate the json only first key and values of the first block is printed. Let me post the code what i have done
fs.createReadStream(path)
.pipe(csv.parse({headers:true ,ignoreEmpty : true}))
.on("error",(error) => {
throw error.message;
})
.on("data",function(data){
if(data && data!=={}){
Object.keys(data).forEach(function(k){
if(k==='name' || k==='Office'){
let selectQury = `select name,Office from myTable where name = ${data['name']} and Office
=${data[Office]};
db.query(selectQury,(err,res)=>{
if(err){
console.log('error',null);
This my json which i parse from the csv looks like
{
id:1,
name:"AS",
Office:"NJ"
........
ACTIVE: 1.
},
{
id:2,
name:"AKJS",
Office:"NK"
........
ACTIVE: 2.
}
so now what i want is in the select Query the parameters will be passed like
let selectQury = `select name,Office from myTable where name = "AS" and Office = "NJ";
in the first iteration
let selectQury = `select name,Office from myTable where name = "AKJS" and Office = "NK";
in the second iteration and so on as the csv grows.
I am not able to do it ,please help . Thanks in advance. I am new to node.js & tricky javascript operations.

How to count a huge list of items

I have a huge list of items about almost all the crops and these data is to be plotted using maps and charts. I would like to count the number of each crop, say how many times was cabbage planted. I use Firebase database to store the data and I retrieve it using this function below:
database = firebase.database()
var ref = database.ref('Planting-Calendar-Entries');
ref.on('value', gotData, errData);
function gotData(data){
console.log(data.val())
var veggie = data.val();
var keys = Object.keys(veggie);
console.log(keys);
let counter = 0
for (var i = 0; i < keys.length; i++){
var k = keys[i];
var Veg_planted = veggie[k].Veg_planted;
var coordinates = veggie[k].coordinates;
if (Veg_planted == 'Cabbage'){
counter++;
}
// vegAll = Veg_planted.count()
console.log(Veg_planted, coordinates)
}
console.log(counter)
}
function errData(err){
console.log('Error!');
console.log(err)
}
This data I retrieve it from the database where it gets updated whenever someone submits their planting information. The code I used above will only apply if my list is small, but I have a list of about 170 items and it would be hard to write code to count each crop individually using something like let counter = 0, counter++. Is there a way I could navigate around this?
I'm assuming data.val() returns an array, not an object, and you're misusing Object.keys() on an array instead of just looping over the array itself. If that's true, then it sounds like you want to group by the Veg_planted key and count the groupings:
const counts = Object.values(veggie).reduce((counts, { Veg_planted }) => ({
...counts,
[Veg_planted]: (counts[Veg_planted] || 0) + 1
}), {});
Usage:
const veggie = [{ Veg_planted: 'Cabbage' }, { Veg_planted: 'Cabbage' }, { Veg_planted: 'Corn' }];
// result of counts:
// {Cabbage: 2, Corn: 1}
Actually: the code to count the items is probably going to be the same, no matter how many items there are. The thing that is going to be a problem as you scale though is the amount of data that you have to retrieve that you're not displaying to the user.
Firebase does not support aggregation queries, and your approach only works for short lists of items. For a more scalable solution, you should store the actual count itself in the database too.
So:
Have a blaCount property for each bla that exists.
Increment/decrement the counter each time your write/remove a bla to/from the database.
Now you can read only the counters, instead of having to read the individual items.
Firestore would be better option. You can query based on the field value.
var plantingRef = db.collection("PlantingCalendarEntries");
var query = plantingRef.where("Veg_planted", "==", "Cabbage");
if you still want to stuck with realtime database.
Save Counters to database.
Or use cloud dunctions to count.

Javascript Query Builder with MongoDB "$in" operator

So I have set up a query builder that builds a query based on the users interaction with the data filtration area on the front end which contains a lot of radio buttons and dropdown boxes etc. Similar to what eBays data filtration function provided on their website.
My Query Builder so far:
app.post('/user/test',function(req, res) {
var query = {};
if (req.body.region){
query.region = req.body.region
console.log(query.region)
}
if(req.body.sector){
query.sector = req.body.sector
console.log(query.sector)
}
if(req.body.client){
query.client = req.body.client
console.log(query.client)
}
Project.find(query, function(err, project){
if (err){
res.send(err);
}
console.log(project);
res.json(project);
});
});
Now the above works very well. I can send filtration options in any scenario and it will bring back the required result. For example I can only send the region name and it will give me all the data that belongs to that region or I can send region name, sector name and it will further filter down the data that matches region and sector name sent and so on.
The Issue:
Now my database contains an array of data like:
words: ["book", "table", "pen"]
Each object in the database will have this array. So if there are 100 objects in the database each has one of these will have the "words" array with different or similar values.
I want to be able to send multiple options like "table" , "pen" to my database and get all the objects that contains the those two options within the data array.
To achieve that I did the following:
if (req.body.sol){
var arr = [];
arr.push(req.body.sol)
query.words = {words: {$in: arr}}
}
The above Did not work.
But if I make the following changes to this line:
From
query.words = {words: {$in: arr}}
to
query = {words: {$in: arr}}
Making the above change does work but then it does not build the remaining queries. It only builds the "$in" query.
Any idea how I can fix this?
you can simply write the query like
query.words = {$in: arr}
This way you would be able to build rest of the query.
the reason why query.words = {words: {$in: arr}} fails is that the query becomes{words:{words: {$in: arr}}}, which is not what you want, since its trying to find words inside words.
instead using query.words = {$in: arr} will make your query {words: {$in: arr}
You can use the bracket notation to add the $in operator in your query properties:
if (req.body.sol){
var arr = [],
obj = {};
arr.push(req.body.sol);
obj["$in"] = arr;
query.words = obj;
}

Get latest record per field in Parse.com JS query

I have a table in my Parse database with columns validFrom and uniqueId. There can be multiple records with the same uniqueId (like a name)
What query do I have to use to get the items with the latest validFrom for a given set of uniqueIds? I tried the following but this limits my search to 1 item for the entire set rather than 1 item per unique_id record:
var UpdateObject = Parse.Object.extend("UpdateObject");
var query = new Parse.Query(UpdateObject);
query.containedIn("unique_id", uniqueIdsArray).select('status', 'unique_id').descending("validFrom").limit(1);
The query semantics are limited, so the only approach is to query for a superset and manipulate the result to what you need. This is better done on the server to limit the transmission of extra objects.
Big caveat: did this with pencil and paper, not a running parse.app, so it may be wrong. But the big idea is to get all of the matching objects for all of the uniqueIds, group them by uniqueId, and then for each group return the one with the maximum validFrom date...
function updateObjectsInSet(uniqueIdsArray ) {
var query = new Parse.Query("UpdateObject");
// find all of the UpdateObjects with the given ids
query.containedIn("unique_id", uniqueIdsArray);
query.limit(1000);
return query.find().then(function(allUpdates) {
// group the result by id
var byId = _.groupBy(allUpdates, function(update) { return update.get("unique_id"); });
// for each group, select the newest validFrom date
return _.map(byId, function (updates) {
_.max(updates, function(update) { return -update.get("validFrom").getTime(); });
});
});
}
To place this in the cloud, just wrap it:
Parse.Cloud.define("updateObjectsInSet", function(request, response) {
updateObjectsInSet(request.params.uniqueIdsArray).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
});
Then use Parse.Cloud.run() from the client to call it.

Laravel: upload multiple rows created from arrays

I'm building an application with Laravel which communicates with an OpenCPU server to perform some calculations. The OpenCPU server returns data in JSON format, which I then process to pull out the relevant information. This data returns a sku, retailer, date and sales. These are then posted to a controller using AJAX. Within this controller I then want to upload this data into the database by creating a new array of data to upload in one go. Each row should have a sku, retailer, date and sales. The date field in the database is called date, but called obs in the code.
OpenCPU returns JSON which is the parsed to a Javascript object using
var data = JSON.parse(output);
After logging to the Javascript console I get an array of the correct length, with the sales numbers.
The data is then sent to a Laravel controller via AJAX
$('#save').click(function(){
var id = $('#filter option:selected').text();
var json = $.ajax({
url: 'sales/' + id + '/update',
type: 'POST',
data: {
'sku': $('#sku').text(),
'retailer': $('#retailer').text(),
'obs': data.OBS,
'sales': data.Sales,
},
async: false
}).responseText;
var message = JSON.parse(json);
$('.flash').html(message).fadeIn(300).delay(2500).fadeOut(300);
});
In Laravel I then try to store the data in a MySQL database using the following
$sku = Input::get('sku');
$retailer = Input::get('retailer');
$obs = Input::get('obs');
$sales = Input::get('sales');
foreach($obs as $key => $n ){
$arrayData[] = array(
'sku' => $sku,
'retailer' => $retailer,
'date' => $obs[$key]
'sales' => $sales[$key]
);
}
Chart::create($arrayData);
However the above code doesn't appear to work. The following code will create the correct number of rows in the database with the sku and retailer populated, but the sales figure is just the loop number, rather than the number of sales
$sku = Input::get('sku');
$retailer = Input::get('retailer');
$dates = Input::get('obs');
$sales= Input::get('sales');
foreach(range(1, count($dates)) as $key){
DB::table('charts')->insert(
[
'sku' => $sku,
'retailer' => $retailer,
'date' => DateTime($obs[$key]),
'sales' => $sales[$key]
]
);
}
Given that the sku and retailer are a single input and repeated, I expect it's either an issue with passing the array to Laravel or the way in which I'm trying to access the elements in the 'obs' and 'sales' array
It looks like you have the right steps, get the inputs:
$sku = Input::get('sku');
$retailer = Input::get('retailer');
$dates = Input::get('obs');
$sales= Input::get('sales');
Buy now you try to forcibly insert them into the database. Why not use eloquent for database insertion: (Keep in mind you'd need to have a model for the charts table called Chart.php)
$chart = new Chart;
$chart->sku = $sku;
$chart->retailer = $retailer;
$chart->dates = $dates;
$chart->save();
That being said, I do realize that you're trying to pass arrays to the database, so that might take some experimentation. If you can't figure out what's (attempting) being passed to the database, you can always use:
die($variable);
To check what's up. Good luck!

Categories