How to properly use ON DUPLICATE KEY UPDATE in Node.js MySQL - javascript

I'm using Node.js to push values to a MySQL table like:
for (var i = 0; i < data.length; i++) {
flattenedData.push([data[i].id, data[i].adult, data[i].backdrop_path, JSON.stringify(data[i].genre_ids), data[i].original_language, data[i].original_title, data[i].overview, data[i].popularity, data[i].poster_path, data[i].release_date, data[i].title, data[i].video, data[i].vote_average, data[i].vote_count]);
//console.log(flattenedData);
}
db.query("INSERT INTO movies (id, adult, backdrop_path, genre_ids, original_language, original_title, overview, popularity, poster_path, release_date, title, video, vote_average, vote_count ) values ?", [flattenedData], function (err, result) {
if (err) {
throw err;
}
else {
console.log('data inserted' + result);
}
});
I want to add ON DUPLICATE KEY UPDATE to the query, but I keep getting syntax errors, can anyone show me the proper way?

I'm going to short this to three columns, and assume id is the only unique column.
db.query("INSERT INTO movies (id, adult, backdrop_path) VALUES (?, ?, ?)
ON DUPLICATE KEY UPDATE adult=VALUES(adult), backdrop_path=VALUES(backdrop_path)",
flattenedData, function (err, result) {
...
This means if the insert results in a duplicate on the primary/unique column (id), then copy the other columns from the values you tried to insert in the VALUES clause into their respective column, overriding what their value was previously in the existing row.
There's no shortcut for that; you have to spell out all such column assignments.
I'm pretty sure the argument to query() for parameters should be an array, but it looks like your flattenedData is already an array since you're pushing to it. So I don't think you need to put it in square-brackets.

I was confused by how to get this to work in a react/redux app and eventually came to the "correct" method.
My implementation required me to update one field value per record for an arbitrary number of records in a table with 21 fields.
If you are passing data as an array structure it like [['dataString',666.66],['dataString2',666666.66],['dataString3',666666666.66]] and then make sure you pass this whole thing as an array to the query function. See itemQtyData in my code sample below.
Another thing that tripped me up was the use of brackets around the values replacement string. I didn't need them. The examples I looked at showed implementations that needed them. I also only used a single ? for all the values. So instead of using (?,?) to represent the values in the query, which didn't work, I used ?.
I found it unnecessary to supply all the field names and the corresponding values for the table. MySQL will warn you if fields don't have a default value. I haven't found this to be an issue in this case.
You can console.log the formatted sql in SqlString.format function in the file node_modules/sqlstring/lib/SqlString.js. I found this useful to see exactly why the query wasn't working and to have something that I could plug into MySQL Workbench to mess around with.
Edit: You can also do this console.log(connection.query(yourQuery, [someData], callback)) and you get the sql and lot's more when the function executes. Might make more sense than adding console.log calls to the module code.
Hope this helps!
let itemQtyData = order.map(item => {
return [
`${item.id}`,
`${Number(item.products_quantity - Number(item.quantity_to_add))}`
];
});
const updateQtyQuery =`INSERT INTO products (products_id, products_quantity) VALUES ? ON DUPLICATE KEY UPDATE products_quantity=VALUES(products_quantity)`;
connectionOSC.query(
updateQtyQuery,
[itemQtyData],
(error, results, fields) => {
if (error) throw error;
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Origin": "*"
},
body: saleId,
response: results,
isBase64Encoded: false
};
context.succeed(response);
});

Related

Upsert in prisma without ID creates duplicate records (promise error?)

I need to insert records in a PG database from a json file with 500k records.
Since the file is huge I'm creating a Read Stream and using JSONStream.parse to send json objects over to pipe.
So far so good. This is not the problem, I'm just providing context.
After parsing the object, I need to insert the information using prisma but I cannot insert records if certain field is already in the table. So first thing I think is that I should use an upsert.
The problem is that this field is not a unique key of that table, therefore I cannot use it in the where clause of the prisma upsert.
Then, I did the following:
await prisma.pets
.findFirst({
where: { name: nameFromJson },
})
.then(async (existing_pet) => {
if (!existing_pet) {
await prisma.pets.create({
data: {
name: nameFromJson,
legs: numberOfLegs,
isAlive: isAlive,
},
})
}
})
.catch((error) => {
throw new Error(error)
})
My idea is to find the record with the same field first and when that promise is resolved, send the result to another promise where I can check the record and if it does not exist, then just shoot the create.
But I keep getting duplicates in the table.
I'd like to understand what I'm doing wrong. And of course, what would be the right way to proceed in a scenario like this.

NodeJS + MySql nested queries

I'm trying to get users from database and then loop throug each user and get their images. However when I try to assign images array to user.images property nothing happens. I still get only users with empty images array.
Currently my code is the following:
user.getAll().then(result =>{
const userCollection = result[0];
for(let i=0; i < userCollection.length; i++){
userCollection[i].images = [{}];
image.getImagesByUserId(userCollection[i].Id).then(res =>{
userCollection[i].images = res[0];
}).catch(err =>{
console.log(err);
res.status(400).json(err);
})
}
res.json(userCollection);
})
.catch(err =>{
console.log(err);
res.status(400).json(err);
});
Why I can't assign images array to it's property?
It looks to me like you need to run your res.json(userCollection) from inside the callback.
This will of course mean you need to rename your res variable in the callback
This is assuming you are somehow using res.json(userCollection) to export the information. Which I am inferring based on the fact I don’t see res defined anywhere
May be a silly question, but are you assigning it? I only see
userCollection[i].images = [{}];
above and never see:
userCollection[i].images = res;
assignment once you grab the images.
The problem is that getImagesByUserId() is asynchronous so by the time you get to the assignment, you've already sent the response.
The order of events you have currently is:
Assign an empty image list to all the users
Queue up a load of database requests
Send the response (res.json())
Reassign the images for all the users once the database results come back.
The easy fix is to look at Promise.all() and aggregate the results when they all come back.
However, this isn't the most efficient way to deal with SQL databases, I would suggest you restructure your query so that you get all the images for all the users in 1 trip to the database and then format the response based on those results.
Say you have an image table and a user table, something like:
SELECT image_name, user_id
FROM image
then replace your loop in your js with:
getImagesForUsers().then(result => {
//create your response from the rows
res.json(response);
});

Deleting a row based on ID from a local csv file - javascript

I'm not a super experienced coder, so forgive me if the question is rather too simple.
I have a csv with many rows, and one of its columns is 'id'. How can I remove just one row based on the id (i.e. code should search for id and delete that row)?
I got the following so far (not too helpful since on one day I may need to remove id 5 and on another I may need to remove id 2...) Thank you so much!
var fs = require('fs')
fs.readFile(filename, 'utf8', function(err, data)
{
if (err)
{
// check and handle err
}
var linesExceptFirst = data.split('\n').slice(1).join('\n');
fs.writeFile(filename, linesExceptFirst);
});
PS: it must be in javascript as the code is running on a nodejs server
You'll need to parse the CSV which is simple with Array.prototype.map()
Then you'll need to use Array.prototype.filter() to find the column value you are after.
It is just a couple lines of code and you are all set:
var fs = require('fs')
// Set this up someplace
var idToSearchFor = 2;
// read the file
fs.readFile('csv.csv', 'utf8', function(err, data)
{
if (err)
{
// check and handle err
}
// Get an array of comma separated lines`
let linesExceptFirst = data.split('\n').slice(1);
// Turn that into a data structure we can parse (array of arrays)
let linesArr = linesExceptFirst.map(line=>line.split(','));
// Use filter to find the matching ID then return only those that don't matching
// deleting the found match
// Join then into a string with new lines
let output = linesArr.filter(line=>parseInt(line[0]) !== idToSearchFor).join("\n");
// Write out new file
fs.writeFileSync('new.csv', output);
});
Note that I removed the call to .join() so we can operate on the array created from the call to .split(). The rest is commented.
And finally, a working example can be found here: https://repl.it/#randycasburn/Parse-CSV-and-Find-row-by-column-value-ID
EDIT: The code will now return all rows except the found id. Hence, in essence, deleting the row. (Per OPs comment request).
EDIT2: Now outputting to new CSV file per request.

How to set input variable as key in json while creating view in couchBase

I am trying to create user specific views on couchBase from Node.js. Here is the code I have
app.post("/todo/:id", function(req, res){
console.log("hey" + req.body.userId)
baseview.setDesign('design_users', {
"$req.body.userId": {
'map': "function (doc, meta) { if(doc.toUserName == req.body.userId) {emit(doc.status, doc.title);}}"
}
},
function(err, result){
if (err != null) console.log(err);
else res.send(result)
}
);
})
On executing the following curl
curl -H "Content-Type: application/json" -X POST -d '{"userId": "uidam231"}' http://localhost:3000/todo/id
Though the design document design_users is created on couchBase, the expected view (uidam231) is not. There are no errors on the console.
I suspect that the assignment of the variable req.body.userId in the json is probably the cause. Please note that the view name ("$req.body.userId": ) and the filter criteria (doc.toUserName == req.body.userId) both need the passed in variable.
Any suggestions please?
It is not possible to pass variable to the map function in the view, because map() and reduce() used not to query data, but rather to build the index, materialized view of the data in the bucket, and later you can do filtering, ranging, sorting, but based on the keys you have selected to put into that index during build (i.e. when map was executed). If put it simple, the database runs your map function once for whole data set, and then for each new/changed document, to update the index. This is why you cannot pass or use the parameters from the query, or it makes meaningless usage of new Date() or Math.random() in some cases, because their values will be evaluated only once for each document during index build, but not the query time.
What you should do in your case, is to build map function like this:
function (doc, meta) {
emit(doc.toUserName, [doc.status, doc.title]);
}
This will give you view index, where user name is the key, and status and title is the value. So that you can query view using key= argument and pass there user ID.
More information:
Querying Data with Views
MapReduce Views

Update specific values in a nested array within mongo through mongoose

I have somewhat the following schema(without _id) -
{uid: String,
inbox:[{msgid:String, someval:String}]
}
Now, in the request I get the msgid and I use it in the following mongoose query like this-
my_model.findOne({'inbox.msgid':'msgidvaluexyz'}
, function(err, doc) {
console.log(doc);
return !0; })
Now, the problem is that I get the whole document which has the specific message along with the other messages in inbox -
Output-
{uid:'xyz',
inbox:[
{msgid:,someval},
{msgid:'our queried msgid',someval}, //required sub array
{msgid:,someval},
]
}
Now what query can i use to get the specific sub array only as the document inbox is too large to be looped through.
Use the $ positional selection operator to have the returned doc only include the matched inbox element:
my_model.findOne({'inbox.msgid':'msgidvaluexyz'}
, {'inbox.$': 1}
, function(err, doc) {
console.log(doc);
return !0; })
If I understood your question correctly:
// find each person with a last name matching 'Ghost'
var query = Person.findOne({ 'name.last': 'Ghost' });
// selecting the `name` and `occupation` fields
query.select('name occupation');
http://mongoosejs.com/docs/queries.html
You can select which fields you want to get back. You could get only the inbox array or everything except that.

Categories