How to check update and insert while using bulkCreate in node.js - javascript

I am using node.js to upload an excel file into the database, in my service i am using bulkCreate to upload the data into the mysql db.Let me post the table structure
table name : customer_details
columns:
customer_org_id INT,
customer_name VARCHAR,
customer_type char,
active boolean,
customer_slot VARCHAR,
service_start_time DATE,
service_end_time DATE
I have one additional requirement that is, while i will upload the excel and try to push into the db then it must check in the database that customer_org_id &customer_name exists in the database or not.If the combination exists then the existing record will be updated and active column will be false.And a new row will be inserted with customer_org_id & customer_name and the active will be set to true.I am able to do the individual operations like create , update ,delete etc but i don't understand where to put these operations together while doing a bulkCreate. I am posting my code
const upload = async(req,res) => {
try{
if(req.file == undefined){
return res.status(400).send("Please upload an excel file");
}
let path=
__basedir + "/resources/static/assets/uploads/" + req.file.filename;
readXlsxFile(path).then((rows)=>{
rows.shift();
let custdetail = [];
row.forEach((row)=>{
let custdetails ={
customer_org_id: row[0],
customer_name :row[1],
customer_type :row[2],
active :row[3],
customer_slot: row[4],
};
custdetails.push(custdetail);
});
CustomerDetails.bulkCreate(custdetails)
.then(()=>{
res.status(200).send({
message: "Uploaded the file successfully :" + req.file.originalname,
});
})
.catch((error) =>{
res.status(500).send({
message : "Fail to import data into DB",
error : error.message,
});
});
});
}catch(error){
console.log(error);
res.status(500).send({
message : "Could not upload the file :" +req.file.originalname,
});
}
}
Can anyone let me know how i can do the operations before adding data to the Db ? I am new to node js

If I understood it correctly, it seems that bulkCreate is not best solution for your problem, because, you will need to do a validation and create/update for each line of your array.
I didn't understand all your requirements, but the code would be something close to this:
const upload = async (rows) => {
const promises = rows.map(async (sindleRow) => {
const customer = await CustomerDetails.find({ where: { customer_org_id: row[0], customer_name: row[1] }})
if (customer !== undefined) { // CUSTOMER FOUND
await CustomerDetails.update({ where: { id: customer.id }, data: { active: false }})
await CustomerDetails.create({ data: { customer_org_id: row[0], customer_name: row[1], active: false }})
}
return;
});
return await Promise.all(promises);
}
Important: This code is only an example.
Your scenario does not seem to benefit from the use of a bulkCreate, because the data needs to be individually verified.

Related

How to retrieve binary data from an image stored in a MongoDB document to then display it on the client?

The user has a form. He submits a file for his profile picture. The instance of Blob is being received by the server and its role is to upload in a document with MongoDB.
If I'm not mistaken, it is something like this:
import { serialize } from "bson";
// ...
// with `db` my database
// and `picture` the Blob.
await db
.collection("users")
.updateOne({ email: user.email }, { $set: { picture: serialize(picture) } });
So far, storing data seems to be working fine. The problem occurs when I want to retrieve the data in order to display it on the client side.
I fetch the data and get this in my "users" collection :
{
email: "randomuser#foobar.com",
picture : BinData(0, 'LgAAAAJuYW1lAAkAAABsb2dvLnBuZwABbGFzdE1vZGlmaWVkAADQM5z8XHhCAA==')
}
I'm not sure to understand what any of this means. All I want is to be able to transfer the data to the client side and then reconstruct the image in order to display it.
Note that the data is fetched server-side, using SvelteKit load function :
export const load: PageServerLoad = async ({cookies}) => {
const { user } = await getUser(cookies);
console.log("user.picture =", user.picture);
return {
user: {
picture: user.picture, // throws an error here because `picture` is non-serialisable
email: user.email
},
}
}
In the console it looks like this :
user.picture = new Binary(Buffer.from("2e000000026e616d6500090000006c6f676f2e706e6700016c6173744d6f6469666965640000d0339cfc5c784200", "hex"), 0)
What I tried
I tried to send the JSON of the buffer :
export const load: PageServerLoad = async ({cookies}) =>{
const { user } = await getUser(cookies);
console.log("user.picture =", user.picture);
console.log("user.picture =", user.picture!.buffer);
return {
user: {
picture: user.picture!.buffer.toJSON(),
email: user.email
},
}
}
And then, on the client side :
onMount(() => {
if (user.picture != undefined) {
const arr = new Uint8Array(user.picture.data);
console.log("on the client side : ", user.picture);
console.log("in Uint8Array : ", arr);
pictureURL = URL.createObjectURL(new Blob([arr], { type: "image/png" }));
console.log("pictureURL = ", pictureURL);
}
});
In the console, it gives something like this :
pictureURL = blob:http://localhost:5173/0ed581fc-87be-46ab-ade7-f44d78b98646
Therefore, I give that to the img element :
<img src={pictureURL} alt="Profile picture" />
But the image is not displayed, only the alt description.
To sum up
I have the binary data of an image in a document of my database. How do I display the corresponding image with the fetched data?

How do you replace and save documents in MongoDB?

I currently have a database that holds 5 entries per document in my mongoDB. Here is an example
I would like to update my documents in my database with more fields and information. Currently I am doing that with the code below.
NFT.findOne({ "name": name }, (err, value) => {
if (err) { console.error(err) }
if (!value) {
console.log('no value')
let newNFT = new NFT({name, slug, symbol, description, verifiedStatus, bannerUrl, logoUrl, floorPrice, stats, social})
newNFT.save()
}
else {
let newNFT = new NFT({name, slug, symbol, description, verifiedStatus, bannerUrl, logoUrl, floorPrice, stats, social})
NFT.replaceOne({"name": name}, newNFT, {returnDocument: "after"})
}
})
The reason for this question is, I have run console.log(NFT.find({"name": name}) and gotten an object back with all of the fields however, they don't seem to update in my database online. Because it pulls information from the web database, I know I'm connected, but they just aren't updating. What am I doing wrong here?
save() returns a Promise, you should await it.
Plus, try to change the code for updating your existing value:
NFT.findOne({ name }, async (err, value) => {
if (err) {
console.error(err);
}
if (!value) {
// Create new NFT
let newNFT = new NFT({
name,
slug,
symbol,
description,
verifiedStatus,
bannerUrl,
logoUrl,
floorPrice,
stats,
social,
});
await newNFT.save();
} else {
// Update existing NFT
value.name = name
value.slug = slug
value.symbol = symbol
value.description = description
value.verifiedStatus = verifiedStatus
value.bannerUrl = bannerUrl
value.logoUrl = logoUrl
value.floorPrice = floorPrice
value.stats = stats
value.social = social
await value.save();
}
});

Retrieve an array from a Firestore document and store it to Node.Js then use it as tokens to send notifications

I've been trying to figure this out for hours and I just can't. I'm still a beginner with Node.js and Firebase. I need your help to be able to retrieve the tokens array in my "userdata" collection to Node.js and be able to use it to send notifications in the Cloud Function. So far this is what I've been working on. Here is what my database looks like:
The receiverId is gathered from when I have an onCreate function whenever a user sends a new message. Then I used it to access the userdata of a specific user which uses the receiverId as their uid.
In the cloud function, I was able to start the function and retrieve the receiverId and print the userToken[key]. However, when I try to push the token it doesnt go through and it results in an error that says that the token is empty. See the image:
Your help would mean a lot. Thank you!
newData = snapshot.data();
console.log("Retrieving Receiver Id");
console.log(newData.receiverId); //uid of the user
const tokens = [];
const docRef = db.collection('userdata').doc(newData.receiverId);
docRef.get().then((doc) => {
if (doc.exists) {
console.log("DocRef exist");
const userToken = doc.data().tokens;
for(var key in userToken){
console.log(userToken[key]);
tokens.push(userToken[key]);
}
} else {
// doc.data() will be undefined in this case
console.log("No such document!");
}
}).catch((error) => {
console.log("Error getting document:", error);
});
//Notification Payload
var payload = {
notification: {
title: newData.sendBy,
body: 'Sent you a message',
sound: 'default',
},
data: {
click_action : 'FLUTTER_NOTIFICATION_CLICK',
route: '/telconsultinbox',
}
};
console.log("Sending Notification now.");
console.log(tokens);
try{
//send to device
const response = await admin.messaging().sendToDevice(tokens, payload);
console.log('Notification sent successfully');
console.log(newData.sendBy);
}catch(err){
console.log(err);
}
I think you should avoid using for..in to iterate through an array (you can read more about it in this answer). Try one of these 2 options:
You could use forEach(), which is more elegant:
userToken.forEach((token) => {
console.log(token);
tokens.push(token);
});
for-of statement:
for(const token of userToken){
console.log(token);
tokens.push(token);
}
Also, I would consider renaming userToken to userTokens, since it should contain multiple values. Makes the code a bit more readable.

node.js redis.get(key,callback) returns null randomly even after data and key both exist in redis db

I am stuck in a very abnormal situation in node.js . I am using ioredis npm module to connect with redis database and saving keys in redis db 1.
In my application multiple api first get the data from a key and then get the data from another key which was returned from the first key.
In this case when 2 or more api hit comes on node instance. node js randomly gives null on the key even key exist in the redis database.
i have also tried to use an alternative to add settimeout to get key after sometime but still no success. Please check the code below
const userDataKey: any = await getUserDataFromRedis(`${UserConfig.USER_LOGIN_REDIS_PREFIX}${decodedData.id}_${decodedData.uuid}`)
const userData: any = await getUserDataFromRedis(`${userDataKey}`)
export const getUserDataFromRedis = async (key: string) => {
return new Promise((resolve, reject) => {
redisConnection.select(UserConfig.REDIS_DB, (err) => {
if (err) reject(err)
console.log("this is the key : ", key);
redisConnection.get(key, (error, data: any) => {
if (err) reject(error)
console.log("this is data : ", data);
if (_.isNull(data)) {
console.log("NULL CONDITION MATCHED:",data);
setTimeout(() => {
redisConnection.get(key, (error, data: any) => {
if (err) reject(error)
console.log(`this is key and data if first time found NULL:${key},${data}`);
resolve(JSON.parse(data));
});
}, 1000);
} else {
resolve(JSON.parse(data));
}
});
})
})
}
Can you please help me out what should be the reason, i get null sometimes even key exist in redis database.
this is data : "rm_user_92"
this is data : "rm_user_92"
this is data : {"id":92,"name":"VinayRM1","email":"vinayrm20#yopmail.com","password":"$2b$10$0h5i.TiB6AENQF1c7XB1DOrBh/yNcKW9m3H.CM77wz.59IVfOjrEe","phone":"943434343434","is_active":"1","image":null,"user_type":2,"created_at":"2021-01-11T11:56:44.000Z","updated_at":"2021-01-11T06:26:44.046Z","added_by":1,"company":null,"country_code":91,"password_updated_at":"2021-01-29T19:14:01.000Z","device_token":"asdfdsafsadfasfsaf"}
Executing (default): Select U.*,A."id" as rm_id, A."name" as rm_name, A."email" as rm_email, A."phone" as rm_phone
FROM "user" U
LEFT JOIN "admin" A ON U."rm_id" = A."id" where U."id" = 172
Executing (default): SELECT * FROM "augment_view_analytics" WHERE import_date='2021-11-08' and user_id=172 and file_type=1;
Executing (default): SELECT * FROM "augment_view_analytics" where user_id = 172 and file_type=1 and import_date <= '2021-11-08' ORDER BY import_date DESC LIMIT 1;
this is data : null
NULL CONDITION MATCHED: null

Is there a way to do this without nested using nested promise?

In the below cloud function, I am populating a collection-1 with an autogenerated ID and 5 field values. While adding each document, I am populating another collection with the document name as one of the properties containing the earlier auto-generated document name as the field,
Collection-1
-auto-id
-property1
-property2
-property3
Collection-2
property2
-auto-id from collection-1
Collection-2 is maintained for faster lookup of the data.
exports.addSafe = functions.https.onCall((data, context) => {
// The HTTP endpoint is going to receive an object with an attribute "data", which is going to contain an array of objects with every single safe data point to add
for (let i=0; i<data.length; i++) {
db.collection('Safes').add(data[i])
.then((docRef) => {
db.collection('Safes-Hardware').doc(data[i]['Mac address Check']).set({
"ID" : docRef.id
})
.then((value) =>{
console.log("Reference added with ID: ", value.id);
return { message: "Successful" }
})
.catch(err => {
console.log('Oops!, error while adding lookup details',err);
return { message: "Error while adding lookup details",err }
})
console.log('Mac written with ID: ', docRef.id);
return { message: "Success is within the palm of our hands." }
})
.catch(err => {
console.log('Error logged', err);
})
}
}
})
Updated Code - Using nested async-await
exports.addSafe = functions.https.onCall((data, context) => {
// The HTTP endpoint is going to receive an object with an attribute "data", which is going to contain an array of objects with every single safe data point to add
const attributesToDelete = ["CARTON#", "NO#"] // This first function call is implemented initially because of the first CSV file that I was given, which includes unnecessary columns, like "Carton" or "No". The factory producing the safes should send a CSV file with no unecessary extra data. If they do, this function should theoretically take care of removing those data points, to ensure that the database only holds the necessary data points ;)
deleteAttributes(data, attributesToDelete);
let validated = true;
//validateForm(data);
if (validated === false) {
console.log('Data cannot be validated. Misses the correct attributes')
} else {
for (let i=0; i<data.length; i++) {
try
{
// eslint-disable-next-line no-await-in-loop
var ifPresent = db.collection("Safes-Hardware").doc(data[i]['Mac address Check']);
ifPresent.get()
.then(async (doc)=>{
if (!doc.exists)
{
console.log("Document does not exit. Proceeding to add");
try{
// eslint-disable-next-line no-await-in-loop
const docRef = await db.collection('Safes').add(data[i])
console.log('Mac written with ID: ', docRef.id);
try{
// eslint-disable-next-line no-await-in-loop
await db.collection('Safes-Hardware').doc(data[i]['Mac address Check'])
.set({
"ID" : docRef.id
})
console.log("Reference added");
}
catch(err){
console.log("Error while adding reference",err)
}
}
catch(err){
console.log("Error while adding data to 'Safe' collection")
}
}
else
{
console.log("Document exists in database. Skipping safe with MAC Address: ",data[i]['Mac address Check']);
}
return { message: "Success is within the palm of our hands." }
})
.catch((error)=>{
console.log("Error while checking for duplicates", error);
});
}
catch(error){
console.log("Error logged",error)
}
}
}
})
What would be a better way to do this instead of using nested promises?
When I am not populating the second collection- the code works flawlessly. But when the second collection is also being populated - I get the following error once in a while (3/10 times)
Error:
Error logged { Error: The referenced transaction has expired or is no longer valid.
at Http2CallStream.call.on (/srv/node_modules/#grpc/grpc-js/build/src/client.js:96:45)
at emitOne (events.js:121:20)
at Http2CallStream.emit (events.js:211:7)
at process.nextTick (/srv/node_modules/#grpc/grpc-js/build/src/call-stream.js:71:22)
at _combinedTickCallback (internal/process/next_tick.js:132:7)
at process._tickDomainCallback (internal/process/next_tick.js:219:9)
code: 3,
details: 'The referenced transaction has expired or is no longer valid.',
metadata: Metadata { options: undefined, internalRepr: Map {} } }
Collections - Safe
Safes-Hardware
Please try to just first create a collection with the Custom Document Name and then set the data into the document as following:
const doc = db.collection('Safes').doc(data[i]['Mac address Check'])
doc.set({"ID" : docRef.id })

Categories