How can I write a JavaScript object inside of an array that is inside a JSON file?
What I mean is: I'm making a Discord (message app) BOT, when the user uses the command "/add" the BOT will ask for 2 inputs, a "name" and an "artist" both this inputs make up a song so I'm creating an object called "data" for that song.
I also have a JSON file, that my database, what I want is, everytime this command is used, my object should the pushed inside of an array in my JSON file, so later on I can retrieve a random object inside of this array. How can I do that? I hope the question is not too confusing, thanks!
module.exports={
data: new SlashCommandBuilder()
.setName('add')
.setDescription('Add a song to the database.')
.addStringOption(option =>
option.setName('artist')
.setDescription('The artist of the song')
.setRequired(true))
.addStringOption(option =>
option.setName('name')
.setDescription('The name of the song')
.setRequired(true)),
async execute(interaction){
let name = interaction.options.getString('name');
let artist = interaction.options.getString('artist');
const data = { name: name, artist: artist};
await interaction.reply(`**` + artist + `**` + ` - ` + `**` + name + `**` + ` was added to the database.`)},
};
//WHAT YOU SEE FROM NOW ON IS A DIFFERENT FILE, A JSON FILE CALLED data.json with some examples of what it should look like
[
{
"name":"Die for You",
"artist":"The Weeknd"
},
{
"name":"FEAR",
"artist":"Kendrick Lamar"
}
]
You have to use node filesystem.
Import FS and use writeFile function.
https://nodejs.org/api/fs.html#filehandlewritefiledata-options
Dont forget about JSON.stringify to turn your object into string
const fs = require('fs');
const data = { name: name, artist: artist };
fs.writeFile("output.json", JSON.stringify(data), 'utf8', function (err) {
if (err) {
console.log("An error occured while writing JSON Object to File.");
return console.log(err);
}
console.log("JSON file has been saved.");
});
// Edit (Adding new objects to .json)
You have to read data from your file, add something and save again.
let rawdata = fs.readFileSync('data.json');
let data = JSON.parse(rawdata);
data.push({name: name, artist: artist});
// to use push() function data have to be an array, edit your .json file to " [] ",
// now you can add elements.
fs.writeFile("output.json", JSON.stringify(data), 'utf8', function (err) {
if (err) {
console.log("An error occured while writing JSON Object to File.");
return console.log(err);
}
console.log("JSON file has been saved.");
});
If you want to add something to an array, you .push it in there.
var arr = [];
arr.push(123);
arr.push({a:1,b:2});
arr.push(null);
arr.push("string");
console.dir(arr);
Related
I am using delete function in a scenario for cache deleting data from mongodb but how can i delete data which is specific object from redis. I used Lrem() but it didn't work, I would appreciate it if you could help me.
const deleteComment = async (req, res) => {
let key = `Comments/${req.query.postId}`;
try {
const deleteValue = await Comment.findOneAndDelete({
_id: req.params.id,
$or: [{ writer: req.user.id }, { authorId: req.user.id }],
})
.populate("responseTo", "writer")
.populate("postId", "authorId")
.populate("writer");
const jsonData = JSON.stringify(deleteValue);
await client.lRem(key, 0, jsonData);
res
.status(200)
.json({ success: true, message: "Comment Deleted", ıtem: deleteValue });
} catch (err) {
res.status(500).json({ message: err });
}
};
Do you have control over the code that inserts the JSON? If so, you'll need to make sure they generate the same strings. LREM requires an exact match to work. That said, you might want to consider a different data structure or maybe a combination of data structures.
One option is that you could store the JSON as a String and then the List could contain the keys to those Strings. Then, when you delete a comment, you call LREM to remove it from the List and UNLINK or DEL to remove the JSON. The List serves to store an ordered index of the comments. The comments themselves are each stored in a String.
If you can use RedisJSON, you could just store a JSON document with an array of the comments. That would give you ordered data and the data itself and then you could delete a particular comment using JSON.DEL and JSONPath.
I am using node.js to upload an excel file into the database, in my service i am using bulkCreate to upload the data into the mysql db.Let me post the table structure
table name : customer_details
columns:
customer_org_id INT,
customer_name VARCHAR,
customer_type char,
active boolean,
customer_slot VARCHAR,
service_start_time DATE,
service_end_time DATE
I have one additional requirement that is, while i will upload the excel and try to push into the db then it must check in the database that customer_org_id &customer_name exists in the database or not.If the combination exists then the existing record will be updated and active column will be false.And a new row will be inserted with customer_org_id & customer_name and the active will be set to true.I am able to do the individual operations like create , update ,delete etc but i don't understand where to put these operations together while doing a bulkCreate. I am posting my code
const upload = async(req,res) => {
try{
if(req.file == undefined){
return res.status(400).send("Please upload an excel file");
}
let path=
__basedir + "/resources/static/assets/uploads/" + req.file.filename;
readXlsxFile(path).then((rows)=>{
rows.shift();
let custdetail = [];
row.forEach((row)=>{
let custdetails ={
customer_org_id: row[0],
customer_name :row[1],
customer_type :row[2],
active :row[3],
customer_slot: row[4],
};
custdetails.push(custdetail);
});
CustomerDetails.bulkCreate(custdetails)
.then(()=>{
res.status(200).send({
message: "Uploaded the file successfully :" + req.file.originalname,
});
})
.catch((error) =>{
res.status(500).send({
message : "Fail to import data into DB",
error : error.message,
});
});
});
}catch(error){
console.log(error);
res.status(500).send({
message : "Could not upload the file :" +req.file.originalname,
});
}
}
Can anyone let me know how i can do the operations before adding data to the Db ? I am new to node js
If I understood it correctly, it seems that bulkCreate is not best solution for your problem, because, you will need to do a validation and create/update for each line of your array.
I didn't understand all your requirements, but the code would be something close to this:
const upload = async (rows) => {
const promises = rows.map(async (sindleRow) => {
const customer = await CustomerDetails.find({ where: { customer_org_id: row[0], customer_name: row[1] }})
if (customer !== undefined) { // CUSTOMER FOUND
await CustomerDetails.update({ where: { id: customer.id }, data: { active: false }})
await CustomerDetails.create({ data: { customer_org_id: row[0], customer_name: row[1], active: false }})
}
return;
});
return await Promise.all(promises);
}
Important: This code is only an example.
Your scenario does not seem to benefit from the use of a bulkCreate, because the data needs to be individually verified.
actually I've a json file test.json with information like this:
{
"id": ["1234"]
}
Now I want to add another id 2345 using discord.js. I am actually saving user ids in json file and now I want to make a command that will push more ids into the test.json file.
For example: With that command i can add another userid "2345" so that the json file will look like this:
{
"id": ["1234", "2345"]
}
Please Help me regarding this!!
There are several options below, all of which will have to be worked further to achieve the exact result you need but this will get you pointed in the right direction.
const fs = require('fs');
const cName = './path_to_json_from_bot_root_folder.json';
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
let file;
setInterval(() => {
file = requireUncached('../../path_to_json_from_this_file.json');
}, 500);
function prep(file) {
const myJson = JSON.stringify(file);
const JsonCut = myJson.slice(0, myJson.length - 1);
return JsonCut;
}
// Option 1
fs.writeFile(cName, 'information to input will completely overwrite the file', { format: 'json', }), (err) => {
if (err) throw err;
});
// Option 2
fs.writeFile(cName, prep(file) + 'add new information here will have to work through it to make sure that you have all the closing brackets' + '}', { format: 'json', }), (err) => {
if (err) throw err;
});
// Option 3
const oldInfo = file['subsectionName']
const newInfo = message.content // or whatever method you choose like args if you make it a command
fs.writeFile(cName, prep(file) + ${JSON.stringify(file['subsectionName']).trim().replace(`${oldInfo}`, `${newInfo}` + '}', { format: 'json', }), (err) => {
if (err) throw err;
});
I have a simple json object like so.
{
"gymData": {
"previousWorkouts": [
],
"exercises": [
]
}
}
both of the arrays above, are full of objects. I have 2 end points /workouts and /exercises.
my backend is just a simple express server with custom end points. when I add a new workout in the frontend and click submit, the /workouts endpoint behaves correctly.
however, I'm getting some issue when it's not updating the json correctly and resulting in a json error.
this is my node endpoint code
app.post("/exercises", function(req, res) {
fs.readFile('db.json', 'utf8', function (err, data) {
if (err) throw err;
let databaseData = JSON.parse(data);
const workoutExercises = req.body.workoutExercises.workouts
workoutExercises.forEach(exercise => {
const filteredExerciseDatabase = databaseData.gymData.exercises.filter(ex => ex.name === exercise.name)
filteredExerciseDatabase[0].previousWeights.push({date: req.body.date, weight: exercise.weight})
})
const updatedData = JSON.stringify(databaseData)
console.log(updatedData)
fs.writeFile('db.json', updatedData, 'utf8', function(err, data) {
if (err) throw err;
res.status(200).send("Basket was updated");
});
});
})
instead of writing the whole file again. I was wondering if I can just update the specific object key that I need too?
also, for reference, the error seems to be appending extra stuff onto the json object, so it's breaking. but in this line: console.log(updatedData) when I copy that logged data into a json validator, it is valid. so I'm confused as to why it's not writing the correct thing :/
I have a JSON object like this
{ "messages":["{date: 2017-8-23 12:50:05, name: aaa, msg: zzz}"]}
I want to push more objects to messages array after every message from user in Node.js but I only clone JSON object. How to fix this?
This is my code:
const JSONTemplate = (filePath, date, name, msg) => {
let obj = {
messages: [],
};
obj.messages.push(`{date: ${date}, name: ${name}, msg: ${msg}}`); //add some data
json = JSON.stringify(obj);
fs.appendFile(filePath, json, (err) => {
if (err) throw err;
}); }
You can try to put real object instead of string representation
obj.messages.push({date, name, msg});
But your question is still unclear for me. There is a mistake in string representation. You can not parse back your serialized object, because you miss apostrophe in your string.
And maybe you cannot get array of multiple values, because your rewrite your array by every function call.
const JSONTemplate = (filePath, date, name, msg, array = []) => {
let obj = {
messages: array,
};
obj.messages.push({date, name, msg}); //add some data
json = JSON.stringify(obj);
fs.appendFile(filePath, json, (err) => {
if (err) throw err;
}); }
if you want to append JSON data first you read the file and after that you could overwrite that.
const JSONTemplate = (filePath, date, name, msg) => {
fs.readFile(filePath, function (err, data) {
var json = JSON.parse(data);
//add some data
json.messages.push(JSON.stringify({date: ${date}, name: ${name}, msg: ${msg}}));
fs.writeFile(filePath, json, (err) => {
if (err) throw err;
});
});
}
also it's good practices to use try..catch to error handling when dealing with sync function like JSON.parse(), fs.readFileSync etc.
try{
var json = JSON.parse(data);
}catch(e){
throw e
}
Happy coding