I'm new to Node.js. I have a JSON object which looks like the following:
var results = [
{ key: 'Name 1', value: '1' },
{ key: 'Name 2', value: '25%' },
{ key: 'Name 3', value: 'some string' },
...
];
The above object may or may not have different values. Still, I need to get them into a format that looks exactly like the following:
{"Name 1":"1","Name 2":"25%","Name 3":"some string"}
In other words, I'm looping through each key/value pair in results and adding it to a single line. From my understanding this single line approach (with double quotes) is called "JSON Event" syntax. Regardless, I have to print my JSON object out in that way into a text file. If the text file exists, I need to append to it.
I do not know how to append to a text file in Node.js. How do I append to a text file in Node.js?
Thank you!
You can use JSON.stringify to convert a JavaScript object to JSON and fs.appendFile to append the JSON string to a file.
// write all the data to the file
var fs = require('fs');
var str = JSON.stringify(results);
fs.appendFile('file.json', str, function(err) {
if(err) {
console.log('there was an error: ', err);
return;
}
console.log('data was appended to file');
});
If you want to add just one item at a time, just do
// Just pick the first element
var fs = require('fs');
var str = JSON.stringify(results[0]);
Related
I have a problem writing a json file using creatwritestream NodeJS. The JSON in file is ok But it append in the file, I need overwrite the content. I storing selected answer when clicked on button.
Example:
Question A :
Evaluated selected : D ANSWER (Save this in json)
Question B:
Evaluated selected : C ANSWER (Save this in json)
I GOT:
{"IdTest":"1021","answers":[{"questionID":"1","answerSelected":"D"}]}\n {"IdTest":"1021","answers":[{"questionID":"1","answerSelected":"D"},{"questionID":"2","answerSelected":"C"}]}
json file image
This is my code:
//init variables
let jsonObject = {};
const JSON_ANSWERS_FILE = fs.createWriteStream("path/to/jsons/file.json", {
flags: 'w',
encoding: 'utf8'
});
//I created the key and value for the json object
jsonObject = {
"id": 123,
"answers": []
}
// BUTTON TO SAVE ANSWER IN JSON
jsonObject.answers.push({
"qID": $("#element").text(),
"selected": $("input:radio[name=radioName]:checked").attr("value")
})
JSON_ANSWERS_FILE.write(JSON.stringify(jsonObject), (error) => {
if (error) {
Swal.fire({
icon: 'error',
title: 'Oops...',
text: 'message'
})
}
})
Iam able to generate a csv file with the data below. I am using a nodejs library "csv-writer" that generates the file quite well. My problem is that I need a way to return back a buffer instead of the file itself. Reason being I need to upload the file to a remote server via sftp.
How do I go ab bout modifying this piece of code to enable buffer response? Thanks.
...
const csvWriter = createCsvWriter({
path: 'AuthHistoryReport.csv',
header: [
{id: 'NAME', title: 'msg_datetime_date'},
{id: 'AGE', title: 'msg_datetime'}
]
});
var rows = [
{ NAME: "Paul", AGE:21 },
{ NAME: "Charles", AGE:28 },
{ NAME: "Teresa", AGE:27 },
];
csvWriter
.writeRecords(rows)
.then(() => {
console.log('The CSV file was written successfully');
});
...
Read your own file with fs.readFile('AuthHistoryReport.csv', data => ... );. If you don't specify an encoding, then the returned data is a buffer, not a string.
fs.readFile('AuthHistoryReport.csv', 'utf8', data => ... ); Returns a string
fs.readFile('AuthHistoryReport.csv', data => ... ); Returns a buffer
Nodejs file system #fs.readFile
You need to store your created file in a buffer using the native package fs
const fs = require('fs');
const buffer = fs.readFileSync('AuthHistoryReport.csv');
This question already has answers here:
Parse JSON in JavaScript? [duplicate]
(16 answers)
Closed 4 years ago.
I very new to programming and I can't find the solution of my issue, can you give me the solution please?
I have this JSON file:
{
"groups": "[{ id: 1, title: 'group 1' }, { id: 2, title: 'group 2' }]"
}
And I need something like this in my js (but i want import my JSON to get array like this) :
const groups = [{ id: 1, title: 'group 1' }, { id: 2, title: 'group 2' }]
I don't know how to do without using jQuery.
I already tried with this:
const json = require('./test.json');
This code returns an object:
It's almost what I want but it didn't work when I use it in my code because I don't want an object but and array like I said above.
How can achieve this?
The value of groups is not valid JSON: string values should be surrounded by double-quote marks, and so should keys. The file with valid JSON in that string would look like this:
{
"groups": "[{ \"id\": 1, \"title\": \"group 1\" }, { \"id\": 2, \"title\": \"group 2\" }]"
}
Of course if you have control over the creation of this file, it would be better to have this value as part of the native JSON content, rather than JSON-in-a-string-inside-JSON. If that's not possible, you will need to correct the quoting in the string yourself, which can be done with a couple of Regular Expression replacements.
/* obj is the object in the JSON file */
var json_str = obj.groups.replace(/'/g,"\"").replace(/([a-z]+)\:/g,"\"$1\":");
var groups = JSON.parse(json_str);
Alternatively, although the string is not valid JSON it is a valid Javascript expression, so if the contents of the file are trustworthy, you can also do it with eval:
var groups = eval(obj.groups);
I just need to fill my array "groups" like that :
[{ id: 1, title: 'group 1' }, { id: 2, title: 'group 2' }]
with a json file and without jquery
Since I didn't notice the "without jQuery" in the original question, here is a new answer:
var request = new XMLHttpRequest();
request.open('GET', './test.json', true);
request.onload = function() {
if (request.status >= 200 && request.status < 400) {
// Success!
var data = JSON.parse(request.responseText);
} else {
// We reached our target server, but it returned an error
}
};
It does two things:
1. Loads the json file
2. Parses the loaded string into a javascript object.
Before you can even do anything with your JSON file, you need to load it. jQuery has a shortcut, which will even automatically parse the JSON-string into a native JS object for you:
$.getJSON('./test.json', function(data) {
console.dir(data);
});
https://api.jquery.com/jquery.getjson/
if you can't edit original text, you need replace ' to " and then did JSON.parse(json.groups);
otherwise you can change a little your json like this:
JSON.parse('[{ "id": 1, "title": "group 1" }, { "id": 2, "title": "group 2" }]')
be careful with " and ' parentheses
{
"key":"string"
}
string with defined with ' parentheses not valid
in JSON keys must be to in "" parentheses
You can parse the object to an array this way:
Object.values(YOUR_OBJECT)
docs: https://developer.mozilla.org/es/docs/Web/JavaScript/Referencia/Objetos_globales/Object/values
I want to read data from a file and add it to an Object stored in memory. The data in the file text.txt looks roughly like this:
One: {title: 'One' ,
contributor: 'Fred',
summary: 'blah' ,
comments: 'words' },
Two: {title: 'Two' ,
contributor: 'Chris' ,
summary: 'blah blah i'm a blah' ,
comments: '' },
I'm trying to set it to an empty Object like so:
var fs = require('fs');
var text = Object.create(null);
fs.readFile("./public/text.txt", "utf-8", function(error, data) {
text = { data };
});
However, when I log text to the console, it comes out looking like this:
{ data: 'One: {title: \'One\' ,\ncontributor: \'Fred\',\nsummary: \'blah\' ,\ncomments: \'words\' },\n \nTwo: {title: \'Two\' ,\ncontributor: \'Chris\' ,\nsummary: \'blah blah i\'m a blah\' ,\ncomments: \'\' },\n\n' }
Apparently, it's reading data as a key. What I really want, though, is something more like so:
{
One: {title: 'One' ,
contributor: 'Fred',
summary: 'blah' ,
comments: 'words' },
Two: {title: 'Two' ,
contributor: 'Chris' ,
summary: 'blah blah i'm a blah' ,
comments: '' },
}
Any advice here would be much appreciated.
If you are using a newer version of Node, then you have support for ES6.
// So your code
`text = { data }`
// is actually a shortcut for
`text = { data: data }`
That's why you end up with an object that has the key data and the value is a string version of what was found in the file. Instead, just use JSON.parse on the data parameter (which is a string) and it'll convert it to an Object, which you can store in text. Like this
var fs = require('fs');
var text = Object.create(null);
fs.readFile("./public/text.txt", "utf-8", function(error, data) {
text = JSON.parse(data);
});
You'll also need to make the file valid json. Which means keys need quotes around them as do String values.
{
"One": {
"title": "One" ,
"contributor": "Fred",
"summary": "blah" ,
"comments": "words"
},
"Two": {
"title": "Two" ,
"contributor": "Chris" ,
"summary": "blah blah i'm a blah" ,
"comments": ""
}
}
What you are trying to do is use eval, which is the only way if you really don't want to edit the file to be valid JSON or to export an object as #Spidy suggested. Just be sure the file is valid JavaScript, because the example you gave had
summary: 'blah blah i'm a blah'
but you need to escape i'm like i\'m.
var fs = require('fs');
var text = {};
fs.readFile('./public/text.txt', 'utf-8', function (error, data) {
eval(`text = {${data}}`);
//eval('text = {' + data + '}');
});
But I wouldn't necessarily recommend that because that allows arbitrary javascript to get executed. Depending on how the data in the file gets there it would be a huge security risk.
I have a file with name "file.csv", this file have data below:
ID Full name
1 Steve
2 John
3 nam
4 Hạnh
5 Thủy
I use segment code below to parse this file to json file. But my results is not utf8
Code:
var fastCsv = require("fast-csv");
var fs = require("fs");
var iconv = require('iconv-lite');
var fileStream = fs.createReadStream("file.csv");
fastCsv
.fromStream(fileStream, {headers : ["id", "full_name"]})
.on("data", function(data){
console.log("------------------------");
console.log("data: ", data);
})
.on("end", function(){
console.log("done");
});
Results:
data: { id: '��I\u0000D\u0000', full_name: '\u0000F\u0000u\u0000l\u0000l\u0000 \u0000n\u0000a\u0000m\u0000e\u0000' }
data: { id: '\u00001\u0000',full_name: '\u0000S\u0000t\u0000e\u0000v\u0000e\u0000' }
data: { id: '\u00002\u0000',full_name: '\u0000J\u0000o\u0000h\u0000n\u0000' }
data: { id: '\u00003\u0000',full_name: '\u0000n\u0000a\u0000m\u0000' }
data: { id: '\u00004\u0000', full_name: '\u0000H\u0000�\u001en\u0000h\u0000' }
data: { id: '\u00005\u0000',full_name: '\u0000T\u0000h\u0000�\u001ey\u0000' }
data: { id: '\u0000', full_name: '' }
How to convert my result to utf8?
Your input file is encoded in UTF-16LE, but it has been read as if it were UTF-8.
Try opening the file with fs.createReadStream('file.csv', {encoding: 'utf-16le'}).
Take a look at Javascript Has a Unicode Problem
In your case you need to decode the escaped unicode chars. A library included with node called punycode can handle this.
Import punycode via:
var punycode = require("punycode");
Change:
console.log("firstName: ", data);
To:
console.log("firstName: ", punycode.ucs2.decode(data));
You might have to break down the data object further to decode it's properties but I can't tell from your answer what their structure is.