Writing generated JSON to file, BabyParse - javascript

I'm using BabyParse to convert a local CSV file to JSON. Here's the js file I've written that does it:
var Baby = require('babyparse');
var fs = require('fs');
var file = 'test2.csv';
var content = fs.readFileSync(file, { encoding: 'binary' });
parsed = Baby.parse(content, {fastMode: false});
rows = parsed.data;
console.log(rows);
fs.writeFile("blahblahblah.json", rows, function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
The JSON outputted to the console by the console.log(rows) line seems to be correct(ish). However, when I write rows to a file, all the JSON boilerplate disappears. For example, here's what I get when trying to convert the following csv file:
col1,col2,col3
"val1","val2","val3"
"val1","val2","val3"
"val1","val2","val3"
This is what gets printed to console:
[ [ 'col1', 'col2', 'col3' ],
[ 'val1', 'val2', 'val3' ],
[ 'val1', 'val2', 'val3' ],
[ 'val1', 'val2', 'val3' ],
[ '' ] ]
But this is what gets written to the file:
col1,col2,col3,val1,val2,val3,val1,val2,val3,val1,val2,val3,
Does anyone know what's happening here? Why is the JSON-specific syntax being stripped out?

You need to convert your json into a string before you save it.
rows = JSON.stringify(parsed.data);

This should do the trick!
fs.writeFile("blahblahblah.json", JSON.stringify(Baby.parse(content, {fastMode: false}).data), function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});

Related

Add value to array in JSON file with createWriteStream node

I have a problem writing a json file using creatwritestream NodeJS. The JSON in file is ok But it append in the file, I need overwrite the content. I storing selected answer when clicked on button.
Example:
Question A :
Evaluated selected : D ANSWER (Save this in json)
Question B:
Evaluated selected : C ANSWER (Save this in json)
I GOT:
{"IdTest":"1021","answers":[{"questionID":"1","answerSelected":"D"}]}\n {"IdTest":"1021","answers":[{"questionID":"1","answerSelected":"D"},{"questionID":"2","answerSelected":"C"}]}
json file image
This is my code:
//init variables
let jsonObject = {};
const JSON_ANSWERS_FILE = fs.createWriteStream("path/to/jsons/file.json", {
flags: 'w',
encoding: 'utf8'
});
//I created the key and value for the json object
jsonObject = {
"id": 123,
"answers": []
}
// BUTTON TO SAVE ANSWER IN JSON
jsonObject.answers.push({
"qID": $("#element").text(),
"selected": $("input:radio[name=radioName]:checked").attr("value")
})
JSON_ANSWERS_FILE.write(JSON.stringify(jsonObject), (error) => {
if (error) {
Swal.fire({
icon: 'error',
title: 'Oops...',
text: 'message'
})
}
})

Problem generating buffer for nodejs csv file creation

Iam able to generate a csv file with the data below. I am using a nodejs library "csv-writer" that generates the file quite well. My problem is that I need a way to return back a buffer instead of the file itself. Reason being I need to upload the file to a remote server via sftp.
How do I go ab bout modifying this piece of code to enable buffer response? Thanks.
...
const csvWriter = createCsvWriter({
path: 'AuthHistoryReport.csv',
header: [
{id: 'NAME', title: 'msg_datetime_date'},
{id: 'AGE', title: 'msg_datetime'}
]
});
var rows = [
{ NAME: "Paul", AGE:21 },
{ NAME: "Charles", AGE:28 },
{ NAME: "Teresa", AGE:27 },
];
csvWriter
.writeRecords(rows)
.then(() => {
console.log('The CSV file was written successfully');
});
...
Read your own file with fs.readFile('AuthHistoryReport.csv', data => ... );. If you don't specify an encoding, then the returned data is a buffer, not a string.
fs.readFile('AuthHistoryReport.csv', 'utf8', data => ... ); Returns a string
fs.readFile('AuthHistoryReport.csv', data => ... ); Returns a buffer
Nodejs file system #fs.readFile
You need to store your created file in a buffer using the native package fs
const fs = require('fs');
const buffer = fs.readFileSync('AuthHistoryReport.csv');

JS/Node: Updating a JSON file with the square brackets

My application needs to write the new users who sign up in to a JSON file, so that later I send this file to users.
What I want is something like:
[
{"username": "A"},
{"username": "B"},
{"username": "C"}
]
When a new user "D" signs up, NodeJS will update the file as:
[
{"username": "A"},
{"username": "B"},
{"username": "C"},
{"username": "D"}
]
However I'm having problems to implement this because although I can 'append' to the file, I cannot write a user name just before the closing ']'.
I tried to do without square brackets and JSON.parse(arayFromFileRead) but it gives me an
'unexpected token {'
error.
Could somebody help with either:
Writing to the file, one line before the last line. That is, the line before the closing squire bracket.
Reading a file as a JSON object without the enclosing square brackets.
Thank you.
In order to write proper JSON (and be able to parse it as such with JSON.parse), you need to have commas between objects inside an array.
[
{"username": "A"},
{"username": "B"},
{"username": "C"}
]
Checkout this example:
var fs = require('fs');
function addUser(user, callback) {
var usersFile = './users.json';
fs.readFile(usersFile, function(err, users) {
if (err) {
return (callback)? callback(err) : console.error(err);
}
users = (users)? JSON.parse(users) : [];
users.push(user);
fs.writeFile(usersFile, JSON.stringify(users), function(err, result){
(callback)? callback(err, result) : console.error(err);
});
});
}
addUser({username: 'D', password: 'blablabla'});
logic:
to have one users.json file where we'll keep all user data serialized by JSON.stringify() function.
to do so You've to read whole file to variable, parse it, push new record to variable, serialize (stringify) and save it back to file
benefits:
there is no benefit! when Your file will be bigger You'll waste more memory and CPU to read it, push, serialize, write back. also Your file will be locked during read/write
SO BETTER TO DO THIS:
1) create users folder
2) make Your code like this:
var fs = require('fs');
var path = require('path');
var md5 = require('md5');
var usersDir = './users';
function addUser(user, callback) {
var userFile = path.join(usersDir, md5(user.username)+'.json');
fs.writeFile(userFile, JSON.stringify(user), function(err, result){
(callback)? callback(err, result) : console.error(err);
});
}
addUser({username: 'D', password: 'blablabla'});
logic: You have users folder where You keep users records "file per user" way
benefits:
when You've 1 file (users.json) You're having issue of parallel accessing same file.
but when You files are separate, so filesystem itself acts as database, where json file is row and content is document.

Converting a file utf8 with fast-csv module

I have a file with name "file.csv", this file have data below:
ID Full name
1 Steve
2 John
3 nam
4 Hạnh
5 Thủy
I use segment code below to parse this file to json file. But my results is not utf8
Code:
var fastCsv = require("fast-csv");
var fs = require("fs");
var iconv = require('iconv-lite');
var fileStream = fs.createReadStream("file.csv");
fastCsv
.fromStream(fileStream, {headers : ["id", "full_name"]})
.on("data", function(data){
console.log("------------------------");
console.log("data: ", data);
})
.on("end", function(){
console.log("done");
});
Results:
data: { id: '��I\u0000D\u0000', full_name: '\u0000F\u0000u\u0000l\u0000l\u0000 \u0000n\u0000a\u0000m\u0000e\u0000' }
data: { id: '\u00001\u0000',full_name: '\u0000S\u0000t\u0000e\u0000v\u0000e\u0000' }
data: { id: '\u00002\u0000',full_name: '\u0000J\u0000o\u0000h\u0000n\u0000' }
data: { id: '\u00003\u0000',full_name: '\u0000n\u0000a\u0000m\u0000' }
data: { id: '\u00004\u0000', full_name: '\u0000H\u0000�\u001en\u0000h\u0000' }
data: { id: '\u00005\u0000',full_name: '\u0000T\u0000h\u0000�\u001ey\u0000' }
data: { id: '\u0000', full_name: '' }
How to convert my result to utf8?
Your input file is encoded in UTF-16LE, but it has been read as if it were UTF-8.
Try opening the file with fs.createReadStream('file.csv', {encoding: 'utf-16le'}).
Take a look at Javascript Has a Unicode Problem
In your case you need to decode the escaped unicode chars. A library included with node called punycode can handle this.
Import punycode via:
var punycode = require("punycode");
Change:
console.log("firstName: ", data);
To:
console.log("firstName: ", punycode.ucs2.decode(data));
You might have to break down the data object further to decode it's properties but I can't tell from your answer what their structure is.

How to create a CSV file on the server in nodejs

I have the following code which works well, but I'd like this to save and write to a CSV file in the folder I'm in. I'm running the JS in node. Big thanks!
var jsonexport = require('jsonexport');
json = [ { uniq_id: [ 'test' ],
product_url: [ 'http://www.here.com' ],
manufacturer: [ 'Disney' ]}]
jsonexport(json,function(err, csv){
if(err) return console.log(err);
console.log(csv);
});
Note: jsonexport is a JSON-to-CSV converter.
UPDATE you can Use something like this.
jsonexport(json,function(err, csv){
fs.writeFile("/tmp/test.csv", csv, function(err) {
if(err) {}
});
});

Categories