I have a file with name "file.csv", this file have data below:
ID Full name
1 Steve
2 John
3 nam
4 Hạnh
5 Thủy
I use segment code below to parse this file to json file. But my results is not utf8
Code:
var fastCsv = require("fast-csv");
var fs = require("fs");
var iconv = require('iconv-lite');
var fileStream = fs.createReadStream("file.csv");
fastCsv
.fromStream(fileStream, {headers : ["id", "full_name"]})
.on("data", function(data){
console.log("------------------------");
console.log("data: ", data);
})
.on("end", function(){
console.log("done");
});
Results:
data: { id: '��I\u0000D\u0000', full_name: '\u0000F\u0000u\u0000l\u0000l\u0000 \u0000n\u0000a\u0000m\u0000e\u0000' }
data: { id: '\u00001\u0000',full_name: '\u0000S\u0000t\u0000e\u0000v\u0000e\u0000' }
data: { id: '\u00002\u0000',full_name: '\u0000J\u0000o\u0000h\u0000n\u0000' }
data: { id: '\u00003\u0000',full_name: '\u0000n\u0000a\u0000m\u0000' }
data: { id: '\u00004\u0000', full_name: '\u0000H\u0000�\u001en\u0000h\u0000' }
data: { id: '\u00005\u0000',full_name: '\u0000T\u0000h\u0000�\u001ey\u0000' }
data: { id: '\u0000', full_name: '' }
How to convert my result to utf8?
Your input file is encoded in UTF-16LE, but it has been read as if it were UTF-8.
Try opening the file with fs.createReadStream('file.csv', {encoding: 'utf-16le'}).
Take a look at Javascript Has a Unicode Problem
In your case you need to decode the escaped unicode chars. A library included with node called punycode can handle this.
Import punycode via:
var punycode = require("punycode");
Change:
console.log("firstName: ", data);
To:
console.log("firstName: ", punycode.ucs2.decode(data));
You might have to break down the data object further to decode it's properties but I can't tell from your answer what their structure is.
Related
I am reading a csv file and wants to convert into a string of array. My file looks like this:
https://www.google.com/imghp?hl=en&tab=wi
https://maps.google.com/maps?hl=en&tab=wl
https://play.google.com/?hl=en&tab=w8
https://www.youtube.com/?gl=US&tab=w1
https://news.google.com/nwshp?hl=en&tab=wn
https://mail.google.com/mail/?tab=wm
https://drive.google.com/?tab=wo
https://www.google.com/intl/en/about/products?tab=wh
https://accounts.google.com/ServiceLogin?hl=en&passive=true&continue=https://www.google.com/
https://www.google.com/url?q=https://lifeinaday.youtube/%3Futm_source%3Dgoogle%26utm_medium%3Dhppcta%26utm_campaign%3D2020&source=hpp&id=19019062&ct=3&usg=AFQjCNEJMAD58Mjdnro8Mjm-RtJ3nfEIZA&sa=X&ved=0ahUKEwi98PWM4-HqAhVh1uAKHeYGCPwQ8IcBCAU
I am running this code to read the above csv file and pushing the data into csv_data array
const fs = require('fs');
const Papa = require('papaparse');
const csvFilePath = 'anchors.csv'
const file = fs.createReadStream(csvFilePath);
var csvData=[];
Papa.parse(file, {
header: true,
step: function(result) {
csvData.push(result.data)
},
complete: function(results, file) {
console.log('Complete', csvData.length, 'records.');
console.log(csvData)
}
});
However, I am getting this weird output seperated with colons and curly brackets.
[
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://maps.google.com/maps?hl=en&tab=wl'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://play.google.com/?hl=en&tab=w8'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://www.youtube.com/?gl=US&tab=w1'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://news.google.com/nwshp?hl=en&tab=wn'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://mail.google.com/mail/?tab=wm'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://drive.google.com/?tab=wo'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://www.google.com/intl/en/about/products?tab=wh'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://accounts.google.com/ServiceLogin?hl=en&passive=true&continue=https://www.google.com/'
},
{
'https://www.google.com/imghp?hl=en&tab=wi': 'https://www.google.com/url?q=https://lifeinaday.youtube/%3Futm_source%3Dgoogle%26utm_medium%3Dhppcta%26utm_campaign%3D2020&source=hpp&id=19019062&ct=3&usg=AFQjCNEJMAD58Mjdnro8Mjm-RtJ3nfEIZA&sa=X&ved=0ahUKEwi98PWM4-HqAhVh1uAKHeYGCPwQ8IcBCAU'
}
]
What am i doing wrong here. I just need the whole string separated by comma.Any help would be appreciated.Thanks.
Nothing wrong here, in a CSV file, the first line is for the header-row (the name of your columns). Here you have a column named 'https://www.google.com/imghp?hl=en&tab=wi', all subsequent lines are value-rows.
try passing header: false.
I would like to store accounts in a json file.
Something like : accounts{[user: user1, email: email1], [user: user2. email: email2]}
Javascript file
Accounts = {
Nickname: form.getElementsByTagName('input')[0].value,
Email: form.getElementsByTagName('input')[3].value
};
var json = JSON.stringify(Accounts);
fs.appendFile('Accounts.json', json, function(err){});
When I add a second user this code make a new object and it looks like.
Json file
{"NickName": "user1", "Email": "Email1"}{"NickName": "user2", "Email": "Email2"}
Then, when I try to read the file and parse it I receive an unexpected error {
I think the Json file should look like
{
{"Name":"value"},
{"Name2":"value2"}
}
I fix the issue.
Here the solution with a mix of Michael Brune's solution.
// Read accounts.json
const Accounts = fs.readFileSync('Accounts.json', 'utf-8');
// Check if json file is empty
if(Accounts.length !== 0){
var ParsedAccounts = JSON.parse(Accounts);
}
else{
ParsedAccounts = [];
}
ParsedAccounts.push(Account);
const NewData = JSON.stringify(ParsedAccounts, null, 4);
// Write new data to accounts.json
fs.writeFileSync('Accounts.json', NewData);
Basically, I push the new data in ParsedAccounts then I write it in the json file.
Maybe there is another way, but if your file is pretty small, then try like this:
Accounts.json:
{
"other": "other data"
}
index.js
const fileData = fs.readFileSync('Accounts.json');
let parsedFileData
try {
parsedFileData = JSON.parse(fileData);
} catch (error) {
if (error instanceof SyntaxError) {
// Create empty object to append to since parse was not a success
parsedFileData = {};
// Create backup of old, possible corrupt file
fs.writeFileSync('Accounts.backup.json', fileData);
}
}
const newFileData = JSON.stringify({
...parsedFileData,
...newDataToAppend,
}, null, 4);
fs.writeFileSync('Accounts.json', newFileData);
First you can parse the file.
Assign new and old data to a object.
Then convert that to srting JSON.Stringify
The null and 4 are to write a nice pretty file
Then write it to the file directly.
Iam able to generate a csv file with the data below. I am using a nodejs library "csv-writer" that generates the file quite well. My problem is that I need a way to return back a buffer instead of the file itself. Reason being I need to upload the file to a remote server via sftp.
How do I go ab bout modifying this piece of code to enable buffer response? Thanks.
...
const csvWriter = createCsvWriter({
path: 'AuthHistoryReport.csv',
header: [
{id: 'NAME', title: 'msg_datetime_date'},
{id: 'AGE', title: 'msg_datetime'}
]
});
var rows = [
{ NAME: "Paul", AGE:21 },
{ NAME: "Charles", AGE:28 },
{ NAME: "Teresa", AGE:27 },
];
csvWriter
.writeRecords(rows)
.then(() => {
console.log('The CSV file was written successfully');
});
...
Read your own file with fs.readFile('AuthHistoryReport.csv', data => ... );. If you don't specify an encoding, then the returned data is a buffer, not a string.
fs.readFile('AuthHistoryReport.csv', 'utf8', data => ... ); Returns a string
fs.readFile('AuthHistoryReport.csv', data => ... ); Returns a buffer
Nodejs file system #fs.readFile
You need to store your created file in a buffer using the native package fs
const fs = require('fs');
const buffer = fs.readFileSync('AuthHistoryReport.csv');
I have a data format that I receive from jquery data tables editor Datatables Editor which looks like the one below and I need to parse it so that I can store it into db but I have not figured out a way of doing so.
{ action: 'edit',
'data[1][Name]': 'Some Text ',
'data[1][Rating]': '1',
'data[1][Division]': 'Some Text '
}
What is the best way to parse this form of data using javascript ? The editor library comes with a php library for parsing the data but I am using nodejs for the backend/
If you want to convert data[] into a literal, you could do something like this :
var prop, fieldName, literal = {};
for (prop in data) {
if (prop != 'action') {
fieldName = prop.match(/\[(.*?)\]/g)[1].replace(/\]|\[/g,'');
literal[fieldName] = data[prop];
}
}
→demo. It will produce a literal like
{Name: "Some Text ", Rating: "1", Division: "Some Text "}
that can be used to be inserted in a mongodb for example.
It simply loops through data, extracts each #2 [] and take the content of that bracket as property names to the literal. I do not at all claim this is the best method.
I have new and maybe a bit more systematic approach, that excludes risc of '[]' characters in regexed strings. Very simple way is to use custom ajax, I have used my own data:
const editor = new $.fn.dataTable.Editor({
ajax: (method, url, data, success, error) => {
$.ajax({
type: 'POST',
url: '/updateproductcode',
data: JSON.stringify(data),
success: (json) => {
success(json);
},
error: (xhr, error, thrown) => {
error(xhr, error, thrown);
}
});
},
table: '#mytable',
idSrc: 'productcode',
fields: ...
Then on serverside you receive object, whose key is your stringified data:
{'{"action":"edit","data":{"08588001339265":{"productcode":"08588001339265","name":"does_not_existasdfadsf","pdkname":"Prokain Penicilin G 1.5 Biotika ims.inj.s.10x1.5MU","suklcode":"0201964","pdkcode":"2895002"}
}:''}
If you parse the key of it with JSON.parse(Object.keys(req.body)[0]), you get your results:
{ action: 'edit',
data:
{ '08588001339265':
{ productcode: '08588001339265',
name: 'does_not_existasdfadsf',
pdkname: 'Prokain Penicilin G 1.5 Biotika ims.inj.s.10x1.5MU',
suklcode: '0201964',
pdkcode: '2895002' } } }
I'm new to Node.js. I have a JSON object which looks like the following:
var results = [
{ key: 'Name 1', value: '1' },
{ key: 'Name 2', value: '25%' },
{ key: 'Name 3', value: 'some string' },
...
];
The above object may or may not have different values. Still, I need to get them into a format that looks exactly like the following:
{"Name 1":"1","Name 2":"25%","Name 3":"some string"}
In other words, I'm looping through each key/value pair in results and adding it to a single line. From my understanding this single line approach (with double quotes) is called "JSON Event" syntax. Regardless, I have to print my JSON object out in that way into a text file. If the text file exists, I need to append to it.
I do not know how to append to a text file in Node.js. How do I append to a text file in Node.js?
Thank you!
You can use JSON.stringify to convert a JavaScript object to JSON and fs.appendFile to append the JSON string to a file.
// write all the data to the file
var fs = require('fs');
var str = JSON.stringify(results);
fs.appendFile('file.json', str, function(err) {
if(err) {
console.log('there was an error: ', err);
return;
}
console.log('data was appended to file');
});
If you want to add just one item at a time, just do
// Just pick the first element
var fs = require('fs');
var str = JSON.stringify(results[0]);