Get CSV File From URL & Convert to JSON Array - javascript

I need to get data from a csv file hosted on a url and convert it to json array. Till now I'm using this.
import request from "request-promise";
import encode from "nodejs-base64-encode";
let url = 'https://example.com/filename.csv';
function conversion(url) {
return request.get(url, {encoding: null})
.then( function (res) {
//var base64File = encode.encode(res, 'base64');
//how to convert the received data to json
});
}
I have tried converting to base64 first and then decoding the string but it just gives me string like below.
name,value,email
jason,23,email#email.com
jeremy,45,email#email.com
nigel,23,email#email.com
jason,23,email#email.com
I need it as a json array.

You can include a package like csvtojson to handle this conversion for you.
/** csv file
a,b,c
1,2,3
4,5,6
*/
const csvFilePath='<path to csv file>'
const csv=require('csvtojson')
csv()
.fromFile(csvFilePath)
.then((jsonObj)=>{
console.log(jsonObj);
/**
* [
* {a:"1", b:"2", c:"3"},
* {a:"4", b:"5". c:"6"}
* ]
*/
})
// Async / await usage
const jsonArray=await csv().fromFile(csvFilePath);

You're halfway there.
You just need to:
split the string by end-of-line \n. You get an array of strings,
each representing a line in your csv file.
split each line by ,.
And in your case, don't forget to ignore the first line (the header).
Check this article out:
https://medium.com/#sanderdebr/converting-csv-to-a-2d-array-of-objects-94d43c56b12d

Split the string by \r\n to get an array of strings. Use the .map in array to make it an array of objects.
Here is an example:
var newArray = decodedString.split('\r\n').slice(1)map(row => {
var temp = row.split(',')
return {
name: temp[0],
value: temp[1],
email: temp[2]
}
})
Hope this helps :)

Include a package called csvtojson
const request=require('request')
const csv=require('csvtojson')
csv()
.fromStream(request.get('http://mywebsite.com/mycsvfile.csv'))
.subscribe((json)=>{
return new Promise((resolve,reject)=>{
// long operation for each json e.g. transform / write into database.
})
},onError,onComplete);
With Async and Await
async function ConvertCSVtoJSON{
const jsonArray = await csv().fromStream(request.get('http://mywebsite.com/mycsvfile.csv'));
console.log(jsonArray);
}

Related

How to format the file containing json objects to json array

I have json file containing multiple objects in a single file I want to convert them to JSON array how to do this javascript. My overall goal is generate a CSV file out of it.
Here is the sample file content,
{
Name:"nom1",
Cities:['city1','city2']
}
{
Name:"nom2",
Cities:['city4','city5']
}
Note above is the data dump it is not correct format yet , I want to convert it to following format
var data = [ { Name:"nom1", Cities:['city1','city2'] }, { Name:"nom2", Cities:['city3','city4'] } ]};
Then pass it to the below script.
I have thousands of such objects in a single file. I want to generate CSV data as follows
|Name|Cities|
|nom1|city1 |
|nom1|city2 |
|nom2|city3 |
|nom2|city4 |
I am using node javascript to achieve this ,
const { Parser, transforms: { unwind } } = require('json2csv');
const data = {
Name:"nom1",
Cities:['city1','city2']
}
const fields = ['Name'];
const transforms = [unwind({ paths: ['features'] })];
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
console.log(csv);
This would do the trick with the file exactly like the one you presented:
log-parser.js
const fs = require('fs')
async function fileToArray (file) {
const fileContent = await fs.promises.readFile(file)
const singleLined = fileContent
.toString()
.replace(/\n/g, '')
.replace(/\}\{/g, '},{')
.replace(/(\w+):/g, '"$1":')
.replace(/'/g, '"')
console.log(singleLined)
const array = JSON.parse(`[${singleLined}]`)
console.log(array)
}
fileToArray(process.argv[2])
node log-parser.js my-log-file.log

Node.js write to CSV file

I have an array of text lines in JavaScript inside variable named "dataArray":
[
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;',
...
]
First line contains headers, and other lines contain data.
I need to write this into .csv file and store that file. How can I do this (I'm using Node.js)?
First i converte the array to a valid csv data, for that i replace all ; with ,
Then i join all entries together with a newline (csv.join("\r\n")) and write it to a file.
const fs = require("fs");
const data = [
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;'
];
const csv = data.map((e) => {
return e.replace(/;/g, ",");
});
fs.writeFile("./data.csv", csv.join("\r\n"), (err) => {
console.log(err || "done");
});

JavaScript Unicode Matching

I am using Node.js to read data from an XML file. But when I try to compare the data from the file with a literal, it is not matching, even though it looks the same:
const parser: xml2js.Parser = new xml2js.Parser();
const suggestedMatchesXml: any
= fs.readFileSync(`${inputSuggMatchXmlFile}`, 'utf8');
parser.parseString(suggestedMatchesXml, (_parseErr: any, result: any) => {
// console.debug(`result = ${util.inspect(result, false, null)}`);
suggestedMatchesObjFromXml = JSON.parse(JSON.stringify(result));
// console.debug(`suggestedMatchesObjFromXml BEFORE = ${JSON.stringify(suggestedMatchesObjFromXml)}`);
});
const destinations: Array<any> = suggestedMatchesObjFromXml.suggestedmatches.destination;
let docIdForUrl: string | undefined;
_.each(destinations, (destination: any) => {
const { url }: { url: string } = destination;
if (!destination.docId) {
console.debug(`processInputSmXmlFile(): url = ${url} ; index = ${_.indexOf(url, 'meetings')}`);
Here's the log:
processInputSmXmlFile(): url = https://apps.na.collabserv.com/meetings/ ; index = -1
I'm not sure how this could happen, unless one of those strings is unicode, and the other isn't.
How can I convert this one way or the other so that the data matches?
Because I was doing a JSON.parse(), url was not a string - it was an object. When I did a _.toString(url), and replaced _.indexOf(url, 'meetings') with _.includes(url, 'meetings') (since Lodash indexOf() is only for arrays), now everything is working.

How can I write an array to a file in nodejs and keep the square brackets?

I want to write a matrix to a .js file. When I use console.log(matrix) everything is fine but when I write it to the file it comes out differently.
var fs = require("fs");
var matrix = new Array(10);
for(var i=0;i<matrix.length;i++) matrix[i]=[];
for (var i = 0; i < 100 ; i++)
{
var n = i%10;
matrix[n].push(i);
}
console.log(matrix);
//write it as a js array and export it (can't get brackets to stay)
fs.writeFile("./matrixtest.js", matrix, function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
So the console.log gives me [[0,10,20,30,...100],...,[1,11,21,31,...91]] and so on. But opening up matrixtest.js it's only this:
0,10,20,30,40,50...
All the numbers separated by commas with no brackets. How do I prevent it from converting to that format? Thank you.
When you are writing an Array to a file, it is getting converted to a string as JavaScript cannot figure out how to write an array as it is. That is why it loses the format. You can convert an array to a string like this and check
var array = [1, 2, 3, 4];
console.log(array.toString());
// 1,2,3,4
So, to solve this problem, you might want to convert it to a JSON string like this
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
...
}
stringify it (JSON.stringify) before saving it then parse it (JSON.parse) when reading it back in.
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
then when reading back in
var matrix = JSON.parse(contents);
The system doesn't know that you wanna store the array into the file with [].
It just puts the contents of the array to the file.
So, first you need to convert the data you wanna write to file into JSON format.
The JSON.stringify() method converts a JavaScript value to a JSON string
Then, write the JSON string to the file.
Then, while reading, use JSON.parse(). JSON.parse() method parses a JSON string, constructing the JavaScript value or object described by the string
fs.writeFile('./matrix.js', JSON.stringify(matrix), function (err) {
if(err) {
console.log(err);
}
})
fs.readFile('./matrix.js', function(err, data) {
console.log(JSON.parse(data));
//Do whatever you want with JSON.parse(data)
});

Write objects into file with Node.js

I've searched all over stackoverflow / google for this, but can't seem to figure it out.
I'm scraping social media links of a given URL page, and the function returns an object with a list of URLs.
When I try to write this data into a different file, it outputs to the file as [object Object] instead of the expected:
[ 'https://twitter.com/#!/101Cookbooks',
'http://www.facebook.com/101cookbooks']
as it does when I console.log() the results.
This is my sad attempt to read and write a file in Node, trying to read each line(the url) and input through a function call request(line, gotHTML):
fs.readFileSync('./urls.txt').toString().split('\n').forEach(function (line){
console.log(line);
var obj = request(line, gotHTML);
console.log(obj);
fs.writeFileSync('./data.json', obj , 'utf-8');
});
for reference -- the gotHTML function:
function gotHTML(err, resp, html){
var social_ids = [];
if(err){
return console.log(err);
} else if (resp.statusCode === 200){
var parsedHTML = $.load(html);
parsedHTML('a').map(function(i, link){
var href = $(link).attr('href');
for(var i=0; i<socialurls.length; i++){
if(socialurls[i].test(href) && social_ids.indexOf(href) < 0 ) {
social_ids.push(href);
};
};
})
};
return social_ids;
};
Building on what deb2fast said I would also pass in a couple of extra parameters to JSON.stringify() to get it to pretty format:
fs.writeFileSync('./data.json', JSON.stringify(obj, null, 2) , 'utf-8');
The second param is an optional replacer function which you don't need in this case so null works.
The third param is the number of spaces to use for indentation. 2 and 4 seem to be popular choices.
obj is an array in your example.
fs.writeFileSync(filename, data, [options]) requires either String or Buffer in the data parameter. see docs.
Try to write the array in a string format:
// writes 'https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks'
fs.writeFileSync('./data.json', obj.join(',') , 'utf-8');
Or:
// writes ['https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks']
var util = require('util');
fs.writeFileSync('./data.json', util.inspect(obj) , 'utf-8');
edit: The reason you see the array in your example is because node's implementation of console.log doesn't just call toString, it calls util.format see console.js source
If you're geting [object object] then use JSON.stringify
fs.writeFile('./data.json', JSON.stringify(obj) , 'utf-8');
It worked for me.
In my experience JSON.stringify is slightly faster than util.inspect.
I had to save the result object of a DB2 query as a json file, The query returned an object of 92k rows, the conversion took very long to complete with util.inspect, so I did the following test by writing the same 1000 record object to a file with both methods.
JSON.Stringify
fs.writeFile('./data.json', JSON.stringify(obj, null, 2));
Time: 3:57 (3 min 57 sec)
Result's format:
[
{
"PROB": "00001",
"BO": "AXZ",
"CNTRY": "649"
},
...
]
util.inspect
var util = require('util');
fs.writeFile('./data.json', util.inspect(obj, false, 2, false));
Time: 4:12 (4 min 12 sec)
Result's format:
[ { PROB: '00001',
BO: 'AXZ',
CNTRY: '649' },
...
]
Could you try doing JSON.stringify(obj);
Like this:
var stringify = JSON.stringify(obj);
fs.writeFileSync('./data.json', stringify, 'utf-8');
Just incase anyone else stumbles across this, I use the fs-extra library in node and write javascript objects to a file like this:
const fse = require('fs-extra');
fse.outputJsonSync('path/to/output/file.json', objectToWriteToFile);
Further to #Jim Schubert's and #deb2fast's answers:
To be able to write out large objects of order which are than ~100 MB, you'll need to use for...of as shown below and match to your requirements.
const fsPromises = require('fs').promises;
const sampleData = {firstName:"John", lastName:"Doe", age:50, eyeColor:"blue"};
const writeToFile = async () => {
for (const dataObject of Object.keys(sampleData)) {
console.log(sampleData[dataObject]);
await fsPromises.appendFile( "out.json" , dataObject +": "+ JSON.stringify(sampleData[dataObject]));
}
}
writeToFile();
Refer https://stackoverflow.com/a/67699911/3152654 for full reference for node.js limits

Categories