I have json file containing multiple objects in a single file I want to convert them to JSON array how to do this javascript. My overall goal is generate a CSV file out of it.
Here is the sample file content,
{
Name:"nom1",
Cities:['city1','city2']
}
{
Name:"nom2",
Cities:['city4','city5']
}
Note above is the data dump it is not correct format yet , I want to convert it to following format
var data = [ { Name:"nom1", Cities:['city1','city2'] }, { Name:"nom2", Cities:['city3','city4'] } ]};
Then pass it to the below script.
I have thousands of such objects in a single file. I want to generate CSV data as follows
|Name|Cities|
|nom1|city1 |
|nom1|city2 |
|nom2|city3 |
|nom2|city4 |
I am using node javascript to achieve this ,
const { Parser, transforms: { unwind } } = require('json2csv');
const data = {
Name:"nom1",
Cities:['city1','city2']
}
const fields = ['Name'];
const transforms = [unwind({ paths: ['features'] })];
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
console.log(csv);
This would do the trick with the file exactly like the one you presented:
log-parser.js
const fs = require('fs')
async function fileToArray (file) {
const fileContent = await fs.promises.readFile(file)
const singleLined = fileContent
.toString()
.replace(/\n/g, '')
.replace(/\}\{/g, '},{')
.replace(/(\w+):/g, '"$1":')
.replace(/'/g, '"')
console.log(singleLined)
const array = JSON.parse(`[${singleLined}]`)
console.log(array)
}
fileToArray(process.argv[2])
node log-parser.js my-log-file.log
Related
From a txt file with values but not keys I want to add those values to a empty json file. I have to add the key for each value, save the json file and export the json file to a new txt, this time with keys.
So I have this json file
{ "agenda" : []
}
I have this txt file (all fake data in case you wonder):
"Fidel","Oltra","fidel#gmail.com","6650403234"
"Merxe","Sanz","merxe#gmail.com","65345235"
"David","Garcia","dgarcia#gmail.com","69823422"
"Amparo","López","alopez#gmail.com","67234234"
"Antonio","Gómez","antoniog#gmail.com","69929292"
And I want the json file to look like
{
"agenda":[
{
"Name": "Fidel",
"Surname": "Oltra",
"Email": "fidel#gmail.com",
"Phone": 6650403234
},
{
...
},
...
]
}
I have this code that is kind of working but I don't know how to push properly the data because at the final json file doesn't look as expected.
const archivo = require("fs");
const json = require("fs");
const file = archivo.readFileSync('agenda.txt', 'utf8');
console.log(file);
const lines = file.split('\n');
console.log(lines);
let campos;
let rawdata = json.readFileSync('agenda.json');
let json1 = JSON.parse(rawdata);
console.log(json1);
for (i in lines) {
campos = lines[i].split(",");
json1.agenda.push('Nombre:', campos[0]);
json1.agenda.push('Apellido:', campos[1]);
json1.agenda.push('Email:', campos[2]);
json1.agenda.push('Teléfono:', campos[3]);
console.log(campos);
console.log(json1);
};
let data = JSON.stringify(json1);
json.writeFileSync('agenda2.json', data);
And the json when I open it is:
{"agenda":["Nombre:","\"Fidel\"","Apellido:","\"Oltra\"","Email:","\"fidel#gmail.com\"","Teléfono:","\"6650403234\"\r","Nombre:","\"Merxe\"","Apellido:","\"Sanz\"","Email:","\"merxe#gmail.com\"","Teléfono:","\"65345235\"\r","Nombre:","\"David\"","Apellido:","\"Garcia\"","Email:","\"dgarcia#gmail.com\"","Teléfono:","\"69823422\"\r","Nombre:","\"Amparo\"","Apellido:","\"López\"","Email:","\"alopez#gmail.com\"","Teléfono:","\"67234234\"\r","Nombre:","\"Antonio\"","Apellido:","\"Gómez\"","Email:","\"antoniog#gmail.com\"","Teléfono:","\"69929292\""]}
So I would like some help to make it work and learn what I am doing wrong... and also to know how to write the right final json back to txt file.
There's two issues I found.
You never removed the quotes / cotizaciones from agenta.txt. This means that your ending json was escaping the quotes for you. "String" in JSON is "\"String\"".
In the for loop, you were pushing your keys and values into an array instead of making an object, and then pushing that object in the array.
const fs = require("fs");
const file = fs.readFileSync("agenda.txt", "utf8");
console.log(file);
const sinCotiz = file.replace(/"/g, ""); // remove quotes
const lines = sinCotiz.split("\n");
console.log(lines);
const rawdata = fs.readFileSync("agenda.json");
const json1 = JSON.parse(rawdata);
console.log(json1);
for (i in lines) {
const obj = {}; // create an object
const campos = lines[i].split(",");
obj["Nombre"] = campos[0]; // create keys and values for the object
obj["Apellido"] = campos[1];
obj["Email"] = campos[2];
obj["Teléfono"] = campos[3];
json1.agenda.push(obj); // push the entire object in the array
console.log(campos);
console.log(json1);
}
const data = JSON.stringify(json1);
fs.writeFileSync("agenda2.json", data);
i think there is more simple solution,
but you can format your file as it is a CSV file . you will put a layer at first to remove the '' and change , with ; , then you will need to add a header line in your case the header will be
Nombre;Apellido;Email;Teléfono
then you use this function :
function csvJSON(csv){
var lines=csv.split("\n");
var result = [];
var headers=lines[0].split(",");
for(var i=1;i<lines.length;i++){
var obj = {};
var currentline=lines[i].split(",");
for(var j=0;j<headers.length;j++){
obj[headers[j]] = currentline[j];
}
result.push(obj);
}
//return result; //JavaScript object
return JSON.stringify(result); //JSON
}
this will gives you a formated json with the following strecture :
[{Nombre:4444,Apellido:"AAAA",....}]
and you can put the whole function directly to your agenda :)
again i think there is more simple solution , but maybe this will help.
const archivo = require("fs");
const json = require("fs");
const file = archivo.readFileSync('agenda.txt', 'utf8');
const lines = file.split('\n');
let campos;
let rawdata = json.readFileSync('agenda.json');
let json1 = JSON.parse(rawdata);
for (i in lines) {
if(lines[i]) {
campos = lines[i].split(",");
let payload = {
'Name' : campos[0].substr(1,campos[0].length-2), // to remove double quotes from the string
'Surname' : campos[0].substr(1,campos[0].length-2),
'Email' : campos[0].substr(1,campos[0].length-2),
'Phone' : campos[0].substr(1,campos[0].length-2),
}
json1.agenda.push(payload);
}
};
let data = JSON.stringify(json1);
json.writeFileSync('agenda2.json', data);
How can I convert this text to JSON by nodejs?
Input :
---
title:Hello World
tags:java,C#,python
---
## Hello World
```C#
Console.WriteLine(""Hello World"");
```
Expected output :
{
title:"Hello World",
tags:["java","C#","python"],
content:"## Hello World\n```C#\nConsole.WriteLine(\"Hello World\"");\n```"
}
What I've tried to think :
use regex to get key:value array, like below:
---
{key}:{value}
---
then check if key equals tags then use string.split function by , to get tags values array else return value.
other part is content value.
but I have no idea how to implement it by nodejs.
If the input is in a known format then you should use a battle tested library to convert the input into json especially if the input is extremeley dynamic in nature, otherwise depending on how much dynamic is the input you might be able to build a parser easily.
Assuming the input is of a static structure as you posted then the following should do the work
function convertToJson(str) {
const arr = str.split('---').filter(str => str !== '')
const tagsAndTitle = arr[0]
const tagsAndTitleArr = tagsAndTitle.split('\n').filter(str => str !== '')
const titleWithTitleLabel = tagsAndTitleArr[0]
const tagsWithTagsLabel = tagsAndTitleArr[1]
const tagsWithoutTagsLabel = tagsWithTagsLabel.slice(tagsWithTagsLabel.indexOf(':') + 1)
const titleWithoutTitleLabel = titleWithTitleLabel.slice(titleWithTitleLabel.indexOf(':') + 1)
const tags = tagsWithoutTagsLabel.split(',')
const result = {
title: titleWithoutTitleLabel,
tags,
content: arr[1].slice(0, arr[1].length - 1).slice(1) // get rid of the first new line, and last new line
}
return JSON.stringify(result)
}
const x = `---
title:Hello World
tags:java,C#,python
---
## Hello World
\`\`\`C#
Console.WriteLine(""Hello World"");
\`\`\`
`
console.log(convertToJson(x))
Looks like you're trying to convert markdown to JSON. Take a look at markdown-to-json.
You can also use a markdown parser (like markdown-it) to get tokens out of the text which you'd have to parse further.
In this specific case, if your data is precisely structured like that, you can try this:
const fs = require("fs");
fs.readFile("input.txt", "utf8", function (err, data) {
if (err) {
return console.log(err);
}
const obj = {
title: "",
tags: [],
content: "",
};
const content = [];
data.split("\n").map((line) => {
if (!line.startsWith("---")) {
if (line.startsWith("title:")) {
obj.title = line.substring(6);
} else if (line.startsWith("tags")) {
obj.tags = line.substring(4).split(",");
} else {
content.push(line);
}
}
});
obj.content = content.join("\n");
fs.writeFileSync("output.json", JSON.stringify(obj));
});
Then you just wrap the whole fs.readFile in a loop to process multiple inputs.
Note that you need each input to be in a separate file and structured EXACTLY the way you mentioned in your question for this to work. For more general usage, probably try some existing npm packages like others suggest so you do not reinvent the wheel.
I need to get data from a csv file hosted on a url and convert it to json array. Till now I'm using this.
import request from "request-promise";
import encode from "nodejs-base64-encode";
let url = 'https://example.com/filename.csv';
function conversion(url) {
return request.get(url, {encoding: null})
.then( function (res) {
//var base64File = encode.encode(res, 'base64');
//how to convert the received data to json
});
}
I have tried converting to base64 first and then decoding the string but it just gives me string like below.
name,value,email
jason,23,email#email.com
jeremy,45,email#email.com
nigel,23,email#email.com
jason,23,email#email.com
I need it as a json array.
You can include a package like csvtojson to handle this conversion for you.
/** csv file
a,b,c
1,2,3
4,5,6
*/
const csvFilePath='<path to csv file>'
const csv=require('csvtojson')
csv()
.fromFile(csvFilePath)
.then((jsonObj)=>{
console.log(jsonObj);
/**
* [
* {a:"1", b:"2", c:"3"},
* {a:"4", b:"5". c:"6"}
* ]
*/
})
// Async / await usage
const jsonArray=await csv().fromFile(csvFilePath);
You're halfway there.
You just need to:
split the string by end-of-line \n. You get an array of strings,
each representing a line in your csv file.
split each line by ,.
And in your case, don't forget to ignore the first line (the header).
Check this article out:
https://medium.com/#sanderdebr/converting-csv-to-a-2d-array-of-objects-94d43c56b12d
Split the string by \r\n to get an array of strings. Use the .map in array to make it an array of objects.
Here is an example:
var newArray = decodedString.split('\r\n').slice(1)map(row => {
var temp = row.split(',')
return {
name: temp[0],
value: temp[1],
email: temp[2]
}
})
Hope this helps :)
Include a package called csvtojson
const request=require('request')
const csv=require('csvtojson')
csv()
.fromStream(request.get('http://mywebsite.com/mycsvfile.csv'))
.subscribe((json)=>{
return new Promise((resolve,reject)=>{
// long operation for each json e.g. transform / write into database.
})
},onError,onComplete);
With Async and Await
async function ConvertCSVtoJSON{
const jsonArray = await csv().fromStream(request.get('http://mywebsite.com/mycsvfile.csv'));
console.log(jsonArray);
}
I have an array of text lines in JavaScript inside variable named "dataArray":
[
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;',
...
]
First line contains headers, and other lines contain data.
I need to write this into .csv file and store that file. How can I do this (I'm using Node.js)?
First i converte the array to a valid csv data, for that i replace all ; with ,
Then i join all entries together with a newline (csv.join("\r\n")) and write it to a file.
const fs = require("fs");
const data = [
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;'
];
const csv = data.map((e) => {
return e.replace(/;/g, ",");
});
fs.writeFile("./data.csv", csv.join("\r\n"), (err) => {
console.log(err || "done");
});
I am using Node.js to read data from an XML file. But when I try to compare the data from the file with a literal, it is not matching, even though it looks the same:
const parser: xml2js.Parser = new xml2js.Parser();
const suggestedMatchesXml: any
= fs.readFileSync(`${inputSuggMatchXmlFile}`, 'utf8');
parser.parseString(suggestedMatchesXml, (_parseErr: any, result: any) => {
// console.debug(`result = ${util.inspect(result, false, null)}`);
suggestedMatchesObjFromXml = JSON.parse(JSON.stringify(result));
// console.debug(`suggestedMatchesObjFromXml BEFORE = ${JSON.stringify(suggestedMatchesObjFromXml)}`);
});
const destinations: Array<any> = suggestedMatchesObjFromXml.suggestedmatches.destination;
let docIdForUrl: string | undefined;
_.each(destinations, (destination: any) => {
const { url }: { url: string } = destination;
if (!destination.docId) {
console.debug(`processInputSmXmlFile(): url = ${url} ; index = ${_.indexOf(url, 'meetings')}`);
Here's the log:
processInputSmXmlFile(): url = https://apps.na.collabserv.com/meetings/ ; index = -1
I'm not sure how this could happen, unless one of those strings is unicode, and the other isn't.
How can I convert this one way or the other so that the data matches?
Because I was doing a JSON.parse(), url was not a string - it was an object. When I did a _.toString(url), and replaced _.indexOf(url, 'meetings') with _.includes(url, 'meetings') (since Lodash indexOf() is only for arrays), now everything is working.