I have an array of text lines in JavaScript inside variable named "dataArray":
[
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;',
...
]
First line contains headers, and other lines contain data.
I need to write this into .csv file and store that file. How can I do this (I'm using Node.js)?
First i converte the array to a valid csv data, for that i replace all ; with ,
Then i join all entries together with a newline (csv.join("\r\n")) and write it to a file.
const fs = require("fs");
const data = [
'freq[Hz];re:Trc1_S11;im:Trc1_S11;re:Trc2_S21;im:Trc2_S21;',
'2.400000000000000E+009;1.548880785703659E-001;1.067966520786285E-001;1.141964457929134E-003;5.855074618011713E-003;',
'2.400166666666667E+009;1.546109169721603E-001;1.043454632163048E-001;1.287244027480483E-003;5.807569250464439E-003;',
'2.400333333333334E+009;1.546102017164230E-001;1.018797382712364E-001;1.497663557529450E-003;5.986513104289770E-003;',
'2.400500000000000E+009;1.545133888721466E-001;9.928287565708160E-002;1.647840370424092E-003;5.912321619689465E-003;',
'2.400666666666667E+009;1.544111520051956E-001;9.671460092067719E-002;1.589289400726557E-003;5.917594302445650E-003;'
];
const csv = data.map((e) => {
return e.replace(/;/g, ",");
});
fs.writeFile("./data.csv", csv.join("\r\n"), (err) => {
console.log(err || "done");
});
Related
With NodeJs I need to fill the Excel file with the data fetched from the csv file.
I am using the ExcelJS npm package.
I sucessfully read the data from csv fiel and write it in console.log() but the problem is, it is very strange format.
Code:
var Excel = require("exceljs");
exports.generateExcel = async () => {
let workbookNew = new Excel.Workbook();
let data = await workbookNew.csv.readFile("./utilities/file.csv");
const worksheet = workbookNew.worksheets[0];
worksheet.eachRow(function (row: any, rowNumber: number) {
console.log(JSON.stringify(row.values));
});
};
Data looks like this:
[null,"Users;1;"]
[null,"name1;2;"]
[null,"name2;3;"]
[null,"name3;4;"]
[null,"Classes;5;"]
[null,"class1;6;"]
[null,"class2;7;"]
[null,"class3;8;"]
[null,"Teachers;9;"]
[null,"teacher1;10;"]
[null,"teacher2;11;"]
[null,"Grades;12;"]
[null,"grade1;13;"]
[null,"grade2;14;"]
[null,"grade3;15;"]
So the Excel file which I need to fill with this data is very complex.. In specific cells I need to insert the users, in other sheet I need some images with grades, etc...
The Main question for me is:
How can I parse and store the data which is displayed in my console.log() in separate variables like Users in separate variable, Grades in separate variable and Teachers in separate variable.
Example for users:
users = {
title: "Users",
names: ["name1", "name2", "name3"],
};
There is no need to be exactly the same as example, but the something that can be reused when I will read different csv files with same structure and so I could easily access to the specific data and put it in specific cell in the Excel file.
Thank you very much.
I prepared example, how could you pare your file. As it was proposed in one answer above we use fast-csv. The parsing is quite simple you split by separator and than took line[0] which is first element.
const fs = require('fs');
const csv = require('#fast-csv/parse');
fs.createReadStream('Test_12345.csv')
.pipe(csv.parse())
.on('error', error => console.error(error))
.on('data', function (row) {
var line = String(row)
line = line.split(';')
console.log(`${line[0]}`)
})
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));
If we put for input like this:
Value1;1101;
Value2;2202;
Value3;3303;
Value4;4404;
your output is in this case like this:
Value1
Value2
Value3
Value4
How can I convert this text to JSON by nodejs?
Input :
---
title:Hello World
tags:java,C#,python
---
## Hello World
```C#
Console.WriteLine(""Hello World"");
```
Expected output :
{
title:"Hello World",
tags:["java","C#","python"],
content:"## Hello World\n```C#\nConsole.WriteLine(\"Hello World\"");\n```"
}
What I've tried to think :
use regex to get key:value array, like below:
---
{key}:{value}
---
then check if key equals tags then use string.split function by , to get tags values array else return value.
other part is content value.
but I have no idea how to implement it by nodejs.
If the input is in a known format then you should use a battle tested library to convert the input into json especially if the input is extremeley dynamic in nature, otherwise depending on how much dynamic is the input you might be able to build a parser easily.
Assuming the input is of a static structure as you posted then the following should do the work
function convertToJson(str) {
const arr = str.split('---').filter(str => str !== '')
const tagsAndTitle = arr[0]
const tagsAndTitleArr = tagsAndTitle.split('\n').filter(str => str !== '')
const titleWithTitleLabel = tagsAndTitleArr[0]
const tagsWithTagsLabel = tagsAndTitleArr[1]
const tagsWithoutTagsLabel = tagsWithTagsLabel.slice(tagsWithTagsLabel.indexOf(':') + 1)
const titleWithoutTitleLabel = titleWithTitleLabel.slice(titleWithTitleLabel.indexOf(':') + 1)
const tags = tagsWithoutTagsLabel.split(',')
const result = {
title: titleWithoutTitleLabel,
tags,
content: arr[1].slice(0, arr[1].length - 1).slice(1) // get rid of the first new line, and last new line
}
return JSON.stringify(result)
}
const x = `---
title:Hello World
tags:java,C#,python
---
## Hello World
\`\`\`C#
Console.WriteLine(""Hello World"");
\`\`\`
`
console.log(convertToJson(x))
Looks like you're trying to convert markdown to JSON. Take a look at markdown-to-json.
You can also use a markdown parser (like markdown-it) to get tokens out of the text which you'd have to parse further.
In this specific case, if your data is precisely structured like that, you can try this:
const fs = require("fs");
fs.readFile("input.txt", "utf8", function (err, data) {
if (err) {
return console.log(err);
}
const obj = {
title: "",
tags: [],
content: "",
};
const content = [];
data.split("\n").map((line) => {
if (!line.startsWith("---")) {
if (line.startsWith("title:")) {
obj.title = line.substring(6);
} else if (line.startsWith("tags")) {
obj.tags = line.substring(4).split(",");
} else {
content.push(line);
}
}
});
obj.content = content.join("\n");
fs.writeFileSync("output.json", JSON.stringify(obj));
});
Then you just wrap the whole fs.readFile in a loop to process multiple inputs.
Note that you need each input to be in a separate file and structured EXACTLY the way you mentioned in your question for this to work. For more general usage, probably try some existing npm packages like others suggest so you do not reinvent the wheel.
I need to get data from a csv file hosted on a url and convert it to json array. Till now I'm using this.
import request from "request-promise";
import encode from "nodejs-base64-encode";
let url = 'https://example.com/filename.csv';
function conversion(url) {
return request.get(url, {encoding: null})
.then( function (res) {
//var base64File = encode.encode(res, 'base64');
//how to convert the received data to json
});
}
I have tried converting to base64 first and then decoding the string but it just gives me string like below.
name,value,email
jason,23,email#email.com
jeremy,45,email#email.com
nigel,23,email#email.com
jason,23,email#email.com
I need it as a json array.
You can include a package like csvtojson to handle this conversion for you.
/** csv file
a,b,c
1,2,3
4,5,6
*/
const csvFilePath='<path to csv file>'
const csv=require('csvtojson')
csv()
.fromFile(csvFilePath)
.then((jsonObj)=>{
console.log(jsonObj);
/**
* [
* {a:"1", b:"2", c:"3"},
* {a:"4", b:"5". c:"6"}
* ]
*/
})
// Async / await usage
const jsonArray=await csv().fromFile(csvFilePath);
You're halfway there.
You just need to:
split the string by end-of-line \n. You get an array of strings,
each representing a line in your csv file.
split each line by ,.
And in your case, don't forget to ignore the first line (the header).
Check this article out:
https://medium.com/#sanderdebr/converting-csv-to-a-2d-array-of-objects-94d43c56b12d
Split the string by \r\n to get an array of strings. Use the .map in array to make it an array of objects.
Here is an example:
var newArray = decodedString.split('\r\n').slice(1)map(row => {
var temp = row.split(',')
return {
name: temp[0],
value: temp[1],
email: temp[2]
}
})
Hope this helps :)
Include a package called csvtojson
const request=require('request')
const csv=require('csvtojson')
csv()
.fromStream(request.get('http://mywebsite.com/mycsvfile.csv'))
.subscribe((json)=>{
return new Promise((resolve,reject)=>{
// long operation for each json e.g. transform / write into database.
})
},onError,onComplete);
With Async and Await
async function ConvertCSVtoJSON{
const jsonArray = await csv().fromStream(request.get('http://mywebsite.com/mycsvfile.csv'));
console.log(jsonArray);
}
I have json file containing multiple objects in a single file I want to convert them to JSON array how to do this javascript. My overall goal is generate a CSV file out of it.
Here is the sample file content,
{
Name:"nom1",
Cities:['city1','city2']
}
{
Name:"nom2",
Cities:['city4','city5']
}
Note above is the data dump it is not correct format yet , I want to convert it to following format
var data = [ { Name:"nom1", Cities:['city1','city2'] }, { Name:"nom2", Cities:['city3','city4'] } ]};
Then pass it to the below script.
I have thousands of such objects in a single file. I want to generate CSV data as follows
|Name|Cities|
|nom1|city1 |
|nom1|city2 |
|nom2|city3 |
|nom2|city4 |
I am using node javascript to achieve this ,
const { Parser, transforms: { unwind } } = require('json2csv');
const data = {
Name:"nom1",
Cities:['city1','city2']
}
const fields = ['Name'];
const transforms = [unwind({ paths: ['features'] })];
const json2csvParser = new Parser({ fields, transforms });
const csv = json2csvParser.parse(myCars);
console.log(csv);
This would do the trick with the file exactly like the one you presented:
log-parser.js
const fs = require('fs')
async function fileToArray (file) {
const fileContent = await fs.promises.readFile(file)
const singleLined = fileContent
.toString()
.replace(/\n/g, '')
.replace(/\}\{/g, '},{')
.replace(/(\w+):/g, '"$1":')
.replace(/'/g, '"')
console.log(singleLined)
const array = JSON.parse(`[${singleLined}]`)
console.log(array)
}
fileToArray(process.argv[2])
node log-parser.js my-log-file.log
I want to write a matrix to a .js file. When I use console.log(matrix) everything is fine but when I write it to the file it comes out differently.
var fs = require("fs");
var matrix = new Array(10);
for(var i=0;i<matrix.length;i++) matrix[i]=[];
for (var i = 0; i < 100 ; i++)
{
var n = i%10;
matrix[n].push(i);
}
console.log(matrix);
//write it as a js array and export it (can't get brackets to stay)
fs.writeFile("./matrixtest.js", matrix, function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
So the console.log gives me [[0,10,20,30,...100],...,[1,11,21,31,...91]] and so on. But opening up matrixtest.js it's only this:
0,10,20,30,40,50...
All the numbers separated by commas with no brackets. How do I prevent it from converting to that format? Thank you.
When you are writing an Array to a file, it is getting converted to a string as JavaScript cannot figure out how to write an array as it is. That is why it loses the format. You can convert an array to a string like this and check
var array = [1, 2, 3, 4];
console.log(array.toString());
// 1,2,3,4
So, to solve this problem, you might want to convert it to a JSON string like this
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
...
}
stringify it (JSON.stringify) before saving it then parse it (JSON.parse) when reading it back in.
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
then when reading back in
var matrix = JSON.parse(contents);
The system doesn't know that you wanna store the array into the file with [].
It just puts the contents of the array to the file.
So, first you need to convert the data you wanna write to file into JSON format.
The JSON.stringify() method converts a JavaScript value to a JSON string
Then, write the JSON string to the file.
Then, while reading, use JSON.parse(). JSON.parse() method parses a JSON string, constructing the JavaScript value or object described by the string
fs.writeFile('./matrix.js', JSON.stringify(matrix), function (err) {
if(err) {
console.log(err);
}
})
fs.readFile('./matrix.js', function(err, data) {
console.log(JSON.parse(data));
//Do whatever you want with JSON.parse(data)
});