i try to convert flatten json data to csv with js.
js code
import { saveAs } from "file-saver";
let data = [{}, {}] //data export from firebase as an array filled with objects
let dataJson = JSON.stringify(data);
let fileToSave = new Blob([dataJson ], {
type: "csv",
name: 'data.csv'
});
saveAs(fileToSave, 'data.csv');
my example json
[
{"kaab":{"ka11":6,"ka12":6,"ka10":6},"ae":{"a6":2,"a5":2,"a4":6},"kg3":"fdsf","kg2":4,"solz":"2","kg1":5,"ges":1,"kaak":{"ka4":5,"ka1":4,"ka5":3,"ka6":5,"ka3":5,"ka2":4},"eink":"","kawe":{"ka9":4,"ka7":5,"ka8":5},"soz2":"","alt":3,"zul":{"infl":1,"spi":1,"int":1,"les":1,"mer":1,"aut":1,"inf2":1},"kg4":2,"am":{"a1":5,"a3":2,"a2":2}}
{"kaab":{"ka11":6,"ka12":6,"ka10":6},"ae":{"a6":2,"a5":2,"a4":6},"kg3":"fdsf","kg2":4,"solz":"2","kg1":5,"ges":1,"kaak":{"ka4":5,"ka1":4,"ka5":3,"ka6":5,"ka3":5,"ka2":4},"eink":"","kawe":{"ka9":4,"ka7":5,"ka8":5},"soz2":"","alt":3,"zul":{"infl":1,"spi":1,"int":1,"les":1,"mer":1,"aut":1,"inf2":1},"kg4":2,"am":{"a1":5,"a3":2,"a2":2}}
]
I have used file-saver for this but with no big succes.
I get a csv file for excel, but something is still wrong
excel image
But I need this as an example
expected result
If any of you can help me I really appreciate it. It doesnt need to be done with file-saver
Related
I recently exported all my user data from Firebase and now I want to format the JSON file to filter only the relevant field I need for my data model.
The file I got on Firebase is currently stored like this:
{
"Users": {
"00uniqueuserid3": {
"1UserName": "Pusername",
"2Password": "password",
"3Email": "email#gmail.com",
"4City": "dubai"
}
}
}
The issue is that the JSON file got over 5,000 users and I cannot get possibly manual format them how I want them. Is there any Javascript script or tool I can use to reformat all the data in the file, I would like to format them as such:
{"id": uniqueid , "name": name, "email": email, "city": city}
You can create a new NodeJS project (npm init -y) and install mongodb. Then read the JSON file and modify the format using JS:
const mongodb = require("mongodb")
const exportedData = require("./myFirebaseExport.json");
const db = "" // <-- Mongo Client
const data = Object.values(exportedData.Users);
const parsedData = data.map((d) => {
// modify the data array as required;
return {
name: d.username, // replace with your field names
email: d.Email,
}
})
await db.collection("users").insertMany(parsedData);
Iam able to generate a csv file with the data below. I am using a nodejs library "csv-writer" that generates the file quite well. My problem is that I need a way to return back a buffer instead of the file itself. Reason being I need to upload the file to a remote server via sftp.
How do I go ab bout modifying this piece of code to enable buffer response? Thanks.
...
const csvWriter = createCsvWriter({
path: 'AuthHistoryReport.csv',
header: [
{id: 'NAME', title: 'msg_datetime_date'},
{id: 'AGE', title: 'msg_datetime'}
]
});
var rows = [
{ NAME: "Paul", AGE:21 },
{ NAME: "Charles", AGE:28 },
{ NAME: "Teresa", AGE:27 },
];
csvWriter
.writeRecords(rows)
.then(() => {
console.log('The CSV file was written successfully');
});
...
Read your own file with fs.readFile('AuthHistoryReport.csv', data => ... );. If you don't specify an encoding, then the returned data is a buffer, not a string.
fs.readFile('AuthHistoryReport.csv', 'utf8', data => ... ); Returns a string
fs.readFile('AuthHistoryReport.csv', data => ... ); Returns a buffer
Nodejs file system #fs.readFile
You need to store your created file in a buffer using the native package fs
const fs = require('fs');
const buffer = fs.readFileSync('AuthHistoryReport.csv');
The backend of my webapp, written in node.js interacts with Json file, with a specific format that I thought not so complex but apparently is.
The structure of my json file is as such :
{
"data": [
{
"somefield": "ioremipsum",
"somedate" : "2018-08-23T11:48:00Z",
"someotherdate" : "2018-08-23T13:43:00Z",
"somethingelse":"ioremipsum",
"files": [
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
}
]
}
]
}
I try to make this answer fit a JS object like this :
const file = require('specificJsonFile.json');
let fileList = file;
And I need to loop through my 'files' array, for further treatments, but unfortunately, my JS object looks like this :
{ data:
[ { somefield: "ioremipsum",
somedate : "2018-08-23T11:48:00Z",
someotherdate : "2018-08-23T13:43:00Z",
somethingelse:"ioremipsum",
files: [Array] } ] }
Please forgive me if this is obvious, for I am still a beginner with JS.
That's only how console.log logs deep objects. To get a deeper output, you can use util.inspect
const util = require('util');
console.log(util.inspect(yourObject, {showHidden: false, depth: null}));
To loop each data's files, simply loop data, then its files
yourObject.data.forEach(d => {
d.files.forEach(file => console.log(file));
});
It looks like there is nothing wrong there and the console is abbreviating the log.
Try accessing the files list with the following code:
const filesList = file.data[0].files
and then
console.log(filesList) to check that it's eventually working.
Hope it helps!
let fileList = file.data[0].files;
This will create an array of only your files array.
You can console.log(fileList)
Or whatever you like with the data.
Based on your comment, try the of keyword instead of in keyword to get the behaviour you expected.
for (let file of fileList){
console.log(file);
}
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of
You can use for in
for (item in fileList.data) {
for (file in fileList.data[item].files) {
let data = fileList.data[item].files[file];
// process the data
}
}
I have a CSV file and I want to parse it using PapaParse. How do I do this properly?
I have so far:
Papa.parse(fileInput, {
download: true,
complete: function(results) {
console.log(results.data);
console.log(results.errors);
}
});
However, is there a better way to do this? Is this the proper way to get errors? The documentation didn't emphasize download: true or anything so I was wondering if there are any experts on this subject here.
EDIT: Also, am I suppose to further parse the file with papacsv or do it in react. For instance, if I have multiple arrays in my data file which have a similar name reference. Should I initially somehow parse the file so it groups all those references together and how would I go about doing this?
For instance,
Date, Name , Win/Lose
I want to group all the winners together. How do I do that?
The method you are using of Papa parse, is for remote CSV.
download: true is for downloading the remote file.
By using Papa parse, this is the only way of getting errors, data, meta with parse result object.
//If(header:true)
var data = [
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "win"
},
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "loose"
},
{
"date": "8/12/2018",
"name": "foo1",
"win/loose": "win"
},
];
var winners = data.filter(d => d['win/loose'] == 'win');
console.log(winners);
//If you want to group winners and losers then:
var grouped = data.reduce(function(acc, co) {
var key = co['win/loose'];
if(!acc[key]) {
acc[key] = [];
}
acc[key].push(co);
return acc;
}, {});
console.log(grouped);
This'll give you separate array of winners from extracted data.
I am writing to a json file in casperjs and am trying to add new objects to it.
json file looks like
{ "visited": [
{
"id": "258b5ee8-9538-4480-8109-58afe741dc2f",
"url": "https://................"
},
{
"id": "5304de97-a970-48f2-9d3b-a750bad5416c",
"url": "https://.............."
},
{
"id": "0fc7a072-7e94-46d6-b38c-9c7aedbdaded",
"url": "https://................."
}]}
The code to add to the array is
var data;
if (fs.isFile(FILENAME)) {
data = fs.read(FILENAME);
} else {
data = JSON.stringify({ 'visited': [] });
}
var json = JSON.parse(data);
json.visited.push(visiteddata);
data = JSON.stringify(json, null, '\n');
fs.write(FILENAME, data, "a");
This is starting off by adding an new { "visited" : [ ] } array with first couple of objects, below the existing { "visited" : [ ] } array and subsequently the script breaks because the json array is no longer valid json.
Can anybody point me in the right direction. Thank you in advance.
You have a JSON file containing some data.
You:
Read that data
Modify that data
Append the modified version of that data to the original file
This means the file now has the original data and then, immediately after it, a near identical copy with a little bit added.
You don't need the original. You only need the new version.
You need to write to the file instead of appending to it.
Change the 'a' flag to 'w'.