The backend of my webapp, written in node.js interacts with Json file, with a specific format that I thought not so complex but apparently is.
The structure of my json file is as such :
{
"data": [
{
"somefield": "ioremipsum",
"somedate" : "2018-08-23T11:48:00Z",
"someotherdate" : "2018-08-23T13:43:00Z",
"somethingelse":"ioremipsum",
"files": [
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
}
]
}
]
}
I try to make this answer fit a JS object like this :
const file = require('specificJsonFile.json');
let fileList = file;
And I need to loop through my 'files' array, for further treatments, but unfortunately, my JS object looks like this :
{ data:
[ { somefield: "ioremipsum",
somedate : "2018-08-23T11:48:00Z",
someotherdate : "2018-08-23T13:43:00Z",
somethingelse:"ioremipsum",
files: [Array] } ] }
Please forgive me if this is obvious, for I am still a beginner with JS.
That's only how console.log logs deep objects. To get a deeper output, you can use util.inspect
const util = require('util');
console.log(util.inspect(yourObject, {showHidden: false, depth: null}));
To loop each data's files, simply loop data, then its files
yourObject.data.forEach(d => {
d.files.forEach(file => console.log(file));
});
It looks like there is nothing wrong there and the console is abbreviating the log.
Try accessing the files list with the following code:
const filesList = file.data[0].files
and then
console.log(filesList) to check that it's eventually working.
Hope it helps!
let fileList = file.data[0].files;
This will create an array of only your files array.
You can console.log(fileList)
Or whatever you like with the data.
Based on your comment, try the of keyword instead of in keyword to get the behaviour you expected.
for (let file of fileList){
console.log(file);
}
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of
You can use for in
for (item in fileList.data) {
for (file in fileList.data[item].files) {
let data = fileList.data[item].files[file];
// process the data
}
}
Related
I am new to JS ecosystem and a requirement is to create a json object dynamically at run time and to write it to .
{
"prerequisite": [
{
"url1": "",
"step1": {},
"step2": {}
}
],
"PF": [
{},
{}
]
}
I tried using new Object() but not sure how to build this complex one (to my extent its complex ).
For writing it to file i understand using FS we can achieve it.
Need support on building this json object at run time
I guess you'd like to do something like this. I'm not sure how you want to make it dynamic, but if you have a json object, you can always add to it like this: json_obj.something = "anything"
And it save (overwrite) every time when it runs. Save this code to a file (like filename.js) and run it. (node filename.js)
Oh and must be stringify the json since the FS "data" argument must be of type string or an instance of Buffer, TypedArray, or DataView
const fs = require("fs")
const FILE = "./json_obj.json"
const json_obj = {
prerequisite: [
{
url1: "",
step1: {},
step2: {},
},
],
PF: [{}, {}],
}
fs.writeFile(FILE, JSON.stringify(json_obj), (err) => {
if (err) {
console.error(err)
return
}
})
I would like to store accounts in a json file.
Something like : accounts{[user: user1, email: email1], [user: user2. email: email2]}
Javascript file
Accounts = {
Nickname: form.getElementsByTagName('input')[0].value,
Email: form.getElementsByTagName('input')[3].value
};
var json = JSON.stringify(Accounts);
fs.appendFile('Accounts.json', json, function(err){});
When I add a second user this code make a new object and it looks like.
Json file
{"NickName": "user1", "Email": "Email1"}{"NickName": "user2", "Email": "Email2"}
Then, when I try to read the file and parse it I receive an unexpected error {
I think the Json file should look like
{
{"Name":"value"},
{"Name2":"value2"}
}
I fix the issue.
Here the solution with a mix of Michael Brune's solution.
// Read accounts.json
const Accounts = fs.readFileSync('Accounts.json', 'utf-8');
// Check if json file is empty
if(Accounts.length !== 0){
var ParsedAccounts = JSON.parse(Accounts);
}
else{
ParsedAccounts = [];
}
ParsedAccounts.push(Account);
const NewData = JSON.stringify(ParsedAccounts, null, 4);
// Write new data to accounts.json
fs.writeFileSync('Accounts.json', NewData);
Basically, I push the new data in ParsedAccounts then I write it in the json file.
Maybe there is another way, but if your file is pretty small, then try like this:
Accounts.json:
{
"other": "other data"
}
index.js
const fileData = fs.readFileSync('Accounts.json');
let parsedFileData
try {
parsedFileData = JSON.parse(fileData);
} catch (error) {
if (error instanceof SyntaxError) {
// Create empty object to append to since parse was not a success
parsedFileData = {};
// Create backup of old, possible corrupt file
fs.writeFileSync('Accounts.backup.json', fileData);
}
}
const newFileData = JSON.stringify({
...parsedFileData,
...newDataToAppend,
}, null, 4);
fs.writeFileSync('Accounts.json', newFileData);
First you can parse the file.
Assign new and old data to a object.
Then convert that to srting JSON.Stringify
The null and 4 are to write a nice pretty file
Then write it to the file directly.
I have a CSV file and I want to parse it using PapaParse. How do I do this properly?
I have so far:
Papa.parse(fileInput, {
download: true,
complete: function(results) {
console.log(results.data);
console.log(results.errors);
}
});
However, is there a better way to do this? Is this the proper way to get errors? The documentation didn't emphasize download: true or anything so I was wondering if there are any experts on this subject here.
EDIT: Also, am I suppose to further parse the file with papacsv or do it in react. For instance, if I have multiple arrays in my data file which have a similar name reference. Should I initially somehow parse the file so it groups all those references together and how would I go about doing this?
For instance,
Date, Name , Win/Lose
I want to group all the winners together. How do I do that?
The method you are using of Papa parse, is for remote CSV.
download: true is for downloading the remote file.
By using Papa parse, this is the only way of getting errors, data, meta with parse result object.
//If(header:true)
var data = [
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "win"
},
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "loose"
},
{
"date": "8/12/2018",
"name": "foo1",
"win/loose": "win"
},
];
var winners = data.filter(d => d['win/loose'] == 'win');
console.log(winners);
//If you want to group winners and losers then:
var grouped = data.reduce(function(acc, co) {
var key = co['win/loose'];
if(!acc[key]) {
acc[key] = [];
}
acc[key].push(co);
return acc;
}, {});
console.log(grouped);
This'll give you separate array of winners from extracted data.
By using node js I generate the array below(2) , by parsing through multiple json files for a particular value.
My json files contains a list of IDs with their status : isAvailable or undefined.
So In my code i parse through all my json files but looking only for the first ID and get the statuts of Availability in the picture bellow. When it's true t means that the ID is available. As you can see the name of the file, it's the date and the hour the json was produced.
So What I want to achieve is write a function or anything simple, where i go through the array you can see in the picture.
Example:
We can see that the status is available for the first file, I wanna recover first file name with status available
("{ fileName: '2017-03-17T11:39:36+01:00',
Status: Available }"
when the status stop being available, in our example that would be here ( { fileName: '2017-04-06T11:19:17+02:00', contents: undefined } )
get:
{ fileName: '2017-04-06T11:19:17+02:00', Status: unavailable }
(2)
So here is part of my code where I generate this array :
Promise.mapSeries(filenames, function(fileName) {
var contents = fs
.readFileAsync("./availibility/" + fileName, "utf8")
.catch(function ignore() {});
return Promise.join(contents, function(contents) {
return {
fileName,
contents
};
});
}).each(function(eachfile) {
if(eachfile.contents){
jsonobject = JSON.parse(eachfile.contents);
if(jsonobject && jsonobject.hasOwnProperty('messages'))
// console.log(jsonobject.messages[0].message.results[2]);
eachfile.contents = jsonobject.messages[0].message.results[1].isAvailable;
}
eachfile.fileName = eachfile.fileName.substring('revision_'.length,(eachfile.fileName.length-5));
console.log(eachfile);
})
May someone help me please
Thank you,
Suppose you have an array:
[
{
filename : "2017-03-23 00:00:00",
contents : true
},
...
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : true
}
]
where the ... represents a long stream of objects where the contents value is true.
You want to end up with a list of objects without consequent objects with the same contents value, meaning the result would look like:
[
{
filename : "2017-03-23 00:00:00",
contents : true
},
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : true
}
]
Im going to use jQuery because this is a javascript framework I am familiar with, but you should be able to translate it with ease to whatever framework you're using.
function doit(dataArray) {
var resultList = [];
var currentContent = "";
$.each(dataArray, function(index, value) {
if(currentContent != value.content) {
resultList.push(value);
currentContent = value.content;
}
});
console.log(resultList);
}
Note that you need an array with the data that looks like the data in your picture, however, you print every row. You might need to add those rows into a new array, and then pass that array to this function.
I am writing to a json file in casperjs and am trying to add new objects to it.
json file looks like
{ "visited": [
{
"id": "258b5ee8-9538-4480-8109-58afe741dc2f",
"url": "https://................"
},
{
"id": "5304de97-a970-48f2-9d3b-a750bad5416c",
"url": "https://.............."
},
{
"id": "0fc7a072-7e94-46d6-b38c-9c7aedbdaded",
"url": "https://................."
}]}
The code to add to the array is
var data;
if (fs.isFile(FILENAME)) {
data = fs.read(FILENAME);
} else {
data = JSON.stringify({ 'visited': [] });
}
var json = JSON.parse(data);
json.visited.push(visiteddata);
data = JSON.stringify(json, null, '\n');
fs.write(FILENAME, data, "a");
This is starting off by adding an new { "visited" : [ ] } array with first couple of objects, below the existing { "visited" : [ ] } array and subsequently the script breaks because the json array is no longer valid json.
Can anybody point me in the right direction. Thank you in advance.
You have a JSON file containing some data.
You:
Read that data
Modify that data
Append the modified version of that data to the original file
This means the file now has the original data and then, immediately after it, a near identical copy with a little bit added.
You don't need the original. You only need the new version.
You need to write to the file instead of appending to it.
Change the 'a' flag to 'w'.