I am new to JS ecosystem and a requirement is to create a json object dynamically at run time and to write it to .
{
"prerequisite": [
{
"url1": "",
"step1": {},
"step2": {}
}
],
"PF": [
{},
{}
]
}
I tried using new Object() but not sure how to build this complex one (to my extent its complex ).
For writing it to file i understand using FS we can achieve it.
Need support on building this json object at run time
I guess you'd like to do something like this. I'm not sure how you want to make it dynamic, but if you have a json object, you can always add to it like this: json_obj.something = "anything"
And it save (overwrite) every time when it runs. Save this code to a file (like filename.js) and run it. (node filename.js)
Oh and must be stringify the json since the FS "data" argument must be of type string or an instance of Buffer, TypedArray, or DataView
const fs = require("fs")
const FILE = "./json_obj.json"
const json_obj = {
prerequisite: [
{
url1: "",
step1: {},
step2: {},
},
],
PF: [{}, {}],
}
fs.writeFile(FILE, JSON.stringify(json_obj), (err) => {
if (err) {
console.error(err)
return
}
})
Related
The title may be miss leading but I'm not really sure how do I ask this question correctly. Here is the problem: I'd like to query my own API(not created yet so I made placeholder data) for global settings which might change in the future and I will only need to rebuild the website instead of editing it manually, I want to create source node called CmsSettings and pass it to GraphQL (structure similar to site.siteMetadata) but I don't know how can I achieve that. What I achieved so far is to create a source node called allCmsSettings which has my data as an object in nodes array.
exports.sourceNodes = ({ actions, createNodeId, createContentDigest }) => {
const { createNode } = actions;
const myData = {
key: 123,
app_title: `The foo field of my node`,
...
}
const nodeContent = JSON.stringify(myData);
const nodeMeta = {
id: createNodeId(`my-data${ myData.key }`),
parent: null,
children: [],
internal: {
type: `CmsSettings`,
mediaType: `text/html`,
content: nodeContent,
contentDigest: createContentDigest(myData)
}
}
const node = Object.assign({}, myData, nodeMeta);
createNode(node);
}
Here is the query used to get the data of the source node
allCmsSettings {
edges {
node {
id
app_title
...
}
}
}
Creating a query results in an array of results(which I know is the result of creating source nodes) but I'd like to create that source so that I could query it like this and:
CmsSettings {
app_title
app_keywords
app_descriptions
app_logo_path
brand_name
...
}
You get the point. I was browsing the gatsby node API but I can't find how to achieve this.
Thank you for your help
Nevermind, the answer is pretty simple, if you are new to gatsby just like me the sourceNodes export creates 2 graphql fields for you with all prefix and camel case source node. The thing that I wanted to make is already there and is queryable with
cmsSettings {
app_title
app_keywords
app_descriptions
app_logo_path
brand_name
...
}
Notice the lowercase letter even though it was declared as CmsSettings. It seems that gatsby really does some magic under the hood.
I would like to store accounts in a json file.
Something like : accounts{[user: user1, email: email1], [user: user2. email: email2]}
Javascript file
Accounts = {
Nickname: form.getElementsByTagName('input')[0].value,
Email: form.getElementsByTagName('input')[3].value
};
var json = JSON.stringify(Accounts);
fs.appendFile('Accounts.json', json, function(err){});
When I add a second user this code make a new object and it looks like.
Json file
{"NickName": "user1", "Email": "Email1"}{"NickName": "user2", "Email": "Email2"}
Then, when I try to read the file and parse it I receive an unexpected error {
I think the Json file should look like
{
{"Name":"value"},
{"Name2":"value2"}
}
I fix the issue.
Here the solution with a mix of Michael Brune's solution.
// Read accounts.json
const Accounts = fs.readFileSync('Accounts.json', 'utf-8');
// Check if json file is empty
if(Accounts.length !== 0){
var ParsedAccounts = JSON.parse(Accounts);
}
else{
ParsedAccounts = [];
}
ParsedAccounts.push(Account);
const NewData = JSON.stringify(ParsedAccounts, null, 4);
// Write new data to accounts.json
fs.writeFileSync('Accounts.json', NewData);
Basically, I push the new data in ParsedAccounts then I write it in the json file.
Maybe there is another way, but if your file is pretty small, then try like this:
Accounts.json:
{
"other": "other data"
}
index.js
const fileData = fs.readFileSync('Accounts.json');
let parsedFileData
try {
parsedFileData = JSON.parse(fileData);
} catch (error) {
if (error instanceof SyntaxError) {
// Create empty object to append to since parse was not a success
parsedFileData = {};
// Create backup of old, possible corrupt file
fs.writeFileSync('Accounts.backup.json', fileData);
}
}
const newFileData = JSON.stringify({
...parsedFileData,
...newDataToAppend,
}, null, 4);
fs.writeFileSync('Accounts.json', newFileData);
First you can parse the file.
Assign new and old data to a object.
Then convert that to srting JSON.Stringify
The null and 4 are to write a nice pretty file
Then write it to the file directly.
The backend of my webapp, written in node.js interacts with Json file, with a specific format that I thought not so complex but apparently is.
The structure of my json file is as such :
{
"data": [
{
"somefield": "ioremipsum",
"somedate" : "2018-08-23T11:48:00Z",
"someotherdate" : "2018-08-23T13:43:00Z",
"somethingelse":"ioremipsum",
"files": [
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
},
{
"specificfieldinarray": "ioremipsum",
"specificotherfieldinarray": "ioremipsum"
}
]
}
]
}
I try to make this answer fit a JS object like this :
const file = require('specificJsonFile.json');
let fileList = file;
And I need to loop through my 'files' array, for further treatments, but unfortunately, my JS object looks like this :
{ data:
[ { somefield: "ioremipsum",
somedate : "2018-08-23T11:48:00Z",
someotherdate : "2018-08-23T13:43:00Z",
somethingelse:"ioremipsum",
files: [Array] } ] }
Please forgive me if this is obvious, for I am still a beginner with JS.
That's only how console.log logs deep objects. To get a deeper output, you can use util.inspect
const util = require('util');
console.log(util.inspect(yourObject, {showHidden: false, depth: null}));
To loop each data's files, simply loop data, then its files
yourObject.data.forEach(d => {
d.files.forEach(file => console.log(file));
});
It looks like there is nothing wrong there and the console is abbreviating the log.
Try accessing the files list with the following code:
const filesList = file.data[0].files
and then
console.log(filesList) to check that it's eventually working.
Hope it helps!
let fileList = file.data[0].files;
This will create an array of only your files array.
You can console.log(fileList)
Or whatever you like with the data.
Based on your comment, try the of keyword instead of in keyword to get the behaviour you expected.
for (let file of fileList){
console.log(file);
}
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of
You can use for in
for (item in fileList.data) {
for (file in fileList.data[item].files) {
let data = fileList.data[item].files[file];
// process the data
}
}
I have a CSV file and I want to parse it using PapaParse. How do I do this properly?
I have so far:
Papa.parse(fileInput, {
download: true,
complete: function(results) {
console.log(results.data);
console.log(results.errors);
}
});
However, is there a better way to do this? Is this the proper way to get errors? The documentation didn't emphasize download: true or anything so I was wondering if there are any experts on this subject here.
EDIT: Also, am I suppose to further parse the file with papacsv or do it in react. For instance, if I have multiple arrays in my data file which have a similar name reference. Should I initially somehow parse the file so it groups all those references together and how would I go about doing this?
For instance,
Date, Name , Win/Lose
I want to group all the winners together. How do I do that?
The method you are using of Papa parse, is for remote CSV.
download: true is for downloading the remote file.
By using Papa parse, this is the only way of getting errors, data, meta with parse result object.
//If(header:true)
var data = [
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "win"
},
{
"date": "8/12/2018",
"name": "foo",
"win/loose": "loose"
},
{
"date": "8/12/2018",
"name": "foo1",
"win/loose": "win"
},
];
var winners = data.filter(d => d['win/loose'] == 'win');
console.log(winners);
//If you want to group winners and losers then:
var grouped = data.reduce(function(acc, co) {
var key = co['win/loose'];
if(!acc[key]) {
acc[key] = [];
}
acc[key].push(co);
return acc;
}, {});
console.log(grouped);
This'll give you separate array of winners from extracted data.
I am writing to a json file in casperjs and am trying to add new objects to it.
json file looks like
{ "visited": [
{
"id": "258b5ee8-9538-4480-8109-58afe741dc2f",
"url": "https://................"
},
{
"id": "5304de97-a970-48f2-9d3b-a750bad5416c",
"url": "https://.............."
},
{
"id": "0fc7a072-7e94-46d6-b38c-9c7aedbdaded",
"url": "https://................."
}]}
The code to add to the array is
var data;
if (fs.isFile(FILENAME)) {
data = fs.read(FILENAME);
} else {
data = JSON.stringify({ 'visited': [] });
}
var json = JSON.parse(data);
json.visited.push(visiteddata);
data = JSON.stringify(json, null, '\n');
fs.write(FILENAME, data, "a");
This is starting off by adding an new { "visited" : [ ] } array with first couple of objects, below the existing { "visited" : [ ] } array and subsequently the script breaks because the json array is no longer valid json.
Can anybody point me in the right direction. Thank you in advance.
You have a JSON file containing some data.
You:
Read that data
Modify that data
Append the modified version of that data to the original file
This means the file now has the original data and then, immediately after it, a near identical copy with a little bit added.
You don't need the original. You only need the new version.
You need to write to the file instead of appending to it.
Change the 'a' flag to 'w'.