I have a JSON file that I am using javascript to read, I am able to print it out in console, but I have to manually code the number of objects in the JSON file, the file is really simple with only 3 objects, I would like to create a function that checks how many objects are in JSON file.
Json Code
{
"items": [
{
"fname": "Kali",
"lname": "Flower",
"age": "19"},
{
"fname": "JD",
"lname": "Wyatt",
"age": "19"
}]
}
I'm trying to write a javascript function showing how many objects are in it
Use JSON.parse to convert the content to object then use .length to get the size:
JSON.parse(fs.readFileSync(file)).items.length
Explained:
const fs = require("fs"); // require fs
const content = fs.readFileSync(file); // read the file content as string
const obj = JSON.parse(content); // convert string to object
const length = obj.items.length;
Related
I recently exported all my user data from Firebase and now I want to format the JSON file to filter only the relevant field I need for my data model.
The file I got on Firebase is currently stored like this:
{
"Users": {
"00uniqueuserid3": {
"1UserName": "Pusername",
"2Password": "password",
"3Email": "email#gmail.com",
"4City": "dubai"
}
}
}
The issue is that the JSON file got over 5,000 users and I cannot get possibly manual format them how I want them. Is there any Javascript script or tool I can use to reformat all the data in the file, I would like to format them as such:
{"id": uniqueid , "name": name, "email": email, "city": city}
You can create a new NodeJS project (npm init -y) and install mongodb. Then read the JSON file and modify the format using JS:
const mongodb = require("mongodb")
const exportedData = require("./myFirebaseExport.json");
const db = "" // <-- Mongo Client
const data = Object.values(exportedData.Users);
const parsedData = data.map((d) => {
// modify the data array as required;
return {
name: d.username, // replace with your field names
email: d.Email,
}
})
await db.collection("users").insertMany(parsedData);
I created a JSON file as follows
{
"fooditems" : [
{
"name": "pizza",
"type": "fastfood",
"price": 10
},
{
"name": "apple",
"type": "fruit",
"price": 1
}
]
}
created a JS file to read the JSON file
const data = require("./data.json");
data1 = JSON.parse(data);
data1.foodData.forEach( foodItem => console.log(foodItem));
When I run the JS, I get error for the json file
Syntax error: Unexpected token o in json at position 1
at JSON.parse
You don't need to parse data since it's already and object. The following should work.
const data = require("./data.json");
data.fooditems.forEach( foodItem => console.log(foodItem));
Note foodData was change to fooditems based on the contents of the data.json file.
Your initial data JSON contains "fooditems", but in the JS file you are trying to process the "foodData". Change the "foodData" to "fooditems" and it should work.
I think that you are trying to access invalid object key in your JS file on the last line.
Instead of
data1.foodData
put
data1.fooditems
I have an existing service to create files from strings and zip them up. It currently takes takes JSON as a string as such:
data: ${ JSON.stringify(data) }
which works. However I want to in the came way express that data as a csv.
assume the data is:
json
[
{
"test1": "1",
"test2": "2",
"test3": "3",
},
{
"test1": "1",
"test2": "2",
"test3": "3",
},
]
I have found many good node json to csv libraries, I ended up using json2csv and have found that I can successfully create a csv file for one line of data, but not two as follows.
// works (header row)
const test = "\"test1\",\"test2\",\"test3\"";
const csvStr = `data:text/csv;charset=UTF-8, ${ test }`;
// fails (header row + 1 data row)
const test = "\"test1\",\"test2\",\"test3\"\n\"test1\",\"test2\",\"test3\"";
const csvStr = `data:text/csv;charset=UTF-8, ${ test }`;
Based on these tests I believe the issue is with how the newline / carriage return is being used. If this is even possible. Anyone know what I might be doing wrong here? If it's possible to express a CSV file on a single line with line breaks?
I am writing to a json file in casperjs and am trying to add new objects to it.
json file looks like
{ "visited": [
{
"id": "258b5ee8-9538-4480-8109-58afe741dc2f",
"url": "https://................"
},
{
"id": "5304de97-a970-48f2-9d3b-a750bad5416c",
"url": "https://.............."
},
{
"id": "0fc7a072-7e94-46d6-b38c-9c7aedbdaded",
"url": "https://................."
}]}
The code to add to the array is
var data;
if (fs.isFile(FILENAME)) {
data = fs.read(FILENAME);
} else {
data = JSON.stringify({ 'visited': [] });
}
var json = JSON.parse(data);
json.visited.push(visiteddata);
data = JSON.stringify(json, null, '\n');
fs.write(FILENAME, data, "a");
This is starting off by adding an new { "visited" : [ ] } array with first couple of objects, below the existing { "visited" : [ ] } array and subsequently the script breaks because the json array is no longer valid json.
Can anybody point me in the right direction. Thank you in advance.
You have a JSON file containing some data.
You:
Read that data
Modify that data
Append the modified version of that data to the original file
This means the file now has the original data and then, immediately after it, a near identical copy with a little bit added.
You don't need the original. You only need the new version.
You need to write to the file instead of appending to it.
Change the 'a' flag to 'w'.
I am using the node.js to interpret a JSON data, the data format is like this below
{
"href": "https://localhost/light/0000293D",
"i-object-metadata": [
{
"rel": "temperature",
"val": "244"
}
]
}
I can print the raw data using print (body)
to interpret data all works except printing the field i-object-metadata
var obj = JSON.parse(body);
console.log(obj.items); // works well
console.log(obj.i-object-metadata); // error
How could I interpret the JSON object like this i-object-metadata
Can't use the object shorthand in this case, you'll have to use the array notation:
console.log(obj['i-object-metadata'].val); // 244