NodeJs read JSON file - javascript

I am tying to read a json file using NodeJs
my code is pretty basic,
var obj = require("./sample.json");
console.log(obj[0]);
The sample.json file contains a stringified JSON like this,
"[{\"sample\":\"good\",\"val\":76159}]"
However the console.log output is '[' not the first element in the variable. I have tried opening the file in the long way like this as well.
var obj;
fs.readFile('sample.json', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
console.log(obj[0]);
});
But here also the output is '[' why is the json file not properly parsed? How can I fix it?
Thanks in advance.

Your file should contain:
[{"sample":"good","val":76159}]
If it contains
"[{\"sample\":\"good\",\"val\":76159}]"
Then it is encoded twice. It is still valid JSON because it is a string, but this JSON does not represent the JavaScript Object:
[{
sample:"good",
val:76159
}]
But a string with the content [{"sample":"good","val":76159}].
If you add a second parse (the first one is implicitly done by the require):
var obj = JSON.parse(require("./sample.json"));
console.log(obj[0]);
then you will see that the the correct information is logged.
So your initial problem is that you stored the value the wrong way in ./sample.json.

[info.json]
[{
"name" : "Young",
"age" : 100,
"skill" : "js"
}]
[main.js]
var jsonObj = require('./info.json');
console.log(jsonObj[0]);
[result]
{ name: 'Young', age: 100, skill: 'js' }

Related

Convert Facebook json file sequences like \u00f0\u009f\u0098\u008a to emoji characters

I have downloaded my Facebook data as json files. The json files for my posts contain emojis, which appear something like this in the json file: \u00f0\u009f\u0098\u008a. I want to parse this json file and extract the posts with the correct emojis.
I can't find a way to load this json file into a json object (using JavaScript) then read (and output) the post with the correct emojis.
(Eventually I will upload these posts to WordPress using its REST API, which I've worked out how to do.)
My program is written in JavaScript and run using nodejs from the command line. I've parsed the file using:
const fs = require('fs')
let filetext = fs.readFileSync(filename, 'utf8')
let jsonObj = JSON.parse(filetext)
However, when I output the data (using something like jsonObj.status_updates.data[0].post), I get strange characters for the emoji, like Happy birthday ├░┬ƒ┬ÿ┬è instead of Happy birthday 😊. This is not a Windows 10 console display issue because I've piped the output to a file also.
I've used the answer Decode or unescape \u00f0\u009f\u0091\u008d to 👍 to change the \uXXXX sequences in the json file to actual emojis before parsing the file. However, then JSON.parse does not work. It gives this message:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
So I'm in a bind: if I convert the \uXXXX sequences before trying to parse the json file, the JavaScript json parser has an error. If I don't convert the \uXXXX sequences then the parsed file in the form of a json object does not provide the correct emojis!
How can I correctly extract data, including emojis, from the json file?
I believe you should be able to do all this in Node.js, here's an example.
I've tested this using Visual Studio Code.
You can try it here: https://repl.it/repls/BrownAromaticGnudebugger
Note: I've updated processMessageas per #JakubASuplicki's very helpful comments to only look at string properties.
index.js
const fs = require('fs')
let filename = "test.json";
let filetext = fs.readFileSync(filename, "utf8");
let jsonObj = JSON.parse(filetext);
console.log(jsonObj);
function decodeFBString(str) {
let arr = [];
for (var i = 0; i < str.length; i++) {
arr.push(str.charCodeAt(i));
}
return Buffer.from(arr).toString("utf8");
}
function processMessages (messageArray) {
return messageArray.map(processMessage);
}
function processMessage(message) {
return Object.keys(message).reduce((obj, key) => {
obj[key] = (typeof message[key] === "string") ? decodeFBString(message[key]): message[key];
return obj
}, {});
}
let messages = processMessages(jsonObj.messages);
console.log("Input: ", jsonObj.messages);
console.log("Output: ", messages);
test.json
{
"participants": [
{
"name": "Philip Marlowe"
},
{
"name": "Terry Lennox"
}
],
"messages": [
{
"sender_name": "Philip Marlowe",
"timestamp_ms": 1546857175,
"content": "Meet later? \u00F0\u009F\u0098\u008A",
"type": "Generic"
},
{
"sender_name": "Terry Lennox",
"timestamp_ms": 1546857177,
"content": "Excellent!! \u00f0\u009f\u0092\u009a",
"type": "Generic"
}
]
}

d3.js parse JSON-like data from file

I am struggling with reading in data from a JSON file.
The JSON file that I want to read has the following structure:
{"name":"Andy Hunt",
"title":"Big Boss",
"age": 68,
"bonus": true
}
{"name":"Charles Mack",
"title":"Jr Dev",
"age":24,
"bonus": false
}
However, as far as I understand, d3.js can only parse JSON arrays that have the following structure:
[
{"name":"Andy Hunt",
"title":"Big Boss",
"age": 68,
"bonus": true
},
{"name":"Charles Mack",
"title":"Jr Dev",
"age":24,
"bonus": false
}
]
I am using this code snippet trying to display the first data object just to see if it works:
d3.json("/data/sample_data.json", function(data) {
console.log(data[0]);
});
This works for the second JSON structure, but not for the first one.
Now my question: Is it possible to somehow read in data with d3.js?
Or do I have to write a program, that reformats the whole JSON file as a JSON array?
EDIT: Turns out the data I wanted to use was formatted as NDJSON. So there is currently no way to read the data natively with d3.js other than parsing it as text and reformat it on the go the way #Mark explained.
As #Shashank points out your JSON is simply invalid. This has nothing to do with d3 but rather JavaScript and acceptable JSON formats. So, how can we fix it? You could do it in your d3/javascript itself instead of pre-processing the files.
// how many lines of JSON make an object?
var chunks = 5,
data = [];
//download file as TEXT
d3.text("data.txt", function(error, text) {
if (error) throw error;
// split on new line
var txt = text.split("\n");
// iterate it
for (var i = 0; i < txt.length; i += chunks) {
var block = "";
// grab blocks of 5 lines
for (var j = 0; j < chunks; j++ ){
block += txt[i+j];
}
// parse as JSON and push into data ARRAY
data.push(JSON.parse(block));
}
console.log(data);
});
Here it is running on a plunker.
The format of the JSON file you're trying to read is incorrect. JSON (as per standards) can be a collection of name/value pairs i.e. an object OR a list of objects i.e. an array of objects. Refer this: http://json.org/
If you're JSON file has the following data (notice the format - a single object):
{
"name":"Andy Hunt",
"title":"Big Boss",
"age": 68,
"bonus": true,
"value": 500
}
d3.json will provide the following result:
=> {name: "Andy Hunt", title: "Big Boss", age: 68, bonus: true, value: 500}
You've already tried it with the array structure.
I'd suggest you to change the JSON file to an array of objects. Hope this clears it up.

How to read the values inside the json array

I have a json like
var obj={
"address":{
"addlin1":"",
"addlin2":""
},
"name":"sam",
"score":[{"maths":"ten",
"science":"two",
"pass":false
}]
}
Now when Iam trying to modify the json iam try an array variable and passing above json to that like
var data=JSON.parse(obj);
var json={};
json['name']=data['name'];
json['address']={};
json['address']['addressline1']=data['address']['addlin1'];
json['address']['addressline2']=data['address']['addlin2'];
json['marks']={};
json['maths']=data['score']['maths'];
For name and address I was able to form the json as i was expecting.But for marks I was unable.May be in obj json score values are in [ ]
So,when i console the json it is in this way
"name":"sam",
"address":{
"addresslin1":"",
"addresslin2":""
},
"score":{}
}
So how can I also read the values inside [] array.
Can someone help me
Thanks
json['maths']=data['score'][0]['maths'];
if you're not sure that data['score'] has any elements you can check prior to reading maths key:
if (data['score'].length) {
json['maths']=data['score'][0]['maths'];
}
data['score'] is an array, so you can't read it like that
json['maths']=data['score']['maths'];
you have to read it like that:
json['maths'] = data['score'][0].maths;
Also, obj is not a JSON, but a JavaScript object. You can use it directly.
json['maths'] = obj['score'][0].maths;
A JSON is a string, like that:
JSON.stringify(obj)
var json = "{"address":{"addlin1":"","addlin2":""},"name":"sam","score":[{"maths":"ten","science":"two","pass":false}]}";
create another json2 to contain score data then assign to json.
for example :
var json={};
json2 = {}
json2[0] = 1;
json2[1] = 2;
json[0] = json2;

How can I write an array to a file in nodejs and keep the square brackets?

I want to write a matrix to a .js file. When I use console.log(matrix) everything is fine but when I write it to the file it comes out differently.
var fs = require("fs");
var matrix = new Array(10);
for(var i=0;i<matrix.length;i++) matrix[i]=[];
for (var i = 0; i < 100 ; i++)
{
var n = i%10;
matrix[n].push(i);
}
console.log(matrix);
//write it as a js array and export it (can't get brackets to stay)
fs.writeFile("./matrixtest.js", matrix, function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
So the console.log gives me [[0,10,20,30,...100],...,[1,11,21,31,...91]] and so on. But opening up matrixtest.js it's only this:
0,10,20,30,40,50...
All the numbers separated by commas with no brackets. How do I prevent it from converting to that format? Thank you.
When you are writing an Array to a file, it is getting converted to a string as JavaScript cannot figure out how to write an array as it is. That is why it loses the format. You can convert an array to a string like this and check
var array = [1, 2, 3, 4];
console.log(array.toString());
// 1,2,3,4
So, to solve this problem, you might want to convert it to a JSON string like this
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
...
}
stringify it (JSON.stringify) before saving it then parse it (JSON.parse) when reading it back in.
fs.writeFile("./matrixtest.js", JSON.stringify(matrix), function(err) {
if(err) {
console.log(err);
}
else {
console.log("Output saved to /matrixtest.js.");
}
});
then when reading back in
var matrix = JSON.parse(contents);
The system doesn't know that you wanna store the array into the file with [].
It just puts the contents of the array to the file.
So, first you need to convert the data you wanna write to file into JSON format.
The JSON.stringify() method converts a JavaScript value to a JSON string
Then, write the JSON string to the file.
Then, while reading, use JSON.parse(). JSON.parse() method parses a JSON string, constructing the JavaScript value or object described by the string
fs.writeFile('./matrix.js', JSON.stringify(matrix), function (err) {
if(err) {
console.log(err);
}
})
fs.readFile('./matrix.js', function(err, data) {
console.log(JSON.parse(data));
//Do whatever you want with JSON.parse(data)
});

Write objects into file with Node.js

I've searched all over stackoverflow / google for this, but can't seem to figure it out.
I'm scraping social media links of a given URL page, and the function returns an object with a list of URLs.
When I try to write this data into a different file, it outputs to the file as [object Object] instead of the expected:
[ 'https://twitter.com/#!/101Cookbooks',
'http://www.facebook.com/101cookbooks']
as it does when I console.log() the results.
This is my sad attempt to read and write a file in Node, trying to read each line(the url) and input through a function call request(line, gotHTML):
fs.readFileSync('./urls.txt').toString().split('\n').forEach(function (line){
console.log(line);
var obj = request(line, gotHTML);
console.log(obj);
fs.writeFileSync('./data.json', obj , 'utf-8');
});
for reference -- the gotHTML function:
function gotHTML(err, resp, html){
var social_ids = [];
if(err){
return console.log(err);
} else if (resp.statusCode === 200){
var parsedHTML = $.load(html);
parsedHTML('a').map(function(i, link){
var href = $(link).attr('href');
for(var i=0; i<socialurls.length; i++){
if(socialurls[i].test(href) && social_ids.indexOf(href) < 0 ) {
social_ids.push(href);
};
};
})
};
return social_ids;
};
Building on what deb2fast said I would also pass in a couple of extra parameters to JSON.stringify() to get it to pretty format:
fs.writeFileSync('./data.json', JSON.stringify(obj, null, 2) , 'utf-8');
The second param is an optional replacer function which you don't need in this case so null works.
The third param is the number of spaces to use for indentation. 2 and 4 seem to be popular choices.
obj is an array in your example.
fs.writeFileSync(filename, data, [options]) requires either String or Buffer in the data parameter. see docs.
Try to write the array in a string format:
// writes 'https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks'
fs.writeFileSync('./data.json', obj.join(',') , 'utf-8');
Or:
// writes ['https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks']
var util = require('util');
fs.writeFileSync('./data.json', util.inspect(obj) , 'utf-8');
edit: The reason you see the array in your example is because node's implementation of console.log doesn't just call toString, it calls util.format see console.js source
If you're geting [object object] then use JSON.stringify
fs.writeFile('./data.json', JSON.stringify(obj) , 'utf-8');
It worked for me.
In my experience JSON.stringify is slightly faster than util.inspect.
I had to save the result object of a DB2 query as a json file, The query returned an object of 92k rows, the conversion took very long to complete with util.inspect, so I did the following test by writing the same 1000 record object to a file with both methods.
JSON.Stringify
fs.writeFile('./data.json', JSON.stringify(obj, null, 2));
Time: 3:57 (3 min 57 sec)
Result's format:
[
{
"PROB": "00001",
"BO": "AXZ",
"CNTRY": "649"
},
...
]
util.inspect
var util = require('util');
fs.writeFile('./data.json', util.inspect(obj, false, 2, false));
Time: 4:12 (4 min 12 sec)
Result's format:
[ { PROB: '00001',
BO: 'AXZ',
CNTRY: '649' },
...
]
Could you try doing JSON.stringify(obj);
Like this:
var stringify = JSON.stringify(obj);
fs.writeFileSync('./data.json', stringify, 'utf-8');
Just incase anyone else stumbles across this, I use the fs-extra library in node and write javascript objects to a file like this:
const fse = require('fs-extra');
fse.outputJsonSync('path/to/output/file.json', objectToWriteToFile);
Further to #Jim Schubert's and #deb2fast's answers:
To be able to write out large objects of order which are than ~100 MB, you'll need to use for...of as shown below and match to your requirements.
const fsPromises = require('fs').promises;
const sampleData = {firstName:"John", lastName:"Doe", age:50, eyeColor:"blue"};
const writeToFile = async () => {
for (const dataObject of Object.keys(sampleData)) {
console.log(sampleData[dataObject]);
await fsPromises.appendFile( "out.json" , dataObject +": "+ JSON.stringify(sampleData[dataObject]));
}
}
writeToFile();
Refer https://stackoverflow.com/a/67699911/3152654 for full reference for node.js limits

Categories