Below is the way I usually use to safely parse local JSON data in Node environment, mostly config files and some other relevant data:
const fs = require('fs')
let localDb
let parsedData
try {
localDb = fs.readFileSync('./file.json', 'utf8')
parsedData = JSON.parse(localDb)
} catch (err) {
throw err
}
exports.data = parsedData
In the end, I export the parsed data from the JavaScript file for usage. While this works perfectly fine, I'm curious to know if there are better ways to do the same thing with a functional approach.
Just wrap your code inside a function and export the return of that function:
const fs = require('fs')
function parseDBData(name, coding) {
let localDb;
let parsedData;
try {
localDb = fs.readFileSync(name, coding);
parsedData = JSON.parse(localDb);
} catch (err) {
throw err;
}
}
exports.data = parseDBData('./file.json', 'utf8');
p.s. with node you can directly get the JSON files content through require:
exports.data = require('./file.json');
Related
This works and pipes downloaded data into a file:
const fs = require('fs')
await fetch(downloadURL).then(res => {
const dest = fs.createWriteStream('/tmp/output.xlsx');
res.body.pipe(dest);
});
This also works:
const buffer = await page.evaluate(({downloadURL}) =>
{
return fetch(downloadURL, {
method: 'GET'
}).then(r => r.text());
}, {downloadURL});
But to read a binary stream inside of page.evaluate(), I need to replace r => r.text() with the res.body.pipe from the first code snipped. When I do that:
const fs = require('fs')
const buff = await page.evaluate(({ downloadURL, fs }) => {
return fetch(downloadURL, fs, {
method: 'GET'
}).then(res => {
const dest = fs.createWriteStream('/tmp/output.xlsx');
res.body.pipe(dest);
});
}, { downloadURL, fs });
The error I get is TypeError: fs.createWriteStream is not a function
I don't think it has anything to do with "fs" per se; my bet is that somehow the "fs" is out of scope for this function structure.
How do I fix this last snippet so the data read is piped to a file?
My gut tells me it's some syntactical fix someone more skilled than I am can do...
thx
Since zlib has been added to node.js I'd like to ask a question about unzipping .gz with async/await style, w/o of using streams, one by one.
In the code below I am using fs-extra instead of standard fs & typescript (instead of js), but as for the answer, it doesn't matter will it have js or ts code.
import fs from 'fs-extra';
import path from "path";
import zlib from 'zlib';
(async () => {
try {
//folder which is full of .gz files.
const dir = path.join(__dirname, '..', '..', 'folder');
const files: string[] = await fs.readdir(dir);
for (const file of files) {
//read file one by one
const
file_content = fs.createReadStream(`${dir}/${file}`),
write_stream = fs.createWriteStream(`${dir}/${file.slice(0, -3)}`,),
unzip = zlib.createGunzip();
file_content.pipe(unzip).pipe(write_stream);
}
} catch (e) {
console.error(e)
}
})()
As for now, I have this code, based on streams, which is working, but in various StackOverflow answers, I haven't found any example with async/await, only this one, but it also uses streams I guess.
So does it even possible?
//inside async function
const read_file = await fs.readFile(`${dir}/${file}`)
const unzip = await zlib.unzip(read_file);
//write output of unzip to file or console
I understand that this task will block the main thread. It's ok for me since I write a simple day schedule script.
Seems I have figure it out, but I am still not hundred percent sure about it, here is example of full IIFE:
(async () => {
try {
//folder which is full of .gz files.
const dir = path.join(__dirname, '..', '..', 'folder');
const files: string[] = await fs.readdir(dir);
//parallel run
await Promise.all(files.map(async (file: string, i: number) => {
//let make sure, that we have only .gz files in our scope
if (file.match(/gz$/g)) {
const
buffer = await fs.readFile(`${dir}/${file}`),
//using .toString() is a must, if you want to receive readble data, instead of Buffer
data = await zlib.unzipSync(buffer , { finishFlush: zlib.constants.Z_SYNC_FLUSH }).toString(),
//from here, you can write data to a new file, or parse it.
json = JSON.parse(data);
console.log(json)
}
}))
} catch (e) {
console.error(e)
} finally {
process.exit(0)
}
})()
If you have many files in one directory, I guess you could use await Promise.all(files.map => fn()) to run this task in parallel. Also, in my case, I required to parse JSON, so remember about some nuances of JSON.parse.
I have the following problem, I want some questions to be asked in my console, and when I receive the answers I save them in a json and then use them, I am not very familiar with Nodejs, I suppose it is something simple but it does not work for me
When trying to send customer.Token = answers [1]; what happens is that "token" no longer exists in my json. And for example if I use customer.Token =" Hi "; my json file changes perfectly
I need the answer that the user is giving at that moment to be sent
I've been trying all morning to make this work but I can't find a solution, if someone knows <3 it would help me a lot
here below I leave my code
const customer = require('./db.json');
const fs = require("fs");
function jsonReader(filePath, cb) {
fs.readFile(filePath, (err, fileData) => {
if (err) {
return cb && cb(err);
}
try {
const object = JSON.parse(fileData);
return cb && cb(null, object);
} catch (err) {
return cb && cb(err);
}
});
}
jsonReader("./db.json", (err, customer) => {
if (err) {
console.log("Error reading file:", err);
return;
}
customer.Token = "HI";
fs.writeFile("./db.json", JSON.stringify(customer), err => {
if (err) console.log("Error writing file:", err);
});
});
var questions = ['Cual es tu nombre?' ,
'Cual es tu Token?' ,
'Cual es tu numero de orden?'
]
var answers = [];
function question(i){
process.stdout.write(questions[i]);
}
process.stdin.on('data', function(data){
answers.push(data.toString().trim());
if(answers.length < questions.length){
question(answers.length);
}else{
process.exit();
}
})
question(0);
y en mi JSON:
{"name":"Jabibi","order_count":103,"Token":"HI"}
Although it's possible to get your script working with traditional callbacks, I think switching to promises and modern async/await syntax would be easier to read and follow.
readline (built-in module for Node.js) can be used for getting input from user.
You can use fs/promises instead of fs to take advantage of promises.
It seems like you were trying to create a new customer object (using the user's input) and then write it as JSON to your file system.
The script below writes to a temporary file path. Once tested that the right data is getting correctly written to a file, you can change the file path (I didn't want to overwrite an existing file on your machine).
const fs = require("fs/promises");
const readline = require("readline").createInterface({
input: process.stdin,
output: process.stdout,
});
function getUserInput(displayText) {
return new Promise((resolve) => {
readline.question(displayText, resolve);
});
}
const questions = [
["name", "Cual es tu nombre?"],
["token", "Cual es tu Token?"],
["order_count", "Cual es tu numero de orden?"],
];
async function main() {
const newCustomer = {};
for (const [property, question] of questions) {
const answer = await getUserInput(question.concat("\n"));
newCustomer[property] = answer;
}
console.log("New customer:", newCustomer);
const oldCustomer = await fs
.readFile("./db.json")
.then((data) => data.toString("utf-8"))
.then(JSON.parse);
// You can do something with old customer here, if you need to.
console.log("Old customer:", oldCustomer);
// I don't want to overwrite any existing file on your machine.
// You can change the file path and data below to whatever they should be
// once you've got your script working.
await fs.writeFile(
`./db_temp_${Date.now()}.json`,
JSON.stringify(newCustomer)
);
}
main()
.catch(console.error)
.finally(() => readline.close());
I've been making a project recently and I basically need to check for new text in a text file.
My code was this:
const fs = require('fs');
fs.watch('./file.txt', (event, filename) => {
fs.readFile('./file.txt', (err, data) => {
if (err) throw err;
data = JSON.parse(data);
console.log(data);
}
}
It worked great. However, sometimes, I must delete this file for whatever reasons, and thus my code crashes too!
Any idea on how to handle this? Thank you for your answers
Node's built-in module fs doesn't support file deletion detection very well. There is a workaround using a package called nsfw which is a wrapper around a native library that provides much better support for deletion detection.
The API is a bit odd but it is a solid package nonetheless.
Here is an example of what you're attempting to do using nsfw.
const nsfw = require("nsfw");
const path = require("path");
const fs = require("fs");
const file = path.join(__dirname, "file.txt");
let watcher;
nsfw(
file,
([event, ...restEvents]) => {
switch (event.action) {
case nsfw.actions.DELETED: {
watcher.stop();
return; // or handle this however you need to..
}
default: {
fs.readFile(file, (err, data) => {
if (err) throw err;
try {
data = JSON.parse(data);
console.log(data);
} catch (error) {
console.error(error)
}
});
}
}
}
)
.then((w) => {
watcher = w;
watcher.start()
});
Background
I am doing some experimentation with Node.js and would like to read a JSON object, either from a text file or a .js file (which is better??) into memory so that I can access that object quickly from code. I realize that there are things like Mongo, Alfred, etc out there, but that is not what I need right now.
Question
How do I read a JSON object out of a text or js file and into server memory using JavaScript/Node?
Sync:
var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));
Async:
var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
});
The easiest way I have found to do this is to just use require and the path to your JSON file.
For example, suppose you have the following JSON file.
test.json
{
"firstName": "Joe",
"lastName": "Smith"
}
You can then easily load this in your node.js application using require
var config = require('./test.json');
console.log(config.firstName + ' ' + config.lastName);
Asynchronous is there for a reason! Throws stone at #mihai
Otherwise, here is the code he used with the asynchronous version:
// Declare variables
var fs = require('fs'),
obj
// Read the file and send to the callback
fs.readFile('path/to/file', handleFile)
// Write the callback function
function handleFile(err, data) {
if (err) throw err
obj = JSON.parse(data)
// You can now play with your datas
}
At least in Node v8.9.1, you can just do
var json_data = require('/path/to/local/file.json');
and access all the elements of the JSON object.
Answer for 2022, using ES6 module syntax and async/await
In modern JavaScript, this can be done as a one-liner, without the need to install additional packages:
import { readFile } from 'fs/promises';
let data = JSON.parse(await readFile("filename.json", "utf8"));
Add a try/catch block to handle exceptions as needed.
In Node 8 you can use the built-in util.promisify() to asynchronously read a file like this
const {promisify} = require('util')
const fs = require('fs')
const readFileAsync = promisify(fs.readFile)
readFileAsync(`${__dirname}/my.json`, {encoding: 'utf8'})
.then(contents => {
const obj = JSON.parse(contents)
console.log(obj)
})
.catch(error => {
throw error
})
Using fs-extra package is quite simple:
Sync:
const fs = require('fs-extra')
const packageObj = fs.readJsonSync('./package.json')
console.log(packageObj.version)
Async:
const fs = require('fs-extra')
const packageObj = await fs.readJson('./package.json')
console.log(packageObj.version)
using node-fs-extra (async await)
const readJsonFile = async () => {
const myJsonObject = await fs.readJson('./my_json_file.json');
console.log(myJsonObject);
}
readJsonFile() // prints your json object
https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfile_file_options_callback
var fs = require('fs');
fs.readFile('/etc/passwd', (err, data) => {
if (err) throw err;
console.log(data);
});
// options
fs.readFile('/etc/passwd', 'utf8', callback);
https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfilesync_file_options
You can find all usage of Node.js at the File System docs!
hope this help for you!
function parseIt(){
return new Promise(function(res){
try{
var fs = require('fs');
const dirPath = 'K:\\merge-xml-junit\\xml-results\\master.json';
fs.readFile(dirPath,'utf8',function(err,data){
if(err) throw err;
res(data);
})}
catch(err){
res(err);
}
});
}
async function test(){
jsonData = await parseIt();
var parsedJSON = JSON.parse(jsonData);
var testSuite = parsedJSON['testsuites']['testsuite'];
console.log(testSuite);
}
test();
Answer for 2022, using v8 Import assertions
import jsObject from "./test.json" assert { type: "json" };
console.log(jsObject)
Dynamic import
const jsObject = await import("./test.json", {assert: { type: "json" }});
console.log(jsObject);
Read more at:
v8 Import assertions
So many answers, and no one ever made a benchmark to compare sync vs async vs require. I described the difference in use cases of reading json in memory via require, readFileSync and readFile here.
If you are looking for a complete solution for Async loading a JSON file from Relative Path with Error Handling
// Global variables
// Request path module for relative path
const path = require('path')
// Request File System Module
var fs = require('fs');
// GET request for the /list_user page.
router.get('/listUsers', function (req, res) {
console.log("Got a GET request for list of users");
// Create a relative path URL
let reqPath = path.join(__dirname, '../mock/users.json');
//Read JSON from relative path of this file
fs.readFile(reqPath , 'utf8', function (err, data) {
//Handle Error
if(!err) {
//Handle Success
console.log("Success"+data);
// Parse Data to JSON OR
var jsonObj = JSON.parse(data)
//Send back as Response
res.end( data );
}else {
//Handle Error
res.end("Error: "+err )
}
});
})
Directory Structure: