How to write to a JSON file in Node - javascript

I am currently requiring a JSON file which I am reading data from.
var allUORHours = require('./UORHoursAch.json');
How do I then write to the file? The below doesn't make any changes to the file
allUORHours.test = {};

You may use the File System API's writeFile():
https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback

No, of course it doesn't. It just changes the variable's value. To write a JSON, you would need to convert to JSON, then write to a file:
var fs = require('fs');
fs.writeFile('./UORHoursAch.json', JSON.stringify(allUORHours), function (err) {
if (err) {
console.log(err);
} else {
console.log("Saved");
}
});

Related

Read and Write text file using create-react-app from the browser

I am trying to read a text file that is in the source(src) folder of the react project(creat-react-app), manipulate the values and write back the new value to the same text file.
I am unable to read the values from the file, even though the code that reads the file is logging out old data, not sure where is that coming from. Because even if change the data in the text file directly, it doesn't read the new value.
I am using a package called browserify-fs (https://www.npmjs.com/package/browserify-fs) for reading and writing to a file.
var fs = require('browserify-fs');
var reader = new FileReader();
export const getData = () => {
let initialString = "abcd";
fs.readFile('file.txt', function (err, data) {
if (err) {
return console.error(err);
}
console.log(initialString + data.toString());
});
};
export const writeData = () => {
let data = "abcd";
fs.writeFile("file.txt", data, err => {
// In case of a error throw err.
if (err) throw err;
});
}
Does it have to do something with webpack-loader for importing the types of file for the build or is it related specifically to create-react-app package which defines the files and folder structure for auto-importing types of files?
I am still not sure what is the actual issue causing. Any help would be appreciated.
P.S: I know using CRUD operations on the browser is not a recommended practice, just using for a personal project(learning purpose).

Write json object to .json file in JavaScript

var proj = {
Name : "abc",
Roll no : 123
};
How can I write proj data in JSON format in .json file in javascript?
You can save the json object in file.json file like this.
const FileSystem = require("fs");
FileSystem.writeFile('file.json', JSON.stringify(proj), (error) => {
if (error) throw error;
});
JSON.stringify (introduced in ES 5.1) will convert the object into a string of JSON.
var json = JSON.stringify(proj);
JavaScript has no built-in mechanism for writing to a file. You generally need something non-standard provided by the host environment. For example, Node.js has the writeFile method of the File System module.
Pretty much the same as top answer but slightly simpler version I regularly use to help with debugging (prints out an object at a given point in the code without using a debugger):
require('fs').writeFile('file.json', JSON.stringify(proj), (error) => {
if (error) {
throw error;
}
});

fs.write json file to capture and save streaming json file

I want json stream stored in text file. When running the node server, the json file isn't appended to the json.txt file. What am I missing? Am new to to node, so be gentle..
Here is a code chunk I expect to capture the json content:
var fs = require('fs');
fs.writeFile("json.txt",{encoding:"utf8"}, function(err) {
if(err) {
console.log(err);
} else {
console.log("The file was saved!");
}
});
The issue is you aren't using the correct parameters. When calling fs.writeFile it expects a string for the filename, a buffer or string for the content, an object for the options and a callback function. What it looks like you're doing is sending the options as the second parameter when it expects a buffer or a string. Correction below;
var fs = require('fs');
fs.writeFile("json.txt", JSON.stringify({some: object}), {encoding:"utf8"}, function(err) {
if(err) {
console.log(err);
} else {
console.log("The file was saved!");
}
});
You can replace the JSON.stringify part with some plain text if you wanted to, but you specified JSON in your question so I assumed you wanted to store some object in a file
Source (NodeJS documentation)
EDIT:
The links to other questions in the comments may be more relevant if you want to add new lines to the end of the file and not completely overwrite the old one. However I made the assumption that fs.writeFile was the intended function. If that wasn't the intention, those other questions will help a lot more
UPDATE:
It seems the issue was the fact that the body wasn't being parsed, so when the POST request was going through, Node didn't have the request body. To alleviate this, during the express configuration, the following code is needed:
var bodyParser = require('body-parser');
app.use(bodyParser.json());
Uses the npm module body-parser. This will convert the JSON body to a JavaScript object, and it is accessible via req.body.

Node.js traverse HTML files and extract angular elements that needs translating

I have a set of html and js files that needs translating. Instead of traditionally copying pasting each keys to a json file, I was wondering if there was a way to do this faster by building a Node JS script. I have a JS script currently which traverses recursively on the directory. And, able to read the current file which is being traversed. But, I want to only extract angular elements that needs to be translated. {{"Welcome" | translate}} <-- HTML $scope.word = {$translate.instant('Export Attendance'); <-- JS Controller
Basically, these are the patterns I want my program to look out for, and only capture the strings into another seperate JSON file.
Currently I have a program that is below.
get_translations.js
var read = require('recursive-readdir-sync');
var fs = require('fs');
try {
root = read('./files/that/has/html/and/js');
} catch (err) {
if (err) {
console.log("File does not exist");
} else {
throw err;
}
}
for (var a=0; a<root.length; a++) {
console.log(root[a]);
fs.readFile(root[a], 'utf-8', function(err, data) {
if (err) {
throw err;
} else {
console.log(data); //need help here.. (noob)
}
});
}
I'd like to avoid JQuery as much as possible. Any light shed on the matter will be greatly apprecieated.
Thanks.
Noob Javascript

Trouble listing files with fs.readdir() in Node.js

I'm working my way through the "Learn You The Node.js For Much Win!" workshop but I'm having trouble on exercise 5. It asks you to Create a program that prints a list of files in a given directory, filtered by the extension of the files.
I passed in the directory, files, that contains an assortment of JavaScript, Ruby, and plain text files. It is supposed to console.log() each file with the .js extension.
var fs = require('fs');
function indexDirectory(directory) {
fs.readdir(directory, function(err, files) {
for (var i in files) {
if (i.indexOf('.js') != -1) {
console.log(files[i]);
}
}
});
}
indexDirectory('files');
My current code does not output anything when I run it with node program.js. Am I missing some asynchronous principle? Am I using callbacks incorrectly? Any help would be appreciated :)
files are array, you should use forEach instead of for .. in
var fs = require('fs');
function indexDirectory(directory) {
fs.readdir(directory, function(err, files) {
files.forEach(function (file) {
if (file.indexOf('.js') != -1) {
console.log(file);
}
});
});
}
indexDirectory('files');
One more problem with this code is that it will print also files with '.json' extension. So instead of indexOf you should use, for example, regular expressions. Something like this:
var matches = new RegExp(".js$").test(files[i]);
if (matches) {
console.log(files[i]);
}

Categories