Related
I recently made a script in which I want to delete an EXACT match from a txt file with node. It print's out true but doesn't actually remove the line from the txt file. I am using discord to get the argument.
I'm not sure what I'm doing wrong but here is my script:
const fs = require('fs')
fs.readFile('notredeemed.txt', function (err, data) {
if (err) throw err;
if (data.toString().match(new RegExp(`^${args[0]}$`, "m"))) {
fs.readFile('notredeemed.txt', 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
var result = data.replace(/${args[0]/g, '');
fs.writeFile('notredeemed.txt', result, 'utf8', function (err) {
if (err) return console.log(err);
});
});
console.log("true")
If someone could help me I would appreciate it :)
I think you can do it very similarly as your current solution, just the regex is a little off.
Instead of just using ^ and $ to indicate that the entire string starts and ends with the args[0], I used two capture groups as the delimiters of a line.
This one matches any newline character or the beginning of the string. Works for first line of file, and prevents partial replacement e.g. llo replacing Hello:
(\n|^)
And this one matches a carriage return or the end of the string. This works for cases where there is no newline at the end of the file, and also prevents Hel replacing Hello:
(\r|$)
That should ensure that you are always taking out an entire line that matches your args.
I also eliminated the second readFile as it wasn't necessary to get it to work.
const fs = require("fs")
fs.readFile("notredeemed.txt", function (err, data) {
if (err) throw err
const match = new RegExp(`(\n|^)${args[0]}(\r|$)`)
const newFile = data.toString().replace(match, ``)
fs.writeFile("notredeemed.txt", newFile, "utf8", function (err) {
if (err) return console.log(err)
console.log("true")
})
})
Here is my solution.
Algorithm:
Split the file content into individual lines with your desired End of Line Flag (generally "\n" or "\r\n")
Filter all the lines that you want to delete
Join all the filtered lines back together with the EOL flag
const replacingLine = "- C/C++ addons with Node-API";
const fileContent = `- V8
- WASI
- C++ addons
- C/C++ addons with Node-API
- C++ embedder API
- C/C++ addons with Node-API
- Corepack`;
const newFileContent = replace({ fileContent, replacingLine });
console.log(newFileContent);
function replace({ fileContent, replacingLine, endOfLineFlag = "\n" }) {
return fileContent
.split(endOfLineFlag)
.filter((line) => line !== replacingLine)
.join(endOfLineFlag);
}
I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.
fs.writeFile('log.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'
Any idea how to do this the easy way?
For occasional appends, you can use appendFile, which creates a new file handle each time it's called:
Asynchronously:
const fs = require('fs');
fs.appendFile('message.txt', 'data to append', function (err) {
if (err) throw err;
console.log('Saved!');
});
Synchronously:
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
But if you append repeatedly to the same file, it's much better to reuse the file handle.
When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.
I can add that appendFile is not easier to use than a WriteStream.
Example with appendFile:
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
fs.appendFile("append.txt", index+ "\n", function (err) {
if (err) console.log(err);
});
});
console.log(new Date().toISOString());
Up to 8000 on my computer, you can append data to the file, then you obtain this:
{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
at Error (native)
errno: -4066,
code: 'EMFILE',
syscall: 'open',
path: 'C:\\mypath\\append.txt' }
Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.
If you want to append to a file, you must use a writable stream like this:
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();
You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.
Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.
var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.
fs.writeFile('log.txt', 'Hello Node', {'flag':'a'}, function(err) {
if (err) {
return console.error(err);
}
});
By passing flag 'a', data will be appended at the end of the file.
Use a+ flag to append and create a file (if doesn't exist):
fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
if (err) throw err;
console.log('The file is created if not existing!!');
});
Docs: https://nodejs.org/api/fs.html#fs_file_system_flags
You need to open it, then write to it.
var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
fs.write( id, 'string to append to file', null, 'utf8', function(){
fs.close(id, function(){
console.log('file closed');
});
});
});
Here's a few links that will help explain the parameters
open
write
close
EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.
My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).
Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.
var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
stream.cork();
stream.write("some content to log");
process.nextTick(() => stream.uncork());
}
uncork will flush the data to the file in the next tick.
In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.
Node.js 0.8 has fs.appendFile:
fs.appendFile('message.txt', 'data to append', (err) => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
Documentation
Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.
In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.
I tried it with over 50,000 lines in a file.
Examples :
(async () => {
// using appendFile.
const fsp = require('fs').promises;
await fsp.appendFile(
'/path/to/file', '\r\nHello world.'
);
// using apickfs; handles error and edge cases better.
const apickFileStorage = require('apickfs');
await apickFileStorage.writeLines(
'/path/to/directory/', 'filename', 'Hello world.'
);
})();
Ref: https://github.com/nodejs/node/issues/7560
If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:
const os = require('os');
const fs = require('fs-extra');
const file = 'logfile.txt';
const options = {flag: 'a'};
async function writeToFile(text) {
await fs.outputFile(file, `${text}${os.EOL}`, options);
}
writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');
Tested with Node v8.9.4.
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)
fs.open('log.txt', 'a', function(err, log) {
if (err) throw err;
fs.writeFile(log, 'Hello Node', function (err) {
if (err) throw err;
fs.close(log, function(err) {
if (err) throw err;
console.log('It\'s saved!');
});
});
});
Using jfile package :
myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Try to use flags: 'a' to append data to a file
var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
stream.once('open', function(fd) {
stream.write(msg+"\r\n");
});
Here's a full script. Fill in your file names and run it and it should work!
Here's a video tutorial on the logic behind the script.
var fs = require('fs');
function ReadAppend(file, appendFile){
fs.readFile(appendFile, function (err, data) {
if (err) throw err;
console.log('File was read');
fs.appendFile(file, data, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
const inovioLogger = (logger = "") => {
const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
const log_stdout = process.stdout;
log_file.write(logger + '\n');
}
In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:
const { appendFile } = require('fs').promises;
await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);
I hope it'll help đ
I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.
append (path, name, data) {
return new Promise(async (resolve, reject) => {
try {
fs.appendFile((path + name), data, async (err) => {
if (!err) {
return resolve((path + name));
} else {
return reject(err);
}
});
} catch (err) {
return reject(err);
}
});
}
I am trying to loop through all the images in my folder convert it into base64 and send to MongoDB.
I started with one image, worked fine.
var filename = '1500.jpg';
var binarydata = fs.readFileSync(filename);
var converted = new Buffer(binarydata).toString("base64");
console.log(converted);
The above code gives me base64 for one file.
I tried changing the code so that it will loop through all the files in my directory and give me base64 for each file.
here is what I wrote but it did not work;
var variantfolder = './variantimages';
fs.readdir(variantfolder, function(err, files){
if (err) {
console.log(err);
}
else {
fs.readFileSync(files, function(err, res){
if (err){console.log('err')} else {
var converted = new Buffer(res).toString("base64");
var onevariant = {
"imagename":files,
"imagebase64":converted
}
var newvariant = new Variant(onevariant)
newvariant.save(err, newvar){
if (err) {
console.log('err');
}
else {
console.log('saved to mongo');
}
}
}
})
}
})
I suspect the problem will be related to you calling functions in the wrong ways.
Check the inputs and outputs of the functions you are using.
The fs.readdir() function callback is passed 2 parameters, an error and an array of file names.
The fs.readFileSync() function takes the parameters path and options. It also returns the file contents, it doesn't take a callback. The callback version is fs.readFile().
So in your code you are passing an array of file names into the file path parameter, which will not work.
You can also pass base64 as the encoding when reading the file and you won't have to convert it after.
I expect you will want something more along these lines (add your own error handling as required):
fs.readdir(variantfolder, (err, fileNames) => {
fileNames.forEach((fileName) => {
fs.readFile(`${variantfolder}/${fileName}`, 'base64', (err, base64Data) => {
// Do your thing with the file data.
});
});
});
Note that you can use the async, sync or promise (fs.promises) version of the fs functions depending on what is most suitable for your code.
I have two CSV files, one with routing steps and one with a list of ids. I need to add each id to the start of each routing step. I'm using Node.js.
var routeNumFile = '/routing_numbers.csv';
var routeStepFile = '/routing_steps.csv';
const csvToJson = require('csvtojson');
const jsonToCsv = require('json2csv').parse;
const fs = require('fs');
var routeNumArray;
var routeStepArray;
try {
routeNumArray = await csvToJson().fromFile(routeNumFile);
} catch (err) {
console.log("error in reading csv file")
}
try {
routeStepArray = await csvToJson().fromFile(routeStepFile);
} catch (err) {
console.log("error in reading csv file")
}
var outputArray = new Array;
var outputPath = '/gitlab/BSI_Create_Csv_Import/finalOutput.csv';
if (routeNumArray != null && routeStepArray != null) {
Object.keys(routeNumArray).forEach(function (key1) {
Object.keys(routeStepArray).forEach(function (key2) {
var comboObj = Object.assign(routeNumArray[key1], routeStepArray[key2]);
console.log(comboObj);
outputArray.push(comboObj);
});
});
}
console.log(outputArray);
var csv = jsonToCsv(outputArray);
fs.writeFileSync(outputPath, csv);
The output from console.log(comboObj) is what I want. However, when I put that into an array, I just get the very last entry in the routing steps CSV over and over. If I write it using a stream directly where the comboObj is created, it mostly works. If I do that I convert the object to CSV first and the output leaves me with the headings duplicated at the end of every line. It writes much cleaner from an array of JSON objects to a CSV.
So, I'm trying to get all of the combined objects together in an array so I can convert it all to CSV and write it to a file. Can anyone tell me what's going wrong with my method?
You should create a new object and assign properties to it, like this:
var comboObj = Object.assign({}, routeNumArray[key1], routeStepArray[key2]);
I've been trying to find a way to write to a file when using Node.js, but with no success. How can I do that?
There are a lot of details in the File System API. The most common way is:
const fs = require('fs');
fs.writeFile("/tmp/test", "Hey there!", function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
// Or
fs.writeFileSync('/tmp/test-sync', 'Hey there!');
Currently there are three ways to write a file:
fs.write(fd, buffer, offset, length, position, callback)
You need to wait for the callback to ensure that the buffer is written to disk. It's not buffered.
fs.writeFile(filename, data, [encoding], callback)
All data must be stored at the same time; you cannot perform sequential writes.
fs.createWriteStream(path, [options])
Creates a WriteStream, which is convenient because you don't need to wait for a callback. But again, it's not buffered.
A WriteStream, as the name says, is a stream. A stream by definition is âa bufferâ containing data which moves in one direction (source âș destination). But a writable stream is not necessarily âbufferedâ. A stream is âbufferedâ when you write n times, and at time n+1, the stream sends the buffer to the kernel (because it's full and needs to be flushed).
In other words: âA bufferâ is the object. Whether or not it âis bufferedâ is a property of that object.
If you look at the code, the WriteStream inherits from a writable Stream object. If you pay attention, youâll see how they flush the content; they don't have any buffering system.
If you write a string, itâs converted to a buffer, and then sent to the native layer and written to disk. When writing strings, they're not filling up any buffer. So, if you do:
write("a")
write("b")
write("c")
You're doing:
fs.write(new Buffer("a"))
fs.write(new Buffer("b"))
fs.write(new Buffer("c"))
Thatâs three calls to the I/O layer. Although you're using âbuffersâ, the data is not buffered. A buffered stream would do: fs.write(new Buffer ("abc")), one call to the I/O layer.
As of now, in Node.js v0.12 (stable version announced 02/06/2015) now supports two functions:
cork() and
uncork(). It seems that these functions will finally allow you to buffer/flush the write calls.
For example, in Java there are some classes that provide buffered streams (BufferedOutputStream, BufferedWriter...). If you write three bytes, these bytes will be stored in the buffer (memory) instead of doing an I/O call just for three bytes. When the buffer is full the content is flushed and saved to disk. This improves performance.
I'm not discovering anything, just remembering how a disk access should be done.
You can of course make it a little more advanced. Non-blocking, writing bits and pieces, not writing the whole file at once:
var fs = require('fs');
var stream = fs.createWriteStream("my_file.txt");
stream.once('open', function(fd) {
stream.write("My first row\n");
stream.write("My second row\n");
stream.end();
});
Synchronous Write
fs.writeFileSync(file, data[, options])
fs = require('fs');
fs.writeFileSync("foo.txt", "bar");
Asynchronous Write
fs.writeFile(file, data[, options], callback)
fs = require('fs');
fs.writeFile('foo.txt', 'bar', (err) => { if (err) throw err; });
Where
file <string> | <Buffer> | <URL> | <integer> filename or file descriptor
data <string> | <Buffer> | <Uint8Array>
options <Object> | <string>
callback <Function>
Worth reading the offical File System (fs) docs.
Update: async/await
fs = require('fs');
util = require('util');
writeFile = util.promisify(fs.writeFile);
fn = async () => { await writeFile('foo.txt', 'bar'); }
fn()
var path = 'public/uploads/file.txt',
buffer = new Buffer("some content\n");
fs.open(path, 'w', function(err, fd) {
if (err) {
throw 'error opening file: ' + err;
}
fs.write(fd, buffer, 0, buffer.length, null, function(err) {
if (err) throw 'error writing file: ' + err;
fs.close(fd, function() {
console.log('file written');
})
});
});
The answers provided are dated and a newer way to do this is:
const fsPromises = require('fs').promises
await fsPromises.writeFile('/path/to/file.txt', 'data to write')
see documents here for more info
I liked Index of ./articles/file-system.
It worked for me.
See also How do I write files in node.js?.
fs = require('fs');
fs.writeFile('helloworld.txt', 'Hello World!', function (err) {
if (err)
return console.log(err);
console.log('Wrote Hello World in file helloworld.txt, just check it');
});
Contents of helloworld.txt:
Hello World!
Update:
As in Linux node write in current directory , it seems in some others don't, so I add this comment just in case :
Using this ROOT_APP_PATH = fs.realpathSync('.'); console.log(ROOT_APP_PATH); to get where the file is written.
I know the question asked about "write" but in a more general sense "append" might be useful in some cases as it is easy to use in a loop to add text to a file (whether the file exists or not). Use a "\n" if you want to add lines eg:
var fs = require('fs');
for (var i=0; i<10; i++){
fs.appendFileSync("junk.csv", "Line:"+i+"\n");
}
OK, it's quite simple as Node has built-in functionality for this, it's called fs which stands for File System and basically, NodeJS File System module...
So first require it in your server.js file like this:
var fs = require('fs');
fs has few methods to do write to file, but my preferred way is using appendFile, this will append the stuff to the file and if the file doesn't exist, will create one, the code could be like below:
fs.appendFile('myFile.txt', 'Hi Ali!', function (err) {
if (err) throw err;
console.log('Thanks, It\'s saved to the file!');
});
You may write to a file using fs (file system) module.
Here is an example of how you may do it:
const fs = require('fs');
const writeToFile = (fileName, callback) => {
fs.open(fileName, 'wx', (error, fileDescriptor) => {
if (!error && fileDescriptor) {
// Do something with the file here ...
fs.writeFile(fileDescriptor, newData, (error) => {
if (!error) {
fs.close(fileDescriptor, (error) => {
if (!error) {
callback(false);
} else {
callback('Error closing the file');
}
});
} else {
callback('Error writing to new file');
}
});
} else {
callback('Could not create new file, it may already exists');
}
});
};
You might also want to get rid of this callback-inside-callback code structure by useing Promises and async/await statements. This will make asynchronous code structure much more flat. For doing that there is a handy util.promisify(original) function might be utilized. It allows us to switch from callbacks to promises. Take a look at the example with fs functions below:
// Dependencies.
const util = require('util');
const fs = require('fs');
// Promisify "error-back" functions.
const fsOpen = util.promisify(fs.open);
const fsWrite = util.promisify(fs.writeFile);
const fsClose = util.promisify(fs.close);
// Now we may create 'async' function with 'await's.
async function doSomethingWithFile(fileName) {
const fileDescriptor = await fsOpen(fileName, 'wx');
// Do something with the file here...
await fsWrite(fileDescriptor, newData);
await fsClose(fileDescriptor);
}
You can write to files with streams.
Just do it like this:
const fs = require('fs');
const stream = fs.createWriteStream('./test.txt');
stream.write("Example text");
var fs = require('fs');
fs.writeFile(path + "\\message.txt", "Hello", function(err){
if (err) throw err;
console.log("success");
});
For example : read file and write to another file :
var fs = require('fs');
var path = process.cwd();
fs.readFile(path+"\\from.txt",function(err,data)
{
if(err)
console.log(err)
else
{
fs.writeFile(path+"\\to.text",function(erro){
if(erro)
console.log("error : "+erro);
else
console.log("success");
});
}
});
Here we use w+ for read/write both actions and if the file path is not found then it would be created automatically.
fs.open(path, 'w+', function(err, data) {
if (err) {
console.log("ERROR !! " + err);
} else {
fs.write(data, 'content', 0, 'content length', null, function(err) {
if (err)
console.log("ERROR !! " + err);
fs.close(data, function() {
console.log('written success');
})
});
}
});
Content means what you have to write to the file and its length, 'content.length'.
Here is the sample of how to read file csv from local and write csv file to local.
var csvjson = require('csvjson'),
fs = require('fs'),
mongodb = require('mongodb'),
MongoClient = mongodb.MongoClient,
mongoDSN = 'mongodb://localhost:27017/test',
collection;
function uploadcsvModule(){
var data = fs.readFileSync( '/home/limitless/Downloads/orders_sample.csv', { encoding : 'utf8'});
var importOptions = {
delimiter : ',', // optional
quote : '"' // optional
},ExportOptions = {
delimiter : ",",
wrap : false
}
var myobj = csvjson.toSchemaObject(data, importOptions)
var exportArr = [], importArr = [];
myobj.forEach(d=>{
if(d.orderId==undefined || d.orderId=='') {
exportArr.push(d)
} else {
importArr.push(d)
}
})
var csv = csvjson.toCSV(exportArr, ExportOptions);
MongoClient.connect(mongoDSN, function(error, db) {
collection = db.collection("orders")
collection.insertMany(importArr, function(err,result){
fs.writeFile('/home/limitless/Downloads/orders_sample1.csv', csv, { encoding : 'utf8'});
db.close();
});
})
}
uploadcsvModule()
fs.createWriteStream(path[,options])
options may also include a start option to allow writing data at some position past the beginning of the file. Modifying a file rather than replacing it may require a flags mode of r+ rather than the default mode w. The encoding can be any one of those accepted by Buffer.
If autoClose is set to true (default behavior) on 'error' or 'finish' the file descriptor will be closed automatically. If autoClose is false, then the file descriptor won't be closed, even if there's an error. It is the application's responsibility to close it and make sure there's no file descriptor leak.
Like ReadStream, if fd is specified, WriteStream will ignore the path argument and will use the specified file descriptor. This means that no 'open' event will be emitted. fd should be blocking; non-blocking fds should be passed to net.Socket.
If options is a string, then it specifies the encoding.
After, reading this long article. You should understand how it works.
So, here's an example of createWriteStream().
/* The fs.createWriteStream() returns an (WritableStream {aka} internal.Writeable) and we want the encoding as 'utf'-8 */
/* The WriteableStream has the method write() */
fs.createWriteStream('out.txt', 'utf-8')
.write('hello world');
Point 1:
If you want to write something into a file.
means: it will remove anything already saved in the file and write the new content. use fs.promises.writeFile()
Point 2:
If you want to append something into a file.
means: it will not remove anything already saved in the file but append the new item in the file content.then first read the file, and then add the content into the readable value, then write it to the file. so use fs.promises.readFile and fs.promises.writeFile()
example 1: I want to write a JSON object in my JSON file .
const fs = require('fs');
const data = {table:[{id: 1, name: 'my name'}]}
const file_path = './my_data.json'
writeFile(file_path, data)
async function writeFile(filename, writedata) {
try {
await fs.promises.writeFile(filename, JSON.stringify(writedata, null, 4), 'utf8');
console.log('data is written successfully in the file')
}
catch (err) {
console.log('not able to write data in the file ')
}
}
example2 :
if you want to append data to a JSON file.
you want to add data {id:1, name:'my name'} to file my_data.json on the same folder root. just call append_data (file_path , data ) function.
It will append data in the JSON file if the file existed . or it will create the file and add the data to it.
const fs = require('fs');
const data = {id: 2, name: 'your name'}
const file_path = './my_data.json'
append_data(file_path, data)
async function append_data(filename, data) {
if (fs.existsSync(filename)) {
var read_data = await readFile(filename)
if (read_data == false) {
console.log('not able to read file')
} else {
read_data.table.push(data) //data must have the table array in it like example 1
var dataWrittenStatus = await writeFile(filename, read_data)
if (dataWrittenStatus == true) {
console.log('data added successfully')
} else {
console.log('data adding failed')
}
}
}
}
async function readFile(filePath) {
try {
const data = await fs.promises.readFile(filePath, 'utf8')
return JSON.parse(data)
}
catch (err) {
return false;
}
}
async function writeFile(filename, writedata) {
try {
await fs.promises.writeFile(filename, JSON.stringify(writedata, null, 4), 'utf8');
return true
}
catch (err) {
return false
}
}
You can use library easy-file-manager
install first from npm
npm install easy-file-manager
Sample to upload and remove files
var filemanager = require('easy-file-manager')
var path = "/public"
var filename = "test.jpg"
var data; // buffered image
filemanager.upload(path,filename,data,function(err){
if (err) console.log(err);
});
filemanager.remove(path,"aa,filename,function(isSuccess){
if (err) console.log(err);
});
You can write in a file by the following code example:
var data = [{ 'test': '123', 'test2': 'Lorem Ipsem ' }];
fs.open(datapath + '/data/topplayers.json', 'wx', function (error, fileDescriptor) {
if (!error && fileDescriptor) {
var stringData = JSON.stringify(data);
fs.writeFile(fileDescriptor, stringData, function (error) {
if (!error) {
fs.close(fileDescriptor, function (error) {
if (!error) {
callback(false);
} else {
callback('Error in close file');
}
});
} else {
callback('Error in writing file.');
}
});
}
});