I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.
fs.writeFile('log.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'
Any idea how to do this the easy way?
For occasional appends, you can use appendFile, which creates a new file handle each time it's called:
Asynchronously:
const fs = require('fs');
fs.appendFile('message.txt', 'data to append', function (err) {
if (err) throw err;
console.log('Saved!');
});
Synchronously:
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
But if you append repeatedly to the same file, it's much better to reuse the file handle.
When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.
I can add that appendFile is not easier to use than a WriteStream.
Example with appendFile:
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
fs.appendFile("append.txt", index+ "\n", function (err) {
if (err) console.log(err);
});
});
console.log(new Date().toISOString());
Up to 8000 on my computer, you can append data to the file, then you obtain this:
{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
at Error (native)
errno: -4066,
code: 'EMFILE',
syscall: 'open',
path: 'C:\\mypath\\append.txt' }
Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.
If you want to append to a file, you must use a writable stream like this:
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();
You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.
Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.
var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.
fs.writeFile('log.txt', 'Hello Node', {'flag':'a'}, function(err) {
if (err) {
return console.error(err);
}
});
By passing flag 'a', data will be appended at the end of the file.
Use a+ flag to append and create a file (if doesn't exist):
fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
if (err) throw err;
console.log('The file is created if not existing!!');
});
Docs: https://nodejs.org/api/fs.html#fs_file_system_flags
You need to open it, then write to it.
var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
fs.write( id, 'string to append to file', null, 'utf8', function(){
fs.close(id, function(){
console.log('file closed');
});
});
});
Here's a few links that will help explain the parameters
open
write
close
EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.
My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).
Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.
var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
stream.cork();
stream.write("some content to log");
process.nextTick(() => stream.uncork());
}
uncork will flush the data to the file in the next tick.
In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.
Node.js 0.8 has fs.appendFile:
fs.appendFile('message.txt', 'data to append', (err) => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
Documentation
Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.
In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.
I tried it with over 50,000 lines in a file.
Examples :
(async () => {
// using appendFile.
const fsp = require('fs').promises;
await fsp.appendFile(
'/path/to/file', '\r\nHello world.'
);
// using apickfs; handles error and edge cases better.
const apickFileStorage = require('apickfs');
await apickFileStorage.writeLines(
'/path/to/directory/', 'filename', 'Hello world.'
);
})();
Ref: https://github.com/nodejs/node/issues/7560
If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:
const os = require('os');
const fs = require('fs-extra');
const file = 'logfile.txt';
const options = {flag: 'a'};
async function writeToFile(text) {
await fs.outputFile(file, `${text}${os.EOL}`, options);
}
writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');
Tested with Node v8.9.4.
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)
fs.open('log.txt', 'a', function(err, log) {
if (err) throw err;
fs.writeFile(log, 'Hello Node', function (err) {
if (err) throw err;
fs.close(log, function(err) {
if (err) throw err;
console.log('It\'s saved!');
});
});
});
Using jfile package :
myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Try to use flags: 'a' to append data to a file
var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
stream.once('open', function(fd) {
stream.write(msg+"\r\n");
});
Here's a full script. Fill in your file names and run it and it should work!
Here's a video tutorial on the logic behind the script.
var fs = require('fs');
function ReadAppend(file, appendFile){
fs.readFile(appendFile, function (err, data) {
if (err) throw err;
console.log('File was read');
fs.appendFile(file, data, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
const inovioLogger = (logger = "") => {
const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
const log_stdout = process.stdout;
log_file.write(logger + '\n');
}
In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:
const { appendFile } = require('fs').promises;
await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);
I hope it'll help 😉
I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.
append (path, name, data) {
return new Promise(async (resolve, reject) => {
try {
fs.appendFile((path + name), data, async (err) => {
if (!err) {
return resolve((path + name));
} else {
return reject(err);
}
});
} catch (err) {
return reject(err);
}
});
}
For a project I need to incorporate a backend Python function with Javascript (main code for a chatbot). Using Child processes, it seems to work (when using 'node script.js'). However, I need to access the data from the called python function. Right now, all I am getting is the the output.I tried to store it in the global variable, but it's showing as 'undefined'. Is there a way to actually access the data so I can use it outside the stdout.on?
This is the Javascript code for running the pythonscript:
// Give a path to the QR scanner Python file
const qrScannerPath = "python/qrCodeScanner.py"
const base64Arg = "base64_2.txt"
// Provide the '.exe' python file. If python is available as an 'environment varaible', then simply refer to it as 'python'"
const pythonExe = "python"
// Function to convert a utf-8 array to a string
const utfConverter = function (data) {
return String.fromCharCode.apply(String,(data))
}
// let's us handle python scripts
const spawn = require("child_process").spawn
const scriptExe = spawn(pythonExe, [qrScannerPath, base64Arg])
// If all goes well, the program should execute the Python code
let counterpartyData = {}
scriptExe.stdout.on("data", function (data) {
console.log("getting the Python script data...")
let cp = JSON.parse(utfConverter(data))
counterpartyData = {... cp} //should store the data to the variable
});
console.log(counterpartyData) // shows 'undefinied"
// In case our python script cannot be run, we'll get an error
scriptExe.stderr.on("data", (data) => {
console.error("error : " + data.toString())
});
//logs error message
scriptExe.on('error', (error) => {
console.error('error: ', error.message);
});
// Logs a message saying that our code worked perfectly fine
scriptExe.on("exit", (code) => {
console.log("Process quit with code : " + code)
})
If I run this code with node, the output of 'counterpartyData' is undefined. However, inside the stdout, it actually prints out the data I want.
Furthermore, I get python import errors when running the app on Heroku :(.
Thank you in advance!!
Happy New Year and joyful greetings <3
I want to get this measurement out of the terminal to a file and only the STDOut I don't want a STDerr. Can you help me with this?
printf("Average measurement = %8.3f %s .\n", averageCalbMeasurement, units);
So here is the point i need help with. I just want it out as a normal file.
Assuming this is the only line (and not all of stdout) you can use:
writeFile (example from https://www.geeksforgeeks.org/javascript-program-to-write-data-in-a-text-file/ )
<script>
// Requiring fs module in which
// writeFile function is defined.
const fs = require('fs')
// Data which will write in a file.
let data = "Learning how to write in a file."
// Write data in 'Output.txt' .
fs.writeFile('Output.txt', data, (err) => {
// In case of a error throw err.
if (err) throw err;
})
</script>
I need to pass in a text file in the terminal and then read the data from it, how can I do this?
node server.js file.txt
How do I pass in the path from the terminal, how do I read that on the other side?
You'll want to use the process.argv array to access the command-line arguments to get the filename and the FileSystem module (fs) to read the file. For example:
// Make sure we got a filename on the command line.
if (process.argv.length < 3) {
console.log('Usage: node ' + process.argv[1] + ' FILENAME');
process.exit(1);
}
// Read the file and print its contents.
var fs = require('fs')
, filename = process.argv[2];
fs.readFile(filename, 'utf8', function(err, data) {
if (err) throw err;
console.log('OK: ' + filename);
console.log(data)
});
To break that down a little for you process.argv will usually have length two, the zeroth item being the "node" interpreter and the first being the script that node is currently running, items after that were passed on the command line. Once you've pulled a filename from argv then you can use the filesystem functions to read the file and do whatever you want with its contents. Sample usage would look like this:
$ node ./cat.js file.txt
OK: file.txt
This is file.txt!
[Edit] As #wtfcoder mentions, using the "fs.readFile()" method might not be the best idea because it will buffer the entire contents of the file before yielding it to the callback function. This buffering could potentially use lots of memory but, more importantly, it does not take advantage of one of the core features of node.js - asynchronous, evented I/O.
The "node" way to process a large file (or any file, really) would be to use fs.read() and process each available chunk as it is available from the operating system. However, reading the file as such requires you to do your own (possibly) incremental parsing/processing of the file and some amount of buffering might be inevitable.
Usign fs with node.
var fs = require('fs');
try {
var data = fs.readFileSync('file.txt', 'utf8');
console.log(data.toString());
} catch(e) {
console.log('Error:', e.stack);
}
IMHO, fs.readFile() should be avoided because it loads ALL the file in memory and it won't call the callback until all the file has been read.
The easiest way to read a text file is to read it line by line. I recommend a BufferedReader:
new BufferedReader ("file", { encoding: "utf8" })
.on ("error", function (error){
console.log ("error: " + error);
})
.on ("line", function (line){
console.log ("line: " + line);
})
.on ("end", function (){
console.log ("EOF");
})
.read ();
For complex data structures like .properties or json files you need to use a parser (internally it should also use a buffered reader).
You can use readstream and pipe to read the file line by line without read all the file into memory one time.
var fs = require('fs'),
es = require('event-stream'),
os = require('os');
var s = fs.createReadStream(path)
.pipe(es.split())
.pipe(es.mapSync(function(line) {
//pause the readstream
s.pause();
console.log("line:", line);
s.resume();
})
.on('error', function(err) {
console.log('Error:', err);
})
.on('end', function() {
console.log('Finish reading.');
})
);
I am posting a complete example which I finally got working. Here I am reading in a file rooms/rooms.txt from a script rooms/rooms.js
var fs = require('fs');
var path = require('path');
var readStream = fs.createReadStream(path.join(__dirname, '../rooms') + '/rooms.txt', 'utf8');
let data = ''
readStream.on('data', function(chunk) {
data += chunk;
}).on('end', function() {
console.log(data);
});
The async way of life:
#! /usr/bin/node
const fs = require('fs');
function readall (stream)
{
return new Promise ((resolve, reject) => {
const chunks = [];
stream.on ('error', (error) => reject (error));
stream.on ('data', (chunk) => chunk && chunks.push (chunk));
stream.on ('end', () => resolve (Buffer.concat (chunks)));
});
}
function readfile (filename)
{
return readall (fs.createReadStream (filename));
}
(async () => {
let content = await readfile ('/etc/ssh/moduli').catch ((e) => {})
if (content)
console.log ("size:", content.length,
"head:", content.slice (0, 46).toString ());
})();