nodejs prepending to a file - javascript

For Node.js, what is the best way to prepend to a file in a way SIMILAR to
fs.appendFile(path.join(__dirname, 'app.log'), 'appendme', 'utf8')
Personally, the best way really revolves around a asynchronous solution to create a log where I can basically push onto the file from the top.

This solution isn't mine and I don't know where it's from but it works.
const data = fs.readFileSync('message.txt')
const fd = fs.openSync('message.txt', 'w+')
const insert = Buffer.from("text to prepend \n")
fs.writeSync(fd, insert, 0, insert.length, 0)
fs.writeSync(fd, data, 0, data.length, insert.length)
fs.close(fd, (err) => {
if (err) throw err;
});

It is impossible to add to a beginning of a file. See this question for the similar problem in C or this question for the similar problem in C#.
I suggest you do your logging in the conventional way (that is, log to the end of file).
Otherwise, there is no way around reading the file, adding the text to the start and writing it back to the file which can get really costly really fast.

It seems it is indeed possible with https://www.npmjs.com/package/prepend-file

Here is an example of how to prepend text to a file using gulp and a custom built function.
var through = require('through2');
gulp.src('somefile.js')
.pipe(insert('text to prepend with'))
.pipe(gulp.dest('Destination/Path/'))
function insert(text) {
function prefixStream(prefixText) {
var stream = through();
stream.write(prefixText);
return stream;
}
let prefixText = new Buffer(text + "\n\n"); // allocate ahead of time
// creating a stream through which each file will pass
var stream = through.obj(function (file, enc, cb) {
//console.log(file.contents.toString());
if (file.isBuffer()) {
file.contents = new Buffer(prefixText.toString() + file.contents.toString());
}
if (file.isStream()) {
throw new Error('stream files are not supported for insertion, they must be buffered');
}
// make sure the file goes through the next gulp plugin
this.push(file);
// tell the stream engine that we are done with this file
cb();
});
// returning the file stream
return stream;
}
Sources: [cole_gentry_github_dealingWithStreams][1]

Its possible by using the prepend-file node module. Do the following:
npm i prepend-file -S
import prepend-file module in your respective code.
Example:
let firstFile = 'first.txt';
let secondFile = 'second.txt';
prependFile(firstFile, secondFile, () => {
console.log('file prepend successfully');
})

Related

Save a console.log to a text file on local machine

I have a JavaScript file (file.js) that contains the following code:
console.log("Hello World!")
When I run this code in my terminal node file.js, I get the following output:
Hello World
If I wanted to save this to a file programmatically (not right clicking and clicking save), does anyone know how I can do that?
The only solution I can find on the internet was using JSON.stringify("Hello World!"), but this doesn't do anything I don't believe (doesn't even output an error).
Reference: How to save the output of a console.log(object) to a file?
You'll need to overwrite console.log with your own implementation that writes the values it's called with to a file.
const { appendFileSync } = require('fs');
const origConsole = globalThis.console;
const console = {
log: (...args) => {
appendFileSync('./logresults.txt', args.join('\n') + '\n');
return origConsole.log.apply(origConsole, args);
}
}
console.log("Hello World!");
console.log("another line", "yet another line");
If you called this frequently enough such that the sync writes are a problem, you could use appendFile or fs.promises.appendFile so that the writes don't block and use a queue for the pending values to write.
I suppose that your console.log() is some kind of text output you get from somewhere, so i made a randomstring gen function just to exemplify.
You just need to use the fs module from Node, then write the file to your system.
Like this:
const fs = require('fs');
const randomString = () => {
return Math.random().toString(36).substring(7);
};
const createFile = (fileName, content) => {
fs.writeFile(fileName, content, (err) => {
if (err) throw err;
console.log('The file has been saved!');
});
}
createFile('test.txt', randomString());
Just note that if you're receiving text from an iteration, you maybe wanna insert '\n' on the end of each iteration to break the line on your text file.

Node Js fs.writeFile changes existing object instead of pushing data to it [duplicate]

I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.
fs.writeFile('log.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'
Any idea how to do this the easy way?
For occasional appends, you can use appendFile, which creates a new file handle each time it's called:
Asynchronously:
const fs = require('fs');
fs.appendFile('message.txt', 'data to append', function (err) {
if (err) throw err;
console.log('Saved!');
});
Synchronously:
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
But if you append repeatedly to the same file, it's much better to reuse the file handle.
When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.
I can add that appendFile is not easier to use than a WriteStream.
Example with appendFile:
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
fs.appendFile("append.txt", index+ "\n", function (err) {
if (err) console.log(err);
});
});
console.log(new Date().toISOString());
Up to 8000 on my computer, you can append data to the file, then you obtain this:
{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
at Error (native)
errno: -4066,
code: 'EMFILE',
syscall: 'open',
path: 'C:\\mypath\\append.txt' }
Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.
If you want to append to a file, you must use a writable stream like this:
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();
You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.
Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.
var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.
fs.writeFile('log.txt', 'Hello Node', {'flag':'a'}, function(err) {
if (err) {
return console.error(err);
}
});
By passing flag 'a', data will be appended at the end of the file.
Use a+ flag to append and create a file (if doesn't exist):
fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
if (err) throw err;
console.log('The file is created if not existing!!');
});
Docs: https://nodejs.org/api/fs.html#fs_file_system_flags
You need to open it, then write to it.
var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
fs.write( id, 'string to append to file', null, 'utf8', function(){
fs.close(id, function(){
console.log('file closed');
});
});
});
Here's a few links that will help explain the parameters
open
write
close
EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.
My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).
Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.
var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
stream.cork();
stream.write("some content to log");
process.nextTick(() => stream.uncork());
}
uncork will flush the data to the file in the next tick.
In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.
Node.js 0.8 has fs.appendFile:
fs.appendFile('message.txt', 'data to append', (err) => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
Documentation
Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.
In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.
I tried it with over 50,000 lines in a file.
Examples :
(async () => {
// using appendFile.
const fsp = require('fs').promises;
await fsp.appendFile(
'/path/to/file', '\r\nHello world.'
);
// using apickfs; handles error and edge cases better.
const apickFileStorage = require('apickfs');
await apickFileStorage.writeLines(
'/path/to/directory/', 'filename', 'Hello world.'
);
})();
Ref: https://github.com/nodejs/node/issues/7560
If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:
const os = require('os');
const fs = require('fs-extra');
const file = 'logfile.txt';
const options = {flag: 'a'};
async function writeToFile(text) {
await fs.outputFile(file, `${text}${os.EOL}`, options);
}
writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');
Tested with Node v8.9.4.
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)
fs.open('log.txt', 'a', function(err, log) {
if (err) throw err;
fs.writeFile(log, 'Hello Node', function (err) {
if (err) throw err;
fs.close(log, function(err) {
if (err) throw err;
console.log('It\'s saved!');
});
});
});
Using jfile package :
myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Try to use flags: 'a' to append data to a file
var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
stream.once('open', function(fd) {
stream.write(msg+"\r\n");
});
Here's a full script. Fill in your file names and run it and it should work!
Here's a video tutorial on the logic behind the script.
var fs = require('fs');
function ReadAppend(file, appendFile){
fs.readFile(appendFile, function (err, data) {
if (err) throw err;
console.log('File was read');
fs.appendFile(file, data, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
const inovioLogger = (logger = "") => {
const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
const log_stdout = process.stdout;
log_file.write(logger + '\n');
}
In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:
const { appendFile } = require('fs').promises;
await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);
I hope it'll help 😉
I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.
append (path, name, data) {
return new Promise(async (resolve, reject) => {
try {
fs.appendFile((path + name), data, async (err) => {
if (!err) {
return resolve((path + name));
} else {
return reject(err);
}
});
} catch (err) {
return reject(err);
}
});
}

Electron - write file before open save dialog

I'm using electron to develop an app. after some encryption operations are done, I need to show a dialog to the user to save the file. The filename I want to give to the file is a random hash but I have no success also with this. I'm trying with this code but the file will not be saved. How I can fix this?
const downloadPath = app.getPath('downloads')
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach( (file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
const filename = hash.createHash('md5').toString('hex');
console.log(filename)
const response = output.join(' :: ');
dialog.showSaveDialog({title: 'Save encrypted file', defaultPath: downloadPath }, () => {
fs.writeFile(`${filename}.mfs`, response, (err) => console.log(err) )
})
})
The problem you're experiencing is resulting from the asynchronous nature of Electron's UI functions: They do not take callback functions, but return promises instead. Thus, you do not have to pass in a callback function, but rather handle the promise's resolution. Note that this only applies to Electron >= version 6. If you however run an older version of Electron, your code would be correct -- but then you should really update to a newer version (Electron v6 was released well over a year ago).
Adapting your code like below can be a starting point to solve your problem. However, since you do not state how you generate the hash (where does hash.createHash come from?; did you forget to declare/import hash?; did you forget to pass any message string?; are you using hash as an alias for NodeJS' crypto module?), it is (at this time) impossible to debug why you do not get any output from console.log (filename) (I assume you mean this by "in the code, the random filename will not be created"). Once you provide more details on this problem, I'd be happy to update this answer accordingly.
As for the default filename: As per the Electron documentation, you can pass a file path into dialog.showSaveDialog () to provide the user with a default filename.
The file type extension you're using should also actually be passed with the file extension into the save dialog. Also passing this file extension as a filter into the dialog will prevent users from selecting any other file type, which is ultimately what you're also currently doing by appending it to the filename.
Also, you could utilise CryptoJS for the filename generation: Given some arbitrary string, which could really be random bytes, you could do: filename = CryptoJS.MD5 ('some text here') + '.mfs'; However, remember to choose the input string wisely. MD5 has been broken and should thus no longer be used to store secrets -- using any known information which is crucial for the encryption of the files you're storing (such as data.password) is inherently insecure. There are some good examples on how to create random strings in JavaScript around the internet, along with this answer here on SO.
Taking all these issues into account, one might end up with the following code:
const downloadPath = app.getPath('downloads'),
path = require('path');
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach((file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
// not working:
// const filename = hash.createHash('md5').toString('hex') + '.mfs';
// alternative requiring more research on your end
const filename = CryptoJS.MD5('replace me with some random bytes') + '.mfs';
console.log(filename);
const response = output.join(' :: ');
dialog.showSaveDialog(
{
title: 'Save encrypted file',
defaultPath: path.format ({ dir: downloadPath, base: filename }), // construct a proper path
filters: [{ name: 'Encrypted File (*.mfs)', extensions: ['mfs'] }] // filter the possible files
}
).then ((result) => {
if (result.canceled) return; // discard the result altogether; user has clicked "cancel"
else {
var filePath = result.filePath;
if (!filePath.endsWith('.mfs')) {
// This is an additional safety check which should not actually trigger.
// However, generally appending a file extension to a filename is not a
// good idea, as they would be (possibly) doubled without this check.
filePath += '.mfs';
}
fs.writeFile(filePath, response, (err) => console.log(err) )
}
}).catch ((err) => {
console.log (err);
});
})

NodeJS stream parse and write json line to line upon Promise result

I have a large json file that looks like that:
[
{"name": "item1"},
{"name": "item2"},
{"name": "item3"}
]
I want to stream this file (pretty easy so far), for each line run a asynchronous function (that returns a promise) upon the resolve/reject call edit this line.
The result of the input file could be:
[
{"name": "item1", "response": 200},
{"name": "item2", "response": 404},
{"name": "item3"} // not processed yet
]
I do not wish to create another file, I want to edit on the fly the SAME FILE (if possible!).
Thanks :)
I don't really answer the question, but don't think it can be answered in a satisfactory way anyway, so here are my 2 cents.
I assume that you know how to stream line by line, and run the function, and that the only problem you have is editing the file that you are reading from.
Consequences of inserting
It is not possible to natively insert data into any file (which is what you want to do by changing the JSON live). A file can only grow up at its end.
So inserting 10 bytes of data at the beginning of a 1GB file means that you need to write 1GB to the disk (to move all the data 10 bytes further).
Your filesystem does not understand JSON, and just sees that you are inserting bytes in the middle of a big file so this is going to be very slow.
So, yes it is possible to do.
Write a wrapper over the file API in NodeJS with an insert() method.
Then write some more code to be able to know where to insert bytes into a JSON file without loading the whole file and not producing invalid JSON at the end.
Now I would not recommend it :)
=> Read this question: Is it possible to prepend data to an file without rewriting?
Why do it then?
I assume that want to either
Be able to kill your process at any time, and easily resume work by reading the file again.
Retry partially treated files to fill only the missing bits.
First solution: Use a database
Abstracting the work that needs to be done to live edit files at random places is the sole purpose of existence of databases.
They all exist only to abstract the magic that is behind UPDATE mytable SET name = 'a_longer_name_that_the_name_that_was_there_before' where name = 'short_name'.
Have a look at LevelUP/Down, sqlite, etc...
They will abstract all the magic that needs to be done in your JSON file!
Second solution: Use multiple files
When you stream your file, write two new files!
One that contain current position in the input file and lines that need to be retried
The other one the expected result.
You will also be able to kill your process at any time and restart
According to this answer writing to the same file while reading is not reliable. As a commenter there says, better to write to a temporary file, and then delete the original and rename the temp file over it.
To create a stream of lines you can use byline. Then for each line, apply some operation and pipe it out to the output file.
Something like this:
var fs = require('fs');
var stream = require('stream');
var util = require('util');
var LineStream = require('byline').LineStream;
function Modify(options) {
stream.Transform.call(this, options);
}
util.inherits(Modify, stream.Transform);
Modify.prototype._transform = function(chunk, encoding, done) {
var self = this;
setTimeout(function() {
// your modifications here, note that the exact regex depends on
// your json format and is probably the most brittle part of this
var modifiedChunk = chunk.toString();
if (modifiedChunk.search('response:[^,}]+') === -1) {
modifiedChunk = modifiedChunk
.replace('}', ', response: ' + new Date().getTime() + '}') + '\n';
}
self.push(modifiedChunk);
done();
}, Math.random() * 2000 + 1000); // to simulate an async modification
};
var inPath = './data.json';
var outPath = './out.txt';
fs.createReadStream(inPath)
.pipe(new LineStream())
.pipe(new Modify())
.pipe(fs.createWriteStream(outPath))
.on('close', function() {
// replace input with output
fs.unlink(inPath, function() {
fs.rename(outPath, inPath);
});
});
Note that the above results in only one async operation happening at a time. You could also save the modifications to an array and once all of them are done write the lines from the array to a file, like this:
var fs = require('fs');
var stream = require('stream');
var LineStream = require('byline').LineStream;
var modifiedLines = [];
var modifiedCount = 0;
var inPath = './data.json';
var allModified = new Promise(function(resolve, reject) {
fs.createReadStream(inPath).pipe(new LineStream()).on('data', function(chunk) {
modifiedLines.length++;
var index = modifiedLines.length - 1;
setTimeout(function() {
// your modifications here
var modifiedChunk = chunk.toString();
if (modifiedChunk.search('response:[^,}]+') === -1) {
modifiedChunk = modifiedChunk
.replace('}', ', response: ' + new Date().getTime() + '}');
}
modifiedLines[index] = modifiedChunk;
modifiedCount++;
if (modifiedCount === modifiedLines.length) {
resolve();
}
}, Math.random() * 2000 + 1000);
});
}).then(function() {
fs.writeFile(inPath, modifiedLines.join('\n'));
}).catch(function(reason) {
console.error(reason);
});
If instead of lines you wish to stream chunks of valid json which would be a more robust approach, take a look at JSONStream.
As mentioned in the comment, the file you have is not proper JSON, although is valid in Javascript. In order to generate proper JSON, JSON.stringify() could be used. I think it would make life difficult for others to parse nonstandard JSON as well, therefore I would recommend furnishing a new output file instead of keeping the original one.
However, it is still possible to parse the original file as JSON. This is possible via eval('(' + procline + ')');, however it is not secure to take external data into node.js like this.
const fs = require('fs');
const readline = require('readline');
const fr = fs.createReadStream('file1');
const rl = readline.createInterface({
input: fr
});
rl.on('line', function (line) {
if (line.match(new RegExp("\{name"))) {
var procline = "";
if (line.trim().split('').pop() === ','){
procline = line.trim().substring(0,line.trim().length-1);
}
else{
procline = line.trim();
}
var lineObj = eval('(' + procline + ')');
lineObj.response = 200;
console.log(JSON.stringify(lineObj));
}
});
The output would be like this:
{"name":"item1","response":200}
{"name":"item2","response":200}
{"name":"item3","response":200}
Which is line-delimited JSON (LDJSON) and could be useful for streaming stuff, without the need for leading and trailing [, ], or ,. There is an ldjson-stream package for it as well.

Create plugin gulp with stream

I created plugin for send json data in json file.
But I don't understand why send my object json in pipe, and not write file directly in my plugin.
I want use my plugin whit this syntax:
gulp.task('js-hash', function()
{
// Get all js in redis
gulp.src('./build/js/**/*.js')
.pipe(getHashFile('/build/js/'))
.pipe(gulp.dest('./build/js/hash.json'));
});
And not that:
gulp.task('js-hash', function()
{
// Get all js in redis
gulp.src('./build/js/**/*.js')
.pipe(getHashFile('./build/js/hash.json', '/build/js/'));
});
This is my plugin:
var through = require('through2');
var gutil = require('gulp-util');
var crypto = require('crypto');
var fs = require('fs');
var PluginError = gutil.PluginError;
// Consts
const PLUGIN_NAME = 'get-hash-file';
var json = {};
function getHashFile(filename, basename)
{
if (!filename) {
throw PluginError(PLUGIN_NAME, "Missing filename !");
}
// Creating a stream through which each file will pass
var stream = through.obj(function (file, enc, callback) {
if (file.isNull()) {
this.push(file); // Do nothing if no contents
return callback();
}
if (file.isBuffer()) {
var hash = crypto.createHash('sha256').update(String(file.contents)).digest('hex');
json[file.path.replace(file.cwd+basename, '')] = hash;
return callback();
}
if (file.isStream()) {
this.emit('error', new PluginError(PLUGIN_NAME, 'Stream not supported!'));
return callback();
}
}).on('finish', function () {
fs.writeFile(filename, JSON.stringify(json), function(err) {
if (err) {
throw err;
}
});
});
// returning the file stream
return stream;
}
// Exporting the plugin main function
module.exports = getHashFile;
Your are idea
Nothing prevents you from doing this... besides not respecting plugins guidelines!
Users actually assume a plugin will stream files and that they can pipe them to other plugins.
If I get your code right, you're trying to generate a file that contains all sha hashes of inbound files. Why not let users take this file and pipe it to other plugins? You'd be surprised what people could do.
While this question looks a bit opinion-based, you could definitely put the focus on how to deal with files that may not belong to the main stream of files. Issues like this can be found in many plugins; for example, gulp-uglify authors are wondering how they can add source-maps without mixing js and source map downstream.

Categories