NodeJS - Wait Until Streaming Multiple Files Is Complete Before Continuing Code - javascript

I'm new to Javascript and NodeJS. Im trying to read multiple CSV files before doing some processing on them. My current issue is when I run the code it tries to execute the processing before the reading of the file is complete. I would like to load both the CSVs before I start doing any processing on them.
Could some explain why this happens and how I can solve the problem in Javascript/NodeJS.
function readCSV(path){
var events = []
fs.createReadStream(path).pipe(csv()).on('data', (row) => {
events.push(row);
}).on('end', () => {
console.log('CSV file successfully processed. Length: ' + events.length);
});
return events
}
function app(){
var file_list = listFiles(folder_path);
for (let i = 0; i < file_list.length; i++) {
const file = file_list[i];
var events = readCSV(file)
}
processCSV(events) // Some processing
}
app();
Any help would be great and any explanation on how I can control when code is executed would be much appreciated.

Sorry, your code cannot be compiled, so I can answer only with untested code.
My current issue is when I run the code it tries to execute the processing before the reading of the file is complete.
The main problem is that fs.createReadStream doesn't read the file, it requests the file system to start reading and calls your callbacks when some chunks were read, so event 'end' will be called much later, after readCSV completed and returned an empty result.
Your code was written as if you expect an synchronous answer, and you can make it work correctly with the use of sync methods like fs.readFileSync.
How to fix it in asynchronous way? Write CSV processing in "on(end)" callback or use promises.
Promises are much simpler and linear.
First make readCSV to return Promise.
function readCSV(path: string){ //return Promise<any[]>
return new Promise((resolve) => {
var events = [];
fs.createReadStream(path)
.pipe(csv())
.on('data', (row) => {
// this code called in future
events.push(row);
}).on('end', () => {
// this code called in future to,
console.log('CSV file successfully processed. Length: ' + events.length);
resolve(events); //return csv parsed result
});
})
}
Then in main app, use Promise.all to wait all fileReading promises.
function app(){
// i don't know what is listFiles,
// i hope it returns sync result
var file_list = fs.listFiles(folder_path);
const dataPromises: Promise[] = []
for (let i = 0; i < file_list.length; i++) {
const file = file_list[i];
//launch reading
dataPromises.push(readCSV(file))
}
Promise.all(dataPromises).then(result => {
//this code will be called in future after all readCSV Promises call resolve(..)
for(const events of result){
processCSV(events);
}
})
}

Related

How can I insert objects in SQL Server running a function in a loop? ConnectionError: .connect can not be called on a Connection in `Connecting` state

I'm working in a NodeJS project, this project I decided to change the way I'm doing it because this way wasn't working, let me try to explain it.
I need to insert data into a SQL Server DB, so I did a function insertOffice() this function opens a connection using Tedious, then fetchs data to an url with data from an array data2 to load coords, and then with this coords creates an object, then inserts this object into a DB. When inserting only one part of my data2 array, it works, by only sendind data[0] it adds:
{
latjson: 1,
lonjson: 1,
idoficina: "1",
}
But I want to insert both of the parts of my array, changing data2[0] to data2[index], to be able to insert all my array, so I tried creating another function functionLooper()that loops insertOffice() to insert my data from my array data2.
I builded this little code to learn how to loop a function, this prints index that is the value I use for bringing idoficina.
As you can see functionLooper() runs the code twice, so it can read fully data2 array, I have this little code that works with the same logic, I builded my full code using this:
function insertOffice(index) {
console.log(index);
}
function functionLooper() {
for (let i = 0; i < 5; i++) {
let response = insertOffice(i);
}
}
functionLooper();
This prints:
0
1
2
3
4
So my code it's supposed to send index
I'm expecting my code to loop my insertOffice() and being able to insert my objects, the issue is that this doesn't seems to work as I am getting this error:
C:\...\node_modules\tedious\lib\connection.js:993
throw new _errors.ConnectionError('`.connect` can not be called on a Connection in `' + this.state.name + '` state.');
^
ConnectionError: `.connect` can not be called on a Connection in `Connecting` state.
this is my code:
var config = {
....
};
const data2 = [
...
];
var connection = new Connection(config);
function insertOffice(index) {
console.log(index)
connection.on("connect", function (err) {
console.log("Successful connection");
});
connection.connect();
const request = new Request(
"EXEC SPInsert #Data1, ... ",
function (err) {
if (err) {
console.log("Couldn't insert, " + err);
} else {
console.log("Inserted")
}
}
);
console.log(myObject.Id_Oficina)
request.addParameter("Data1", TYPES.SmallInt, myObject.Id_Oficina);
request.on("row", function (columns) {
columns.forEach(function (column) {
if (column.value === null) {
console.log("NULL");
} else {
console.log("Product id of inserted item is " + column.value);
}
});
});
request.on("requestCompleted", function () {
connection.close();
});
connection.execSql(request);
}
function functionLooper() {
for (let i = 0; i < 2; i++) {
let response = insertOffice(i);
}
}
functionLooper();
I do not know if this is the right way to do it (looping the inserting function insertOffice()twice), if you know a better way to do it and if you could show me how in an example using a similar code to mine, would really appreciate it.
You're approaching an asynchronous problem as if it's a synchronous one. You're also making your life a bit harder by mixing event based async tasks with promise based ones.
For example, connection.connect() is asynchronous (meaning that it doesn't finish all its work before the next lines of code is executed), it is only done when connection emits the connect event. So the trigger for starting the processing of your data should not be started until this event is fired.
For each of the events in your loop they are not running one at a time but all at the same time because the fetch() is a promise (asynchronous) it doesn't complete before the next iteration of the loop. In some cases it may have even finished before the database connection is ready, meaning the code execution has moved on to DB requests before the connection to the database is established.
To allow your code to be as manageable as possible you should aim to "promisify" the connection / requests so that you can then write an entirely promise based program, rather than mixing promises and events (which will be pretty tricky to manage - but is possible).
For example:
const connection = new Connection(config);
// turn the connection event into a promise
function connect() {
return new Promise((resolve, reject) => {
connection.once('connect', (err) => err ? reject(err) : resolve(connection));
connection.connect()
});
}
// insert your data once the connection is ready and then close it when all the work is done
function insertOffices() {
connect().then((conn) => {
// connection is ready I can do what I want
// NB: Make sure you return a promise here otherwise the connection.close() call will fire before it's done
}).then(() => {
connection.close();
});
}
The same approach can be taken to "promisify" the inserts.
// turn a DB request into a promise
function request(conn) {
return new Promise((resolve, reject) => {
const request = new Request(...);
request.once('error', reject);
request.once('requestCompleted', resolve);
conn.execSql(request);
});
}
This can then be combined to perform a loop where it's executed one at a time:
function doInserts() {
return connect().then((conn) => {
// create a "chain" of promises that execute one after the other
let inserts = Promise.resolve();
for (let i = 0; i < limit; i++) {
inserts = inserts.then(() => request(conn));
}
return inserts;
}).then(() => connection.close())
}
or in parallel:
function doInserts() {
return connect().then((conn) => {
// create an array of promises that all execute independently
// NB - this probably won't work currently because it would need
// multiple connections to work (rather than one)
let inserts = [];
for (let i = 0; i < limit; i++) {
inserts.push(request(conn));
}
return Promise.all(inserts);
}).then(() => connection.close())
}
Finally I could fix it, I'm sharing my code for everyone to could use it and do multiple inserts, thanks to Dan Hensby, I didn't do it his way but used part of what he said, thanks to RbarryYoung and MichaelSun90 who told me how, just what I did was changing my
var connection = new Connection(config);
to run inside my
function insertOffice(index) { ... }
Looking like this:
function insertOffice(index) {
var connection = new Connection(config);
....
}

Wait all promises in a map function

I want to wait to read all my pictures in a map function
I tried this
let buffer = [];
// Folder of the dataset.
const rootFolder = './dataset'
console.log("Entering in folder dataset");
fs.readdirSync(rootFolder);
// For each folders
const files = fs.readdirSync(rootFolder).map(dirName => {
if(fs.lstatSync(path.join(rootFolder, dirName)).isDirectory()){
console.log(`Entering in folder ${path.join(rootFolder, dirName)}`);
// For each files
fs.readdirSync(path.join(rootFolder, dirName)).map(picture => {
if(fs.lstatSync(path.join(rootFolder, dirName, picture)).isFile()){
if(picture.startsWith("norm")){
return fileToTensor(path.join(rootFolder, dirName, picture)).then((img) => {
buffer.push(img);
}).catch((error) => {console.log(error)});
}
}
});
}
});
Promise.all(files);
console.log(buffer);
async function fileToTensor(path) {
return await sharp(path)
.removeAlpha()
.raw()
.toBuffer({ resolveWithObject: true });
}
But my buffer is still empty...
I know promises exist but I don't know how can I include them in map(map())
Thanks you :)
I would refactor the above code to this:
let files = [];
// loop each dir.
fs.readdirSync(rootFolder).forEach(dirName => {
// if it's a directory, procede.
if(fs.lstatSync(path.join(rootFolder, dirName)).isDirectory()){
console.log(`Entering in folder ${path.join(rootFolder, dirName)}`);
fs.readdirSync(path.join(rootFolder, dirName)).forEach(picture => {
if(fs.lstatSync(path.join(rootFolder, dirName, picture)).isFile()){
// If lstatsync says it's a file and if it starts with "norm"
if(picture.startsWith("norm")){
// push a new promise to the array.
files.push(new Promise((resolve, reject) => {
fileToTensor(path.join(rootFolder, dirName, picture)).then((img) => {
buffer.push(img);
resolve();
}).catch((error) => {console.log(error); reject(error);});
}));
}
}
});
}
});
// Resolve all promises.
Promise.all(files).then(() => {
// Then do whatever you need to do.
console.log(buffer);
}).catch((errors) => {
console.log('one ore more errors occurred', errors);
});
Basically, here is what I did:
Removed .map, since it's not necessary in this context. Also, in your case, not all code paths returned a result, hence not every callback returned a result.
Pushed each needed item to the files array, which is a Promise[].
Called Promise.all on the files array. Each resolved promise will push the result to the buffer array. I would've handled it in a different way, but still, this is the fastest I could think of.
Registered a callback on Promise.all, so that buffer will be defined.
As a side note, there are a lot of third party libraries that helps you to avoid having nested loops and promises looping the file system. I've just posted this to try giving something that could actually work from the existing code, despite an entire refactor would be clever here, and a preliminary analysis of available node libraries would also help to make the code easier to read and to mantain.
First of all a few advices:
DON't use arrow functions for anything you cannot put in a single line (they aren't intended for that and this wrecks readability)
Check that each callback you pass to .map() actually return something (first one doesn't. It seems you missed a return before inner fs.readdir(..)....
Better try to name all functions (except arrow ones in the cases that they're good choice). This way not only I could name it to better identify them in the previous point but also stack traces would be much more readable and useful (traceable).
That being said, you are reading directories (and subdirectories) synchronously to finally return promises (I understand that fileToTensor() is expected to return a promise). It may not have a major impact on the overall execution time because I suppose actual file processings would be much more expensive BUT this is a bad pattern because you are blocking the event loop during the tree scan (so, if your code is for a server that needs to attend other petitions, you are pulling performance a bit down...).
Finally, as others already said, there are libraries, such as glob that eases that task.
On the other hand, if you want to do it by yourself (as an understanding exercise) I myself implemented my own library for the same task before knowing about glob which could serve you as a simpler example.
Hye I've bit updated your code please go through once. It might be helpful :)
let fsReadDir = Util.promisify(Fs.readdir);
let fsStat = Util.promisify(Fs.stat);
let picturePromises = [];
let directories = await fsReadDir(rootFolder);
for (let dirIndex = 0; dirIndex < directories.length; dirIndex++) {
let file = directories[dirIndex];
let stat = await fsStat(path[pathIndex] + '/' + file);
if (stat.isDirectory()) {
let pictures = await fsReadDir(path.join(rootFolder, dirName));
for (let picIndex = 0; picIndex < pictures.length; picIndex++) {
let stat = await fsStat(path.join(rootFolder, dirName, pictures[picIndex]));
if (stat.isFile()) {
if (picture.startsWith("norm")) {
let pTensor = fileToTensor(path.join(rootFolder, dirName, pictures[picIndex])).then((img) => {
buffer.push(img);
}).catch((error) => { console.log(error) });
picturePromises.push(pTensor);
}
}
}
}
}
Promise.all(picturePromises);
console.log(buffer);
async function fileToTensor(path) {
return await sharp(path)
.removeAlpha()
.raw()
.toBuffer({ resolveWithObject: true });
}

How to use Promise.all in react js ES6

What i want to do is to upload file on server, then get URL of uploaded file and preview it. Files can be more than one. For that purpose i have written following code:
let filesURL=[];
let promises=[];
if(this.state.files_to_upload.length>0) {
for(let i=0; i<this.state.files_to_upload.length; i++) {
promises.push(this.uploadFilesOnServer(this.state.files_to_upload[i]))
}
Promise.all(promises).then(function(result){
console.log(result);
result.map((file)=>{
filesURL.push(file);
});
});
console.log(filesURL);
}
const uploadedFilesURL=filesURL;
console.log(uploadedFilesURL);
console.log(filesURL); give me the values returned by Promise.all.
And i want to use these values only when Promise.all completes properly. But, i am facing problem that lines console.log(uploadedFilesURL); excutes first irrespective of Promise.all and give me undefined values.I think i am not using promises correctly, can anyone please help me?
uploadFileOnServer code is:
uploadFilesOnServer(file)
{
let files=[];
let file_id='';
const image=file;
getImageUrl().then((response) => {
const data = new FormData();
data.append('file-0', image);
const {upload_url} = JSON.parse(response);
console.log(upload_url);
updateProfileImage(upload_url, data).then ((response2) => {
const data2 = JSON.parse(response2);
file_id=data2;
console.log(file_id);
files.push(file_id);
console.log(files);
});
});
return files;
}
No, promise is asynchronous and as such, doesn't work the way you think. If you want to execute something after a promise completed, you must put it into the promise's then callback. Here is the example based on your code:
uploadFilesOnServer(file) {
let files=[];
let file_id='';
const promise = getImageUrl()
.then((imageUrlResponse) => {
const data = new FormData();
data.append('file-0', file);
const { upload_url } = JSON.parse(imageUrlResponse);
console.log(upload_url);
return updateProfileImage(upload_url, data);
})
.then ((updateImageResponse) => {
file_id= JSON.parse(updateImageResponse);
console.log(file_id);
files.push(file_id);
console.log(files);
return files;
});
return promise;
}
let filesPromise = Promise.resolve([]);
if(this.state.files_to_upload.length > 0) {
const promises = this.state.files_to_upload.map((file) => {
return this.uploadFilesOnServer(file);
});
filesPromise = Promise.all(promises).then((results) => {
console.log(results);
return [].concat(...results);
});
}
// This is the final console.log of you (console.log(uploadedFilesURL);)
filesPromise.then((filesUrl) => console.log(filesUrl));
A good book to read about ES6 in general and Promises in particular is this book Understanding ECMAScript 6 - Nicholas C. Zakas
Edit:
Here is an simple explanation of the example code:
The uploadFilesOnServer is a function that takes a file, upload it and will return the file URL when the upload completes in the future in the form of a promise. The promise will call its then callback when it gets the url.
By using the map function, we create a list of url promises, the results we've got from executing uploadFilesOnServer on each file in the list.
The Promise.all method waits for all the promises in the list to be completed, joins the list of url results and create a promise with the result which is the list of urls. We need this because there is no guarantee that all of the promises will complete at once, and we need to gather all the results in one callback for convenience.
We get the urls from the then callback.
You have to do this on the .then part of your Promise.all()
Promise.all(promises)
.then(function(result){
console.log(result);
result.map((file)=>{
filesURL.push(file);
});
return true; // return from here to go to the next promise down
})
.then(() => {
console.log(filesURL);
const uploadedFilesURL=filesURL;
console.log(uploadedFilesURL);
})
This is the way async code works. You cannot expect your console.log(filesURL); to work correctly if it is being called syncronously after async call to fetch files from server.
Regarding to your code there are several problems:
1.uploadFilesOnServer must return Promise as it is async. Therefore:
uploadFilesOnServer(file)
{
let files=[];
let file_id='';
const image=file;
return getImageUrl().then((response) => {
const data = new FormData();
data.append('file-0', image);
const {upload_url} = JSON.parse(response);
console.log(upload_url);
updateProfileImage(upload_url, data).then ((response2) => {
const data2 = JSON.parse(response2);
file_id=data2;
console.log(file_id);
files.push(file_id);
console.log(files);
return files;
});
});
}
2.Inside your main function body you can assess results of the Promise.all execution only in its respective then handler.
As a side note I would recomment you to use es7 async/await features with some transpilers like babel/typescript. This will greatly reduce the nesting/complications of writing such async code.

How to determine that all the files have been read and resolve a promise

The following code is responsible for reading files. My requirement is how to find whether all files has been read so that I can return or resolve a promise from the parent function(readmultifiles).
$.when(readmultifiles(files))
.then(function(){//all files uploaded}))
Above code initiates the file read. What can be done so that upon reading of all files
callback is done or a return can be made.
function readmultifiles(files) {
// Read first file
setup_reader(files, 0);
}
function setup_reader(files, i) {
var file = files[i];
var name = file.name;
var reader = new FileReader();
reader.onload = function(e) {
readerLoaded(e, files, i, name);
};
reader.readAsBinaryString(file);
// After reading, read the next file.
}
function readerLoaded(e, files, i, name) {
// get file content
var bin = e.target.result;
// do sth with text
// If there's a file left to load
if (i < files.length - 1) {
// Load the next file
setup_reader(files, i + 1);
}
}
There are several things to consider in a good design using promises that your implementation could learn from:
Create a promise (called "promisify") from the lowest level async operation your have. Then, you can use promise features to control the logic flow and propagate errors and your code will be consistently implemented with promises. In this case, it means you should promisify readFile(). It also makes readFile() more useful elsewhere in your project or in future projects.
Make sure you are always propagating errors properly. With async code when not using promises, it can be hard to properly get errors back to the original caller, particular if the async logic ends up complicated (with nested or sequences operations).
Consider carefully whether your async operations must be sequences or whether they can run in parallel. If one operation does not depend upon another and you aren't likely to overload some service with multiple requests, then running things in parallel will often achieve a result quicker.
Return promises from async functions so callers can know when things are done and can access async results.
Don't create another promise around an existing promise unnecessarily (considered one of the promise anti-patterns).
If using jQuery promises, try to stick to jQuery features that are compatible with the promise standard so you don't run into interoperability issues going forward or confuse future readers of your code who are more likely to know how standard promises work.
Given all that, here are five ways to implement your code -using standard promises, using jQuery promises and with your operation sequences or run in parallel and using Bluebird promises. In all cases, you get an array of results in order at the end.
Promisify readFile() using standard promises
First, let's "promisify" your readFile operation so you can then use promise logic to control things.
function readFile(file) {
return new Promise(function(resolve, reject) {
var reader = new FileReader();
reader.onload = function(e) {
resolve(e.target.result);
};
reader.onerror = reader.onabort = reject;
reader.readAsBinaryString(file);
});
}
With standard promises, all operation in parallel
To run all your file operations in parallel and return all the results in order and use standard promises, you can do this:
function readmultifiles(files) {
return Promise.all(files.map(readFile));
}
// sample usage
readmultifiles(arrayOfFiles).then(function(results) {
// all results in the results array here
});
With standard promises, all operations in sequence
To run all your files operations in sequence (which it does not look like you need to do here because all the operations are indepedent even though your original code was sequencing them) and return all the results in order and use standard promises, you can do this.
This, somewhat standard design pattern for sequencing uses .reduce() to sequence through the array and chain all the operations together so they are run one at a time down the sequence of the chain:
function readmultifiles(files) {
var results = [];
files.reduce(function(p, file) {
return p.then(function() {
return readFile(file).then(function(data) {
// put this result into the results array
results.push(data);
});
});
}, Promise.resolve()).then(function() {
// make final resolved value be the results array
return results;
});
}
// sample usage
readmultifiles(arrayOfFiles).then(function(results) {
// all results in the results array here
});
And, here's how it would look using jQuery promises
Promisify readFile() using jQuery promises:
function readFile(file) {
return new $.Deferred(function(def) {
var reader = new FileReader();
reader.onload = function() {
def.resolve(e.target.result);
};
reader.onerror = reader.onabort = def.reject;
reader.readAsBinaryString(file);
}).promise();
}
Run in a parallel with jQuery:
function readmultifiles(files) {
return $.when.apply($, files.map(readFile));
}
// sample usage
readmultifiles(arrayOfFiles).then(function() {
var results = Array.prototype.slice(arguments);
// all results in the results array here
});
And, to run in sequence with jQuery
function readmultifiles(files) {
var results = [];
files.reduce(function(p, file) {
return p.then(function() {
return readFile(file).then(function(data) {
// put this result into the results array
results.push(data);
});
});
}, $.Deferred().resolve()).then(function() {
// make final resolved value be the results array
return results;
});
}
// sample usage
readmultifiles(arrayOfFiles).then(function(results) {
// all results in the results array here
});
Bluebird implementation
And, for completeness, I'll show you what it looks like using a little more advanced promise library like Bluebird that has additional capabilities that are useful here. The parallel code and the implementation of readFile() is the same as for standard promises, but for the sequential implementation, it could take advantage of some built-in Bluebird operations for sequencing async operation and it would just consist of:
function readmultifiles(files) {
return Promise.mapSeries(files, readFile);
}
// sample usage
readmultifiles(arrayOfFiles).then(function(results) {
// all results in the results array here
});
What if I change the code structure to this
$.when(readmultifiles(files)).then(
function(status) {
alert(status + ", things are going well");
},
function(status) {
alert(status + ", you fail this time");
},
function(status) {
$("body").append(status);
}
);
function readmultifiles(files) {
var dfrd = $.Deferred();
// Read first file
setup_reader(files, 0);
function setup_reader(files, i) {
var file = files[i];
var name = file.name;
var reader = new FileReader();
reader.onload = function(e) {
readerLoaded(e, files, i, name);
};
reader.readAsBinaryString(file);
// After reading, read the next file.
}
function readerLoaded(e, files, i, name) {
// get file content
var bin = e.target.result;
// do sth with text
namee.push(name);
// If there's a file left to load
if (i < files.length - 1) {
// Load the next file
setup_reader(files, i + 1);
} else {
dfrd.resolve(namee.join(','));
}
}
return dfrd.promise();
}

Can Multiple fs.write to append to the same file guarantee the order of execution?

Assume we have such a program:
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
for (let i = 1; i < 1000; i++) {
fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}
My question is, will string1 to string1000 be gurantted to append to the same file in order?
Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread (like a callstack?) and once the previous call is done the next call can be executed.
I'm not really sure if my understanding is accurate.
Edit 1
As in comments and answers, I see fs.write is not safe for multiple write to same file without waiting for callback. But what about writestream?
If I use the following code, would it guarantee the order of writing?
// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"];
var fileStream = fs.createWriteFileStream("./same/path/file.txt", { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();
Any comments or corrections will be helpful! Thanks!
The docs say that
Note that it is unsafe to use fs.write multiple times on the same file without waiting for the callback. For this scenario, fs.createWriteStream is strongly recommended.
Using a stream works because streams inherently guarantee that the order of strings being written to them is the same order that is read out of them.
var stream = fs.createWriteStream("./same/path/file.txt");
stream.on('error', console.error);
arr.forEach((str) => {
stream.write(str + '\n');
});
stream.end();
Another way to still use fs.write but also make sure things happen in order is to use promises to maintain the sequential logic.
function writeToFilePromise(str) {
return new Promise((resolve, reject) => {
fs.write("./same/path/file.txt", str, {flag: "a"}}, (err) => {
if (err) return reject(err);
resolve();
});
});
}
// for every string,
// write it to the file,
// then write the next one once that one is finished and so on
arr.reduce((chain, str) => {
return chain
.then(() => writeToFilePromise(str));
}, Promise.resolve());
You can synchronize the access to the file using the read/write locking for node, please see the following example an you could read the documentation
var ReadWriteLock = require('rwlock');
var lock = new ReadWriteLock();
lock.writeLock(function (release) {
fs.appendFile(fileName, addToFile, function(err, data) {
if(err)
console.log("write error"); //logging error message
else
console.log("write ok");
release(); // unlock
});
});
I had the same problem and wrote an NPM package to solve it for my project. It works by buffering the data in an array, and waiting until the event loop turns over, to concatenate and write the data in a single call to fs.appendFile:
const SeqAppend = require('seqappend');
const writeLog = SeqAppend('log1.txt');
writeLog('Several...');
writeLog('...logged...');
writeLog('.......events');

Categories