Node- Append file in Async function - javascript

I'm using mongoose to query a database for objects and wish to write each object to a file. The console.log shows me that the data I want is being returned from the query, but the file that is created by fs.append (./returned.json) is always empty. Is it not possible to do this within an async function?
async function findReturned(){
try {
const returned = await Data.find({});
returned.forEach(function(file) {
returnedfiles = file.BACSDocument;
console.log(returnedfiles);
fs.appendFile('./returned.json', returnedfiles, 'utf-8', (err) => {
if (err) throw err;
});
});
process.exit();
} catch(e) {
console.log('Oops...😯😯😯😯😯😯😯');
console.error(e);
process.exit();
}
};

You probably should use mz/fs for async function
const fs = require('mz/fs');
...
returned.forEach(async (file) => {
returnedfiles = file.BACSDocument;
await fs.writeFile('./returned.json', returnfiles, 'utf-8');
})
You don't need to throw the err as it will the catch.

You can't mix async function and functions with callback API
Also you can't call multiply async function inside forEach, you have to create a promise that will do what do you need in chain (see reduce function in example below).
Another way - to use .map, return an array of promises and await it using
await Proimse.all(arrayOfPromises)
but in this case the order of new data in file will not be as the order in the initial array.
In order to do that you need to promisify fs.appendFile and call promisified version of function
// add new import
const {promisify} = require('util');
// create new function with Promise API on existing Callback API function
const appendFile = promisify(fs.appendFile)
async function findReturned(){
try {
const returned = await Data.find({});
// Create a single promise that will append each record one by one to the file in initial order.
const promise = returned.reduce(function(acc, file){
returnedfiles = file.BACSDocument;
console.log(returnedfiles);
acc.then(() => appendFile('./returned.json', returnedfiles, 'utf-8'));
return acc
}, Promise.resolve());
// Wait until the promise resolves;
await promise;
process.exit();
} catch(e) {
console.log('Oops...😯😯😯😯😯😯😯');
console.error(e);
process.exit();
}
};
Instead of using build int fs module you can use fs-extra module that already has promisified versions of native methods and several extra methods.
What you can improve in your function - do not append file multiply times, but collect all data that you need to append to file into single variable and do it once.
If for some reason you have to do it multiply times - obtain file descriptor using fs.open and pass it instead of file name to fs.appendFile (or promisified verison). In this case do not forget co close file descriptor (fs.Close).

Related

How to retrieve object from JSON in nodejs?

Having this code:
const fs = require('fs')
const file = 'books.json';
class Book{
constructor(code) {
this._code = code;
}
get code() {
return this._code;
}
set code(value) {
this._code = value;
}
}
async function writeBooks(){
const data = JSON.stringify([new Book('c1'), new Book('c2')]);
await fs.promises.writeFile(file, data, 'utf8');
}
async function getBook(code){
try{
const data = await fs.promises.readFile(file);
const array = JSON.parse(data);
return array.find(b => b.code === code);
} catch (err){
console.log(err)
}
}
writeBooks();
getBook('c1').then(b => console.log(b));
I am getting undefined (instead of the expecting book object).
How to get the object (the above problem)
If async function always returns promise, how can I then return object for the client, instead of him having to call then() from the getBook(code)?
do I need to await for the fs.promises.writeFile()? as I am doing in writeBooks()? As fas as I understand the async/await now, is that the return value from await function is the data or error. But since the writeFile() does not returns anything, or error at most (as opposed to readFile()) why would I want to await for no data?
Actually the root of problem is not about async/awaits or promises. The problem is trying to write an array to a json file. If you write your json data like the code snippet below (as a key-value pair), your problem is solved.
{"1": [new Book('c1').code, new Book('c2').code]} //as a key-value pair
const fs = require('fs')
const file = 'books.json';
class Book{
constructor(code) {
this._code = code;
}
get code() {
return this._code;
}
set code(value) {
this._code = value;
}
}
async function writeBooks(){
const data = JSON.stringify({"1": [new Book('c1').code, new Book('c2').code]});
await fs.promises.writeFile(file, data, 'utf8');
}
async function getBook(code){
try{
const data = await fs.promises.readFile(file);
const dataOnJsonFile = JSON.parse(data);
return dataOnJsonFile["1"];
} catch (err){
console.log(err)
}
}
writeBooks();
getBook('c1').then(b => console.log(b));
The above problem is that the Books returned from JSON.parse have only data, not methods, and thus I cannot get the code via get code(){}, but only as public parameter of class Book as book._code, which however breaks encapsulation (convetion is that _[propery] is private, and there should be appropriate getters/setters). So I made the properties public (and broke encapsulation), because I still don't know, how to assign methods to object created from JSON.
No, the result of async is always Promise. You cannot unwrap it inside async, the client will always have to unwrap it. (so await fs.promises.WriteFile() will unwrap it, but then immediately wrap it back, before async function returns.
as explained above.

Cannot print object through res.json in express JS

I am trying to build an API through which I can get whois detail in the JSON output like below
For this, I installed the whois package from npm (https://www.npmjs.com/package/whois[whois Link]2). I tried to convert the string to object and print it in JSON format but I am not getting any output on the web but it console i can get the data easily. Can you guys please fix my error.
function whoisFunction() {
var whois = require('whois')
whois.lookup(url,async function(err, data) {
try {
a = await data.split('\n')
}
catch (e) {
console.log(e)
return e
}
c=[]
for(i = 0; i < a.length; i++){
c.push(a[i])
}
await console.log(typeof (c))
console.log(c)
return a
})
}
// res.json({'Headers':+whoisFunction()})
res.status(200).json(whoisFunction())
There are async and awaits sprinkled seemingly randomly all over your function.
You should realize that the only thing that is asynchronous here is whois.lookup().
console.log is not asynchronous. Array.prototype.split is not asynchronous. The callback (err, data) => {...} is not asynchronous.
If you want to use the callback pattern, then you need to use res.send() inside of the callback
(err, data) => {
res.send(data)
}
But we got fed up with callback-patterns because of how messy it is to nest them. So instead we went over to using promises. If you have callbacks but want use promises, then you wrap the callback in a promise. You do it once, and you do it as tight to the offending callback as you can:
function promisifiedLookup(url){
return new Promise( (resolve, reject) => {
whois.lookup(url, function(err, data) {
if(err) reject(err)
else resolve(data)
})
})
}
So, to use async/await we need that:
the calling function is declared async
the called function is returning a promise (or else there is nothing to wait for)
async function whoisFunction() {
let data = await promisifiedLookup(url) // _NOW_ we can use await
data = data.split('\n')
// ...
return data; // Because this funtion is declared async, it will automatically return promise.
}
If your express-handler is defined as async, then you now can use await here as well:
res.status(200).json(await whoisFunction())

async/await is not working for mongo DB queries

Working case:
async await is working fine when we call a asynchronous function and that function returning a promise resolve()
Not working case:
async await is not working for mongo DB queries
tried then(), async/await
I have 2 JS files.
In one.js file i am importing function which is in functionone.js
WORKING CASE:
When one.js looks like
var functiononestatus = transactions.functionone(req.session.email).then((came) => {
console.log(came); // getting `need to be done at first` message
console.log("exec next")
});
When functionone.js looks like
module.exports.functionone = functionone;
async function functionone(email) {
return await new Promise((resolve, reject) => {
resolve('need to be done at first')
});
});
NOT WORKING CASE (when mongo db query need to be executed):
When one.js looks like
var functiononestatus = transactions.functionone(req.session.email).then((came) => {
console.log(came); // getting undefined
console.log("exec next")
});
When functionone.js looks like
module.exports.functionone = functionone;
async function functionone(email) {
//mongo starts
var collection = await connection.get().collection('allinonestores');
await collection.find({
"email": email
}).toArray(async function(err, wallcheck) {
return await new Promise((resolve, reject) => {
resolve(wallcheck[0])
});
});
Quick clarification:
.collection('name') returns a Collection instance, not a Promise, so no need to await for it.
toArray() operates in two modes: either with a callback when a function is provided, either returns a Promise when no callback function is provided.
You're essentially expecting a Promise result out of toArray() while supplying a callback function, resulting in undefined, because callback takes priority and no promise is returned, due to the dual operation mode of toArray().
Also, toArray(callback) does not take an async function as callback.
Here's how your code should look like, for retrieving a collection:
const client = await MongoClient.connect('your mongodb url');
const db = client.db('your database name'); // No await here, because it returns a Db instance.
const collection = db.collection('allinonestores'); // No await here, because it returns a Collection instance.
and then, code for fetching results:
const db = <get db somehow>;
// You could even ditch the "async" keyword here,
// because you do not do/need any awaits inside the function.
// toArray() without a callback function argument already returns a promise.
async function functionOne(email) {
// Returns a Collection instance, not a Promise, so no need for await.
const collection = db.collection('allinonestore');
// Without a callback, toArray() returns a Promise.
// Because our functionOne is an "async" function, you do not need "await" for the return value.
return collection.find({"email": email}).toArray();
}
and code alternative, using callback:
const db = <get db somehow>;
// You could even ditch the "async" keyword here,
// because you do not do/need any awaits inside the function.
// You're forced to return a new Promise, in order to wrap the callback
// handling inside it
async function functionOne(email) {
// Returns a Collection instance, not a Promise, so no need for await.
const collection = db.collection('allinonestore');
// We need to create the promise outside the callback here.
return new Promise((resolve, reject) => {
db.find({"email": email}).toArray(function toArrayCallback(err, documents) {
if (!err) {
// No error occurred, so we can solve the promise now.
resolve(documents);
} else {
// Failed to execute the find query OR fetching results as array someway failed.
// Reject the promise.
reject(err);
}
});
});
}
Note: First of all i really need to thank #mihai Potra for the best answer.
Here we go
Case 1:
If it is a function which need to find documents and return from MongoDb as mihai mentioned, below answer is uber cool
const db = <get db somehow>;
async function functionOne(email) {
const collection = db.collection('allinonestore');
return collection.find({"email": email}).toArray();
}
case 2:
If there are nested functions which need to return values every time below ans will be the best as of my knowledge
-no need async/await keywords for every function or no need then()
function one(<parameter>) {
return new Promise(function(resolve, reject) {
const collection = connection.get().collection('<collection_name>');
const docs = collection.find({"Key": Value_fromFunction}).toArray( function (err, result) {
resolve(result[0]);
});
That's it, use resolve callback when ever it needed.

JavaScript async in a loop with callbacks

I have a tricky situation that needs to collect keys that belongs to certain types (types in a given array), then filter the collected keys and pass to a deletion function.
The collection process calls shell codes and process the results in a callback within a loop. I will need to wait until the whole loop of callback finishes then pass to the deletion function.
I am using shelljs in the node codes, basically look like the below:
var del_arr = [];
for (var i in types) {
shell.exec(somecode with
var [i], {
silent: true
},
function(code, stdout, stderr) {
somecode-processing/filtering stdout and pushes the results to del_arr;
});
//loop through array types[] and need to wait for all shell codes' callbacks to finish;
}
//then pass del_arr to deletion function
I wasn't able to build a async function in this format b/s of the shelljs callback. I also don't know how to use promise in this situation.
Can you tell me how to achieve this non-blocking process?
Thanks
Turn child_process.exec into a promise:
function execWrapper(command, options) {
return new Promise((resolve, reject) => {
shell.exec(command, options, (error, out, err) => {
if (error) return reject(error);
resolve({out: out, err: err});
})
})
}
Then you can iterate over types and map each one to a promise:
const promises = types.map(type => execWrapper(type, {slient: true}));
Now wait for each promise to resolve, or for one to reject:
Promise.all(promises).then((del_arr) => {
// del_arr is now a array of objects with the stdout and stderr of each type.
//
})
A good implementation of this case :
async function myAsyncFunction() {
const promises = types.map((type) => myAsyncRequest(type));
let del_arr = Promise.all(promises);
}
A good article that explains this :
https://medium.freecodecamp.org/avoiding-the-async-await-hell-c77a0fb71c4c
Try to convert shell.exec to Promise like
function shellPromise(command,option) {
return Promise((resolv,reject)=>{
shell.exec(command,option,(code,stdout,stderr)=>
resolv({code:code,stdout:stdout,stderr:stderr})
);
};
};
Then you can use something like
for (var i in types){
var result=await shellPromise(somecode with var[i], {silent:true});
// somecode-processing/filtering stdout and pushes the results to del_arr;
}
You can also use async package in npm. It provides a function eachSeries that might come handy in your situation, without useing promises and dealing with callbacks only.
async.eachSeries(hugeArray, function iteratee(item, callback) {
if (inCache(item)) {
callback(null, cache[item]); // if many items are cached, you'll overflow
} else {
doSomeIO(item, callback);
}
}, function done() {
//...
});
For more details on how to use this function: https://caolan.github.io/async/

Better way to deal with Promise flow when functions need to access outer scope variable

I'm creating a Node.js module to interact with my API, and I use the superagent module to do the requests. How it works:
module.exports = data => {
return getUploadUrl()
.then(uploadFiles)
.then(initializeSwam)
function getUploadUrl() {
const request = superagent.get(....)
return request
}
function uploadFiles(responseFromGetUploadUrl) {
const request = superagent.post(responseFromGetUploadUrl.body.url)
// attach files that are in data.files
return request
}
function initializeSwam(responseFromUploadFilesRequest) {
// Same thing here. I need access data and repsonseFromUploadFilesRequest.body
}
}
I feel like I'm doing something wrong like that, but I can't think in a better way to achieve the same result.
Two simple ways:
write your function to take all parameters it needs
const doStuff = data =>
getUploadUrl()
.then(uploadFiles)
.then(initializeSwam)
might become
const doStuff = data =>
getUploadUrl()
.then(parseResponseUrl) // (response) => response.body.url
.then(url => uploadFiles(data, url))
.then(parseResponseUrl) // same function as above
.then(url => initializeSwam(data, url))
That should work just fine (or fine-ish, depending on what hand-waving you're doing in those functions).
partially apply your functions
const uploadFiles = (data) => (url) => {
return doOtherStuff(url, data);
};
// same deal with the other
const doStuff = data =>
getUploadUrl()
.then(parseResponseUrl)
.then(uploadFiles(data)) // returns (url) => { ... }
.then(parseResponseUrl)
.then(initializeSwam(data));
A mix of all of these techniques (when and where sensible) should be more than sufficient to solve a lot of your needs.
The way you have your code structured in the above snippet results in the getUploadUrl(), uploadFiles(), and initializeSwam() functions not being declared until the final .then(initializeSwam) call is made. What you have in this final .then() block is three function declarations, which simply register the functions in the namespace in which they are declared. A declaration doesn't fire-off a function.
I believe what you want is something like:
async function getUploadUrl() { <-- notice the flow control for Promises
const request = await superagent.get(....);
return request;
}
async function uploadFiles(responseFromGetUploadUrl) {
const request = await superagent.post(responseFromGetUploadUrl.body.url)
// attach files that are in data.files
return request;
}
async function initializeSwam(responseFromUploadFilesRequest) {
// Same thing here. I need access data and
repsonseFromUploadFilesRequest.body
const request = await ...;
}
module.exports = data => {
return getUploadUrl(data) <-- I'm guessing you wanted to pass this here
.then(uploadFiles)
.then(initializeSwam);
}
This approach uses ES6 (or ES2015)'s async/await feature; you can alternatively achieve the same flow control using the bluebird Promise library's coroutines paired with generator functions.

Categories