UPDATE
I have read over a dozen articles on this topic and not one of them addresses this fundamental question. I am going to start listing a resources section at the end of this post.
ORIGINAL POST
My understanding of an async function is it returns a promise.
MDN docs: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function
Inside my program I could write something like:
function testPromise() {
return new Promise((resolve, reject) => {
// DO WORK
reject() // IF WORK FAILS
resolve() // IF WORK IS SUCCESSFUL
})
}
async function mainFunction() {
let variable
try {
variable = await testPromise()
} catch(e) {
throw e
}
return variable
}
I could also write testPromise as an async function and await that in the same context.
async function testAsyncFunction() {
//DO WORK AND THROW ERROR IF THEY OCCUR
}
async function mainFunction() {
let variable
try {
variable = await testAsyncFunction()
} catch(e) {
throw e
}
return variable
}
Which would be considered best practice? If I wish to create asynchronous operation, should the function use return New Promise and awaited in a async function or is awaiting an async function from an async function the same difference?
RESOURCES
JavaScript ES 2017: Learn Async/Await by Example
https://codeburst.io/javascript-es-2017-learn-async-await-by-example-48acc58bad65
Javascript — ES8 Introducing async/await Functions
https://medium.com/#reasoncode/javascript-es8-introducing-async-await-functions-7a471ec7de8a
6 Reasons Why JavaScript’s Async/Await Blows Promises Away (Tutorial)
https://hackernoon.com/6-reasons-why-javascripts-async-await-blows-promises-away-tutorial-c7ec10518dd9
----------------------CURRENT----------------------
export default function time_neo4jUpdate({ store, action, change, args, }) {
return new Promise(async (resolve, reject) => {
try {
const {
thing: { type },
nonValidatedArgs: { leapYear, root, start, end },
_neo4j,
_cypherReducers,
_neo4jCreateReduce,
_timetreeSteps: { update }
} = store.getState()
let results = []
for (let i = 0; i < _neo4jCreateReduce.length; i++) {
const result = await _neo4j.session(
_neo4jCreateReduce[i],
_cypherReducers.runQuery(update, i, root, start, end))
results = [...results, result]
}
resolve({
store,
action: 'NEO4J_UPDATE',
change: results,
args
})
} catch (e) {
const m = `NEO4J TIMETREE UPDATE: Unable to complete the UPDATE step for the timetree. WHAT: ${e}`
reject(m)
}
})
}
----------------------AS ASYNC FUNCTION----------------------
export default async function time_neo4jUpdate({ store, action, change, args, }) {
try {
const {
thing: { type },
nonValidatedArgs: { leapYear, root, start, end },
_neo4j,
_cypherReducers,
_neo4jCreateReduce,
_timetreeSteps: { update }
} = store.getState()
let results = []
for (let i = 0; i < _neo4jCreateReduce.length; i++) {
const result = await _neo4j.session(
_neo4jCreateReduce[i],
_cypherReducers.runQuery(update, i, root, start, end))
results = [...results, result]
}
return {
store,
action: 'NEO4J_UPDATE',
change: results,
args
}
} catch (e) {
const m = `NEO4J TIMETREE UPDATE: Unable to complete the UPDATE step for the timetree. WHAT: ${e}`
throw m
}
}
Even without the availability of async/await, you should very rarely need to use new Promise(). If you're using it a lot, it's typically a code smell.
The whole point of async/await is that it allows you to avoid a lot of the situations where you would otherwise need to work with promises explicitly.
So if it's supported in the environment you're targeting (Internet Explorer does not support async/await) or you're using a transpiler, go ahead and use it anywhere you can. That's what it's for.
Bear in mind that this is pointless:
catch(e) {
throw e;
}
There's no point in catching an error just to rethrow it. So if you're not actually doing anything with the caught error, don't catch it:
async function testAsyncFunction() {
//DO WORK AND THROW ERROR IF THEY OCCUR
return value
}
Edit: Now that you've provided an example of your code, I can answer with more certainty: If your function is based upon existing promises, then by all means, it is good to use async/await and you usually should not use new Promise():
export default async function time_neo4jUpdate({
store,
action,
change,
args,
}) {
try {
const {
thing: {
type
},
nonValidatedArgs: {
leapYear,
root,
start,
end
},
_neo4j,
_cypherReducers,
_neo4jCreateReduce,
_timetreeSteps: {
update
}
} = store.getState()
const results = await _neo4jCreateReduce.reduce(async function (acc, el) {
const result = await _neo4j.session(
el,
_cypherReducers.runQuery(
update,
i,
root,
start,
end
)
)
return [...(await acc), result]
}, []);
return {
store,
action: 'NEO4J_UPDATE',
change: results,
args
};
} catch (e) {
const m = `NEO4J TIMETREE UPDATE: Unable to complete the UPDATE step for the timetree. WHAT: ${e}`
throw m;
}
}
Related
I am trying to understand how promises work in JS by playing with swapi.dev. I would like to create a dynamic chain of promises (not using async/await) but it does not provide me with any result. In particular, the idea behind is to get all names of the given person (for instance Luke Skywalker) and dump them into the console.
Could anyone help me? What am I missing?
Thanks in advance.
"use strict";
const request = require("request-promise");
const BASE_URL = "http://swapi.dev/api";
var currentPromise = Promise.resolve();
callApiPromise(`${BASE_URL}/people/1`).then((data) => {
console.log("Getting vehicles' URLs");
const vehicles_URL = data["vehicles"];
console.log("Starting looping through URLs");
for (let i = 0; i < vehicles_URL.length; i++) {
console.log(`i=${i}, vehicle_URL=${vehicles_URL[i]}`);
currentPromise = currentPromise.then(function () {
console.log(".. getting vehicle name");
return getVehicleName[vehicles_URL[i]];
});
}
});
function getVehicleName(url) {
callApiPromise(url).then((vehicle_data) => {
var arrVehicleData = new Array();
arrVehicleData.push(vehicle_data);
console.log(arrVehicleData.map((vehicle) => vehicle.name));
});
}
function callApiPromise(url) {
return new Promise((resolve, reject) => {
callApi(url, (err, data) => {
if (err) {
reject(err);
return;
}
resolve(data);
});
});
}
function callApi(url, callback) {
request
.get(url)
.then((response) => {
const json = JSON.parse(response);
callback(null, json);
})
.catch((err) => {
callback(err, null);
});
}
Some issues:
A missing return statement in getVehicleName
A syntax issue in getVehicleName[vehicles_URL[i]] (should be parentheses)
As the promises for getting the vehicle names are independent, you would not chain them, but use Promise.all
arrVehicleData will always only have one element. There is no reason for an array there where it is used.
You are also taking the wrong approach in using request.get. The bottom function turns that API from a Promise-API to a callback API, only to do the reverse (from callback to promise) in the function just above it. You should just skip the callback layer and stick to promises:
"use strict";
const request = require("request-promise");
const BASE_URL = "http://swapi.dev/api";
getJson(`${BASE_URL}/people/1`).then(data => {
return Promise.all(data.vehicles.map(getVehicleName));
}).then(vehicleNames => {
console.log(vehicleNames);
// Continue here...
});
function getVehicleName(url) {
return getJson(url).then(vehicle => vehicle.name);
}
function getJson(url, callback) {
return request.get(url).then(JSON.parse);
}
Finally, you should not use request-promise anymore since the request module, on which request-promise depends, has been deprecated
The getVehicleName doesn't return a promise. Instead it invokes a promise that by the time it will be resolved, the for loop invoking it will already be removed from the call stack.
This is a sample of promise chaining:
const promise = new Promise(resolve => resolve(1))
const promise1 = Promise.resolve(2)
const methodReturnPromise = () => new Promise(resolve => resolve(3))
promise.then(firstPromiseData => {
// do something with firstPromiseData
console.log(firstPromiseData)
return promise1
}).then(secondPromiseData => {
// do something with secondPromiseData
console.log(secondPromiseData)
return methodReturnPromise()
}).then(thirdPromiseData => {
// do something with thirdPromiseData
console.log(thirdPromiseData)
})
So I have this user service functions in service.ts which includes database stuffs.
export const service = {
async getAll(): Promise<User[]> {
try {
const result = await query
return result
} catch (e) {
report.error(e)
throw new Error(e)
}
},
...
}
And query.ts file for some reasons, eg: caching, business logic, etc.
export const query = {
async index(): Promise<User[]> {
try {
const result = await service.getAll()
return result
} catch (e) {
report.error(e)
throw new Error(e)
}
},
...
}
And another upper layer for routers and resolvers, because I want to see all routes in one file.
export const resolver = {
Query: {
users: (): Promise<User[]> => query.index(),
},
...
}
Do I need to wrap try...catch in all functions? Or can't I just add .catch at the very top layer like this:
export const resolver = {
Query: {
users: (): Promise<User[]> => query.index().catch(e => e),
},
...
}
Do I need to wrap try...catch in all functions?
No, you don't, not unless you want to log it at every level for some reason. Just handle it at the top level.
In an async function, promise rejections are exceptions (as you know, since you're using try/catch with them), and exceptions propagate through the async call tree until/unless they're caught. Under the covers, async functions return promises and reject those promises when a synchronous exception occurs or when a promise the async function is awaiting rejects.
Here's a simple example:
function delay(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function outer() {
// ...
await delay(10);
console.log("before calling inner");
await inner();
console.log("after calling inner (we never get here)");
}
async function inner() {
// ...
await delay(10);
console.log("inside inner");
// Something goes wrong
null.foo();
}
outer()
.catch(e => {
console.log("Caught error: " + e.message, e.stack);
});
Just as a side note: If you do catch an error because you want to do X before the error propagates and you're going to re-throw the error after you do X, best practice is to re-throw the error you caught, rather than creating a new one. So:
} catch (e) {
// ...do X...
throw e; // <== Not `throw new Error(e);`
}
But only do that if you really need to do X when an error occurs. Most of the time, just leave the try/catch off entirely.
I'm working on converting a legacy callback-based API into an async library. But I just can't wrap my head around getting a "resultset" to work as a generator (Node 10.x).
The original API works like this:
api.prepare((err, rs) => {
rs.fetchRows(
(err, row) => {
// this callback is called as many times as rows exist
console.log("here's a row:", row);
},
() => {
console.log("we're done, data exausted");
}
);
});
But here is how I want to use it:
const wrapped = new ApiWrapper(api);
const rs = await wrapped.prepare({});
for (let row of rs.rows()) {
console.log("here's a row:", row);
}
let row;
while(row = await rs.next()) {
console.log("here's a row:", row);
}
I thought I had it under control with generators, but it looks like you cannot use yield inside a callback. It actually seems logical if you think about.
class ApiWrapper {
constructor(api) {
this.api = api;
}
prepare() {
return new Promise((resolve, reject) => {
this.api.prepare((err, rs) => {
if (err) {
reject(err);
} else {
resolve(rs);
}
});
});
}
*rows() {
this.api.fetchRows((err, row) => {
if (err) {
throw err;
} else {
yield row; // nope, not allowed here
}
});
}
next() { ... }
}
So what alternatives do I have?
Important: I don't want to store anything in an array then iterate that, we're talking giga-loads of row data here.
Edit
I'm able to simulate the behavior I want using stream.Readable but it warns me that it's an experimental feature. Here's a simplified array-based version of the issue I'm trying to solve using stream:
const stream = require('stream');
function gen(){
const s = new stream.Readable({
objectMode: true,
read(){
[11, 22, 33].forEach(row => {
this.push({ value: row });
});
this.push(null)
}
});
return s;
}
for await (let row of gen()) {
console.log(row);
}
// { value: 11 }
// { value: 22 }
// { value: 33 }
(node:97157) ExperimentalWarning: Readable[Symbol.asyncIterator] is an experimental feature. This feature could change at any time
I finally realized I needed something similar to Go's channels that were async/await compatible. Basically the answer is to synchronize an async iterator and a callback, making them wait for each other as next() iterations are consumed.
The best (Node) native solution I found was to use a stream as an iterator, which is supported in Node 10.x but tagged experimental. I also tried to implement it with the p-defer NPM module, but that turned out to be more involved than I expected. Finally ran across the https://www.npmjs.com/package/#nodeguy/channel module, which was exactly what I needed:
const Channel = require('#nodeguy/channel');
class ApiWrapper {
// ...
rows() {
const channel = new Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
this.api.fetchRows(async (err, row) => {
await channel.push(row);
}).then(() => channel.close());
return iter;
}
}
// then later
for await (let row of rs.rows()) {
console.log(row)
}
Note how each iterating function core, next() and rows(), have a await that will throttle how much data can be pushed across the channel, otherwise the producing callback could end up pushing data uncontrollably into the channel queue. The idea is that the callback should wait for data to be consumed by the iterator next() before pushing more.
Here's a more self-contained example:
const Channel = require('#nodeguy/channel');
function iterating() {
const channel = Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
console.log('next');
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
[11, 22, 33].forEach(async it => {
await channel.push(it);
console.log('pushed', it);
});
console.log('returned');
return iter;
}
(async function main() {
for await (let it of iterating()) {
console.log('got', it);
}
})();
/*
returned
next
pushed 11
got 11
next
pushed 22
got 22
next
pushed 33
got 33
next
*/
Like I said, Streams and/or Promises can be used to implement this, but the Channel module solves some of the complexity that make it more intuitive.
The original question has two nested callback taking async functions
api.prepare((err,res) => ...)
rs.fetchRows((err,res) => ...)
The first one runs the callback only once so just promisifying it as follows is sufficient.
function promisfied(f){
return new Promise((v,x) => f(x,v));
}
However the second function will invoke it's callback multiple times and we wish to generate an async iterable from this function such that we can consume it in a for await of loop.
This is also possible by employing async generators as follows;
async function* rowEmitterGenerator(rs){
let _v, // previous resolve
_x, // previous reject
_row = new Promise((v,x) => (_v = v, _x = x));
rs.fetchRows((err, row) => ( err ? _x(err) : _v(row)
, _row = new Promise((v,x) => (_v = v, _x = x))
));
while(true){
try {
yield _row;
}
catch(e){
console.log(e);
}
}
}
Then putting all together in a top level await context;
const rows = await promisified(api.prepare),
rowEmitter = rowEmitterGenerator(rows);
for await (let row of rowEmitter){
console.log(`Received row: ${row}`);
// do something with the row
}
I want to perform an iteration over a mongoDB collection w/o numeric key(_id). The collection only has random strings as an _id, and the size of the collection is massive, thus loading up the whole documents on RAM using .toArray() is not a viable option. plus I want to perform asynchronous task on each element. the usage of .map() or .each(), .forEach() is limited because of the asynchronous nature of the task. I tried to run the task with those mentioned methods but it did of course conflicted with asynchronous task, returned pending promises instead of proper results.
example
async function dbanalyze(){
let cursor = db.collection('randomcollection').find()
for(;;){
const el = cursor.hasNext() ? loaded.next() : null;
if(!cursor) break
await performAnalyze(cursor) // <---- this doesn't return a document but just a cursor object
}
}
how can I iterate over mongoDB collection using only for()?
The Cursor.hasNext() method is also "asynchronous", so you need to await that as well. Same goes for Cursor.next(). Therefore the actual "loop" usage really should be a while:
async function dbanalyze(){
let cursor = db.collection('randomcollection').find()
while ( await cursor.hasNext() ) { // will return false when there are no more results
let doc = await cursor.next(); // actually gets the document
// do something, possibly async with the current document
}
}
As noted in the comments, eventually Cursor.hasNext() will return false when the cursor is actually depleted, and the Cursor.next() is the thing that is actually retrieving each value from the cursor. You could do other structures and break the loop when hasNext() is false, but it more naturally lends itself to a while.
These are still "async", so you need to await the promise resolution on each, and that was the main fact you were missing.
As for Cursor.map(), then you are probably missing the point that it can be marked with an async flag on the provided function as well:
cursor.map( async doc => { // We can mark as async
let newDoc = await someAsyncMethod(doc); // so you can then await inside
return newDoc;
})
But you still actually want to "iterate" that somewhere, unless you can get away with using .pipe() to some other output destination.
Also the async/await flags also make Cursor.forEach() "more practical again", as it's one common flaw was not being able to simply handle an "inner" asynchronous call, but with these flags you can now do so with ease, though admittedly since you must use a callback, you probably want to wrap this in a Promise :
await new Promise((resolve, reject) =>
cursor.forEach(
async doc => { // marked as async
let newDoc = await someAsyncMethod(doc); // so you can then await inside
// do other things
},
err => {
// await was respected, so we get here when done.
if (err) reject(err);
resolve();
}
)
);
Of course there has always been ways to apply this with either callbacks or plain Promise implementations, but it's the "sugar" of async/await than actually makes this look much cleaner.
NodeJS v10.x and MongoDB Node driver 3.1.x and up
And the favorite version uses AsyncIterator which is now enabled in NodeJS v10 and upwards. It's a much cleaner way to iterate
async function dbanalyze(){
let cursor = db.collection('randomcollection').find()
for await ( let doc of cursor ) {
// do something with the current document
}
}
Which "in a way" comes back to what the question originally asked as to using a for loop since we can do the for-await-of syntax here fore supporting iterable which supports the correct interface. And the Cursor does support this interface.
If you're curios, here's a listing I cooked up some time ago to demonstrate various cursor iteration techniques. It even includes a case for Async Iterators from a generator function:
const Async = require('async'),
{ MongoClient, Cursor } = require('mongodb');
const testLen = 3;
(async function() {
let db;
try {
let client = await MongoClient.connect('mongodb://localhost/');
let db = client.db('test');
let collection = db.collection('cursortest');
await collection.remove();
await collection.insertMany(
Array(testLen).fill(1).map((e,i) => ({ i }))
);
// Cursor.forEach
console.log('Cursor.forEach');
await new Promise((resolve,reject) => {
collection.find().forEach(
console.log,
err => {
if (err) reject(err);
resolve();
}
);
});
// Async.during awaits cursor.hasNext()
console.log('Async.during');
await new Promise((resolve,reject) => {
let cursor = collection.find();
Async.during(
(callback) => Async.nextTick(() => cursor.hasNext(callback)),
(callback) => {
cursor.next((err,doc) => {
if (err) callback(err);
console.log(doc);
callback();
})
},
(err) => {
if (err) reject(err);
resolve();
}
);
});
// async/await allows while loop
console.log('async/await while');
await (async function() {
let cursor = collection.find();
while( await cursor.hasNext() ) {
let doc = await cursor.next();
console.log(doc);
}
})();
// await event stream
console.log('Event Stream');
await new Promise((end,error) => {
let cursor = collection.find();
for ( let [k,v] of Object.entries({ end, error, data: console.log }) )
cursor.on(k,v);
});
// Promise recursion
console.log('Promise recursion');
await (async function() {
let cursor = collection.find();
function iterate(cursor) {
return cursor.hasNext().then( bool =>
(bool) ? cursor.next().then( doc => {
console.log(doc);
return iterate(cursor);
}) : Promise.resolve()
)
}
await iterate(cursor);
})();
// Uncomment if node is run with async iteration enabled
// --harmony_async_iteration
console.log('Generator Async Iterator');
await (async function() {
async function* cursorAsyncIterator() {
let cursor = collection.find();
while (await cursor.hasNext() ) {
yield cursor.next();
}
}
for await (let doc of cursorAsyncIterator()) {
console.log(doc);
}
})();
// This is supported with Node v10.x and the 3.1 Series Driver
await (async function() {
for await (let doc of collection.find()) {
console.log(doc);
}
})();
client.close();
} catch(e) {
console.error(e);
} finally {
process.exit();
}
})();
I want to check if an async function throws using assert.throws from the native assert module.
I tried with
const test = async () => await aPromise();
assert.throws(test); // AssertionError: Missing expected exception..
It (obviously?) doesn't work because the function exits before the Promise is resolved.
Yet I found this question where the same thing is attained using callbacks.
Any suggestion?
(I'm transpiling to Node.js native generators using Babel.)
node 10 and newer
Since Node.js v10.0, there is assert.rejects which does just that.
Older versions of node
async functions never throw - they return promises that might be rejected.
You cannot use assert.throws with them. You need to write your own asynchronous assertion:
async function assertThrowsAsynchronously(test, error) {
try {
await test();
} catch(e) {
if (!error || e instanceof error)
return "everything is fine";
}
throw new AssertionError("Missing rejection" + (error ? " with "+error.name : ""));
}
and use it like
return assertThrowsAsynchronously(aPromise);
in an asynchronous test case.
Based on Bergi answer I've suggest more universal solution that utilizes original assert.throws for error messages:
import assert from 'assert';
async function assertThrowsAsync(fn, regExp) {
let f = () => {};
try {
await fn();
} catch(e) {
f = () => {throw e};
} finally {
assert.throws(f, regExp);
}
}
Usage:
it('should throw', async function () {
await assertThrowsAsync(async () => await asyncTask(), /Error/);
});
The answers given work, but I came across this issue today and came up with another solution, that I think is a little simpler.
// Code being tested
async function thisFunctionThrows() {
throw new Error('Bad response')
}
// In your test.
try {
await thisFunctionThrows()
assert.equal(1 == 0) // Never gets run. But if it does you know it didn't throw.
} catch (e) {
assert(e.message.includes('Bad response'))
}
Since the question is still getting attention, I'd like to sum up the two best solutions, especially to highlight the new standard method.
Node v10+
There's a dedicated method in the assert library, assert.rejects.
For older versions of Node
A fill from vitalets answer:
import assert from 'assert';
async function assertThrowsAsync(fn, regExp) {
let f = () => {};
try {
await fn();
} catch(e) {
f = () => {throw e};
} finally {
assert.throws(f, regExp);
}
}
You are going to want to use, assert.rejects() which is new in Node.js version 10.
At the high level, instead of assert.throws, we want something like assert.rejects, hopefully you can take this and run with it:
const assertRejects = (fn, options) => {
return Promise.resolve(fn()).catch(e => {
return {
exception: e,
result: 'OK'
}
})
.then(v => {
if (!(v && v.result === 'OK')) {
return Promise.reject('Missing exception.');
}
if (!options) {
return;
}
if (options.message) {
// check options
}
console.log('here we check options');
});
};
it('should save with error', async () => {
// should be an error because of duplication of unique document (see indexes in the model)
return await assertRejects(async () => {
patientSubscriber = await PatientSubscriber.create({
isSubscribed: true,
patient: patient._id,
subscriber: user._id
});
}, {
message: /PatientSubscriber validation failed/
});
});