I try to write my simple eventemitter wrapper for amqplib/callback_api. I have trouble with handling sitaution when rabbit is not available or disconnected.
I have method getConnect which returns Promise, which resolves when connection established. But if connection is refused Promise obviously rejects. How to force this method do reconnection while connection will not wstablished
/**
* Async method getConnect for connection
* #returns {Promise<*>}
*/
getConnect = async () => {
return new Promise((resolve, reject) => {
amqp.connect(this.config.url, async function(err, conn) {
if (err) {
reject(err);
}
resolve(conn);
})
})
};
Whole code is here https://github.com/kimonniez/rabbitEE
Maybe, I'm already very sleepy, but I'm completely confused :) Thanks in advance!
Wrap your Promise inside an Observable
Promise is not built to handle "retry" logic. If you want to do that, you should look into Observables using the rxjs library. This will allow you to retry using an arbitrary time interval while catching errors.
const { from, interval, of } = rxjs;
const { catchError, mergeMap, tap, skipWhile, take } = rxjs.operators;
const THRESHOLD = 3;
const RETRY_INTERVAL = 1000;
// Equivalent to 'amqp.connect'
const functionThatThrows = number =>
number < THRESHOLD
? Promise.reject(new Error("ERROR"))
: Promise.resolve("OK");
// Equivalent to `getConnect`
const getConnect = () =>
interval(RETRY_INTERVAL)
.pipe(
mergeMap(x => from(functionThatThrows(x)).pipe(catchError(e => of(e)))),
skipWhile(x => {
const isError = x instanceof Error;
if (isError) console.log('Found error. Retrying...');
return isError;
}),
take(1)
).toPromise();
// Resolve only if the inner Promise is resolved
getConnect().then(console.log);
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.4.0/rxjs.umd.min.js"></script>
Explanation
Create a source with an interval of 1000. Meaning that it will retry each second
Call your amqp.connect which is equivalent to functionThatThrows in my example
Catch the error using the catchError operator and return it
Skip while the returned object is an error. This will allow your to resolve only if your Promise has been resolved and not rejected
Take the first resolved result using take(1)
Convert your observable into a promise using the toPromise utility function
Call your function and attach then like you do with a standard Promise
If you just want to keep tryng connecting until a connection is made, you can wrap the getConnect method into a new keepConnect method:
keepConnect = async () => {
while (true) {
try {
let conn = await getConnect()
return conn
} catch (e) {}
}
}
But I think it would be better to implement something like a "try to connect for n times", by changing the while condition. In general, a "while true" solution is not clean and could perform bad, with the risk to slow down the event loop (imagine if the connect method will always return an error in few milliseconds).
You could also implement a system of progressive delays between connection attempts, using the keepConnect wrapper as idea.
If you instead want to reconnect when the connection is lost, then this is related to Rabbit (that I don't know) and his events.
Related
Currently I have many concurrent identical calls to my backend, differing only on an ID field:
getData(1).then(...) // Each from a React component in a UI framework, so difficult to aggregate here
getData(2).then(...)
getData(3).then(...)
// creates n HTTP requests... inefficient
function getData(id: number): Promise<Data> {
return backend.getData(id);
}
This is wasteful as I make more calls. I'd like to keep my getData() calls, but then aggregate them into a single getDatas() call to my backend, then return all the results to the callers. I have more control over my backend than the UI framework, so I can easily add a getDatas() call on it. The question is how to "mux" the JS calls into one backend call, the "demux" the result into the caller's promises.
const cache = Map<number, Promise<Data>>()
let requestedIds = []
let timeout = null;
// creates just 1 http request (per 100ms)... efficient!
function getData(id: number): Promise<Data> {
if (cache.has(id)) {
return cache;
}
requestedIds.push(id)
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas(requestedIds).then((datas: Data[]) => {
// TODO: somehow populate many different promises in cache??? but how?
requestedIds = []
timeout = null
}
}, 100)
}
return ???
}
In Java I would create a Map<int, CompletableFuture> and upon finishing my backend request, I would look up the CompletableFuture and call complete(data) on it. But I think in JS Promises can't be created without an explicit result being passed in.
Can I do this in JS with Promises?
A little unclear on what your end goal looks like. I imagine you could loop through your calls as needed; Perhaps something like:
for (let x in cache){
if (x.has(id))
return x;
}
//OR
for (let x=0; x<id.length;x++){
getData(id[x])
}
Might work. You may be able to add a timing method into the mix if needed.
Not sure what your backend consists of, but I do know GraphQL is a good system for making multiple calls.
It may be ultimately better to handle them all in one request, rather than multiple calls.
The cache can be a regular object mapping ids to promise resolution functions and the promise to which they belong.
// cache maps ids to { resolve, reject, promise, requested }
// resolve and reject belong to the promise, requested is a bool for bookkeeping
const cache = {};
You might need to fire only once, but here I suggest setInterval to regularly check the cache for unresolved requests:
// keep the return value, and stop polling with clearInterval()
// if you really only need one batch, change setInterval to setTimeout
function startGetBatch() {
return setInterval(getBatch, 100);
}
The business logic calls only getData() which just hands out (and caches) promises, like this:
function getData(id) {
if (cache[id]) return cache[id].promise;
cache[id] = {};
const promise = new Promise((resolve, reject) => {
Object.assign(cache[id], { resolve, reject });
});
cache[id].promise = promise;
cache[id].requested = false;
return cache[id].promise;
}
By saving the promise along with the resolver and rejecter, we're also implementing the cache, since the resolved promise will provide the thing it resolved to via its then() method.
getBatch() asks the server in a batch for the not-yet-requested getData() ids, and invokes the corresponding resolve/reject functions:
function getBatch() {
// for any
const ids = [];
Object.keys(cache).forEach(id => {
if (!cache[id].requested) {
cache[id].requested = true;
ids.push(id);
}
});
return backend.getDatas(ids).then(datas => {
Object.keys(datas).forEach(id => {
cache[id].resolve(datas[id]);
})
}).catch(error => {
Object.keys(datas).forEach(id => {
cache[id].reject(error);
delete cache[id]; // so we can retry
})
})
}
The caller side looks like this:
// start polling
const interval = startGetBatch();
// in the business logic
getData(5).then(result => console.log('the result of 5 is:', result));
getData(6).then(result => console.log('the result of 6 is:', result));
// sometime later...
getData(5).then(result => {
// if the promise for an id has resolved, then-ing it still works, resolving again to the -- now cached -- result
console.log('the result of 5 is:', result)
});
// later, whenever we're done
// (no need for this if you change setInterval to setTimeout)
clearInterval(interval);
I think I've found a solution:
interface PromiseContainer {
resolve;
reject;
}
const requests: Map<number, PromiseContainer<Data>> = new Map();
let timeout: number | null = null;
function getData(id: number) {
const promise = new Promise<Data>((resolve, reject) => requests.set(id, { resolve, reject }))
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas([...requests.keys()]).then(datas => {
for (let [id, data] of Object.entries(datas)) {
requests.get(Number(id)).resolve(data)
requests.delete(Number(id))
}
}).catch(e => {
Object.values(requests).map(promise => promise.reject(e))
})
timeout = null
}, 100)
}
return promise;
}
The key was figuring out I could extract the (resolve, reject) from a promise, store them, then retrieve and call them later.
I am looking at https://www.promisejs.org/patterns/ and it mentions it can be used if you need a value in the form of a promise like:
var value = 10;
var promiseForValue = Promise.resolve(value);
What would be the use of a value in promise form though since it would run synchronously anyway?
If I had:
var value = 10;
var promiseForValue = Promise.resolve(value);
promiseForValue.then(resp => {
myFunction(resp)
})
wouldn't just using value without it being a Promise achieve the same thing:
var value = 10;
myFunction(10);
Say if you write a function that sometimes fetches something from a server, but other times immediately returns, you will probably want that function to always return a promise:
function myThingy() {
if (someCondition) {
return fetch('https://foo');
} else {
return Promise.resolve(true);
}
}
It's also useful if you receive some value that may or may not be a promise. You can wrap it in other promise, and now you are sure it's a promise:
const myValue = someStrangeFunction();
// Guarantee that myValue is a promise
Promise.resolve(myValue).then( ... );
In your examples, yes, there's no point in calling Promise.resolve(value). The use case is when you do want to wrap your already existing value in a Promise, for example to maintain the same API from a function. Let's say I have a function that conditionally does something that would return a promise — the caller of that function shouldn't be the one figuring out what the function returned, the function itself should just make that uniform. For example:
const conditionallyDoAsyncWork = (something) => {
if (something == somethingElse) {
return Promise.resolve(false)
}
return fetch(`/foo/${something}`)
.then((res) => res.json())
}
Then users of this function don't need to check if what they got back was a Promise or not:
const doSomethingWithData = () => {
conditionallyDoAsyncWork(someValue)
.then((result) => result && processData(result))
}
As a side node, using async/await syntax both hides that and makes it a bit easier to read, because any value you return from an async function is automatically wrapped in a Promise:
const conditionallyDoAsyncWork = async (something) => {
if (something == somethingElse) {
return false
}
const res = await fetch(`/foo/${something}`)
return res.json()
}
const doSomethingWithData = async () => {
const result = await conditionallyDoAsyncWork(someValue)
if (result) processData(result)
}
Another use case: dead simple async queue using Promise.resolve() as starting point.
let current = Promise.resolve();
function enqueue(fn) {
current = current.then(fn);
}
enqueue(async () => { console.log("async task") });
Edit, in response to OP's question.
Explanation
Let me break it down for you step by step.
enqueue(task) add the task function as a callback to promise.then, and replace the original current promise reference with the newly returned thenPromise.
current = Promise.resolve()
thenPromise = current.then(task)
current = thenPromise
As per promise spec, if task function in turn returns yet another promise, let's call it task() -> taskPromise, well then the thenPromise will only resolve when taskPromise resolves. thenPromise is practically equivalent to taskPromise, it's just a wrapper. Let's rewrite above code into:
current = Promise.resolve()
taskPromise = current.then(task)
current = taskPromise
So if you go like:
enqueue(task_1)
enqueue(task_2)
enqueue(task_3)
it expands into
current = Promise.resolve()
task_1_promise = current.then(task_1)
task_2_promise = task_1_promise.then(task_2)
task_3_promise = task_2_promise.then(task_3)
current = task_3_promise
effectively forms a linked-list-like struct of promises that'll execute task callbacks in sequential order.
Usage
Let's study a concrete scenario. Imaging you need to handle websocket messages in sequential order.
Let's say you need to do some heavy computation upon receiving messages, so you decide to send it off to a worker thread pool. Then you write the processed result to another message queue (MQ).
But here's the requirement, that MQ is expecting the writing order of messages to match with the order they come in from the websocket stream. What do you do?
Suppose you cannot pause the websocket stream, you can only handle them locally ASAP.
Take One:
websocket.on('message', (msg) => {
sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
})
This may violate the requirement, cus sendToWorkerThreadPool may not return the result in the original order since it's a pool, some threads may return faster if the workload is light.
Take Two:
websocket.on('message', (msg) => {
const task = () => sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
This time we enqueue (defer) the whole process, thus we can ensure the task execution order stays sequential. But there's a drawback, we lost the benefit of using a thread pool, cus each sendToWorkerThreadPool will only fire after last one complete. This model is equivalent to using a single worker thread.
Take Three:
websocket.on('message', (msg) => {
const promise = sendToWorkerThreadPool(msg)
const task = () => promise.then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
Improvement over take two is, we call sendToWorkerThreadPool ASAP, without deferring, but we still enqueue/defer the writeToMessageQueue part. This way we can make full use of thread pool for computation, but still ensure the sequential writing order to MQ.
I rest my case.
I'm very new to promises. I need my server to wait for socket connections from an external api and from browser clients connections. The external api sends a number of objects (4 in this example) to the server, which is received as a promise and calls the function which waits for the promise. For each object received by the promise, a browser client can make a connection (promise) and join the game.
I have a function which should wait for variables to be populated by these two promises. It is successful in waiting for the external api objects, but it never receives the promise to indicate that the correct number of clients have made connections.
I wrapped the socket listening for the external API objects in a promise as it will only we sent once. I also call the function which handles the two promises here as it didn't seem to work anywhere else.
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
// gameObject.files = imageObj.imagePath;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList) //resolve the promise with the array of objects
sendData()
}
catch (e) {
console.error(e)
}
});
});
I also wrapped the client req listener in a promise because after countless tries to nest the promise inside, this was the only solution which didn't return the actual socket as the promise, so I feel this is probably the closest solution for me.
This promise should only resolve when there are the same amount of client connections as there are server objects received in the first promise. I a testing by simply connecting from 4 open tabs to localhost:3000.
//HANDLER FOR CLIENT REQUEST TO JOIN GAME
const playerPromise = new Promise ((resolve, reject) => {
socket.on('joinGame', async () => {
try {
gameState.totalPlayerCount++;
gameState.playerList.push(socket.id)
switch (true) {
case gameState.totalPlayerCount < gameState.totalServerCount :
console.log("Not enough players", gameState.totalPlayerCount)
break;
case gameState.totalPlayerCount <= gameState.totalServerCount :
console.log("Counts are equal", gameState)
readyPlayers = true;
resolve(gameState.playerList)
break;
case gameState.totalServerCount == 0 :
console.log("Server not ready", gameState)
break;
default :
console.log("Too many players", gameState.totalPlayerCount)
reject("Too many players", gameState.playerList)
}
}
catch(e) {
console.error(e);
}
})
})
sendData() function logs the 1st and 2nd tests to the console, but never the 3rd.
async function sendData() {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2", dataMax)
const dataPlay = await playerPromise;
console.log("TEST3", dataPlay)
for (var key in await dataPlay) {
io.to(dataPlay[key]).emit('gameData', dataPlay[key]);
}
}
catch(e) {
console.error(e)
}
};
I've looked at every other similar post on stackoverflow and online but cannot find any solution to this or where I'm going wrong. I have also devised the above solution with minimal knowledge of socket.io and promises, so is there is a better/cleaner way to do the above please let me know.
EDIT:
This is my current solution using only one promise, but now the promise is not being populated at all in the send function:
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList)
}
catch (e) {
console.error(e)
}
});
});
async function sendData(playerData) {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2")
for (var key in await playerData) {
io.to(playerData[key]).emit('gameData', dataMax);
}
}
catch(e) {
console.error(e)
}
};
The sendData() is called in the Client socket handler which just passes the array of connections as playerData. "TEST2" is never logged.
Seeing as the promise maxPromise is global, shouldn't it be able to access its value?
You've probably figured this out by now. It's an interesting problem. I'd like to know how you solved it. As I understand it, you need to wait for data to arrive in order to select players. Seems like a good use of promises. You could use socket.once instead of socket.on for dictData if it's a one-time event
At the same time you don't want to block players yet still need to wait for enough players to join. Awaiting another promise is again a good gating technique
If you haven't solved all issues I suggest first removing socket.io from the equation while developing the gating logic. You can do this in node with custom event emitters. I'd simulate players and data events occurring at random times. You can also do this in the browser with custom events or broadcast channels. You'll find this more convenient than manually connecting to a port
I'd put in a lot of logging with millisecond timestamps to easily understand the sequence of events - when they occur and when they're handled
Is there any typescript config option or some workaround to check if there is no resolve called in new Promise callback?
Assume I have a promise
new Promise(async (resolve, reject) => {
try {
// do some stuff
// but not calling resolve()
} catch (e) {
reject(e)
}
})
I want typescript to warn me that I did not call resolve(). Is it possible?
I know that I can use noUnusedParameters, but there are a couple of situations where I still need unused parameters (e.g. request inside of express.Hanlder, where I only use response, etc.)
No, that is not possible. Knowing whether code calls a certain function (resolve in this case) is just as hard as the halting problem. There is proof that no algorithm exists that can always determine this.
To illustrate, let's assume that the algorithm for determining whether a function calls resolve exists, and is made available via the function callsResolve(func). So callsResolve(func) will return true when it determines that func will call resolve (without actually running func), and false when it determines that func will not call resolve.
Now image this func:
function func() {
if (!callsResolve(func)) resolve();
}
... now we have a paradox: whatever this call of callsResolve returned, it was wrong. So for instance, if the implementation of callsResolve would have simulated an execution of func (synchronously) and determines that after a predefined timeout it should return false, the above is a demonstration of a function that calls resolve just after that timeout expired.
The closest you can get to a compile time check is to use async / await syntax.
If you don't want to use that, you could timeout your promises, though you would have to do that with each of your promise after / when you are creating them.
A solution could look like this:
export const resolveAfterDelay = (timeout: number) => new Promise((r) => setTimeout(r, timeout));
export const rejectAfterDelay = async (timeout: number) => {
return new Promise((resolve, reject) => setTimeout(() => reject(`Promise timed out as resolve was not called within ${timeout}ms`), timeout));
};
export const timeoutPromise = <T>(timeout: number) => async (p: Promise<T>): Promise<T> => {
return Promise.race([p, rejectAfterDelay(timeout)]);
};
const timeoutAfter1s = timeoutPromise(1e3);
const timeoutAfter10s = timeoutPromise(10e3);
timeoutAfter10s(resolveAfterDelay(3e3)).then(success => console.log("SUCCESS IS PRINTED")).catch(console.error); // works
timeoutAfter1s(resolveAfterDelay(3e3)).then(success => console.log("NEVER REACHED")).catch(console.error); // aborts
const neverResolvingPromise = new Promise(() => {
});
timeoutAfter1s(neverResolvingPromise).catch(console.error); // promise never resolves but it will be rejected by the timeout
It makes use of Promise.race. Basically, whatever first resoves or rejects will be returned. We want to always reject a Promise if it does not resolve in time.
You would always have to wrap your Promise on creation like
timeoutAfter10s(new Promise(...));
And you would have to adapt the timeout according to your use case.
All four functions are called below in update return promises.
async function update() {
var urls = await getCdnUrls();
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
return;
}
What if we want to abort the sequence from outside, at any given time?
For example, while fetchMetaData is being executed, we realize we no longer need to render the component and we want to cancel the remaining operations (fetchContent and render). Is there a way to abort/cancel these operations from outside the update function?
We could check against a condition after each await, but that seems like an inelegant solution, and even then we will have to wait for the current operation to finish.
The standard way to do this now is through AbortSignals
async function update({ signal } = {}) {
// pass these to methods to cancel them internally in turn
// this is implemented throughout Node.js and most of the web platform
try {
var urls = await getCdnUrls({ signal });
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
} catch (e) {
if(e.name !== 'AbortError') throw e;
}
return;
}
// usage
const ac = new AbortController();
update({ signal: ac.signal });
ac.abort(); // cancel the update
OLD 2016 content below, beware dragons
I just gave a talk about this - this is a lovely topic but sadly you're not really going to like the solutions I'm going to propose as they're gateway-solutions.
What the spec does for you
Getting cancellation "just right" is actually very hard. People have been working on just that for a while and it was decided not to block async functions on it.
There are two proposals attempting to solve this in ECMAScript core:
Cancellation tokens - which adds cancellation tokens that aim to solve this issue.
Cancelable promise - which adds catch cancel (e) { syntax and throw.cancel syntax which aims to address this issue.
Both proposals changed substantially over the last week so I wouldn't count on either to arrive in the next year or so. The proposals are somewhat complimentary and are not at odds.
What you can do to solve this from your side
Cancellation tokens are easy to implement. Sadly the sort of cancellation you'd really want (aka "third state cancellation where cancellation is not an exception) is impossible with async functions at the moment since you don't control how they're run. You can do two things:
Use coroutines instead - bluebird ships with sound cancellation using generators and promises which you can use.
Implement tokens with abortive semantics - this is actually pretty easy so let's do it here
CancellationTokens
Well, a token signals cancellation:
class Token {
constructor(fn) {
this.isCancellationRequested = false;
this.onCancelled = []; // actions to execute when cancelled
this.onCancelled.push(() => this.isCancellationRequested = true);
// expose a promise to the outside
this.promise = new Promise(resolve => this.onCancelled.push(resolve));
// let the user add handlers
fn(f => this.onCancelled.push(f));
}
cancel() { this.onCancelled.forEach(x => x); }
}
This would let you do something like:
async function update(token) {
if(token.isCancellationRequested) return;
var urls = await getCdnUrls();
if(token.isCancellationRequested) return;
var metadata = await fetchMetaData(urls);
if(token.isCancellationRequested) return;
var content = await fetchContent(metadata);
if(token.isCancellationRequested) return;
await render(content);
return;
}
var token = new Token(); // don't ned any special handling here
update(token);
// ...
if(updateNotNeeded) token.cancel(); // will abort asynchronous actions
Which is a really ugly way that would work, optimally you'd want async functions to be aware of this but they're not (yet).
Optimally, all your interim functions would be aware and would throw on cancellation (again, only because we can't have third-state) which would look like:
async function update(token) {
var urls = await getCdnUrls(token);
var metadata = await fetchMetaData(urls, token);
var content = await fetchContent(metadata, token);
await render(content, token);
return;
}
Since each of our functions are cancellation aware, they can perform actual logical cancellation - getCdnUrls can abort the request and throw, fetchMetaData can abort the underlying request and throw and so on.
Here is how one might write getCdnUrl (note the singular) using the XMLHttpRequest API in browsers:
function getCdnUrl(url, token) {
var xhr = new XMLHttpRequest();
xhr.open("GET", url);
var p = new Promise((resolve, reject) => {
xhr.onload = () => resolve(xhr);
xhr.onerror = e => reject(new Error(e));
token.promise.then(x => {
try { xhr.abort(); } catch(e) {}; // ignore abort errors
reject(new Error("cancelled"));
});
});
xhr.send();
return p;
}
This is as close as we can get with async functions without coroutines. It's not very pretty but it's certainly usable.
Note that you'd want to avoid cancellations being treated as exceptions. This means that if your functions throw on cancellation you need to filter those errors on the global error handlers process.on("unhandledRejection", e => ... and such.
You can get what you want using Typescript + Bluebird + cancelable-awaiter.
Now that all evidence point to cancellation tokens not making it to ECMAScript, I think the best solution for cancellations is the bluebird implementation mentioned by #BenjaminGruenbaum, however, I find the usage of co-routines and generators a bit clumsy and uneasy on the eyes.
Since I'm using Typescript, which now support async/await syntax for es5 and es3 targets, I've created a simple module which replaces the default __awaiter helper with one that supports bluebird cancellations: https://www.npmjs.com/package/cancelable-awaiter
Unfortunately, there is no support of cancellable promises so far. There are some custom implementations e.g.
Extends/wraps a promise to be cancellable and resolvable
function promisify(promise) {
let _resolve, _reject
let wrap = new Promise(async (resolve, reject) => {
_resolve = resolve
_reject = reject
let result = await promise
resolve(result)
})
wrap.resolve = _resolve
wrap.reject = _reject
return wrap
}
Usage: Cancel promise and stop further execution immediately after it
async function test() {
// Create promise that should be resolved in 3 seconds
let promise = new Promise(resolve => setTimeout(() => resolve('our resolved value'), 3000))
// extend our promise to be cancellable
let cancellablePromise = promisify(promise)
// Cancel promise in 2 seconds.
// if you comment this line out, then promise will be resolved.
setTimeout(() => cancellablePromise.reject('error code'), 2000)
// wait promise to be resolved
let result = await cancellablePromise
// this line will never be executed!
console.log(result)
}
In this approach, a promise itself is executed till the end, but the caller code that awaits promise result can be 'cancelled'.
Unfortunately, no, you can't control execution flow of default async/await behaviour – it does not mean that the problem itself is impossible, it means that you need to do change your approach a bit.
First of all, your proposal about wrapping every async line in a check is a working solution, and if you have just couple places with such functionality, there is nothing wrong with it.
If you want to use this pattern pretty often, the best solution, probably, is to switch to generators: while not so widespread, they allow you to define each step's behaviour, and adding cancel is the easiest. Generators are pretty powerful, but, as I've mentioned, they require a runner function and not so straightforward as async/await.
Another approach is to create cancellable tokens pattern – you create an object, which will be filled a function which wants to implement this functionality:
async function updateUser(token) {
let cancelled = false;
// we don't reject, since we don't have access to
// the returned promise
// so we just don't call other functions, and reject
// in the end
token.cancel = () => {
cancelled = true;
};
const data = await wrapWithCancel(fetchData)();
const userData = await wrapWithCancel(updateUserData)(data);
const userAddress = await wrapWithCancel(updateUserAddress)(userData);
const marketingData = await wrapWithCancel(updateMarketingData)(userAddress);
// because we've wrapped all functions, in case of cancellations
// we'll just fall through to this point, without calling any of
// actual functions. We also can't reject by ourselves, since
// we don't have control over returned promise
if (cancelled) {
throw { reason: 'cancelled' };
}
return marketingData;
function wrapWithCancel(fn) {
return data => {
if (!cancelled) {
return fn(data);
}
}
}
}
const token = {};
const promise = updateUser(token);
// wait some time...
token.cancel(); // user will be updated any way
I've written articles, both on cancellation and generators:
promise cancellation
generators usage
To summarize – you have to do some additional work in order to support canncellation, and if you want to have it as a first class citizen in your application, you have to use generators.
Here is a simple exemple with a promise:
let resp = await new Promise(function(resolve, reject) {
// simulating time consuming process
setTimeout(() => resolve('Promise RESOLVED !'), 3000);
// hit a button to cancel the promise
$('#btn').click(() => resolve('Promise CANCELED !'));
});
Please see this codepen for a demo
Using CPromise (c-promise2 package) this can be easily done in the following way
(Demo):
import CPromise from "c-promise2";
async function getCdnUrls() {
console.log(`task1:start`);
await CPromise.delay(1000);
console.log(`task1:end`);
}
async function fetchMetaData() {
console.log(`task2:start`);
await CPromise.delay(1000);
console.log(`task2:end`);
}
function* fetchContent() {
// using generators is the recommended way to write asynchronous code with CPromise
console.log(`task3:start`);
yield CPromise.delay(1000);
console.log(`task3:end`);
}
function* render() {
console.log(`task4:start`);
yield CPromise.delay(1000);
console.log(`task4:end`);
}
const update = CPromise.promisify(function* () {
var urls = yield getCdnUrls();
var metadata = yield fetchMetaData(urls);
var content = yield* fetchContent(metadata);
yield* render(content);
return 123;
});
const promise = update().then(
(v) => console.log(`Done: ${v}`),
(e) => console.warn(`Fail: ${e}`)
);
setTimeout(() => promise.cancel(), 2500);
Console output:
task1:start
task1:end
task2:start
task2:end
task3:start
Fail: CanceledError: canceled
Just like in regular code you should throw an exception from the first function (or each of the next functions) and have a try block around the whole set of calls. No need to have extra if-elses. That's one of the nice bits about async/await, that you get to keep error handling the way we're used to from regular code.
Wrt cancelling the other operations there is no need to. They will actually not start until their expressions are encountered by the interpreter. So the second async call will only start after the first one finishes, without errors. Other tasks might get the chance to execute in the meantime, but for all intents and purposes, this section of code is serial and will execute in the desired order.
This answer I posted may help you to rewrite your function as:
async function update() {
var get_urls = comPromise.race([getCdnUrls()]);
var get_metadata = get_urls.then(urls=>fetchMetaData(urls));
var get_content = get_metadata.then(metadata=>fetchContent(metadata);
var render = get_content.then(content=>render(content));
await render;
return;
}
// this is the cancel command so that later steps will never proceed:
get_urls.abort();
But I am yet to implement the "class-preserving" then function so currently you have to wrap every part you want to be able to cancel with comPromise.race.
I created a library called #kaisukez/cancellation-token
The idea is to pass a CancellationToken to every async function, then wrap every promise in AsyncCheckpoint. So that when the token is cancelled, your async function will be cancelled in the next checkpoint.
This idea came from tc39/proposal-cancelable-promises
and conradreuter/cancellationtoken.
How to use my library
Refactor your code
// from this
async function yourFunction(param1, param2) {
const result1 = await someAsyncFunction1(param1)
const result2 = await someAsyncFunction2(param2)
return [result1, result2]
}
// to this
import { AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function yourFunction(token, param1, param2) {
const result1 = await AsyncCheckpoint.after(token, () => someAsyncFunction1(param1))
const result2 = await AsyncCheckpoint.after(token, () => someAsyncFunction2(param2))
return [result1, result2]
}
Create a token then call your function with that token
import { CancellationToken, CancellationError } from '#kaisukez/cancellation-token'
const [token, cancel] = CancellationToken.source()
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => yourAsyncFunction(token, param1, param2))
// ... do something ...
// then cancel the background task
await cancel()
So this is the solution of the OP's question.
import { CancellationToken, CancellationError, AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function update(token) {
var urls = await AsyncCheckpoint.after(token, () => getCdnUrls());
var metadata = await AsyncCheckpoint.after(token, () => fetchMetaData(urls));
var content = await AsyncCheckpoint.after(token, () => fetchContent(metadata));
await AsyncCheckpoint.after(token, () => render(content));
return;
}
const [token, cancel] = CancellationToken.source();
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => update(token))
// ... do something ...
// then cancel the background task
await cancel()
Example written in Node with Typescript of a call which can be aborted from outside:
function cancelable(asyncFunc: Promise<void>): [Promise<void>, () => boolean] {
class CancelEmitter extends EventEmitter { }
const cancelEmitter = new CancelEmitter();
const promise = new Promise<void>(async (resolve, reject) => {
cancelEmitter.on('cancel', () => {
resolve();
});
try {
await asyncFunc;
resolve();
} catch (err) {
reject(err);
}
});
return [promise, () => cancelEmitter.emit('cancel')];
}
Usage:
const asyncFunction = async () => {
// doSomething
}
const [promise, cancel] = cancelable(asyncFunction());
setTimeout(() => {
cancel();
}, 2000);
(async () => await promise)();