slowness in promise chain - javascript

my codes transfers huge number of files from one cloud storage to another cloud storage (same region). The workflow is downloading the source file stream, then uploading the stream to the target storage.
If running them in a non-promise loop, the transferring is fast(say, 100M/s), but will hit the memory limit. Finally the server crashes.
if in a promise chain, i.e. run the next job after the last job completes, the crash problem is solved, but the transferring speed is very slow (say 10M/s).
My question: why the promise would affect the downloading & uploading speed? or anything I missed?
code snippet:
transferArray.forEach(function (eachTransfer) {
queue = queue.then(function(result){
// put result somewhere
return eachFileTransfer(jobId,userid,eachTransfer);
});
});
queue.then(function(){
console.log('done');
});
I am thinking to use PromisePool with concurrences, but not sure how much the speed would be improved, and the reasonable number of concurrences I should set. the ref post:
Execute promises concurrently with a buffer pool size in Javascript

Cause i guess with your loop you run a lot of requests in parallel, wich will be faster (but will also exceed all kind of limits). Therefore just as you said, use multiple promise chains:
const inParallel = 10;
const promises = (new Array(inParallel)).fill(Promise.resolve());
for(const [index, transfer] of transferArray.entries())
promises[index % inParallel] = promises[index % inParallel].then(() => eachFileTransfer(jobId,userid,eachTransfer));

finally, I made it working by PromisePool. The code snippet below is based on the other post: Execute promises concurrently with a buffer pool size in Javascript
Since my code is relatively complex than this post, thought it might be helpful for some others:
var PromisePool = require('es6-promise-pool');
//single promises
function p1(){
return new Promise(function(resolve, reject) {
console.log('p1');
setTimeout(resolve, 2000, 'foo');
});
}
function p2(){
return new Promise(function(resolve, reject) {
console.log('p2');
setTimeout(resolve, 2000, 'foo');
});
}
function p3(){
return new Promise(function(resolve, reject) {
console.log('p3');
setTimeout(resolve, 2000, 'foo');
});
}
var tasks=[];
var loopIndex = 0;
[1,2,3,4,5,6,7,8].forEach(function(v,i){
console.log(v,i);
//build promise chain
var x = (v) => new Promise(function(resolve, reject) {
console.log(v);
p1().then(function(r){
return p2();
})
.then(function(r){
return p3();
})
.then(function(r){
//console.log('embedded chain done');
resolve('embedded chain done');
}).catch(function(e){
reject('embedded chain failed');
});
//setTimeout(resolve, 2000, 'foo');
})
//build one promise task
tasks.push({fun:x,param:i});
if(++loopIndex == 8){
//once the array is done
const promiseProducer = () => {
while(tasks.length) {
console.log('processing ');
const task = tasks.shift();
return task.fun(task.param);
}
return null;
}
// concurrent Promises set to 3
const pool = new PromisePool(promiseProducer, 3);
//start to promise all, yet with concurrent 3 tasks.
const poolPromise = pool.start();
poolPromise.then(() => { console.log('done!'); })
}
});

Related

Aggregate multiple calls then separate result with Promise

Currently I have many concurrent identical calls to my backend, differing only on an ID field:
getData(1).then(...) // Each from a React component in a UI framework, so difficult to aggregate here
getData(2).then(...)
getData(3).then(...)
// creates n HTTP requests... inefficient
function getData(id: number): Promise<Data> {
return backend.getData(id);
}
This is wasteful as I make more calls. I'd like to keep my getData() calls, but then aggregate them into a single getDatas() call to my backend, then return all the results to the callers. I have more control over my backend than the UI framework, so I can easily add a getDatas() call on it. The question is how to "mux" the JS calls into one backend call, the "demux" the result into the caller's promises.
const cache = Map<number, Promise<Data>>()
let requestedIds = []
let timeout = null;
// creates just 1 http request (per 100ms)... efficient!
function getData(id: number): Promise<Data> {
if (cache.has(id)) {
return cache;
}
requestedIds.push(id)
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas(requestedIds).then((datas: Data[]) => {
// TODO: somehow populate many different promises in cache??? but how?
requestedIds = []
timeout = null
}
}, 100)
}
return ???
}
In Java I would create a Map<int, CompletableFuture> and upon finishing my backend request, I would look up the CompletableFuture and call complete(data) on it. But I think in JS Promises can't be created without an explicit result being passed in.
Can I do this in JS with Promises?
A little unclear on what your end goal looks like. I imagine you could loop through your calls as needed; Perhaps something like:
for (let x in cache){
if (x.has(id))
return x;
}
//OR
for (let x=0; x<id.length;x++){
getData(id[x])
}
Might work. You may be able to add a timing method into the mix if needed.
Not sure what your backend consists of, but I do know GraphQL is a good system for making multiple calls.
It may be ultimately better to handle them all in one request, rather than multiple calls.
The cache can be a regular object mapping ids to promise resolution functions and the promise to which they belong.
// cache maps ids to { resolve, reject, promise, requested }
// resolve and reject belong to the promise, requested is a bool for bookkeeping
const cache = {};
You might need to fire only once, but here I suggest setInterval to regularly check the cache for unresolved requests:
// keep the return value, and stop polling with clearInterval()
// if you really only need one batch, change setInterval to setTimeout
function startGetBatch() {
return setInterval(getBatch, 100);
}
The business logic calls only getData() which just hands out (and caches) promises, like this:
function getData(id) {
if (cache[id]) return cache[id].promise;
cache[id] = {};
const promise = new Promise((resolve, reject) => {
Object.assign(cache[id], { resolve, reject });
});
cache[id].promise = promise;
cache[id].requested = false;
return cache[id].promise;
}
By saving the promise along with the resolver and rejecter, we're also implementing the cache, since the resolved promise will provide the thing it resolved to via its then() method.
getBatch() asks the server in a batch for the not-yet-requested getData() ids, and invokes the corresponding resolve/reject functions:
function getBatch() {
// for any
const ids = [];
Object.keys(cache).forEach(id => {
if (!cache[id].requested) {
cache[id].requested = true;
ids.push(id);
}
});
return backend.getDatas(ids).then(datas => {
Object.keys(datas).forEach(id => {
cache[id].resolve(datas[id]);
})
}).catch(error => {
Object.keys(datas).forEach(id => {
cache[id].reject(error);
delete cache[id]; // so we can retry
})
})
}
The caller side looks like this:
// start polling
const interval = startGetBatch();
// in the business logic
getData(5).then(result => console.log('the result of 5 is:', result));
getData(6).then(result => console.log('the result of 6 is:', result));
// sometime later...
getData(5).then(result => {
// if the promise for an id has resolved, then-ing it still works, resolving again to the -- now cached -- result
console.log('the result of 5 is:', result)
});
// later, whenever we're done
// (no need for this if you change setInterval to setTimeout)
clearInterval(interval);
I think I've found a solution:
interface PromiseContainer {
resolve;
reject;
}
const requests: Map<number, PromiseContainer<Data>> = new Map();
let timeout: number | null = null;
function getData(id: number) {
const promise = new Promise<Data>((resolve, reject) => requests.set(id, { resolve, reject }))
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas([...requests.keys()]).then(datas => {
for (let [id, data] of Object.entries(datas)) {
requests.get(Number(id)).resolve(data)
requests.delete(Number(id))
}
}).catch(e => {
Object.values(requests).map(promise => promise.reject(e))
})
timeout = null
}, 100)
}
return promise;
}
The key was figuring out I could extract the (resolve, reject) from a promise, store them, then retrieve and call them later.

Promise callback queue with priority for some callbacks

My code needs to perform a multitude of async actions simultaneously and handle their promises in a specific sequential way.
What I mean by "specific sequential way" ==> Assume you are starting promises promise1, promise2, promise3 in that order, and that promise3 actually resolves first, and promise2 second, then what I want is to process promise3, promise2 and promise1 sequentially in that order
Let's consider a timeout async function that times out after X seconds, and fetchMyItem1, fetchMyItem2, that return promises which, when fulfilled, should execute a different code depending whether timeout has resolved or not.
For a concrete scenario, imagine there is a customer waiting at the counter for an item to be delivered, and either the customer stays and we can serve him directly at the counter by bringing one item at a time, or the customer goes away (timeout) and we have to ask a waiter to come so that when the items arrive, he/she can bring the items to him. Note here that even when one item is being delivered, the other items should still be undergoing delivery (the promises are pending), and might even arrive (be fulfilled) while the customer is being served one item or the server arrives.
Here is some code to start with
const allItemsToBeDeliveredPromises = [fetchMyItem1(), fetchMyItem2(), ...]
const customerLeavesCounterPromise = timeout()
const waiter = undefined
const allPromisesToBeFulfilled = [...allItemsToBeDeliveredPromises, customerLeavesCounterPromise]
// LOOP
const itemDeliveredOrVisitorLeft = await Promise.all(allPromisesToBeFulfilled)
if hasCustomerLeft(itemDeliveredOrCustomerLeft) {
// hasCustomerLeft allows us to detect if the promise that resolved first is `customerLeavesCounterPromise` or not
waiter = await callWaiter()
} else {
// An item has arrived
if (waiter) {
deliverItemViaWaiter(itemDeliveredOrVisitorLeft)
} else {
deliverItemAtCounter(itemDeliveredOrVisitorLeft)
}
}
// remove itemDeliveredOrCustomerLeft from allPromisesToBeFulfilled
// END loop
I am not sure how to implement a loop for this scenario. Promises must be accumulated into a queue as they resolve, but there is a priority for a specific promise in the queue (the timeout promise should be executed asap when it arrives, but after finishing the processing of the current promise if a promise fulfilment is already being processed)
My code needs to perform a multitude of async actions simultaneously and handle their promises in a specific sequential way.
You could use streams to consume the promises, as streams are essentially queues that process one message at a time.
The idea (i.e. not tested):
import { Readable, Writable } from 'stream';
let customerHasLeft = false;
/*const customerLeavesCounterPromise = */timeout() // your code...
.then(() => { customerHasLeft = true; }); // ... made boolean
// let's push all our promises in a readable stream
// (they are supposedly sorted in the array)
const input = new Readable({
objectMode: true,
read: function () { // don't use arrow function: we need `this`
const allItemsToBeDeliveredPromises = [fetchMyItem1(), fetchMyItem2(), ...]; // your code
// put everything, in the same order, in the output queue
allItemsToBeDeliveredPromises.forEach(p => this.push(p));
this.push(null); // terminate the stream after all these
}
});
// let's declare the logic to process each promise
// (depending on `timeout()` being done)
const consumer = new Writable({
write: async function (promise, uselessEncoding, callback) {
try {
const order = await promise; // wait for the current promise to be completed
} catch (e) {
/* delivery error, do something cool like a $5 coupon */
return callback(e); // or return callback() without e if you don't want to crash the pipe
}
if (customerHasLeft) { /* find a waiter (you can still `await`) and deliver `order` */ }
else { /* deliver `order` at the counter */ }
callback(); // tell the underlying queue we can process the next promise now
}
});
// launch the whole pipe
input.pipe(consumer);
// you can add listeners on all events you'd like:
// 'error', 'close', 'data', whatever...
EDIT: actually we want to process promises as they resolve, but sequentially (i.e. a single post-process for all promises)
let customerHasLeft = false;
timeout() // your code...
.then(() => { customerHasLeft = true; }); // ... made boolean
const allItemsToBeDeliveredPromises = [fetchMyItem1(), fetchMyItem2(), ...];
const postProcessChain = Promise.resolve(); // start with a promise ready to be thened
// add a next step to each promise so that as soon as one resolves, it registers
// as a next step to the post-process chain
allItemsToBeDeliveredPromises.forEach(p => p.then(order => postProcessChain.then(async () => {
// do something potentially async with the resulting order, like this:
if (customerHasLeft) { /* find a waiter (you can still `await`) and deliver `order` */ }
else { /* deliver `order` at the counter */ }
})));
one of them must have the effect that, whenever it fulfils, it should alter the processing of other fulfilled promises that have not yet been processed or are not yet being processed.
Do you run them all together? Something like:
promise1 = asyncFunc1()
promise2 = asyncFunc2()
promise3 = asyncFunc3()
Promise.all([promise1, promise2, promise3]).then(//sometthing)
If yes, it can't be done, unless the promise2 function and promise3 function aren't waiting for an event, a lock, or something handled by the promise1 function.
If this is the case is better to organize the promises like:
asyncFunc1()
.then(() => asyncFunc2(paramIfOk))
.catch(() => asyncFunc2(paramIfFail))
In your example:
const allItemsToBeDelivered = [myPromise1(), myPromise2(), ...]
myPromise1() code should wait for the item to be delivered and check if someone is waiting for it. It's a model/code design problem, not a promise one.
Another way to do it is consider some event driven logic: an entity Customer has a listener for the delivered event that will be fired by the waitItemDelivered() promise just before resolve itself.
EDIT: as requested here a little more elaboration on the event-driven solution.
Short answer: it highly depend on your software design.
It's already something released and running in production? Be carefull with changes like this. If it's a service that you are developing you're still in time to take in consideration some logic changes. A solution that doesn't change radically how is working but use an events is not that great, mixing patterns never pay off on long term.
Example:
const events = require('events');
class Customer() {
constructor(shop) {
this.emitter = new events.EventEmitter()
this.shopEventListener = shop.emitter
this.setupEventLinstening() // for keeping things clean in the constructor
// other properties here
}
setupEventLinstening() {
this.shopEventListener.on('product-delivered', (eventData) => {
// some logic
})
}
buyProduct() {
// some logic
this.emitter.emit('waiting-delivery', eventData)
}
}
class Shop() {
constructor(shop) {
this.emitter = new events.EventEmitter()
// other properties here
}
addCustomer(customer) {
customer.emitter.on('waiting-delivery', (eventData) => {
// some logic
self.waitDelivery().then(() => self.emitter.emit('product-delivered'))
})
}
waitDelivery() {
return new Promise((resolve, reject) => {
// some logic
resolve()
})
}
}
// setup
const shop = new Shop()
const customer = new Customer(shop)
shop.addCustomer(customer)
This is a new way of seeing the logic, but a similar approach can be used inside a promise:
const waitDelivery = () => new Promise((resolve, reject) => {
logisticWaitDelivery().then(() => {
someEmitter.emit('item-delivered')
resolve()
})
}
const customerPromise = () => new Promise((resolve, reject) => {
someListener.on('item-delivered', () => {
resolve()
})
}
promiseAll([waitDelivery, customerPromise])
Make 2 queues, the second starting after the first ends:
var highPriority=[ph1,ph2,...]//promises that have to be executed first
var lowPriority=[pl1,pl2,...]//promises that have to wait the high priority promises
Promise.all(highPriority).then(()=>{
Promise.all(lowPriority)
})

JavaScript Promises with AJAX

I am trying to write a series of AJAX requests into a dictionary.
I am attempting to use promises for this, however I am either writing the promise syntax incorrectly, or what I think may be happening is that the function is actually completing (for loop is done, and AJAX requests sent) but the AJAX requests are still not returned. Therefore this is still returning an empty dictionary.
let dict = {};
let activeMachines = ["41", "42", "43"];
let dataPromise = new Promise (function (resolve, reject)
{
for (let i = 0; i < activeMachines.length; i++)
{
let machineID = activeMachines[i]
let getAPIData = new XMLHttpRequest();
let url = 'http://127.0.0.1:8000/processes/apidata/' +machineID + '/';
getAPIData.open('GET', url);
getAPIData.send();
getAPIData.onload = function()
{
let APIData = JSON.parse(getAPIData.responseText);
dict['machine_' + machineID] = APIData[0].author_id;
dict['temp' + machineID] = APIData[0].tempData; //get value
dict['humid' + machineID] = APIData[0].humidData;
timeValue = String((APIData[0].dateTime));
dict['time' + machineID] = new Date(timeValue);
console.log("done");
}
}
resolve();
});
dataPromise.then(function() {console.log(dict);});
Is there a way to "sense" when all of the XMLHTTPRequests have returned?
#Rafael's answer will work, but it doesn't illuminate much about what's going wrong, since you're trying to grok the concept of Promises and write one yourself.
Fundamentally I think your approach has two missteps: 1. creating a single Promise that handles calls to all of your arbitrary list of "activeMachines", and 2. putting your resolve() call in the wrong place.
Usually a Promise looks like this:
const myPromise = new Promise(function(resolve, reject) {
doSomeAsyncWork(function(result) {
// Some kind of async call with a callback function or somesuch...
resolve(result);
});
}).then(data => {
// Do something with the final result
console.log(data);
});
You can simulate some kind of arbitrary asynchronous work with setTimeout():
const myPromise = new Promise(function(resolve, reject) {
// Resolve with "Done!" after 5 seconds
setTimeout(() => {
resolve("Done!");
}, 5000);
}).then(data => {
console.log(data); // "Done!"
});
However your original code puts the resolve() call in a weird place, and doesn't even pass it any data. It looks sorta equivalent to this:
const myPromise = new Promise(function(resolve, reject) {
// Resolve with "Done!" after 5 seconds
setTimeout(() => {
// Doing some work here instead of resolving...
}, 5000);
resolve();
}).then(data => {
console.log(data); // This would be "undefined"
});
Where you're doing a console.log("done"); in your original code is actually where you should be doing a resolve(someData);!
You're also trying to do side effect work inside of your Promise's async function stuff, which is really weird and contrary to how a Promise is supposed to work. The promise is supposed to go off and do its async work, and then resolve with the resulting data -- literally with the .then() chain.
Also, instead of doing multiple asynchronous calls inside of your Promise, you should generalize it so it is reusable and encapsulates only a single network request. That way you can fire off multiple asynchronous Promises, wait for them all to resolve, and then do something.
const activeMachines = ["41", "42", "43"];
// Make a reusable function that returns a single Promise
function fetchAPI(num) {
return new Promise(function(resolve, reject) {
const getAPIData = new XMLHttpRequest();
const url = "http://127.0.0.1:8000/processes/apidata/" + num + "/";
getAPIData.open("GET", url);
getAPIData.send();
getAPIData.onload = function() {
const APIData = JSON.parse(getAPIData.responseText);
const resolveData = {};
resolveData["machine_" + num] = APIData[0].author_id;
resolveData["temp" + num] = APIData[0].tempData; //get value
resolveData["humid" + num] = APIData[0].humidData;
timeValue = String(APIData[0].dateTime);
resolveData["time" + num] = new Date(timeValue);
resolve(resolveData);
};
});
}
// Promise.all() will resolve once all Promises in its array have also resolved
Promise.all(
activeMachines.map(ea => {
return fetchAPI(ea);
})
).then(data => {
// All of your network Promises have completed!
// The value of "data" here will be an array of all your network results
});
The fetch() API is great and you should learn to use that also -- but only once you understand the theory and practice behind how Promises actually operate. :)
Here's an example of the Fetch API which uses Promises by default:
let m_ids = [1,2,3,4];
let forks = m_ids.map(m => fetch(`http://127.0.0.1:8000/processes/apidata/${m}`));
let joined = Promise.all(forks);
joined
.then(files => console.log('all done', files))
.catch(error => console.error(error));
I hope this helps!

Promise not returning properly

I have the following function. It calls itself repeatedly and iterates through ftp servers checking for new files. I'm trying to make it a promise so that I can operate().then(function(newFilesObject), but I can't get the .then on operate to activate. It does attempt to resolve it but doesn't send through. By the way, newFiles is a global variable that gets the files per server appended to it. If more code is wanted I can post or github it.
function operate(){
return new Promise(function(resolve, reject){
if(servers[i]){
if(i!== 0) ftp = new JSFtp(servers[i].server)
local = servers[i].local
remote = servers[i].remote
localFiles = fs.readdirSync(local)
}else{
console.log('trying to resolve')
console.log(newFiles)
resolve(newFiles)
}
gatherFiles(remote).then(function(files){
if(files.length>0){
downloadNew(files).then(function(){
console.log('Done: ' + servers[i].server.host)
i++
operate()
})
}else{
console.log('No updates: ' + servers[i].server.host)
i++
operate()
}
})
})
}
operate().then(function(files){
console.log('files: ' + files)
})
The promises in the code sample do not return as their resolvers or rejectors are not always invoked. In fact, resolve is only invoked when i === 0. According to the Promises/A+ specification, promises may only be transitioned to a fulfilled state by invoking resolve. It also can only be transitioned to a rejected state by invoking reject or throwing an exception from within the executor. Therefore, reaching the end of the executor without invoking either or passing one as a callback ensures the promise remains in pending state indefinitely.
The goal you seek may be achieved with a little refactoring. Considering the following as your goal:
Sequentially through each FTP server...
Read a given directory for a list of files
Compare list of files to one stored locally to determine new ones
If there are new ones, download them sequentially
Return a list of all newly downloaded files
Data
var knownFTPServers = [{
'localDirectory': 'sub/',
'localFilepaths': ['docA.json', 'docB.json'],
'remoteDirectory': 'remsub/',
'remoteFilepaths': [],
'jsftpHandle': undefined,
'host': 'example.com'
},
{
'localDirectory': 'root/',
'localFilepaths': ['file1.txt', 'file2.txt'],
'remoteDirectory': 'remroot/',
'remoteFilepaths': [],
'jsftpHandle': undefined,
'host': 'geocities.com'
}];
Logic
function pullNewFilesFromFTPServer(ftpServer) {
return new Promise(function (resolve, reject) {
var handle = new JSFtp(ftpServer);
ftpServer.jsftpHandle = new JSFtp(ftpServer);
// Returns a promise for reading a directory from JSFtp server
// resolves with file list
// rejects with FTP error
function readdir(directory) {
return new Promise(function (resolve, reject) {
handle.ls(ftpServer.remoteDirectory, function (err, res) {
if (err) return reject(err);
resolve(res);
});
});
}
// Returns a promise for downloading a file from a remote JSFtp server
// resolves with the filepath of the downloaded filepath
// rejects with FTP error
function downloadFile(path) {
return new Promise(function (resolve, reject) {
handle.get(path, path, function (err) {
if (err) return reject(err);
resolve(path);
});
});
}
// get all remote filepaths on server
readdir(ftpServer.remoteDirectory)
// filter out filepaths already present locally
.then(function (remoteFilepaths) {
return remoteFilepaths.filter(function (path) {
return ftpServer.localFilepaths.indexOf(path) < 0;
});
})
// download new filepaths sequentially
// reduce turns the array of new filepaths into a promise chain
// return new filepaths after completing the promise chain
.then(function (newFilepaths) {
return newFilepaths.reduce(function (previousDownloadPromise, newPath) {
return previousDownloadPromise.then(function () {
return downloadFile(newPath);
});
}, Promise.resolve())
.then(function () { return newFilepaths; });
})
// resolve server promise with new filepaths or reject with errors
.then(resolve, reject);
});
}
var allFilesDownloaded = [];
knownFTPServers.reduce(function (previousServerPromise, server) {
return previousServerPromise.then(function (filesDownloaded) {
allFilesDownloaded = allFilesDownloaded.concat(filesDownloaded);
return pullNewFilesFromFTPServer(server);
});
}, Promise.resolve([]))
.then(function () {
console.log(allFilesDownloaded);
}, function (err) {
console.err(err);
});
Though it may seem a little more complicated in some places, the actions of each function are more modular. The idea that is somewhat unintuitive is using Array.prototype.reduce to turn an array of data into an array of promises executed sequentially.
Since creating a promise to download a file attempts to download the file immediately, one can't create all the promises at once if one intends to download them one at a time. Otherwise, the sequence might look a somewhat simpler.

Promise - is it possible to force cancel a promise

I use ES6 Promises to manage all of my network data retrieval and there are some situations where I need to force cancel them.
Basically the scenario is such that I have a type-ahead search on the UI where the request is delegated to the backend has to carry out the search based on the partial input. While this network request (#1) may take a little bit of time, user continues to type which eventually triggers another backend call (#2)
Here #2 naturally takes precedence over #1 so I would like to cancel the Promise wrapping request #1. I already have a cache of all Promises in the data layer so I can theoretically retrieve it as I am attempting to submit a Promise for #2.
But how do I cancel Promise #1 once I retrieve it from the cache?
Could anyone suggest an approach?
In modern JavaScript - no
Promises have settled (hah) and it appears like it will never be possible to cancel a (pending) promise.
Instead, there is a cross-platform (Node, Browsers etc) cancellation primitive as part of WHATWG (a standards body that also builds HTML) called AbortController. You can use it to cancel functions that return promises rather than promises themselves:
// Take a signal parameter in the function that needs cancellation
async function somethingIWantToCancel({ signal } = {}) {
// either pass it directly to APIs that support it
// (fetch and most Node APIs do)
const response = await fetch('.../', { signal });
// return response.json;
// or if the API does not already support it -
// manually adapt your code to support signals:
const onAbort = (e) => {
// run any code relating to aborting here
};
signal.addEventListener('abort', onAbort, { once: true });
// and be sure to clean it up when the action you are performing
// is finished to avoid a leak
// ... sometime later ...
signal.removeEventListener('abort', onAbort);
}
// Usage
const ac = new AbortController();
setTimeout(() => ac.abort(), 1000); // give it a 1s timeout
try {
await somethingIWantToCancel({ signal: ac.signal });
} catch (e) {
if (e.name === 'AbortError') {
// deal with cancellation in caller, or ignore
} else {
throw e; // don't swallow errors :)
}
}
No. We can't do that yet.
ES6 promises do not support cancellation yet. It's on its way, and its design is something a lot of people worked really hard on. Sound cancellation semantics are hard to get right and this is work in progress. There are interesting debates on the "fetch" repo, on esdiscuss and on several other repos on GH but I'd just be patient if I were you.
But, but, but.. cancellation is really important!
It is, the reality of the matter is cancellation is really an important scenario in client-side programming. The cases you describe like aborting web requests are important and they're everywhere.
So... the language screwed me!
Yeah, sorry about that. Promises had to get in first before further things were specified - so they went in without some useful stuff like .finally and .cancel - it's on its way though, to the spec through the DOM. Cancellation is not an afterthought it's just a time constraint and a more iterative approach to API design.
So what can I do?
You have several alternatives:
Use a third party library like bluebird who can move a lot faster than the spec and thus have cancellation as well as a bunch of other goodies - this is what large companies like WhatsApp do.
Pass a cancellation token.
Using a third party library is pretty obvious. As for a token, you can make your method take a function in and then call it, as such:
function getWithCancel(url, token) { // the token is for cancellation
var xhr = new XMLHttpRequest;
xhr.open("GET", url);
return new Promise(function(resolve, reject) {
xhr.onload = function() { resolve(xhr.responseText); });
token.cancel = function() { // SPECIFY CANCELLATION
xhr.abort(); // abort request
reject(new Error("Cancelled")); // reject the promise
};
xhr.onerror = reject;
});
};
Which would let you do:
var token = {};
var promise = getWithCancel("/someUrl", token);
// later we want to abort the promise:
token.cancel();
Your actual use case - last
This isn't too hard with the token approach:
function last(fn) {
var lastToken = { cancel: function(){} }; // start with no op
return function() {
lastToken.cancel();
var args = Array.prototype.slice.call(arguments);
args.push(lastToken);
return fn.apply(this, args);
};
}
Which would let you do:
var synced = last(getWithCancel);
synced("/url1?q=a"); // this will get canceled
synced("/url1?q=ab"); // this will get canceled too
synced("/url1?q=abc"); // this will get canceled too
synced("/url1?q=abcd").then(function() {
// only this will run
});
And no, libraries like Bacon and Rx don't "shine" here because they're observable libraries, they just have the same advantage user level promise libraries have by not being spec bound. I guess we'll wait to have and see in ES2016 when observables go native. They are nifty for typeahead though.
With AbortController
It is possible to use abort controller to reject promise or resolve on your demand:
let controller = new AbortController();
let task = new Promise((resolve, reject) => {
// some logic ...
const abortListener = ({target}) => {
controller.signal.removeEventListener('abort', abortListener);
reject(target.reason);
}
controller.signal.addEventListener('abort', abortListener);
});
controller.abort('cancelled reason'); // task is now in rejected state
Also it's better to remove event listener on abort to prevent memory leaks
And you can later check if error was thrown by abort by checking the controller.signal.aborted boolean property like:
const res = task.catch((err) => (
controller.signal.aborted
? { value: err }
: { value: 'fallback' }
));
If you would check if task is aborted and just return, then the Promise will be in pending status forever. But in that case you also will not get .catch fired with any error if that's your intension:
controller.abort();
new Promise((resolve, reject) => {
if(controller.signal.aborted) return;
}
Same works for cancelling fetch:
let controller = new AbortController();
fetch(url, {
signal: controller.signal
});
or just pass controller:
let controller = new AbortController();
fetch(url, controller);
And call abort method to cancel one, or infinite number of fetches where you passed this controller
controller.abort();
Standard proposals for cancellable promises have failed.
A promise is not a control surface for the async action fulfilling it; confuses owner with consumer. Instead, create asynchronous functions that can be cancelled through some passed-in token.
Another promise makes a fine token, making cancel easy to implement with Promise.race:
Example: Use Promise.race to cancel the effect of a previous chain:
let cancel = () => {};
input.oninput = function(ev) {
let term = ev.target.value;
console.log(`searching for "${term}"`);
cancel();
let p = new Promise(resolve => cancel = resolve);
Promise.race([p, getSearchResults(term)]).then(results => {
if (results) {
console.log(`results for "${term}"`,results);
}
});
}
function getSearchResults(term) {
return new Promise(resolve => {
let timeout = 100 + Math.floor(Math.random() * 1900);
setTimeout(() => resolve([term.toLowerCase(), term.toUpperCase()]), timeout);
});
}
Search: <input id="input">
Here we're "cancelling" previous searches by injecting an undefined result and testing for it, but we could easily imagine rejecting with "CancelledError" instead.
Of course this doesn't actually cancel the network search, but that's a limitation of fetch. If fetch were to take a cancel promise as argument, then it could cancel the network activity.
I've proposed this "Cancel promise pattern" on es-discuss, exactly to suggest that fetch do this.
I have checked out Mozilla JS reference and found this:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/race
Let's check it out:
var p1 = new Promise(function(resolve, reject) {
setTimeout(resolve, 500, "one");
});
var p2 = new Promise(function(resolve, reject) {
setTimeout(resolve, 100, "two");
});
Promise.race([p1, p2]).then(function(value) {
console.log(value); // "two"
// Both resolve, but p2 is faster
});
We have here p1, and p2 put in Promise.race(...) as arguments, this is actually creating new resolve promise, which is what you require.
For Node.js and Electron, I'd highly recommend using Promise Extensions for JavaScript (Prex). Its author Ron Buckton is one of the key TypeScript engineers and also is the guy behind the current TC39's ECMAScript Cancellation proposal. The library is well documented and chances are some of Prex will make to the standard.
On a personal note and coming from C# background, I like very much the fact that Prex is modelled upon the existing Cancellation in Managed Threads framework, i.e. based on the approach taken with CancellationTokenSource/CancellationToken .NET APIs. In my experience, those have been very handy to implement robust cancellation logic in managed apps.
I also verified it to work within a browser by bundling Prex using Browserify.
Here is an example of a delay with cancellation (Gist and RunKit, using Prex for its CancellationToken and Deferred):
// by #noseratio
// https://gist.github.com/noseratio/141a2df292b108ec4c147db4530379d2
// https://runkit.com/noseratio/cancellablepromise
const prex = require('prex');
/**
* A cancellable promise.
* #extends Promise
*/
class CancellablePromise extends Promise {
static get [Symbol.species]() {
// tinyurl.com/promise-constructor
return Promise;
}
constructor(executor, token) {
const withCancellation = async () => {
// create a new linked token source
const linkedSource = new prex.CancellationTokenSource(token? [token]: []);
try {
const linkedToken = linkedSource.token;
const deferred = new prex.Deferred();
linkedToken.register(() => deferred.reject(new prex.CancelError()));
executor({
resolve: value => deferred.resolve(value),
reject: error => deferred.reject(error),
token: linkedToken
});
await deferred.promise;
}
finally {
// this will also free all linkedToken registrations,
// so the executor doesn't have to worry about it
linkedSource.close();
}
};
super((resolve, reject) => withCancellation().then(resolve, reject));
}
}
/**
* A cancellable delay.
* #extends Promise
*/
class Delay extends CancellablePromise {
static get [Symbol.species]() { return Promise; }
constructor(delayMs, token) {
super(r => {
const id = setTimeout(r.resolve, delayMs);
r.token.register(() => clearTimeout(id));
}, token);
}
}
// main
async function main() {
const tokenSource = new prex.CancellationTokenSource();
const token = tokenSource.token;
setTimeout(() => tokenSource.cancel(), 2000); // cancel after 2000ms
let delay = 1000;
console.log(`delaying by ${delay}ms`);
await new Delay(delay, token);
console.log("successfully delayed."); // we should reach here
delay = 2000;
console.log(`delaying by ${delay}ms`);
await new Delay(delay, token);
console.log("successfully delayed."); // we should not reach here
}
main().catch(error => console.error(`Error caught, ${error}`));
Note that cancellation is a race. I.e., a promise may have been resolved successfully, but by the time you observe it (with await or then), the cancellation may have been triggered as well. It's up to you how you handle this race, but it doesn't hurts to call token.throwIfCancellationRequested() an extra time, like I do above.
I faced similar problem recently.
I had a promise based client (not a network one) and i wanted to always give the latest requested data to the user to keep the UI smooth.
After struggling with cancellation idea, Promise.race(...) and Promise.all(..) i just started remembering my last request id and when promise was fulfilled i was only rendering my data when it matched the id of a last request.
Hope it helps someone.
See https://www.npmjs.com/package/promise-abortable
$ npm install promise-abortable
You can make the promise reject before finishing:
// Our function to cancel promises receives a promise and return the same one and a cancel function
const cancellablePromise = (promiseToCancel) => {
let cancel
const promise = new Promise((resolve, reject) => {
cancel = reject
promiseToCancel
.then(resolve)
.catch(reject)
})
return {promise, cancel}
}
// A simple promise to exeute a function with a delay
const waitAndExecute = (time, functionToExecute) => new Promise((resolve, reject) => {
timeInMs = time * 1000
setTimeout(()=>{
console.log(`Waited ${time} secs`)
resolve(functionToExecute())
}, timeInMs)
})
// The promise that we will cancel
const fetchURL = () => fetch('https://pokeapi.co/api/v2/pokemon/ditto/')
// Create a function that resolve in 1 seconds. (We will cancel it in 0.5 secs)
const {promise, cancel} = cancellablePromise(waitAndExecute(1, fetchURL))
promise
.then((res) => {
console.log('then', res) // This will executed in 1 second
})
.catch(() => {
console.log('catch') // We will force the promise reject in 0.5 seconds
})
waitAndExecute(0.5, cancel) // Cancel previous promise in 0.5 seconds, so it will be rejected before finishing. Commenting this line will make the promise resolve
Unfortunately the fetch call has already be done, so you will see the call resolving in the Network tab. Your code will just ignore it.
Using the Promise subclass provided by the external package, this can be done as follows: Live demo
import CPromise from "c-promise2";
function fetchWithTimeout(url, {timeout, ...fetchOptions}= {}) {
return new CPromise((resolve, reject, {signal}) => {
fetch(url, {...fetchOptions, signal}).then(resolve, reject)
}, timeout)
}
const chain= fetchWithTimeout('http://localhost/')
.then(response => response.json())
.then(console.log, console.warn);
//chain.cancel(); call this to abort the promise and releated request
Using AbortController
I've been researching about this for a few days and I still feel that rejecting the promise inside an abort event handler is only part of the approach.
The thing is that as you may know, only rejecting a promise, makes the code awaiting for it to resume execution but if there's any code that runs after the rejection or resolution of the promise, or outside of its execution scope, e.g. Inside of an event listener or an async call, it will keep running, wasting cycles and maybe even memory on something that isn't really needed anymore.
Lacking approach
When executing the snippet below, after 2 seconds, the console will contain the output derived from the execution of the promise rejection, and any output derived from the pending work. The promise will be rejected and the work awaiting for it can continue, but the work will not, which in my opinion is the main point of this exercise.
let abortController = new AbortController();
new Promise( ( resolve, reject ) => {
if ( abortController.signal.aborted ) return;
let abortHandler = () => {
reject( 'Aborted' );
};
abortController.signal.addEventListener( 'abort', abortHandler );
setTimeout( () => {
console.log( 'Work' );
console.log( 'More work' );
resolve( 'Work result' );
abortController.signal.removeEventListener( 'abort', abortHandler );
}, 2000 );
} )
.then( result => console.log( 'then:', result ) )
.catch( reason => console.error( 'catch:', reason ) );
setTimeout( () => abortController.abort(), 1000 );
Which leads me to think that after defining the abort event handler there must be calls to
if ( abortController.signal.aborted ) return;
in sensible points of the code that is performing the work so that the work doesn't get performed and can gracefully stop if necessary (Adding more statements before the return in the if block above).
Proposal
This approach reminds me a little about the cancellable token proposal from a few years back but it will in fact prevent work to be performed in vain. The console output should now only be the abort error and nothing more and even, when the work is in progress, and then cancelled in the middle, it can stop, as said before in a sensible step of the processing, like at the beginning of a loop's body
let abortController = new AbortController();
new Promise( ( resolve, reject ) => {
if ( abortController.signal.aborted ) return;
let abortHandler = () => {
reject( 'Aborted' );
};
abortController.signal.addEventListener( 'abort', abortHandler );
setTimeout( () => {
if ( abortController.signal.aborted ) return;
console.log( 'Work' );
if ( abortController.signal.aborted ) return;
console.log( 'More work' );
resolve( 'Work result' );
abortController.signal.removeEventListener( 'abort', abortHandler );
}, 2000 );
} )
.then( result => console.log( 'then:', result ) )
.catch( reason => console.error( 'catch:', reason ) );
setTimeout( () => abortController.abort(), 1000 );
I found the posted solutions here a little hard to read, so I created a helper function that is in my opinion easier to use.
The helper function gives access to to the information whether the current call is already obsolete or not. With this information the function itself has to take care of things accordingly (usually by simply returning).
// Typescript
export function obsoletableFn<Res, Args extends unknown[]>(
fn: (isObsolete: () => boolean, ...args: Args) => Promise<Res>,
): (...args: Args) => Promise<Res> {
let lastCaller = null;
return (...args: Args) => {
const me = Symbol();
lastCaller = me;
const isObsolete = () => lastCaller !== me;
return fn(isObsolete, ...args);
};
}
// helper function
function obsoletableFn(fn) {
let lastCaller = null;
return (...args) => {
const me = Symbol();
lastCaller = me;
const isObsolete = () => lastCaller !== me;
return fn(isObsolete, ...args);
};
}
const simulateRequest = () => new Promise(resolve => setTimeout(resolve, Math.random() * 2000 + 1000));
// usage
const myFireAndForgetFn = obsoletableFn(async(isObsolete, x) => {
console.log(x, 'starting');
await simulateRequest();
if (isObsolete()) {
console.log(x, 'is obsolete');
// return, as there is already a more recent call running
return;
}
console.log(x, 'is not obsolete');
document.querySelector('div').innerHTML = `Response ${x}`;
});
myFireAndForgetFn('A');
myFireAndForgetFn('B');
<div>Waiting for response...</div>
So I have an async function that I needed to cancel on user input, but it's a long running one that involves mouse control.
I used p-queue and added each line in my function into it and have an observable that I feed the cancellation signal. Anything that the queue starts processing will run no matter what but you should be able to cancel anything after that by clearing the queue. The shorter the task you add to the queue, the sooner you can quit after getting the cancel signal. You can be lazy and throw whole chunks of code into the queue instead of the one liners i have in the example.
p-queue releases Version 6 works with commonjs, 7+ switches to ESM and could break your app. Breaks my electron/typescript/webpack one.
const cancellable_function = async () => {
const queue = new PQueue({concurrency:1});
queue.pause();
queue.addAll([
async () => await move_mouse({...}),
async () => await mouse_click({...}),
])
for await (const item of items) {
queue.addAll([
async () => await do_something({...}),
async () => await do_something_else({...}),
])
}
const {information} = await get_information();
queue.addAll([
async () => await move_mouse({...}),
async () => await mouse_click({...}),
])
cancel_signal$.pipe(take(1)).subscribe(() => {
queue.clear();
});
queue.start();
await queue.onEmpty()
}
Because #jib reject my modify, so I post my answer here. It's just the modfify of #jib's anwser with some comments and using more understandable variable names.
Below I just show examples of two different method: one is resolve() the other is reject()
let cancelCallback = () => {};
input.oninput = function(ev) {
let term = ev.target.value;
console.log(`searching for "${term}"`);
cancelCallback(); //cancel previous promise by calling cancelCallback()
let setCancelCallbackPromise = () => {
return new Promise((resolve, reject) => {
// set cancelCallback when running this promise
cancelCallback = () => {
// pass cancel messages by resolve()
return resolve('Canceled');
};
})
}
Promise.race([setCancelCallbackPromise(), getSearchResults(term)]).then(results => {
// check if the calling of resolve() is from cancelCallback() or getSearchResults()
if (results == 'Canceled') {
console.log("error(by resolve): ", results);
} else {
console.log(`results for "${term}"`, results);
}
});
}
input2.oninput = function(ev) {
let term = ev.target.value;
console.log(`searching for "${term}"`);
cancelCallback(); //cancel previous promise by calling cancelCallback()
let setCancelCallbackPromise = () => {
return new Promise((resolve, reject) => {
// set cancelCallback when running this promise
cancelCallback = () => {
// pass cancel messages by reject()
return reject('Canceled');
};
})
}
Promise.race([setCancelCallbackPromise(), getSearchResults(term)]).then(results => {
// check if the calling of resolve() is from cancelCallback() or getSearchResults()
if (results !== 'Canceled') {
console.log(`results for "${term}"`, results);
}
}).catch(error => {
console.log("error(by reject): ", error);
})
}
function getSearchResults(term) {
return new Promise(resolve => {
let timeout = 100 + Math.floor(Math.random() * 1900);
setTimeout(() => resolve([term.toLowerCase(), term.toUpperCase()]), timeout);
});
}
Search(use resolve): <input id="input">
<br> Search2(use reject and catch error): <input id="input2">

Categories