Using exponential backoff on chained requests - javascript

I have chained requests on GA API, meaning the second requests uses result from the first request. I wanted to use exponentialBackoff() function to retry if there is an error (status 429 for example).
My question is, what would be the best way of checking that both of the requests were successful and only repeat the one which wasn't. In my example, both of the requests get called again even if only the second one is not successful.
async function exponentialBackoff(toTry, max, delay) {
console.log('max', max, 'next delay', delay);
try {
let response = await toTry()
console.log(response);
} catch (e) {
if (max > 0) {
setTimeout(function () {
exponentialBackoff(toTry, --max, delay * 2);
}, delay);
} else {
console.log('we give up');
}
}
}
propertyArr.forEach(async function (e, i, a) {
await exponentialBackoff(async function () {
let response = await gapi.client.request({
path: 'https://analyticsadmin.googleapis.com/v1beta/properties',
method: 'POST',
body: e
})
return await gapi.client.request({
path: 'https://analyticsadmin.googleapis.com/v1beta/' + response.result.name + '/dataStreams',
body: dataStreamArr[i],
method: 'POST'
})
}, 10, 30)
})

Related

Is it possible to get the browser's network request timeout value programmatically? [duplicate]

I have a fetch-api POST request:
fetch(url, {
method: 'POST',
body: formData,
credentials: 'include'
})
I want to know what is the default timeout for this? and how can we set it to a particular value like 3 seconds or indefinite seconds?
Using a promise race solution will leave the request hanging and still consume bandwidth in the background and lower the max allowed concurrent request being made while it's still in process.
Instead use the AbortController to actually abort the request, Here is an example
const controller = new AbortController()
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000)
fetch(url, { signal: controller.signal }).then(response => {
// completed request before timeout fired
// If you only wanted to timeout the request, not the response, add:
// clearTimeout(timeoutId)
})
Alternative you can use the newly added AbortSignal.timeout(5000)... but it is not well implemented in most browser right now. All green env have this now. You will lose control over manually closing the request. Both upload and download will have to finish within a total time of 5s
// a polyfill for it would be:
AbortSignal.timeout ??= function timeout(ms) {
const ctrl = new AbortController()
setTimeout(() => ctrl.close(), ms)
return ctrl.signal
}
fetch(url, { signal: AbortSignal.timeout(5000) })
AbortController can be used for other things as well, not only fetch but for readable/writable streams as well. More newer functions (specially promise based ones) will use this more and more. NodeJS have also implemented AbortController into its streams/filesystem as well. I know web bluetooth are looking into it also. Now it can also be used with addEventListener option and have it stop listening when the signal ends
Update since my original answer is a bit outdated I recommend using abort controller like implemented here: https://stackoverflow.com/a/57888548/1059828 or take a look at this really good post explaining abort controller with fetch: How do I cancel an HTTP fetch() request?
outdated original answer:
I really like the clean approach from this gist using Promise.race
fetchWithTimeout.js
export default function (url, options, timeout = 7000) {
return Promise.race([
fetch(url, options),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('timeout')), timeout)
)
]);
}
main.js
import fetch from './fetchWithTimeout'
// call as usual or with timeout as 3rd argument
// throw after max 5 seconds timeout error
fetch('http://google.com', options, 5000)
.then((result) => {
// handle result
})
.catch((e) => {
// handle errors and timeout error
})
Edit 1
As pointed out in comments, the code in the original answer keeps running the timer even after the promise is resolved/rejected.
The code below fixes that issue.
function timeout(ms, promise) {
return new Promise((resolve, reject) => {
const timer = setTimeout(() => {
reject(new Error('TIMEOUT'))
}, ms)
promise
.then(value => {
clearTimeout(timer)
resolve(value)
})
.catch(reason => {
clearTimeout(timer)
reject(reason)
})
})
}
Original answer
It doesn't have a specified default; the specification doesn't discuss timeouts at all.
You can implement your own timeout wrapper for promises in general:
// Rough implementation. Untested.
function timeout(ms, promise) {
return new Promise(function(resolve, reject) {
setTimeout(function() {
reject(new Error("timeout"))
}, ms)
promise.then(resolve, reject)
})
}
timeout(1000, fetch('/hello')).then(function(response) {
// process response
}).catch(function(error) {
// might be a timeout error
})
As described in https://github.com/github/fetch/issues/175
Comment by https://github.com/mislav
Building on Endless' excellent answer, I created a helpful utility function.
const fetchTimeout = (url, ms, { signal, ...options } = {}) => {
const controller = new AbortController();
const promise = fetch(url, { signal: controller.signal, ...options });
if (signal) signal.addEventListener("abort", () => controller.abort());
const timeout = setTimeout(() => controller.abort(), ms);
return promise.finally(() => clearTimeout(timeout));
};
If the timeout is reached before the resource is fetched then the fetch is aborted.
If the resource is fetched before the timeout is reached then the timeout is cleared.
If the input signal is aborted then the fetch is aborted and the timeout is cleared.
const controller = new AbortController();
document.querySelector("button.cancel").addEventListener("click", () => controller.abort());
fetchTimeout("example.json", 5000, { signal: controller.signal })
.then(response => response.json())
.then(console.log)
.catch(error => {
if (error.name === "AbortError") {
// fetch aborted either due to timeout or due to user clicking the cancel button
} else {
// network error or json parsing error
}
});
there's no timeout support in the fetch API yet. But it could be achieved by wrapping it in a promise.
for eg.
function fetchWrapper(url, options, timeout) {
return new Promise((resolve, reject) => {
fetch(url, options).then(resolve, reject);
if (timeout) {
const e = new Error("Connection timed out");
setTimeout(reject, timeout, e);
}
});
}
If you haven't configured timeout in your code, It will be the default request timeout of your browser.
1) Firefox - 90 seconds
Type about:config in Firefox URL field. Find the value corresponding to key network.http.connection-timeout
2) Chrome - 300 seconds
Source
EDIT: The fetch request will still be running in the background and will most likely log an error in your console.
Indeed the Promise.race approach is better.
See this link for reference Promise.race()
Race means that all Promises will run at the same time, and the race will stop as soon as one of the promises returns a value.
Therefore, only one value will be returned.
You could also pass a function to call if the fetch times out.
fetchWithTimeout(url, {
method: 'POST',
body: formData,
credentials: 'include',
}, 5000, () => { /* do stuff here */ });
If this piques your interest, a possible implementation would be :
function fetchWithTimeout(url, options, delay, onTimeout) {
const timer = new Promise((resolve) => {
setTimeout(resolve, delay, {
timeout: true,
});
});
return Promise.race([
fetch(url, options),
timer
]).then(response => {
if (response.timeout) {
onTimeout();
}
return response;
});
}
A more clean way to do it is actually in MDN: https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal#aborting_a_fetch_operation_with_a_timeout
try {
await fetch(url, { signal: AbortSignal.timeout(5000) });
} catch (e) {
if (e.name === "TimeoutError") {
console.log('5000 ms timeout');
}
}
Here's a SSCCE using NodeJS which will timeout after 1000ms:
import fetch from 'node-fetch';
const controller = new AbortController();
const timeout = setTimeout(() => {
controller.abort();
}, 1000); // will time out after 1000ms
fetch('https://www.yourexample.com', {
signal: controller.signal,
method: 'POST',
body: formData,
credentials: 'include'
}
)
.then(response => response.json())
.then(json => console.log(json))
.catch(err => {
if(err.name === 'AbortError') {
console.log('Timed out');
}}
)
.finally( () => {
clearTimeout(timeout);
});
Using AbortController and setTimeout;
const abortController = new AbortController();
let timer: number | null = null;
fetch('/get', {
signal: abortController.signal, // Content to abortController
})
.then(res => {
// response success
console.log(res);
if (timer) {
clearTimeout(timer); // clear timer
}
})
.catch(err => {
if (err instanceof DOMException && err.name === 'AbortError') {
// will return a DOMException
return;
}
// other errors
});
timer = setTimeout(() => {
abortController.abort();
}, 1000 * 10); // Abort request in 10s.
This is a fragment in #fatcherjs/middleware-aborter.
By using fatcher, it can easy to abort a fetch request.
import { aborter } from '#fatcherjs/middleware-aborter';
import { fatcher, isAbortError } from 'fatcher';
fatcher({
url: '/bar/foo',
middlewares: [
aborter({
timeout: 10 * 1000, // 10s
onAbort: () => {
console.log('Request is Aborted.');
},
}),
],
})
.then(res => {
// Request success in 10s
console.log(res);
})
.catch(err => {
if (isAbortError(err)) {
//Run error when request aborted.
console.error(err);
}
// Other errors.
});
fetchTimeout (url,options,timeout=3000) {
return new Promise( (resolve, reject) => {
fetch(url, options)
.then(resolve,reject)
setTimeout(reject,timeout);
})
}
You can create a timeoutPromise wrapper
function timeoutPromise(timeout, err, promise) {
return new Promise(function(resolve,reject) {
promise.then(resolve,reject);
setTimeout(reject.bind(null,err), timeout);
});
}
You can then wrap any promise
timeoutPromise(100, new Error('Timed Out!'), fetch(...))
.then(...)
.catch(...)
It won't actually cancel an underlying connection but will allow you to timeout a promise.
Reference
Proper error handling tips
Normal practice:
To add timeout support most of the time it is suggested to introduce a Promise utility function like this:
function fetchWithTimeout(resource, { signal, timeout, ...options } = {}) {
const controller = new AbortController();
if (signal != null) signal.addEventListener("abort", controller.abort);
const id = timeout != null ? setTimeout(controller.abort, timeout) : undefined;
return fetch(resource, {
...options,
signal: controller.signal
}).finally(() => {
if (id != null) clearTimeout(id);
});
}
Calling controller.abort or rejecting the promise inside the setTimeout callback function distorts the stack trace.
This is suboptimal, since one would have to add boilerplate error handlers with log messages in the functions calling the fetch method if post-error log analysis is required.
Good expertise:
To preserve the error along with it's stack trace one can apply the following technique:
function sleep(ms = 0, signal) {
return new Promise((resolve, reject) => {
const id = setTimeout(() => resolve(), ms);
signal?.addEventListener("abort", () => {
clearTimeout(id);
reject();
});
});
}
async function fetch(
resource,
options
) {
const { timeout, signal, ...ropts } = options ?? {};
const controller = new AbortController();
let sleepController;
try {
signal?.addEventListener("abort", () => controller.abort());
const request = nodeFetch(resource, {
...ropts,
signal: controller.signal,
});
if (timeout != null) {
sleepController = new AbortController();
const aborter = sleep(timeout, sleepController.signal);
const race = await Promise.race([aborter, request]);
if (race == null) controller.abort();
}
return request;
} finally {
sleepController?.abort();
}
}
(async () => {
try {
await fetchWithTimeout(new URL(window.location.href), { timeout: 5 });
} catch (error) {
console.error("Error in test", error);
}
})();
Using c-promise2 lib the cancellable fetch with timeout might look like this one (Live jsfiddle demo):
import CPromise from "c-promise2"; // npm package
function fetchWithTimeout(url, {timeout, ...fetchOptions}= {}) {
return new CPromise((resolve, reject, {signal}) => {
fetch(url, {...fetchOptions, signal}).then(resolve, reject)
}, timeout)
}
const chain = fetchWithTimeout("https://run.mocky.io/v3/753aa609-65ae-4109-8f83-9cfe365290f0?mocky-delay=10s", {timeout: 5000})
.then(request=> console.log('done'));
// chain.cancel(); - to abort the request before the timeout
This code as a npm package cp-fetch

Async Funtion Return recursive call with setTimeout

I am calling a smart contract function to get some status which is the same as calling an API. And, I need to check if status.fulfilled===true before returning it to front-end. To do this I need to call the API every second to return the result as soon as possible. It usually takes 5-20 seconds for it to be fulfilled.
Here is how i tried to do it:
async function getStatus(requestId) {
try {
await Moralis.enableWeb3({ provider: 'metamask' });
const options = {
contractAddress: coinFlipAddress,
functionName: 'getStatus',
abi: coinFlipABI,
params: { requestId },
};
var status = await Moralis.executeFunction(options);
console.log(status);
if (status.fulfilled) {
console.log('fulfilled');
return status;
} else {
setTimeout(async () => {
return await getStatus(requestId);
}, 1000);
}
} catch (err) {
console.log(err);
return { error: err };
}
}
This keeps calling the getStatus function recursively until status.fulfilled===trueand console.log('fulfilled'); also logs when it is fulfilled, but it doesn't return it to where It is first initialized.
const handleFlip = async (choice) => {
setCurrentChoice(null);
setMetamaskInProgress(true);
const transaction = await flip(choice, amount);
setMetamaskInProgress(false);
setCurrentChoice(choices[choice]);
setFlipping(true);
setResult('Spinning');
const requestId = waitForConfirmation(transaction);
const result = await getStatus(requestId); //This is the initial call to getStatus()
console.log('RESULT ' + result);
if (result) {
setFlipping(false);
setSide(result.hasWon ? (choice === '0' ? 'Heads' : 'Tails') : choice === '0' ? 'Tails' : 'Heads');
setResult(result.hasWon ? 'You have won!' : 'You have lost :(');
}
};
What am I doing wrong? Also, could this recursive calls create any problems with memory? If yes, do you have any suggestions to handle this case differently?
You cannot return from a setTimeout callback. You'll need to promisify that, wait, and return afterwards:
async function getStatus(requestId) {
try {
await Moralis.enableWeb3({ provider: 'metamask' });
const options = {
contractAddress: coinFlipAddress,
functionName: 'getStatus',
abi: coinFlipABI,
params: { requestId },
};
var status = await Moralis.executeFunction(options);
console.log(status);
if (status.fulfilled) {
console.log('fulfilled');
return status;
} else {
await new Promise(resolve => {
setTimeout(resolve, 1000);
});
return getStatus(requestId);
}
} catch (err) {
console.log(err);
return { error: err };
}
}
I would have done something like this :
const MAX_RETRIES = 10; //maximum retries
const REQUEST_DELAY = 1000; //delay between requests (milliseconds)
// JS Implementation of the wide known `sleep` function
const sleep = (time) => new Promise(res => setTimeout(res, time, "done."));
/**
* Retrieve the status
* #param {string} requestId
*/
const getStatus = async (requestId) => {
try {
await Moralis.enableWeb3({ provider: 'metamask' }); //Guessing you need to call this only once
const options = {
contractAddress: coinFlipAddress,
functionName: 'getStatus',
abi: coinFlipABI,
params: { requestId },
};
let retries = 0;
while(retries < MAX_RETRIES) {
let status = await Moralis.executeFunction(options); // check status
console.log('attemtp %d | status %s', retries, status);
if (status.fulfilled) {
return status
}
await sleep(REQUEST_DELAY);
retries++;
}
throw new Error('Unable to retrieve status in time');
} catch (error) {
console.error('Error while fetching status', error);
throw error;
}
}
Few notes here :
I took the constants out of the function for more clarity,
use the while to loop for the number of retries.
Used a widely known 'sleep' method to create a delay between requests, and throw an error whenever something's not supposed to happen, happens (up to you to edit according to your needs).
Finally I used arrow functions for simplicity of use, know it's up to you ;)
You could try this change maxRetry and spaceBetweenRetry as you wish
async function getStatus(requestId) {
return new Promise(async (resolve, reject) => {
try {
let maxRetry = 10; //how many times you want to retry
let spaceBetweenRetry = 1000; // sleep between retries in ms
await Moralis.enableWeb3({ provider: 'metamask' }); //Guessing you need to call this only once
const options = {
contractAddress: coinFlipAddress,
functionName: 'getStatus',
abi: coinFlipABI,
params: { requestId },
};
for (let index = 0; index < maxRetry; index++) {
var status = await Moralis.executeFunction(options); // check status
console.log(status);
if (status.fulfilled) {
resolve(status) //return the promise if fullfilled.
}
await new Promise((r) => setTimeout(r, spaceBetweenRetry)); // sleep for spaceBetweenRetry ms
}
} catch (err) {
reject(err)
}
})}

async/await doesn't wait before executing next line in JS

I found an exponential backoff function which is supossed to retry an API fetch in case of an error in exponential delays:
async function exponentialBackoff(toTry, max, delay) {
console.log('max', max, 'next delay', delay);
try {
let response = await toTry()
return response;
} catch (e) {
if (max > 0) {
setTimeout(function () {
exponentialBackoff(toTry, --max, delay * 2);
}, delay);
} else {
console.log('maximum amount of tries exceeded', e);
return e;
}
}
}
I then use the function with the help of GA API library to make a request. On purpose I am sending an invalid body to make the request fail:
response = await exponentialBackoff(async function () {
return await gapi.client.request({
path: 'https://analyticsadmin.googleapis.com/v1beta/properties',
method: 'POST',
body: property
})
}, 10, 30)
console.log("NEXT LINE");
What I would expect is that the exponential backoff will run out of all attempts (10) and only after that it would execute the console log.
What actually happens is that the console log gets executed right after first try. What am I doing wrong there?
Thank you
Promisify setTimeout and await it. A promisified variant is
function sleep(ms) { return new Promise(res => { setTimeout(res, ms); }); }
await exponentialBackoff in the catch block.
The fixed code is:
function sleep(ms) { return new Promise(res => { setTimeout(res, ms); }); }
async function exponentialBackoff(toTry, max, delay) {
console.log('max', max, 'next delay', delay);
try {
let response = await toTry()
return response;
} catch (e) {
if (max > 0) {
await sleep(delay);
return await exponentialBackoff(toTry, --max, delay * 2);
} else {
console.log('maximum amount of tries exceeded', e);
return e;
}
}
}

Proper way to use setInterval on multiple HTTP requests with Node

Say you have a list which contains 3 ~ 5 urls. Requests should be made every 5 seconds, one by one. I wrote code like :
setInterval(() => {
array.foreach(element => {
request
.get(element.url)
.on('response', function(response) {
console.log(response);
}
});
}, 5000)
But because the enqueueing setInterval is faster than executing the request, the .on callback is not responding. I think, before enqueueing setInterval, I must ensure that the I've finished .on callback.
How can I do that?
Recommend you the Async.js module.
If you want a async loop to solve, you can:
var i = 0;
async.whilst(
function() { return i < array.length; },
function(next) {
var elemet = array[i++];
request
.get(element.url)
.on('response', function(response) {
console.log(response);
// Do next while 5s passed
setTimeout(next, 5000);
}
}
},
function (err, n) {
// All Task over
}
);
Or if you want try some concurrency:
// create a queue object with concurrency 2
var q = async.queue(function(task, next) {
console.log('request ' + task.url);
request
.get(task.url)
.on('response', function(response) {
console.log(response);
next();
}
}
}, 2);
// assign a callback
q.drain = function() {
console.log('all task have been processed');
};
// Push a request task to the queue every 5s
setInterval(function () {
let task = array.shift()
if (task) {
q.push(task, function(err) {
console.log('finished processing', task.url);
});
}
}, 5000);
Can you please try this?
var requestArray = [
{
url: 'some URL number 1'
},
{
url: 'some URL number 2'
},
{
url: 'some URL number 2'
}
]
function makeRequests(){
for(var i=0; i<requestArray.length; i++){
request.get(requestArray[i].url)
.on('response',function(response){
console.log(response);
if(i == requestArray.length - 1)//You can check any condition here if you want to stop executing get requests{
makeRequests();
}
})
}
}

Fetch API request timeout?

I have a fetch-api POST request:
fetch(url, {
method: 'POST',
body: formData,
credentials: 'include'
})
I want to know what is the default timeout for this? and how can we set it to a particular value like 3 seconds or indefinite seconds?
Using a promise race solution will leave the request hanging and still consume bandwidth in the background and lower the max allowed concurrent request being made while it's still in process.
Instead use the AbortController to actually abort the request, Here is an example
const controller = new AbortController()
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000)
fetch(url, { signal: controller.signal }).then(response => {
// completed request before timeout fired
// If you only wanted to timeout the request, not the response, add:
// clearTimeout(timeoutId)
})
Alternative you can use the newly added AbortSignal.timeout(5000)... but it is not well implemented in most browser right now. All green env have this now. You will lose control over manually closing the request. Both upload and download will have to finish within a total time of 5s
// a polyfill for it would be:
AbortSignal.timeout ??= function timeout(ms) {
const ctrl = new AbortController()
setTimeout(() => ctrl.close(), ms)
return ctrl.signal
}
fetch(url, { signal: AbortSignal.timeout(5000) })
AbortController can be used for other things as well, not only fetch but for readable/writable streams as well. More newer functions (specially promise based ones) will use this more and more. NodeJS have also implemented AbortController into its streams/filesystem as well. I know web bluetooth are looking into it also. Now it can also be used with addEventListener option and have it stop listening when the signal ends
Update since my original answer is a bit outdated I recommend using abort controller like implemented here: https://stackoverflow.com/a/57888548/1059828 or take a look at this really good post explaining abort controller with fetch: How do I cancel an HTTP fetch() request?
outdated original answer:
I really like the clean approach from this gist using Promise.race
fetchWithTimeout.js
export default function (url, options, timeout = 7000) {
return Promise.race([
fetch(url, options),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('timeout')), timeout)
)
]);
}
main.js
import fetch from './fetchWithTimeout'
// call as usual or with timeout as 3rd argument
// throw after max 5 seconds timeout error
fetch('http://google.com', options, 5000)
.then((result) => {
// handle result
})
.catch((e) => {
// handle errors and timeout error
})
Edit 1
As pointed out in comments, the code in the original answer keeps running the timer even after the promise is resolved/rejected.
The code below fixes that issue.
function timeout(ms, promise) {
return new Promise((resolve, reject) => {
const timer = setTimeout(() => {
reject(new Error('TIMEOUT'))
}, ms)
promise
.then(value => {
clearTimeout(timer)
resolve(value)
})
.catch(reason => {
clearTimeout(timer)
reject(reason)
})
})
}
Original answer
It doesn't have a specified default; the specification doesn't discuss timeouts at all.
You can implement your own timeout wrapper for promises in general:
// Rough implementation. Untested.
function timeout(ms, promise) {
return new Promise(function(resolve, reject) {
setTimeout(function() {
reject(new Error("timeout"))
}, ms)
promise.then(resolve, reject)
})
}
timeout(1000, fetch('/hello')).then(function(response) {
// process response
}).catch(function(error) {
// might be a timeout error
})
As described in https://github.com/github/fetch/issues/175
Comment by https://github.com/mislav
Building on Endless' excellent answer, I created a helpful utility function.
const fetchTimeout = (url, ms, { signal, ...options } = {}) => {
const controller = new AbortController();
const promise = fetch(url, { signal: controller.signal, ...options });
if (signal) signal.addEventListener("abort", () => controller.abort());
const timeout = setTimeout(() => controller.abort(), ms);
return promise.finally(() => clearTimeout(timeout));
};
If the timeout is reached before the resource is fetched then the fetch is aborted.
If the resource is fetched before the timeout is reached then the timeout is cleared.
If the input signal is aborted then the fetch is aborted and the timeout is cleared.
const controller = new AbortController();
document.querySelector("button.cancel").addEventListener("click", () => controller.abort());
fetchTimeout("example.json", 5000, { signal: controller.signal })
.then(response => response.json())
.then(console.log)
.catch(error => {
if (error.name === "AbortError") {
// fetch aborted either due to timeout or due to user clicking the cancel button
} else {
// network error or json parsing error
}
});
there's no timeout support in the fetch API yet. But it could be achieved by wrapping it in a promise.
for eg.
function fetchWrapper(url, options, timeout) {
return new Promise((resolve, reject) => {
fetch(url, options).then(resolve, reject);
if (timeout) {
const e = new Error("Connection timed out");
setTimeout(reject, timeout, e);
}
});
}
If you haven't configured timeout in your code, It will be the default request timeout of your browser.
1) Firefox - 90 seconds
Type about:config in Firefox URL field. Find the value corresponding to key network.http.connection-timeout
2) Chrome - 300 seconds
Source
EDIT: The fetch request will still be running in the background and will most likely log an error in your console.
Indeed the Promise.race approach is better.
See this link for reference Promise.race()
Race means that all Promises will run at the same time, and the race will stop as soon as one of the promises returns a value.
Therefore, only one value will be returned.
You could also pass a function to call if the fetch times out.
fetchWithTimeout(url, {
method: 'POST',
body: formData,
credentials: 'include',
}, 5000, () => { /* do stuff here */ });
If this piques your interest, a possible implementation would be :
function fetchWithTimeout(url, options, delay, onTimeout) {
const timer = new Promise((resolve) => {
setTimeout(resolve, delay, {
timeout: true,
});
});
return Promise.race([
fetch(url, options),
timer
]).then(response => {
if (response.timeout) {
onTimeout();
}
return response;
});
}
A more clean way to do it is actually in MDN: https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal#aborting_a_fetch_operation_with_a_timeout
try {
await fetch(url, { signal: AbortSignal.timeout(5000) });
} catch (e) {
if (e.name === "TimeoutError") {
console.log('5000 ms timeout');
}
}
Here's a SSCCE using NodeJS which will timeout after 1000ms:
import fetch from 'node-fetch';
const controller = new AbortController();
const timeout = setTimeout(() => {
controller.abort();
}, 1000); // will time out after 1000ms
fetch('https://www.yourexample.com', {
signal: controller.signal,
method: 'POST',
body: formData,
credentials: 'include'
}
)
.then(response => response.json())
.then(json => console.log(json))
.catch(err => {
if(err.name === 'AbortError') {
console.log('Timed out');
}}
)
.finally( () => {
clearTimeout(timeout);
});
Using AbortController and setTimeout;
const abortController = new AbortController();
let timer: number | null = null;
fetch('/get', {
signal: abortController.signal, // Content to abortController
})
.then(res => {
// response success
console.log(res);
if (timer) {
clearTimeout(timer); // clear timer
}
})
.catch(err => {
if (err instanceof DOMException && err.name === 'AbortError') {
// will return a DOMException
return;
}
// other errors
});
timer = setTimeout(() => {
abortController.abort();
}, 1000 * 10); // Abort request in 10s.
This is a fragment in #fatcherjs/middleware-aborter.
By using fatcher, it can easy to abort a fetch request.
import { aborter } from '#fatcherjs/middleware-aborter';
import { fatcher, isAbortError } from 'fatcher';
fatcher({
url: '/bar/foo',
middlewares: [
aborter({
timeout: 10 * 1000, // 10s
onAbort: () => {
console.log('Request is Aborted.');
},
}),
],
})
.then(res => {
// Request success in 10s
console.log(res);
})
.catch(err => {
if (isAbortError(err)) {
//Run error when request aborted.
console.error(err);
}
// Other errors.
});
fetchTimeout (url,options,timeout=3000) {
return new Promise( (resolve, reject) => {
fetch(url, options)
.then(resolve,reject)
setTimeout(reject,timeout);
})
}
You can create a timeoutPromise wrapper
function timeoutPromise(timeout, err, promise) {
return new Promise(function(resolve,reject) {
promise.then(resolve,reject);
setTimeout(reject.bind(null,err), timeout);
});
}
You can then wrap any promise
timeoutPromise(100, new Error('Timed Out!'), fetch(...))
.then(...)
.catch(...)
It won't actually cancel an underlying connection but will allow you to timeout a promise.
Reference
Proper error handling tips
Normal practice:
To add timeout support most of the time it is suggested to introduce a Promise utility function like this:
function fetchWithTimeout(resource, { signal, timeout, ...options } = {}) {
const controller = new AbortController();
if (signal != null) signal.addEventListener("abort", controller.abort);
const id = timeout != null ? setTimeout(controller.abort, timeout) : undefined;
return fetch(resource, {
...options,
signal: controller.signal
}).finally(() => {
if (id != null) clearTimeout(id);
});
}
Calling controller.abort or rejecting the promise inside the setTimeout callback function distorts the stack trace.
This is suboptimal, since one would have to add boilerplate error handlers with log messages in the functions calling the fetch method if post-error log analysis is required.
Good expertise:
To preserve the error along with it's stack trace one can apply the following technique:
function sleep(ms = 0, signal) {
return new Promise((resolve, reject) => {
const id = setTimeout(() => resolve(), ms);
signal?.addEventListener("abort", () => {
clearTimeout(id);
reject();
});
});
}
async function fetch(
resource,
options
) {
const { timeout, signal, ...ropts } = options ?? {};
const controller = new AbortController();
let sleepController;
try {
signal?.addEventListener("abort", () => controller.abort());
const request = nodeFetch(resource, {
...ropts,
signal: controller.signal,
});
if (timeout != null) {
sleepController = new AbortController();
const aborter = sleep(timeout, sleepController.signal);
const race = await Promise.race([aborter, request]);
if (race == null) controller.abort();
}
return request;
} finally {
sleepController?.abort();
}
}
(async () => {
try {
await fetchWithTimeout(new URL(window.location.href), { timeout: 5 });
} catch (error) {
console.error("Error in test", error);
}
})();
Using c-promise2 lib the cancellable fetch with timeout might look like this one (Live jsfiddle demo):
import CPromise from "c-promise2"; // npm package
function fetchWithTimeout(url, {timeout, ...fetchOptions}= {}) {
return new CPromise((resolve, reject, {signal}) => {
fetch(url, {...fetchOptions, signal}).then(resolve, reject)
}, timeout)
}
const chain = fetchWithTimeout("https://run.mocky.io/v3/753aa609-65ae-4109-8f83-9cfe365290f0?mocky-delay=10s", {timeout: 5000})
.then(request=> console.log('done'));
// chain.cancel(); - to abort the request before the timeout
This code as a npm package cp-fetch

Categories