Wait X time, then continue with the loop - JS React - javascript

I have a loop that cycles around and creates REST Queries, creating and running a different one each time.
In the future I imagine the number of REST Queries that could be made will increase far higher than the browser will take at once (over 20,000).
I want to do something along the lines of counting how many loops have been done, and after every 500 or so, pausing for a few seconds to allow the browser to catch up with the REST responses, and then continuing.
How is this done in react JS?
example code:
for (var i = 0; i < array.length; i++)
{
query += i;
axious.get(query) . then { ...does stuff here... }
//want something along the lines of if multiple of 500, wait(1000)
}

the simplest approach is to make a waitIf function that returns a Promise and accepts a condition.
If the condition is true, it will wait then execute the callback, otherwise, it will execute the callback directly.
a simple implementation would be.
function waitIf(condition, duration, callback) {
if (condition) {
// return a Promise that wait for a `duration` ms using timeout
return Promise((resolve) => {
setTimeout(() => {
resolve(callback());
}, duration);
})
}
return callback();
}
for (let i = 0; i < array.length; i++) {
query += i;
// unify the call here
waitIf(i % 500 === 0, 1000, () => axious.get(query)).then();
}

Related

Promise not running asynchronously [duplicate]

How can I make a simple, non-block Javascript function call? For example:
//begin the program
console.log('begin');
nonBlockingIncrement(10000000);
console.log('do more stuff');
//define the slow function; this would normally be a server call
function nonBlockingIncrement(n){
var i=0;
while(i<n){
i++;
}
console.log('0 incremented to '+i);
}
outputs
"beginPage"
"0 incremented to 10000000"
"do more stuff"
How can I form this simple loop to execute asynchronously and output the results via a callback function? The idea is to not block "do more stuff":
"beginPage"
"do more stuff"
"0 incremented to 10000000"
I've tried following tutorials on callbacks and continuations, but they all seem to rely on external libraries or functions. None of them answer the question in a vacuum: how does one write Javascript code to be non-blocking!?
I have searched very hard for this answer before asking; please don't assume I didn't look. Everything I found is Node.js specific ([1], [2], [3], [4], [5]) or otherwise specific to other functions or libraries ([6], [7], [8], [9], [10], [11]), notably JQuery and setTimeout(). Please help me write non-blocking code using Javascript, not Javascript-written tools like JQuery and Node. Kindly reread the question before marking it as duplicate.
To make your loop non-blocking, you must break it into sections and allow the JS event processing loop to consume user events before carrying on to the next section.
The easiest way to achieve this is to do a certain amount of work, and then use setTimeout(..., 0) to queue the next chunk of work. Crucially, that queueing allows the JS event loop to process any events that have been queued in the meantime before going on to the next piece of work:
function yieldingLoop(count, chunksize, callback, finished) {
var i = 0;
(function chunk() {
var end = Math.min(i + chunksize, count);
for ( ; i < end; ++i) {
callback.call(null, i);
}
if (i < count) {
setTimeout(chunk, 0);
} else {
finished.call(null);
}
})();
}
with usage:
yieldingLoop(1000000, 1000, function(i) {
// use i here
}, function() {
// loop done here
});
See http://jsfiddle.net/alnitak/x3bwjjo6/ for a demo where the callback function just sets a variable to the current iteration count, and a separate setTimeout based loop polls the current value of that variable and updates the page with its value.
SetTimeout with callbacks is the way to go. Though, understand your function scopes are not the same as in C# or another multi-threaded environment.
Javascript does not wait for your function's callback to finish.
If you say:
function doThisThing(theseArgs) {
setTimeout(function (theseArgs) { doThatOtherThing(theseArgs); }, 1000);
alert('hello world');
}
Your alert will fire before the function you passed will.
The difference being that alert blocked the thread, but your callback did not.
There are in general two ways to do this as far as I know. One is to use setTimeout (or requestAnimationFrame if you are doing this in a supporting environment). #Alnitak shown how to do this in another answer. Another way is to use a web worker to finish your blocking logic in a separate thread, so that the main UI thread is not blocked.
Using requestAnimationFrame or setTimeout:
//begin the program
console.log('begin');
nonBlockingIncrement(100, function (currentI, done) {
if (done) {
console.log('0 incremented to ' + currentI);
}
});
console.log('do more stuff');
//define the slow function; this would normally be a server call
function nonBlockingIncrement(n, callback){
var i = 0;
function loop () {
if (i < n) {
i++;
callback(i, false);
(window.requestAnimationFrame || window.setTimeout)(loop);
}
else {
callback(i, true);
}
}
loop();
}
Using web worker:
/***** Your worker.js *****/
this.addEventListener('message', function (e) {
var i = 0;
while (i < e.data.target) {
i++;
}
this.postMessage({
done: true,
currentI: i,
caller: e.data.caller
});
});
/***** Your main program *****/
//begin the program
console.log('begin');
nonBlockingIncrement(100, function (currentI, done) {
if (done) {
console.log('0 incremented to ' + currentI);
}
});
console.log('do more stuff');
// Create web worker and callback register
var worker = new Worker('./worker.js'),
callbacks = {};
worker.addEventListener('message', function (e) {
callbacks[e.data.caller](e.data.currentI, e.data.done);
});
//define the slow function; this would normally be a server call
function nonBlockingIncrement(n, callback){
const caller = 'nonBlockingIncrement';
callbacks[caller] = callback;
worker.postMessage({
target: n,
caller: caller
});
}
You cannot run the web worker solution as it requires a separate worker.js file to host worker logic.
You cannot execute Two loops at the same time, remember that JS is single thread.
So, doing this will never work
function loopTest() {
var test = 0
for (var i; i<=100000000000, i++) {
test +=1
}
return test
}
setTimeout(()=>{
//This will block everything, so the second won't start until this loop ends
console.log(loopTest())
}, 1)
setTimeout(()=>{
console.log(loopTest())
}, 1)
If you want to achieve multi thread you have to use Web Workers, but they have to have a separated js file and you only can pass objects to them.
But, I've managed to use Web Workers without separated files by genering Blob files and i can pass them callback functions too.
//A fileless Web Worker
class ChildProcess {
//#param {any} ags, Any kind of arguments that will be used in the callback, functions too
constructor(...ags) {
this.args = ags.map(a => (typeof a == 'function') ? {type:'fn', fn:a.toString()} : a)
}
//#param {function} cb, To be executed, the params must be the same number of passed in the constructor
async exec(cb) {
var wk_string = this.worker.toString();
wk_string = wk_string.substring(wk_string.indexOf('{') + 1, wk_string.lastIndexOf('}'));
var wk_link = window.URL.createObjectURL( new Blob([ wk_string ]) );
var wk = new Worker(wk_link);
wk.postMessage({ callback: cb.toString(), args: this.args });
var resultado = await new Promise((next, error) => {
wk.onmessage = e => (e.data && e.data.error) ? error(e.data.error) : next(e.data);
wk.onerror = e => error(e.message);
})
wk.terminate(); window.URL.revokeObjectURL(wk_link);
return resultado
}
worker() {
onmessage = async function (e) {
try {
var cb = new Function(`return ${e.data.callback}`)();
var args = e.data.args.map(p => (p.type == 'fn') ? new Function(`return ${p.fn}`)() : p);
try {
var result = await cb.apply(this, args); //If it is a promise or async function
return postMessage(result)
} catch (e) { throw new Error(`CallbackError: ${e}`) }
} catch (e) { postMessage({error: e.message}) }
}
}
}
setInterval(()=>{console.log('Not blocked code ' + Math.random())}, 1000)
console.log("starting blocking synchronous code in Worker")
console.time("\nblocked");
var proc = new ChildProcess(blockCpu, 43434234);
proc.exec(function(block, num) {
//This will block for 10 sec, but
block(10000) //This blockCpu function is defined below
return `\n\nbla bla ${num}\n` //Captured in the resolved promise
}).then(function (result){
console.timeEnd("\nblocked")
console.log("End of blocking code", result)
})
.catch(function(error) { console.log(error) })
//random blocking function
function blockCpu(ms) {
var now = new Date().getTime();
var result = 0
while(true) {
result += Math.random() * Math.random();
if (new Date().getTime() > now +ms)
return;
}
}
For very long tasks, a Web-Worker should be preferred, however for small-enough tasks (< a couple of seconds) or for when you can't move the task to a Worker (e.g because you needs to access the DOM or whatnot, Alnitak's solution of splitting the code in chunks is the way to go.
Nowadays, this can be rewritten in a cleaner way thanks to async/await syntax.
Also, instead of waiting for setTimeout() (which is delayed to at least 1ms in node-js and to 4ms everywhere after the 5th recursive call), it's better to use a MessageChannel.
So this gives us
const waitForNextTask = () => {
const { port1, port2 } = waitForNextTask.channel ??= new MessageChannel();
return new Promise( (res) => {
port1.addEventListener("message", () => res(), { once: true } );
port1.start();
port2.postMessage("");
} );
};
async function doSomethingSlow() {
const chunk_size = 10000;
// do something slow, like counting from 0 to Infinity
for (let i = 0; i < Infinity; i++ ) {
// we've done a full chunk, let the event-loop loop
if( i % chunk_size === 0 ) {
log.textContent = i; // just for demo, to check we're really doing something
await waitForNextTask();
}
}
console.log("Ah! Did it!");
}
console.log("starting my slow computation");
doSomethingSlow();
console.log("started my slow computation");
setTimeout(() => console.log("my slow computation is probably still running"), 5000);
<pre id="log"></pre>
Using ECMA async function it's very easy to write non-blocking async code, even if it performs CPU-bound operations. Let's do this on a typical academic task - Fibonacci calculation for the incredible huge value.
All you need is to insert an operation that allows the event loop to be reached from time to time. Using this approach, you will never freeze the user interface or I/O.
Basic implementation:
const fibAsync = async (n) => {
let lastTimeCalled = Date.now();
let a = 1n,
b = 1n,
sum,
i = n - 2;
while (i-- > 0) {
sum = a + b;
a = b;
b = sum;
if (Date.now() - lastTimeCalled > 15) { // Do we need to poll the eventloop?
lastTimeCalled = Date.now();
await new Promise((resolve) => setTimeout(resolve, 0)); // do that
}
}
return b;
};
And now we can use it (Live Demo):
let ticks = 0;
console.warn("Calulation started");
fibAsync(100000)
.then((v) => console.log(`Ticks: ${ticks}\nResult: ${v}`), console.warn)
.finally(() => {
clearTimeout(timer);
});
const timer = setInterval(
() => console.log("timer tick - eventloop is not freezed", ticks++),
0
);
As we can see, the timer is running normally, which indicates the event loop is not blocking.
I published an improved implementation of these helpers as antifreeze2 npm package. It uses setImmediate internally, so to get the maximum performance you need to import setImmediate polyfill for environments without native support.
Live Demo
import { antifreeze, isNeeded } from "antifreeze2";
const fibAsync = async (n) => {
let a = 1n,
b = 1n,
sum,
i = n - 2;
while (i-- > 0) {
sum = a + b;
a = b;
b = sum;
if (isNeeded()) {
await antifreeze();
}
}
return b;
};
If you are using jQuery, I created a deferred implementation of Alnitak's answer
function deferredEach (arr, batchSize) {
var deferred = $.Deferred();
var index = 0;
function chunk () {
var lastIndex = Math.min(index + batchSize, arr.length);
for(;index<lastIndex;index++){
deferred.notify(index, arr[index]);
}
if (index >= arr.length) {
deferred.resolve();
} else {
setTimeout(chunk, 0);
}
};
setTimeout(chunk, 0);
return deferred.promise();
}
Then you'll be able to use the returned promise to manage the progress and done callback:
var testArray =["Banana", "Orange", "Apple", "Mango"];
deferredEach(testArray, 2).progress(function(index, item){
alert(item);
}).done(function(){
alert("Done!");
})
I managed to get an extremely short algorithm using functions. Here is an example:
let l=($,a,f,r)=>{f(r||0),$((r=a(r||0))||0)&&l($,a,f,r)};
l
(i => i < 4, i => i+1, console.log)
/*
output:
0
1
2
3
*/
I know this looks very complicated, so let me explain what is really going on here.
Here is a slightly simplified version of the l function.
let l_smpl = (a,b,c,d) => {c(d||0);d=b(d||0),a(d||0)&&l_smpl(a,b,c,d)||0}
First step in the loop, l_smpl calls your callback and passes in d - the index. If d is undefined, as it would be on the first call, it changes it to 0.
Next, it updates d by calling your updater function and setting d to the result. In our case, the updater function would add 1 to the index.
The next step checks if your condition is met by calling the first function and checking if the value is true meaning the loop is not done. If so, it calls the function again, or otherwise, it returns 0 to end the loop.

Callback is being called only after big for loop ends

I'm receiving data in browser through websockets (paho-mqtt) but problem is that the receiving callback gets fired only when another task ends (big for loop) and it gets fired with all the stacked data, I'm not losing data just getting delayed. Shouldn't the callback get fired even if there is a loop running? What is happening here?. Otherwise, how can I achieve this, keep receiving while inside a loop?
What I'm trying to say is equivalent to the following:
If I do this in chrome
setTimeout(() => {
console.log('hello!');
}, 10);
for (var i = 0; i < 50000; i++) {
console.log('for array');
}
I get
50000 VM15292:5 for array
VM15292:2 hello!
Shouldn't I get something like this?
1000 VM15292:5 for array
VM15292:2 hello!
49000 VM15292:5 for array
When you run JavaScript code in the browser (unless using Web Workers or other special technologies), it is executed on a single thread. That might not sound too important, but it is.
Your code consists of a for-loop (synchronous) and a call to setTimeout (asychronous). Since only one piece of JavaScript can be running at once, your for-loop will never be interrupted by setTimeout.
In fact, if your for-loop contained extremely intensive operations that required more than 10 ms to complete, your setTimeout callback might actually be delayed past that mark, because the browser always wait for the currently executing code to finish before continuing to run the event loop.
setTimeout(() => {
console.log('hello!');
}, 10);
for (var i = 0; i < /* 50000 */ 5; i++) {
console.log('for array');
}
The others have diagnosed the problem well, the single threaded nature of the browser. I will offer a possible solution: generators.
Here's a codepen which demonstrates the problem:
http://codepen.io/anon/pen/zZwXem?editors=1111
window.CP.PenTimer.MAX_TIME_IN_LOOP_WO_EXIT = 60000;
function log(message) {
const output = document.getElementById('output');
output.value = output.value + '\n' + message;
}
function asyncTask() {
log('Simulated websocket message')
}
function doWork() {
const timer = setInterval(1000, asyncTask);
let total = 0;
for (let i = 1; i < 100000000; i++) {
const foo = Math.log(i) * Math.sin(i);
total += foo;
}
log('The total is: '+ total);
clearInterval(timer);
}
When doWork() is called by clicking the 'Do Work' button, the asyncTask never runs, and the UI locks up. Horrible UX.
The following example uses a generator to run the long running task.
http://codepen.io/anon/pen/jBmoPZ?editors=1111
//Basically disable codepen infinite loop detection, which is faulty for generators
window.CP.PenTimer.MAX_TIME_IN_LOOP_WO_EXIT = 120000;
let workTimer;
function log(message) {
const output = document.getElementById('output');
output.value = output.value + '\n' + message;
}
function asyncTask() {
log('Simulated websocket message')
}
let workGenerator = null;
function runWork() {
if (workGenerator === null) {
workGenerator = doWork();
}
const work = workGenerator.next();
if (work.done) {
log('The total is: '+ work.value);
workerGenerator = null;
} else {
workTimer = setTimeout(runWork,0);
}
}
function* doWork() {
const timer = setInterval(asyncTask,1000);
let total = 0;
for (let i = 1; i < 100000000; i++) {
if (i % 100000 === 0) {
yield;
}
if (i % 1000000 == 0) {
log((i / 100000000 * 100).toFixed(1) + '% complete');
}
const foo = Math.log(i) * Math.sin(i);
total += foo;
}
clearInterval(timer);
return total;
}
Here we do work in a generator, and create a generator runner to call from the 'Do Work' button in the UI. This runs on the latest version of Chrome, I can't speak for other browsers. Typically you'd use something like babel to compile the generators down to ES5 syntax for a production build.
The generator yields every 10000 rows of calculation, and emits a status update every 100000 rows. The generator runner 'runWork' creates an instance of the generator and repeatedly calls next(). The generator then runs until it hits the next 'yield' or return statement. After the generator yields, the generator runner then gives up the UI thread by calling setTimeout with 0 milliseconds and using itself as the handler function. This typically means it will get called once every animation frame (ideally). This goes until the generator returns the done flag, at which point the generator runner can get the returned value and clean up.
Here the HTML for the example in case you need to recreate the codepen:
<input type='button' value='Do Work' onclick=doWork() />
<textarea id='output' style='width:200px;height:200px'></textarea>
Javascript engines tends to be single threaded.
So if you are in a long running tight loop that doesn't yield (e.g. to do some io) then the callback will never get a chance to run until the loop finishes

Can't stop running promises

I have a complicated javascript function with promises inside it.
Here is my code :
var chunkProjectList = function(list, project, accounts) {
return new Promise((resolve, reject) => {
// init delta
let delta = 5
if ((accounts.length * delta) > list.length ) {
delta = (list.length / accounts.length)
}
// init chunked list
let chunkedList = []
for (let i = 0; i < accounts.length; i++) {
chunkedList.push([])
}
// loop on all users
for (let i = 0; i < list.length; i++) {
isInBlacklist(list[i], project.id)
.then(() => { // current user is in the blacklist
ProjectModel.deleteElementFromList(project.id, list[i])
if (i === list.length - 1) {
// no screen_names available, cu lan
return resolve(chunkedList)
}
})
.catch(() => { // we continue
let promises = []
for (let j = 0; j < accounts.length; j++) {
promises[j] = FollowedUserModel.getFollowedUserByScreenName(list[i], accounts[j].id)
}
// we checked a screen_name for all accounts
Promise.all(promises)
.then((rows) => {
for (let k = 0; k < rows.length; k++) {
if (rows[k][0].length === 0 && chunkedList[k].length < delta && list[i] !== '') {
console.log('we push')
chunkedList[k].push(list[i])
break
}
console.log('we cut (already followed or filled chunklist)')
if (k === rows.length - 1) {
// determine if is cancer screen_name or just filled chunklists
for (let l = 0; l < chunkedList.length; l++) {
if (chunkedList[l].length < delta) {
console.log('we cut because nobody wants ya')
ProjectModel.deleteElementFromList(project.id, list[i])
ProjectModel.addElementToBlackList(project.id, list[i])
}
if (l === chunkedList.length - 1) {
// you are all filled, cu lan
return resolve(chunkedList)
break
}
}
}
}
if (i === list.length - 1) {
// no screen_names available, cu lan
over = 1
return resolve(chunkedList)
}
})
})
}
})
}
My program is looping on a list of usernames, and tries to share it between the accounts, with a limit called 'delta' for each account
example:
My list contains 1000 usernames, the delta is 100 and I have 4 accounts
The expected result is an array of arrays, containing for each arrays, 100 usernames.
chunkedList (array) => ( array 1: length 100, array 2: length 100 ... )
The problem I have is the following :
When the delta is reached for all the chunkedList arrays, I just wanna exit this function, and stop every work inside, even the running promises. Just when I 'return resolve(chunkedList)'.
But the program continues to run, and it's a real problem for performances.
I know it's difficult to understand the code. Thanks
I am having difficulty understanding what your code is trying to accomplish. However, it looks like you could split your code up into a few separate promise functions. This would help clarity and should allow you stop execution. Without being able to test your code, the main problem seems to be caused by your main loop. In your code you have the following:
.then(() => {
....
return resolve(chunkedList);
}
The "return" statement here is only going to return from the inner .then() call and not from your main Promise. Thus, your main loop will continue to execute all of its code on the whole array. Essentially, it seems that your loop will continue to modify your main Promise's resolved value and never return until the whole loop has completed.
My recommendation is that you split out your main loop contents to be a separate method that takes a list as a parameter. This method will return a Promise. With that method, you can create a Promise.all() using your main loop and then add a .catch() method. The catch() method will occur when you reject one of the method calls (therefore not completing the rest of the promises in the rest of the Promise.all array).
I apologize if my above suggestion does not make any sense. I am trying to write this quickly. In summary, if you can split out the different steps of your method into their own methods that return their own promises, you can use promises very effectively allowing you to chain, execute multiple tasks in parallel, and/or execute tasks sequentially.

determining the end of an asynchronous loop

In the code attached i am looking to run the function returnFile after all database querys have run, but the problem is that i am unable to tell which response will be the last from inside of the query response, so what I was thinking was to separate the loops and just have the last callback run the returnFile function but that would dramatically slow things down.
for (var i = 0, len = articleRevisionData.length; i < len; i++) {
tagNames=[]
console.log("step 1, "+articleRevisionData.length+" i:"+i);
if(articleRevisionData[i]["tags"]){
for (var x = 0, len2 = articleRevisionData[i]["tags"].length; x < len2; x++) {
console.log("step 2, I: "+i+" x: "+x+articleRevisionData[i]["articleID"])
tagData.find({"tagID":articleRevisionData[i]["tags"][x]}).toArray( function(iteration,len3,iterationA,error, resultC){
console.log("step 3, I: "+i+" x: "+x+" iteration: "+iteration+" len3: "+len3)
if(resultC.length>0){
tagNames.push(resultC[0]["tagName"]);
}
//console.log("iteration: "+iteration+" len: "+len3)
if(iteration+1==len3){
console.log("step 4, iterationA: "+iterationA+" I: "+iteration)
articleRevisionData[iterationA]["tags"]=tagNames.join(",");
}
}.bind(tagData,x,len2,i));
}
}
if(i==len-1){
templateData={
name:userData["firstName"]+" "+userData["lastName"],
articleData:articleData,
articleRevisionData:articleRevisionData
}
returnFile(res,"/usr/share/node/Admin/anonymousAttempt2/Admin/Articles/home.html",templateData);
}
}
It is rarely a good idea to call an asynchronous function from within a loop since, as you've discovered, you cannot know when all the calls complete (which is the nature of asynchrony.)
In your example, it's important to note that all of your async calls run concurrently, which can consume more system resources than you might wish.
I've found that the best solution to these kinds of problems is to use events to manage execution flow, as in:
const EventEmitter = require('events');
const emitter = new EventEmitter();
let iterations = articleRevisionData.length;
// start up state
emitter.on('start', () => {
// do setup here
emitter.emit('next_iteration');
});
// loop state
emitter.on('next_iteration', () => {
if(iterations--) {
asyncFunc(args, (err,result) => {
if(err) {
emitter.emit('error', err);
return;
}
// do something with result
emitter.emit('next_iteration');
});
return;
}
// no more iterations
emitter.emit('complete');
});
// error state
emitter.on('error', (e) => {
console.error(`processing failed on iteration ${iterations+1}: ${e.toString()}`);
});
// processing complete state
emitter.on('complete', () => {
// do something with all results
console.log('all iterations complete');
});
// start processing
emitter.emit('start');
Note how simple and clean this code is, lacking any "callback hell", and how easy it is to visualize program flow.
It is also worth noting that you can express every kind of execution control (doWhile, doUntil, map/reduce, queue workers, etc.) using events and since event handling is at the very core of Node, you'll find using them in this manner will outperform most, if not all, other solutions.
See Node Events for more information on event handling in Node.

Make several requests to an API that can only handle 20 request a minute

I've got a method that returns a promise and internally that method makes a call to an API which can only have 20 requests every minute. The problem is that I have a large array of objects (around 300) and I would like to make a call to the API for each one of them.
At the moment I have the following code:
const bigArray = [.....];
Promise.all(bigArray.map(apiFetch)).then((data) => {
...
});
But it doesnt handle the timing constraint. I was hoping I could use something like _.chunk and _.debounce from lodash but I can't wrap my mind around it. Could anyone help me out ?
If you can use the Bluebird promise library, it has a concurrency feature built in that lets you manage a group of async operations to at most N in flight at a time.
var Promise = require('bluebird');
const bigArray = [....];
Promise.map(bigArray, apiFetch, {concurrency: 20}).then(function(data) {
// all done here
});
The nice thing about this interface is that it will keep 20 requests in flight. It will start up 20, then each time one finishes, it will start another. So, this is a potentially more efficient than sending 20, waiting for all to finish, sending 20 more, etc...
This also provides the results in the exact same order as bigArray so you can identify which result goes with which request.
You could, of course, code this yourself with generic promises using a counter, but since it is already built in the the Bluebird library, I thought I'd recommend that way.
The Async library also has a similar concurrency control though it is obviously not promise based.
Here's a hand-coded version using only ES6 promises that maintains result order and keeps 20 requests in flight at all time (until there aren't 20 left) for maximum throughput:
function pMap(array, fn, limit) {
return new Promise(function(resolve, reject) {
var index = 0, cnt = 0, stop = false, results = new Array(array.length);
function run() {
while (!stop && index < array.length && cnt < limit) {
(function(i) {
++cnt;
++index;
fn(array[i]).then(function(data) {
results[i] = data;
--cnt;
// see if we are done or should run more requests
if (cnt === 0 && index === array.length) {
resolve(results);
} else {
run();
}
}, function(err) {
// set stop flag so no more requests will be sent
stop = true;
--cnt;
reject(err);
});
})(index);
}
}
run();
});
}
pMap(bigArray, apiFetch, 20).then(function(data) {
// all done here
}, function(err) {
// error here
});
Working demo here: http://jsfiddle.net/jfriend00/v98735uu/
You could send 1 block of 20 requests every minute or space them out 1 request every 3 seconds (latter probably preferred by the API owners).
function rateLimitedRequests(array, chunkSize) {
var delay = 3000 * chunkSize;
var remaining = array.length;
var promises = [];
var addPromises = function(newPromises) {
Array.prototype.push.apply(promises, newPromises);
if (remaining -= newPromises.length == 0) {
Promise.all(promises).then((data) => {
... // do your thing
});
}
};
(function request() {
addPromises(array.splice(0, chunkSize).map(apiFetch));
if (array.length) {
setTimeout(request, delay);
}
})();
}
To call 1 every 3 seconds:
rateLimitedRequests(bigArray, 1);
Or 20 every minute:
rateLimitedRequests(bigArray, 20);
If you prefer to use _.chunk and _.debounce1 _.throttle:
function rateLimitedRequests(array, chunkSize) {
var delay = 3000 * chunkSize;
var remaining = array.length;
var promises = [];
var addPromises = function(newPromises) {
Array.prototype.push.apply(promises, newPromises);
if (remaining -= newPromises.length == 0) {
Promise.all(promises).then((data) => {
... // do your thing
});
}
};
var chunks = _.chunk(array, chunkSize);
var throttledFn = _.throttle(function() {
addPromises(chunks.pop().map(apiFetch));
}, delay, {leading: true});
for (var i = 0; i < chunks.length; i++) {
throttledFn();
}
}
1You probably want _.throttle since it executes each function call after a delay whereas _.debounce groups multiple calls into one call. See this article linked from the docs
Debounce: Think of it as "grouping multiple events in one". Imagine that you go home, enter in the elevator, doors are closing... and suddenly your neighbor appears in the hall and tries to jump on the elevator. Be polite! and open the doors for him: you are debouncing the elevator departure. Consider that the same situation can happen again with a third person, and so on... probably delaying the departure several minutes.
Throttle: Think of it as a valve, it regulates the flow of the executions. We can determine the maximum number of times a function can be called in certain time. So in the elevator analogy.. you are polite enough to let people in for 10 secs, but once that delay passes, you must go!

Categories