Rate Limit Issue, looping through dynamic Promise requests - javascript

I'm having an issue with rate limit 429 when sending to many requests to an api. I'm using the api's Node.js library to make the requests with Javascript es-6 Promises.
Each Promise takes two arguments, and the arguments change on each request.
I have solved the rate limit issue by chaining promises with .then() and including a delay function that returns a resolved promise after ??ms.
let delay = (time = delay_ms) => (result) => new Promise(resolve => setTimeout(() => resolve(result), time));
Something like this:
request(arg1, arg2)
.then(delay(300))
.then(request(arg1, arg2))
.then(delay(300))...
This solved the rate limit issue BUT it's created a real headache with the amount of code I'm having to write using that solution because I'll have to write an awful amount of code.
I would like arg1 and arg2 to live in separate arrays so I can iterate through them to dynamically build the promise request and include a delay between each request.
I attempted to iterate with a forEach and for loop but the request all fire within milliseconds of each other creating the rate limit 429 issue again.
Is there a solution where:
ALL arg1 can be stored in an array let arr1 = ['USD', 'EUR' ...]
ALL arg2 can be stored in an array let arr2 = [60, 300, 600 ...]
Where I can dynamically create the Promise request using arr1 & arr2 with a delay() in between each request?
my code looks something like this:
requestPromise(arg1_a, arg2_a).then(res => delay(ms)).then(requestPromise(arg1_b, arg2_b)).then(res => delay(ms))...
Any help in maybe async await? I've tried but I can't seem to get it to work with this problem? maybe due to Promises and dynamic arguments??? Not sure I'm understanding how to incorporate async await with a dynamic Promise and iteration etc...
Any help appreciated.

If I properly understand what you're looking for, you want to iterate through your arrays with a delay between each request, calling requestPromise(x, y) where x and y come from each of the arrays.
You can do that like this:
const delay_ms = 100;
function delay(t = delay_ms) {
return new Promise(resolve => {
setTimeout(resolve, t);
});
}
function iterate(a1, a2, t) {
let index = 0;
const len = Math.min(a1.length, a2.length);
if (len === 0) {
return Promise.reject(new Error("iterate(a1, a2, t) needs two non-zero length arrays"));
}
function run() {
return requestPromise(a1[index], a2[index]).then(() => {
index++;
// if still more to process, insert delay before next request
if (index < len) {
return delay(t).then(run);
}
});
}
return run();
}
// sample usage
let arr1 = ['USD', 'EUR' ...];
let arr2 = [60, 300, 600 ...];
iterate(arr1, arr2, 500).then(() => {
// all done here
}).catch(err => {
// got error here
});
This works by creating a promise chain where each new request is chained onto the previous promise chain and executed only after the delay. The arrays are accessed via an index variable that is initialized to 0 and then incremented after each iteration.
This function requires two non-zero length arrays and will iterate to the length of the shorter of the two arrays (if for some reason they aren't equal length).

Promises.all() can be used for handling the returns from promises made by asynchronous calls however in order to handle the number of requests made concurrently the http.request code needs to limit the number of calls. Setting the https.globalAgent.maxSocket = 20 has worked for me as a simple work around. This is obviously used when the requests are made a node client.

This should be pretty simple using async/await and a standard for loop.
async function iterate(arr1, arr2) {
for (let i = 0; i < arr1.length; i++) {
await requestPromise(arr1[i], arr2[i]);
await delay(300);
}
}
let arr1 = ['USD', 'EUR'];
let arr2 = [60, 300, 600];
iterate(arr1, arr2);
This implementation assumes that the arrays are of the same length.

Related

How to use promise with for loop and $.get request?

This is my first time using promises so i'm sure there are pretty dumb mistakes here. What im trying to do, is send a http request thats in a for loop.
When doing this without the promise, i can run this loop fine and everything works properly. however, when I do this with the promise, it only returns one object (should be multiple, as its sending multiple requests)
This is my code
function run(o,initialvalue){
const test = new Promise( (resolve,reject) => {
const collectionCountUrl = 'https://x.io/rpc/Query?q=%7B%22%24match%22%3A%7B%22collectionSymbol%22%3A%22'+o.name+'%22%7D%2C%22%24sort%22%3A%7B%22takerAmount%22%3A1%2C%22createdAt%22%3A-1%7D%2C%22%24skip%22%3A'+initialvalue+'%2C%22%24limit%22%3A500%7D'
promises = []
for(i=0; i < o.count(/*in this case 60, so 3 requests*/); i+= 20){
$.get(collectionCountUrl).success(resolve).fail(reject) // This should be sending multiple of the requests above, correct? ^
}
})
test.then(function(data){
console.log(data) // this should return the data from each request? im not sure
})
}
I've tried to check out this post to see if I was setting up the for loop wrong but I dont think I am.
Promise for-loop with Ajax requests
One promise is for a single result. You have to wrap a each request in a seperate promise (move the constructor call into the loop), and store the promises in an array.
Then, you can use aggregator methods like Promise.all (others being Promise.allSettled, Promise.race and Promise.any) to get a single promise of an array of results, that you can wait for.
Like this:
function run(o,initialvalue){
const collectionCountUrl = 'https://x.io/rpc/Query?q=%7B%22%24match%22%3A%7B%22collectionSymbol%22%3A%22'+o.name+'%22%7D%2C%22%24sort%22%3A%7B%22takerAmount%22%3A1%2C%22createdAt%22%3A-1%7D%2C%22%24skip%22%3A'+initialvalue+'%2C%22%24limit%22%3A500%7D'
const promises = []
// ^^^^^----+--- don't forget to declare all your variables
// vvv--+
for(let i=0; i < o.count(/*in this case 60, so 3 requests*/); i+= 20){
promises.push(new Promise( (resolve,reject) => {
$.get(collectionCountUrl).success(resolve).fail(reject)
}))
}
//You've got an array of promises, let's convert it to a single one that resolves when everything's done:
const test = Promise.all(promises)
test.then(function(data){
//This will run when all the requests have succeeded
//`data` is an array of results
console.log(data)
})
}
This will work, but there's an even better way!
jQuery's $.Deferred objects (like the one you're getting from $.get) also have a .then() method like promises do, and while it's not a real promise, you can use it like if it was. Such objects are called thenables.
So, instead of this thing...
promises.push(new Promise( (resolve,reject) => {
$.get(collectionCountUrl).success(resolve).fail(reject)
}))
...you can just do this:
promises.push( $.get(collectionCountUrl) )
So, here's the final code:
function run(o,initialvalue){
const collectionCountUrl = 'https://x.io/rpc/Query?q=%7B%22%24match%22%3A%7B%22collectionSymbol%22%3A%22'+o.name+'%22%7D%2C%22%24sort%22%3A%7B%22takerAmount%22%3A1%2C%22createdAt%22%3A-1%7D%2C%22%24skip%22%3A'+initialvalue+'%2C%22%24limit%22%3A500%7D'
const promises = []
for(let i=0; i < o.count(/*in this case 60, so 3 requests*/); i+= 20){
promises.push( $.get(collectionCountUrl) )
}
//You've got an array of promises, let's convert it to a single one that resolves when everything's done:
const test = Promise.all(promises)
test.then(function(data){
//This will run when all the requests have succeeded
//`data` is an array of results
console.log(data)
})
}

How can I make the promises to stop after getting x results in Firestore

I have this code that is checking if my userContacts ids exist in another collection, and I'm returning all the matches.
async function fetchCommonNumbers() {
var commonNumbers = [];
let contactsReference = admin.firestore().collection("user_contacts").doc("iNaYVsDCg3PWsDu67h75xZ9v2vh1").collection("contacts");
const dbContactReference = admin.firestore().collection('db_contacts_meta');
userContacts = await contactsReference.get();
userContacts = userContacts.docs;
await Promise.all(
userContacts.map(userContact => {
const DocumentID = userContact.ref.id;
//Check if Document exists
return dbContactReference.doc(DocumentID).get().then(dbContact => {
if (dbContact.exists) {
console.log(DocumentID);
commonNumbers.push(dbContact.data());
}
});
}));
return Promise.resolve(commonNumbers);
}
I need to only return X matches and not all since later I'll be having million of records and I want to reduce processing time.
How can I make the Promise.all to stop when commonNumbers has X items in it?
Currently there is not implementation of cancelable promises (more info can be found here enter link description here),
If you want, you can define your own "cancelable promise" wrapping a normal promise.
Reduce the processing time without "stopping" promises
You can't really make the promises stop. But since you're looking to reduce the number of database calls, what you can do is to selectively resolve your promises.
For example you can include a conditional statement in your map function. Like this
if commonNumbers.length < maxLength then return me a Promise containing the database call
Else, just resolve a random value (like false in my example)
Your promises will still be there, but you will have limited the number of DB calls to the necessary. It will look something like this
const arr = [1, 2, 3, 4, 5, 6];
const buffer = [];
const maxLenBuffer = 3;
const p = Promise.all(
arr.map(n => {
if (buffer.length < maxLenBuffer) {
buffer.push(n);
return Promise.resolve(n);
} else {
// There's still a promise to be resolved, but it's not a HTTP call
// This gives you the gain of performance you're looking for
return Promise.resolve(false);
}
})
);
p.then(() => console.log(buffer));
Note
While this can reduce your database calls, the actual number of calls can be a little higher than your maximum specified. This is due to the asynchronous nature of the calls
Instead of breaking the promise in between, I would suggest you use the limit method do firestore.
You can query only for X number of records and this X can be either hardcoded or can come from user. Something like:
documentRef.orderBy("name").limit(3).get()

Function similar to Promise.some/any for an unknown amount of promises

I am creating a script in node.js (V8.1.3) which looks at similar JSON data from multiple API's and compares the values. To be more exact I am looking at different market prices of different stocks (actually cryptocurrencies).
Currently, I am using promise.all to wait for all responses from the respective APIs.
let fetchedJSON =
await Promise.all([getJSON(settings1), getJSON(settings2), getJSON(settings3) ... ]);
However, Promise.all throws an error if even just one promise rejects with an error. In the bluebird docos there is a function called Promise.some which is almost what I want. As I understand it takes an array of promises and resolves the two fastest promises to resolve, or otherwise (if less than 2 promises resolve) throws an error.
The problem with this is that firstly, I don't want the fastest two promises resolved to be what it returns, I want any successful promises to be returned, as long as there is more than 2. This seems to be what Promise.any does except with a min count of 1. (I require a minimum count of 2)
Secondly, I don't know how many Promises I will be awaiting on (In other words, I don't know how many API's I will be requesting data from). It may only be 2 or it may be 30. This depends on user input.
Currently writing this it seems to me there is probably just a way to have a promise.any with a count of 2 and that would be the easiest solution. Is this possible?
Btw, not sure if the title really summarizes the question. Please suggest an edit for the title :)
EDIT: Another way I may be writing the script is that the first two APIs to get loaded in start getting computed and pushed to the browser and then every next JSON that gets loaded and computed after it. This way I am not waiting for all Promises to be fulfilled before I start computing the data and passing results to the front end. Would this be possible with a function which also works for the other circumstances?
What I mean kind of looks like this:
Requesting JSON in parallel...
|-----JSON1------|
|---JSON-FAILS---| > catch error > do something with error. Doesn't effect next results.
|-------JSON2-------| > Meets minimum of 2 results > computes JSON > to browser.
|-------JSON3---------| > computes JSON > to browser.
How about thening all the promises so none fail, pass that to Promise.all, and filter the successful results in a final .then.
Something like this:
function some( promises, count = 1 ){
const wrapped = promises.map( promise => promise.then(value => ({ success: true, value }), () => ({ success: false })) );
return Promise.all( wrapped ).then(function(results){
const successful = results.filter(result => result.success);
if( successful.length < count )
throw new Error("Only " + successful.length + " resolved.")
return successful.map(result => result.value);
});
}
This might be somewhat clunky, considering you're asking to implement an anti-pattern, but you can force each promise to resolve:
async function fetchAllJSON(settingsArray) {
let fetchedJSON = await Promise.all(
settingsArray.map((settings) => {
// force rejected ajax to always resolve
return getJSON(settings).then((data) => {
// initial processing
return { success: true, data }
}).catch((error) => {
// error handling
return { success, false, error }
})
})
).then((unfilteredArray) => {
// only keep successful promises
return dataArray.filter(({ success }) => success)
})
// do the rest of your processing here
// with fetchedJSON containing array of data
}
You can use Promise.allSettled([]). the difference is that allSettled will return an array of objects after all the promises are settled regardless if successful or failed. then just find the successful o whatever you need.
let resArr = await Promise.allSettled(userNamesArr.map(user=>this.authenticateUserPassword(user,password)));
return resArr.find(e=>e.status!="rejected");
OR return resArr.find(e=>e.status=="fulfilled").
The other answers have the downside of having to wait for all the promises to resolve, whereas ideally .some would return as soon as any (N) promise(s) passes the predicate.
let anyNPromises = (promises, predicate = a => a, n = 1) => new Promise(async resolve => {
promises.forEach(async p => predicate(await p) && !--n && resolve(true));
await Promise.all(promises);
resolve(false);
});
let atLeast2NumbersGreaterThan5 = promises => anyNPromises(promises, a => a > 5, 2);
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
Promise.resolve(10),
Promise.resolve(11)]
).then(a => console.log('5, 3, 10, 11', a)); // true
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
Promise.resolve(10),
Promise.resolve(-43)]
).then(a => console.log('5, 3, 10, -43', a)); // false
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
new Promise(() => 'never resolved'),
Promise.resolve(10),
Promise.resolve(11)]
).then(a => console.log('5, 3, unresolved, 10, 11', a)); // true

Am I using Promises well?

Probem
My problem is that I want my code to do the following:
Make an innitial request
Once I get that request's answer, I process it and make a batch of requests again
Once I am done with the batch, and have all its responses, I write a file
Once the file is done, I print a message
I got the first two steps right, but my program is not doing the last two as I expected.
Code
This code tries to exemplify what I am trying to achieve. With only promise and jsonfile it is a simple application that represents the architecture of my code in detail and it works out of the bat, provided you install both libraries.
let jsonfile = require("jsonfile");
let Promise = require("promise");
let write = Promise.denodeify(jsonfile.writeFile);
let writeOutput = function(filename, content) {
return write(filename, content, {spaces: 4});
};
//Returns a random number each time it is invoked
//after a random period of time between 1s and 6s
let requestSimulator = function() {
return new Promise((fulfil, reject) => {
let randomNum = Math.random();
let wait = Math.floor((Math.random() * 6000) + 2000);
setTimeout(() => fulfil(randomNum), wait, randomNum);
});
};
//Returns an array of rounded numbers
let roundNumbers = function(someNumbers) {
let numbersArr = [];
let tmpNum;
for (let number of someNumbers) {
tmpNum = Math.floor(number);
console.log("Rounded " + number + " to " + tmpNum);
numbersArr.push(tmpNum);
}
return numbersArr;
};
//Receives an array of rounded numbers, and for each number
//makes a new request.
//It then sums the response with the given number.
let sumNumbersBatch = function(numbersArr) {
let promisesArray = [];
for (let number of numbersArr) {
let promise = new Promise((fulfil, reject) => {
requestSimulator()
.then(result => {
let newNum = number + result;
console.log("Summing " + number + " with " + result + "resultint in " + newNum);
fulfil(newNum);
});
});
promisesArray.push(promise);
}
return new Promise.all(promisesArray);
};
//Starts the process
let getData = function() {
return new Promise((fulfil, reject) => {
requestSimulator()
.then(number => fulfil([number, number * 2, number * 3]));
});
};
console.log("Starting program");
getData()
.then(roundNumbers)
.then(sumNumbersBatch)
.then(newNumbers => writeOutput("testFile.txt", newNumbers))
.then(console.log("Program finished"))
.catch(console.log);
After running this, you can see that the output will be something like:
Starting program
Program finished
Rounded 0.20890058801647582 to 0
Rounded 0.41780117603295164 to 0
Rounded 0.6267017640494275 to 0
Summing 0 with 0.05537663551196226resultint in 0.05537663551196226
Summing 0 with 0.34853429001859215resultint in 0.34853429001859215
Summing 0 with 0.988336787994851resultint in 0.988336787994851
Which is wrong !!!!
Program finish should appear last, not second!
Questions:
So now I have doubts about my code:
Am I making a correct use of Promise.all ?
Am I promisifying the write function well?
Also, I am open to suggestions regarding code quality !!!!
Any help and explanation would be very appreciated.
I've rewritten the whole answer to match the modified code. My first attempt at an "answer" was no more than an extended comment of what seemed wrong with the first provided code; so nothing lost.
Most of your code is correct, this line is actually "wrong":
.then(console.log("Program finished"))
and confuses you because it calls console.log("Program finished") immediately, and return undefined, so that the then translates to .then(undefined).
it should be
.then(() => console.log("Program finished"))
And there should be no new in front of Promise.all()
Although a few things can be improved, especially your use of the Deferred antipattern. That's the manual creation of a Deferred Object when it is not necessary, when you're already dealing with promises at that place. Like this one:
//Starts the process
let getData = function() {
return new Promise((fulfil, reject) => {
requestSimulator()
.then(number => fulfil([number, number * 2, number * 3]));
});
};
better would be
//Starts the process
let getData = function() {
return requestSimulator().then(number => [number, number * 2, number * 3]);
};
whereas in requestSimulator you need to create a new Promise() in order to use Promises with setTimeout(). There it is appropriate.
let jsonfile = require("jsonfile");
let Promise = require("promise");
let write = Promise.denodeify(jsonfile.writeFile);
//OK, now this function has a purpose/additional value (formatting)
//and is not just forwarding the arguments
let writeOutput = function(filename, content) {
return write(filename, content, {spaces: 4});
};
//fine, a mock
let requestSimulator = function() {
return new Promise((fulfil, reject) => {
let randomNum = Math.random();
let wait = Math.floor((Math.random() * 6000) + 2000);
//there's no need/reason to pass `randomNum` to setTimeout as a third argument
setTimeout(() => fulfil(randomNum), wait);
});
};
//this can be shortened to, although it doesn't log anymore
let roundNumbers = function(someNumbers) {
return someNumbers.map(Math.floor);
};
//Receives an array of rounded numbers, and for each number
//makes a new request.
//It then sums the response with the given number.
let sumNumbersBatch = function(numbersArr) {
//this again can be achieved simpler by using `Array#map` instead of `for..of`
let promisesArray = numbersArr.map(number => {
return requestSimulator()
.then(result => result + number);
});
//no `new` here! Promise.all() is just a utility-function, no constructor.
return Promise.all(promisesArray);
};
//Starts the process
let getData = function() {
//removed the wrapping Promise.
return requestSimulator()
.then(number => [ number, number * 2, number * 3 ]);
};
console.log("Starting program");
getData()
.then(roundNumbers)
.then(sumNumbersBatch)
.then(newNumbers => writeOutput("testFile.txt", newNumbers))
//this executes `console.log()` immediately and passes the result (`undefined`) to `then()`
//.then(console.log("Program finished"))
//but `then()` needs a function:
.then(() => console.log("Program finished"))
//here you pass a reference to the function `console.log`, that's fine
.catch(console.log);
Am I making a correct use of Promise.all?
Yes and no, the way you use Promise.all is weird - you always provide an empty array. Promise.all expects as input an array of promises, it wait for all promises in the input to be resolved OR either of them to fail.
It returns a promise that is either resolved (if all input promises are ok) or rejected (if either fails). In your original case Promise.all is always resolved because the input list is empty
Is my file really being written after ALL the requests finished, or is it being written as they finish?
The method writeOutput in invoked after the Promise returned from makeBatchRequests is resolved, it is invoked with two arguments - fileName, which is undefined because it was never defined in your code and the second one is result - which is an Array whose members are resolved results of promisesArray, which is always empty. So technically, YES, the function is invoked after all the requests being finished, but no data would be written to a file (oh, actually, empty string "" which is [].toString() will be printed to file :] )
Am I promisifying the write function well?
YES, you're doing it right.
Other than that, try to rewrite your code, step by step, testing it when proceeding with every step and comparing the expected results with what you get.
As mentioned in the answers above, there're a lot of stuff to fix, good luck! :]
Well then, I think the problem might be on writeOutput(fileName, result), are you sure it returns a promise?
This is more of a suggestion than an actual answer, but try doing like so:
scrapy.getStanceMods()
.then(['www.google.com', 'www.reddit.com'])
.then(makeBatchRequest)
.then(result => {
return writeOutput(fileName, result))
.then(console.log("Completed."))
.catch(error=>console.log('will catch inner errors'))
})
.catch(error =>console.error(error));

generators + promises to parallelize N number of items

the challenge:
We want to make N parallel ajax requests for an item's children.
Upon returning, we want to process them in sequential order (1...N)
We do NOT want to wait for all promises to return, but we want to process them IN ORDER as they come back.
For example:
Even if 2,3,5 come back before 1, we should hold onto the results of 2,3,5, and upon 1's return, process 1,2,3 in order (and wait for 4 to come back before 5)
Tools: Q + ES6 generators
Create array of N-1 length with placeholder variables
EG when N = 3:
let [N1,N2,N3] = yield [ Promise1, Promise2, Promise3 ]
//process items sequentially:
console.log(N1)
console.log(N2)
console.log(N3)
However, populating an array of empty variables doesn't seem to work of course because the reference doesn't know where to find the var declaration
for(var i = 0; i< 3; i++) {
res.push("some empty var")
}
Given the constraints of sticking to the tools provided, how could we parallelize calls, but process their returns sequentially?
You can use Promise.all(), .then()
javascript at Answer returns exact results described at Question
We want to make N parallel ajax requests for an item's children.
Upon returning, we want to process them in sequential order (1...N)
We do NOT want to wait for all promises to return, but we want to process them IN ORDER as they come back.
how could we parallelize calls, but process their returns
sequentially?
You can use .then() chained to original function which returns a promise or Promise object itself to process promise before returning value to be processed in sequential order of parameters passed to Promise.all() at .then() chained to Promise.all()
var n = 0;
var fn = function() {
return new Promise(function(resolve, reject) {
console.log("promise " + ++n + " called");
setTimeout(function(i) {
resolve(i)
}, Math.random() * 2500, n)
})
// handle requirement 3. here
.then(function(res) {
console.log(res);
return res
})
}
Promise.all([fn(), fn(), fn()]) // handle requirement 1. here
// handle requirement 2. here
.then(function(data) {
let [N1, N2, N3] = data;
console.log(N1, N2, N3);
})
You can do that by waiting for the next promise inside the loop:
const promises = […]; // or created programmatically
for (const promise of promises) {
const result = yield promise; // await them sequentially
console.log(result);
}

Categories