I'm new to js and async/await stuff and I want to build distance matrix using yandex maps api and then optimize the route using ACO algorithm.
Everything works, but my distance matrix generating so long because of await for every request in loop. I know that i should avoid it but i have no idea how.
My distance matrix should be done before calling ACO algorithm function.
async function buildDistanceMatrix(ymaps) {
const n = routeCoords.length;
let distanceMatrix = [];
console.log(routeCoordsRef.current);
for (let i = 0; i < n; i++) {
let newArr = [];
for (let j = 0; j < n; j++) {
await ymaps.route([routeCoordsRef.current[i], routeCoords[j]]).then((route) => {
newArr.push(route.getLength())
console.log(i, j, routeCoordsRef.current[i], routeCoords[j], route.getLength());
});
}
distanceMatrix.push(newArr);
}
return distanceMatrix;
}
let distanceMatrix = await buildDistanceMatrix(ymaps);
// and here my distance matrix done and i'm calling another function that uses distanceMatrix
I think that you need to take into consideration the following two notes...
Note 1: Typically when dealing with promises, either use "async" / "await" or stick with the ".then()" syntax, but don't mix them. For example...
async function buildDistanceMatrix(ymaps) {...
let route = await ymaps.route( [ routeCoordsRef.current[ i ], routeCoords[ j ] ] );
newArr.push( route.getLength() );
console.log( i, j, routeCoordsRef.current[ i ], routeCoords[ j ], route.getLength() );
...}
...or...
function buildDistanceMatrix(ymaps) {...
ymaps.route( [ routeCoordsRef.current[ i ], routeCoords[ j ] ] ).then( ( route ) => {
newArr.push( route.getLength() );
console.log( i, j, routeCoordsRef.current[ i ], routeCoords[ j ], route.getLength() );
} );
...}
Generally speaking, the "async" / "await" is more readable as it provides a syntax that reads procedurally, and helps avoid nesting hell of the then() functions.
Note 2: It appears that the Yandex router function makes use of a web service...
https://yandex.com/dev/maps/jsapi/
https://yandex.com/dev/maps/jsbox/2.1/router
...and hence the need for the promise, as the javascript code essentially suspends awaiting on the promise to resolve once the Yandex server responds.
Suggest looking into the Promise.all() function...
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
...as this permits your function to initiate a number of promises without waiting for any promises to complete, and only after creating all the promises does your function then await for all the promises to resolve.
The primary benefit is that the remote server is working on all the requests concurrently.
Before running off down this path, check on the Yandex terms of service to determine if there are restrictions on the number of concurrent outstanding calls. Additionally, well designed web services will throttle the number of concurrent requests, so there might be some large gains in performance for a small number of web calls, but after a certain limit, the web server throttle kicks in and the calls are then essentially queued (or ignored!) and handled synchronously again...
EDIT: Use of Promise.all with async / await syntax.
In short, your function should not await when creating the promises, but instead, capture the promises into an array. Then, once all the promises have been created (in this case, initiating a series of web service calls), then await on the Promise.all() function, passing all the unresolved promises. The resulting array will correspond directly with the order that the promises were passed in Promise.all(). Here is generally how your function will look. Note the addition of parallelArray, which is a means of capturing any ancillary data that you need when later making use of arrayOfResults...
async function buildDistanceMatrix(ymaps) {...
let arrayOfPromises = [];
let parallelArray = [];
for ( let i = 0; i < n; i++ ) {
for (let j = 0; j < n; j++ ) {
arrayOfPromises.push( ymaps.route( [ routeCoordsRef.current[ i ], routeCoords[ j ] ] ) );
parallelArray.push( { i: i, j: j, rci: routeCoordsRef.current[ i ], rcj: routeCoords[ j ] );
}
}
let arrayOfResults = await Promise.all( arrayOfPromises );
...}
Hope this helps...
Related
I have below code in javascript in which some asynchronous task is being performed:
async function fetchData(id){
for(let i=1;;++i){
res = await fetch(`https://some-api/v1/products/${id}/data?page=${i}`,{//headers here});
res = await res.json();
if(res.length==0) break;
else{ //do some work here and continue for next iteration}
}
}
async function callApi(){
var arr = [//list of id's here to pass to api one by one, almost 100 id's here];
await Promise.all(arr.map(async(e)=>{
await fetchData(e);
}));
}
callApi();
The above code looks fine to me, except that it doesn't work as expected. Ideally, what should happen is that unless one id's call is not completed( unless break condition not satisfies for one id), the for loop should not proceed to next iteration. Rather, I am getting totally different results. The api calls are happening in random order because the loop is not waiting the iteration to complete. My hard requirement is that unless one iteration is not complete, it should not move to next one.
await seems to have no effect here. Please guide me how can I achieve this. I am running out of ideas.
Thank You!
Your arr.map(...) is not awaiting the different fetchData calls before the next map call, so I'd turn this into a specific for loop to be sure it waits:
async function callApi(){
const arr = [...];
for(let i = 0; i < arr.length; i++){
await fetchData(arr[i]);
}
}
or alternatively use a for of
async function callApi(){
const arr = [...];
for(let a of arr){
await fetchData(a);
}
}
The fetchData function also looks like it could use some improvements with error handling, but since you shortened your code quite a bit, I'm assuming there is something like that going on there, too, and your issue is actually with the callApi() code instead, as the fetch and await looks good to me there.
You should decide either to use promises or async await. Don't mix them.
With promises you can always use funky abstractions but with a simple recursive approach you can do like
function fetchData(hds, id, page = 1, pages = []){
return fetch(`https://some-api/v1/products/${id}/data?page=${page}`,hds)
.then(r => r.ok ? r.json() : Promise.reject({status:r.status,pages})
.then(j => fetchData(hds, id, ++page, pages.push(doSomethingWith(j))))
.catch(e => (console.log(e.status), e.pages));
}
So we use recursion to fetch indefinitelly until the API says enough and r.ok is false.
At the callApi side you can use reduce since we have an ids array.
const ids = [/* ids array */],
hds = { /* headers object */ };
function callApi(ids){
return ids.reduce( (p,id) => p.then(_ => fetchData(hds,id))
.then(pages => process(pages))
, Promise.resolve(null)
)
.catch(e => console.log(e));
}
So now both accesses to the id and page data are working asynchronously but only fired once the previous one finishes. Such as
(id=1,page=1) then (id=1,page=2) then (id=1,page=3) then (process 3 pages of id=1) then
(id=2,page=1) then (id=2,page=2) then (process 2 pages of id=2) etc...
While I love the promises, you can also implement the same functionality with the asyc await abstraction. I believe the idea behind the invention of the async await is to mimic sync imperative code. But keep in mind that it's an abstraction over an abstraction and I urge you to learn promises by heart before even attemting to use async await. The general rule is to never mix both in the same code.
Accordingly the above code could have been written as follows by using async await.
async function fetchData(hds, id){
let page = 1,
pages = [],
res;
while(true){
res = await fetch(`https://some-api/v1/products/${id}/data?page=${page++}`,hds);
if (res.ok) pages.push(await res.json())
else return pages;
}
}
Then the callApi function can be implemented in a similar fashion
const ids = [/* ids array */],
hds = { /* headers object */ };
async function callApi(ids){
let pages;
for(let i = 0; i < ids.length; i++){
try {
pages = await fetchData(hds,ids[i]);
await process(pages); // no need for await if the process function is sync
}
catch(e){
console.log(e);
}
}
}
I have N workspaces. The number N is dynamic.
Every workspace has to execute a few pre-defined queries.
What I am currently doing is looping through an array of workspaces(Synchronously) and executing all the queries using Promise.all() (which is asynchronous).
Goal: What I need is to run all the queries for all the workspaces asynchronously. So I want to get rid of the loop to go thru each workspace. The ideal result would be an array of array. For example, if there are 3 workspaces and 2 queries the result would be [[q1, q2], [q1,q2], [q1,q2]] each q1 and q2 are the results for every workspace.
Below is the sample code:
async function fetchingWorkspaceLogs (workspaceId) {
// Defining q1QueryString, q2QueryString so on...
// for azure "loganalytics" reader.
const [q1Result, q2Result, q3Result] = await Promise.all([
logAnalyticsReader.query(
q1QueryString,
workspaceId
),
logAnalyticsReader.query(
q2QueryString,
workspaceId
),
])
// return some promises
}
// Parse all workspaces for query
for (let j = 0; j < workspaceIdList.length; j++) {
workspaceId = workspaceIdList[j]
const queryResults = await fetchingWorkspaceLogs(workspaceId)
q1QueryResults = queryResults[0]
q2QueryResults = queryResults[1]
}
How can I create another promise object to make it async?
Feel free to ask if you need anything else to get more clarity.
If I understand you correctly, you can map() that workspaces array into Promises array and wrap it with Promise.all.
The below code is "pseudo" to make the point.
If it not reflects your situation, I'll probably need more information.
async function fetchingWorkspaceLogs (workspaceId) {
const [q1Result, q2Result] = await Promise.all([
Promise.resolve(`param1: ${workspaceId}`),
Promise.resolve(`param2: ${workspaceId}`),
]);
// In this example, the function returns the result to make the point of returning Promise with the information
return Promise.resolve([q1Result, q2Result]);
}
const workspaceIdList = ['workspace1', 'workspace2', 'workspace3'];
(async () => {
const result = await Promise.all(
workspaceIdList.map(workspace => fetchingWorkspaceLogs(workspace))
);
console.log(result);
})();
you need to use .map() function.
I was fetching data asynchronously using async/await and then I am running two for loops one after another. So my question is will the for loops overlap each other for big data sets as js is asynchronous and if yes how to solve this?
And for what condition loops can overlap?
Actually, I am trying to make a dropdown and its working but I had this doubt.
const createDropdown = async language => {
let i = 0;
let listPromise = new Promise((resolve, reject) => {
db.ref("Products/" + language).on('value', snapshot => {
resolve(snapshot.val())
})//Fetching Data
})
let list = await listPromise;
for(i in list)
dropdown.remove(0)
for(i in list)
dropdown.options[dropdown.options.length] = new Option(list[i].name, list[i].name)
}
I am running this code and the for loops are not getting overlapped but is there a condition that it will?
Loops which are put in the code one after the other will never overlap either the code inside the loops are synchronous or asynchronous.
for (var i = 0; i < 10; i++) {
doSomethingSync()
}
for (var j = 0; j < 10; j++) {
createPromise().then((res) => {console.log(res))
}
for (var k = 0; k < 10; k++) {
var res = await createPromise();
console.log(res);
}
Above, the "I" loop completes all its operations and then the "J" loop, and then the "K" loop.
Here is the order and the details of each operation
The "I" loop serializes the 10 synchronous operations.
The "J" loop create 10 different promises. They maybe resolved in a different order.
The "K" loop creates 10 serialized promises. Each iteration waits until the promise is resolved before going for the net one.
1, 2, and 3 always happen one after the other.
I used this solution to work with axios, with about 1000 requests to a server and works fine, maybe this can be your solution too. In this code you will make a promise of your db.ref and wait for the response then you use that to manipulate.
const createDropdown = async (language) => {
const listPromise = await Promise.all(
return await db.ref("Products/" + language).on ('value', snapshot => snapshot.val())
)
for(i in listPromise)
dropdown.remove(0)
for(i in listPromise)
dropdown.options[dropdown.options.length] = new Option(list[i].name, list[i].name)
}
I'm having an issue with rate limit 429 when sending to many requests to an api. I'm using the api's Node.js library to make the requests with Javascript es-6 Promises.
Each Promise takes two arguments, and the arguments change on each request.
I have solved the rate limit issue by chaining promises with .then() and including a delay function that returns a resolved promise after ??ms.
let delay = (time = delay_ms) => (result) => new Promise(resolve => setTimeout(() => resolve(result), time));
Something like this:
request(arg1, arg2)
.then(delay(300))
.then(request(arg1, arg2))
.then(delay(300))...
This solved the rate limit issue BUT it's created a real headache with the amount of code I'm having to write using that solution because I'll have to write an awful amount of code.
I would like arg1 and arg2 to live in separate arrays so I can iterate through them to dynamically build the promise request and include a delay between each request.
I attempted to iterate with a forEach and for loop but the request all fire within milliseconds of each other creating the rate limit 429 issue again.
Is there a solution where:
ALL arg1 can be stored in an array let arr1 = ['USD', 'EUR' ...]
ALL arg2 can be stored in an array let arr2 = [60, 300, 600 ...]
Where I can dynamically create the Promise request using arr1 & arr2 with a delay() in between each request?
my code looks something like this:
requestPromise(arg1_a, arg2_a).then(res => delay(ms)).then(requestPromise(arg1_b, arg2_b)).then(res => delay(ms))...
Any help in maybe async await? I've tried but I can't seem to get it to work with this problem? maybe due to Promises and dynamic arguments??? Not sure I'm understanding how to incorporate async await with a dynamic Promise and iteration etc...
Any help appreciated.
If I properly understand what you're looking for, you want to iterate through your arrays with a delay between each request, calling requestPromise(x, y) where x and y come from each of the arrays.
You can do that like this:
const delay_ms = 100;
function delay(t = delay_ms) {
return new Promise(resolve => {
setTimeout(resolve, t);
});
}
function iterate(a1, a2, t) {
let index = 0;
const len = Math.min(a1.length, a2.length);
if (len === 0) {
return Promise.reject(new Error("iterate(a1, a2, t) needs two non-zero length arrays"));
}
function run() {
return requestPromise(a1[index], a2[index]).then(() => {
index++;
// if still more to process, insert delay before next request
if (index < len) {
return delay(t).then(run);
}
});
}
return run();
}
// sample usage
let arr1 = ['USD', 'EUR' ...];
let arr2 = [60, 300, 600 ...];
iterate(arr1, arr2, 500).then(() => {
// all done here
}).catch(err => {
// got error here
});
This works by creating a promise chain where each new request is chained onto the previous promise chain and executed only after the delay. The arrays are accessed via an index variable that is initialized to 0 and then incremented after each iteration.
This function requires two non-zero length arrays and will iterate to the length of the shorter of the two arrays (if for some reason they aren't equal length).
Promises.all() can be used for handling the returns from promises made by asynchronous calls however in order to handle the number of requests made concurrently the http.request code needs to limit the number of calls. Setting the https.globalAgent.maxSocket = 20 has worked for me as a simple work around. This is obviously used when the requests are made a node client.
This should be pretty simple using async/await and a standard for loop.
async function iterate(arr1, arr2) {
for (let i = 0; i < arr1.length; i++) {
await requestPromise(arr1[i], arr2[i]);
await delay(300);
}
}
let arr1 = ['USD', 'EUR'];
let arr2 = [60, 300, 600];
iterate(arr1, arr2);
This implementation assumes that the arrays are of the same length.
I have this JavaScript pseudocode:
async function main() {
for(let i=0; i<array.length; i++) {
for(let j=0; j<3; j++) {
let value = await promise thisFunctionReturnsAPromise();
If(value) {
// do_something
}
}
}
}
Now I need to iterate 2 promises at time (to speed up the job) and to test each one of them 3 times; so i can't use a simple for anymore.
I need to replace the external for loop.
I was thinking to use async.eachLimit to solve the problem but i can't use await within eachLimit. How can i put an await within async eachLimit to call more then one "test cicle"?
I need something like async.eachLimit because my job needs to go to the next element that needs to be tested after one of the previous two has been tested for a maximum of 3 times.
Sorry for my bad english.