Create an order for JSON to be loaded - javascript

I have multiple external JSON files that need to be loaded on my page. Is there a way to set an order for them to be loaded.
JavaScript
const func1 = () => {
$.getJSON(json1, result => {});
$.getJSON(json2, result => {});
}
...
const func2 = () => {
$.getJSON(json3, result => {});
$.getJSON(json4, result => {});
}
Right now I have 4 different getJSON and now as I can see, this is the loading order = json1=>json3=>json4=>json2.
Is there a way to set an order on when to load the json files. I want to load it json1=>json2=>json3=>json4?

The issue you're seeing is because the requests are all asynchronous. As such you are entirely at the mercy of the network between the client's machine and the receiving server, as well as the load the server is under at that given moment and how strenuous each task you're asking to be done is, as to which of those requests are received and responded to first.
There are ways to mitigate this in your client-side logic, such as request chaining or synchronous requests, however the former has a negative effect of UI performance and the latter is terrible practice as it locks the browser until all requests complete.
The best approach to take if you need to rely on the order of the response, is to instead aggregate all of these requests in to a single one and send that alone. Your client-side logic can then continue processing as it has access to all the data you require. This will also scale much better in terms of server performance. It may require some changes to your server side logic, however, but this is a small price for both better client-side logic and server-side performance.

You can queue them. Initiate new api call once previous has succeeded.
const jsonData = [];
$.getJSON(json1, result => {
jsonData.push(result);
$.getJSON(json2, result => {
jsonData.push(result);
$.getJSON(json3, result => {
jsonData.push(result);
$.getJSON(json4, result => {
jsonData.push(result);
processJsonData(jsonData);
});
})
})
})
function processJsonData(jsonData) {
const [
json1,
json2,
json3,
json4
] = jsonData;
// Further proccessing
}

While theoretically possible, it is not advised.
I would rather make one request out of those 4 and handle that one request, or send all 4 asynchronously and continue work, when all 4 resolved.
Here would be the simple solution to send them one after another:
$.getJSON(json1, result => {
$.getJSON(json2, result => {
$.getJSON(json3, result => {
$.getJSON(json4, result => {});
});
});
});
Here would be a solution, to write it with promises and async/await (Its a bit cleaner):
function sendJsonRequest(jsonData) {
return new Promise(resolve => {
$.getJSON(jsonData, result => resolve(result));
});
}
async function getData() {
let data1 = await sendJsonRequest(json1);
let data2 = await sendJsonRequest(json2);
let data3 = await sendJsonRequest(json3);
let data4 = await sendJsonRequest(json4);
}

All of your requests are done simultaneously, thus, the order you observe is related to the reponse time of your backend.
If you absolutely want to wait for a request before the next one starts, then just chain them like that:
$.getJSON(json1, result => {
$.getJSON(json2, result => {
$.getJSON(json3, result => {
$.getJSON(json4, result => {})
})
})
})
And if you want to avoid indentation :
const func1 = async () => {
const result1 = await $.getJSON(json1);
const result2 = await $.getJSON(json2);
const result3 = await $.getJSON(json3);
//...
}

Related

NodeJS: Chain functions automatically in a promise?

I'm currently fetching data from an API and I need to do multiple GET requests (using axios). After all those GET requests are completed, I return a resolved promise.
However, I need to do these GET requests automatically based on an array list:
function do_api_get_requests() {
return promise = new Promise(function(resolve, reject) {
API_IDs = [0, 1, 2];
axios.get('https://my.api.com/' + API_IDs[0])
.then(data => {
// Do something with data
axios.get('https://my.api.com/' + API_IDs[1])
.then(data => {
// Do something with data
axios.get('https://my.api.com/' + API_IDs[2])
.then(data => {
// Do something with data
// Finished, resolve
resolve("success");
}
}
}
}
}
This works but the problem is API_IDs isn't always going to be the same array, it will change. So I'm not sure how to chain these requests automatically.
Since you said it may be a variable length array and you show sequencing the requests, you can just loop through the array using async/await:
async function do_api_get_requests(API_IDS) {
for (let id of API_IDS) {
const data = await axios.get(`https://my.api.com/${id}`);
// do something with data here
}
return "success";
}
And, since you said the list of API ids would be variable, I made it a parameter that you can pass into the function.
If you wanted to run all the API requests in parallel (which might be OK for a small array, but might be trouble for a large array) and you don't need to run them in a specific order, you can do this:
function do_api_get_requests(API_IDS) {
return Promise.all(API_IDS.map(async (id) => {
const data = await axios.get(`https://my.api.com/${id}`);
// do something with data here for this request
})).then(() => {
// make resolved value be "success"
return "success";
});
}
Depending upon your circumstances, you could also use Promise.allSettled(). Since you don't show getting results back, it's not clear whether that would be useful or not.
You can use Promise.all() method to do all API requests at the same time, and resolve when all of them resolves.
function do_api_get_requests() {
const API_IDs = [0, 1, 2];
let promises = [];
for (const id of API_IDS) {
promises.push(axios.get(`https://my.api.com/${id}`));
}
return Promise.all(promises);
}
If you use Bluebird.js (a better promise library, and faster than the in-built Promise), you can use Promise.each(), Promise.mapSeries(), or Promisme.reduce() to do what you want.
http://bluebirdjs.com

React - what is the proper way to reload after multiple axios requests are done

I am wondering what is the best way to make multiple axios requests, and reload the page after it's completed.
If using the way below, the page will reload before all the axios requests completed
const handleDeletePost= async (postId) => {
//delete multiple posts
let postToBeDeleted = selectedPostId; // it's an array containing the id of selected posts
if (postToBeDeleted.length > 0) {
postToBeDeleted.forEach(async (id) => {
const response = await PostService.deletePost(id);
console.log(response);
});
window.location.reload(); // if I put it here, the page will reload before the delete completes
}
};
You can use Promise.all to parallelly fetching requests without waiting for each one to be finished:
Also, need to wait for it to be completed first, then reload the page.
const handleDeletePost = async (postId) => {
const postToBeDeleted = selectedPostId;
if (postToBeDeleted.length > 0) {
const promises = [];
postToBeDeleted.forEach((id) => {
promises.push(PostService.deletePost(id));
});
await Promise.all(promises).then((values) => {
console.log(values);
});
window.location.reload();
}
};
The problem seems to be because you're expecting forEach to wait on every iteration. However, forEach and async functions do not play well together. You need to use something like Promise.all instead.

Array of filtered axios results from paginated API is empty

In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}

Parallel HTTP requests in batches with async for loop for each request

I am trying to run parallel requests in batches to an API using a bunch of keywords in an array. Article by Denis Fatkhudinov.
The problem I am having is that for each keyword, I need to run the request again with a different page argument for as many times as the number in the pages variable.
I keep getting Cannot read property 'then' of undefined for the return of the chainNext function.
The parallel request in batches on its own, without the for loop, works great, I am struggling to incorporate the for loop on the process.
// Parallel requests in batches
async function runBatches() {
// The keywords to request with
const keywords = ['many keyword strings here...'];
// Set max concurrent requests
const concurrent = 5;
// Clone keywords array
const keywordsClone = keywords.slice()
// Array for future resolved promises for each batch
const promises = new Array(concurrent).fill(Promise.resolve());
// Async for loop
const asyncForEach = async (pages, callback) => {
for (let page = 1; page <= pages; page++) {
await callback(page);
}
};
// Number of pages to loop for
const pages = 2;
// Recursively run batches
const chainNext = (pro) => {
// Runs itself as long as there are entries left on the array
if (keywordsClone.length) {
// Store the first entry and conviently also remove it from the array
const keyword = keywordsClone.shift();
// Run 'the promise to be' request
return pro.then(async () => {
// ---> Here was my problem, I am declaring the constant before running the for loop
const promiseOperation = await asyncForEach(pages, async (page) => {
await request(keyword, page)
});
// ---> The recursive invocation should also be inside the for loop
return chainNext(promiseOperation);
});
}
return pro;
}
return await Promise.all(promises.map(chainNext));
}
// HTTP request
async function request(keyword, page) {
try {
// request API
const res = await apiservice(keyword, page);
// Send data to an outer async function to process the data
await append(res.data);
} catch (error) {
throw new Error(error)
}
}
runBatches()
The problem is simply that pro is undefined, because you haven't initialized it.
You basically execute this code:
Promise.all(new Array(concurrent).fill(Promise.resolve().map(pro => {
// pro is undefined here because the Promise.resolve had no parameter
return pro.then(async () => {})
}));
I'm not completely sure about your idea behind that, but this is your problem in a more condensed version.
I got it working by moving actual request promiseOperation inside the for loop and returning the recursive function there too
// Recursively run batches
const chainNext = async (pro) => {
if (keywordsClone.length) {
const keyword = keywordsClone.shift()
return pro.then(async () => {
await asyncForEach(pages, (page) => {
const promiseOperation = request(keyword, page)
return chainNext(promiseOperation)
})
})
}
return pro
}
Credit for the parallel requests in batches goes to https://itnext.io/node-js-handling-asynchronous-operations-in-parallel-69679dfae3fc

How execute promises in order?

I can't make my code work in order. I need the connection test to come first, and finally the functions are also resolved in order to form a text string that will be sent in a tweet with an NPM package. (This is not my true code, it is a summary example)
I've tried many things and my brain is on fire
// Test DB conection
db.authenticate()
.then(() => {
const server = http.createServer(app)
server.listen(config.port, () => {
console.log(`http://localhost:${config.port}`)
})
reload(app)
})
.catch(err => {
console.log(`Error: ${err}`)
})
// Functions
resumen.man = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is good.'
})
}
resumen.man1 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is bad.'
})
}
resumen.man2 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is big.'
})
}
// Execute resumen.man(1) first and save text in $varStringMultiLine ?
// Execute resumen.man1(1) later and save text in the same $varStringMultiLine ?
// Execute resumen.man2(1) last and save text in the same $varStringMultiLine ?
sendTweet($varStringMultiLine)
Thanx.
As commented by #Barmar and #some, you could chain the promises with .then or use async / await. I would recommend the latter, since .then-chaining will get unwieldy fast.
This is a really good explanation for async / await: https://javascript.info/async-await
Basically, you can use
await db.authenticate();
to halt the code and not execute the next line before the promise is resolved. However, to not freeze the whole execution, this itself needs to be done asynchronously in a promise.

Categories