This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
Closed 11 months ago.
I have an api route that needs to take data from two sources, merge the data together into one object and then return. The issue I am having is I'm basically stuck in async/await hell and when pushing to a second array within the .then() block, the second array named
clone returns []. How can I make an api request, merge the data and return to the requester as needed?
Fetch code:
export default async function getProduct(product_id) {
const product = await fetch(
`${process.env.PRIVATE_APP_URL}/products/${product_id}.json`,
{
method: "GET",
headers: {
"Content-Type": "application/json",
},
}
).then((result) => {
return result.json();
});
return product.product;
}
API Handler:
const recharge_subscription_res = await rechargeAPI(
"GET",
`https://api.rechargeapps.com/subscriptions?customer_id=${recharge_customer.id}`
);
const closest_array = recharge_subscription_res.subscriptions.filter(
(e) => e.next_charge_scheduled_at == closest_date
);
let clone = [];
closest_array.forEach((element) => {
getProduct(element.shopify_product_id).then((product) => {
element.shopify_product_handle = product.handle;
element.shopify_product_image_url = product.image.src;
clone.push(element);
});
});
console.log(clone);
clone should log as an array of objects like closest_array, but instead logs as just an empty array. This isn't exactly like the other seemingly duplicate questions because typically their feature doesn't require sending the promise's data back to an external source. Most questions are related to the front end of things. My situation is with an Express.js API. Any help would be appreciated.
The original promise spec used .then(), and the new syntax hides then's with await. Style-wise, it makes sense to choose just one style and go with it.
In either style, there's a little challenge having to do with creating many promises in a loop. The js iteration functions (like map and forEach) take synchronous functions. The most common design is to create a collection of promises in a synchronous loop, then run them concurrently with Promise.all(). Taking both ideas into account...
You could (but don't have to) rewrite your network request like this...
// since we decorated "async" let's use await...
export default async function getProduct(product_id) {
const url = `${process.env.PRIVATE_APP_URL}/products/${product_id}.json`;
const options = { method: "GET", headers: { "Content-Type": "application/json" }};
const result = await fetch(url, options);
const product = await result.json();
return product.product;
}
await is not permitted at the top-level; it may be used only within an async function. Here, I'll make up a name and guess about the parameter
async function rechargeAndLookupProduct(recharge_customer) {
const base = 'https://api.rechargeapps.com/subscriptions';
const query = `customer_id=${recharge_customer.id}`;
const recharge_subscription_res = await rechargeAPI("GET",`${base}?${query}`);
const closest_array = recharge_subscription_res.subscriptions.filter(e =>
e.next_charge_scheduled_at == closest_date
);
// here's the important part: collect promises synchronously
// execute them together with Promise.all()
const promises = closest_array.map(element => {
return getProduct(element.shopify_product_id)
});
const allProducts = await Promise.all(promises);
// allProducts will be an array of objects that the promises resolved to
const clones = allProducts.map((product, i) => {
// use Object.assign so we'll really have a "clone"
let closest = Object.assign({}, closest_array[i]);
closest.shopify_product_handle = product.handle;
closest.shopify_product_image_url = product.image.src;
return closest;
});
// if I didn't make any typos (which I probably did), then
// clones ought to contain the result you expect
console.log(clones);
}
Your code has a flaw (in the section shown below). You have a dangling promise that you forgot to await or return.
When you log clone, none of the async getProduct operations have completed yet, and none of the elements have been pushed.
let clone = [];
closest_array.forEach((element) => {
getProduct(element.shopify_product_id).then((product) => {
element.shopify_product_handle = product.handle;
element.shopify_product_image_url = product.image.src;
clone.push(element);
}); // FLAW: dangling .then
});
console.log(clone); // FLAW: clone is not ready yet.
I would set it up more like this:
let clone = await Promise.all(closest_array.map((element) =>
getProduct(element.shopify_product_id).then((product) => {
element.shopify_product_handle = product.handle;
element.shopify_product_image_url = product.image.src;
return element;
})
));
console.log(clone);
It's a little sketchy to modify element the way you are (I wouldn't), but this way the getProduct calls are all in flight together for maximum efficiency. Promise.all handles awaiting all the promises and putting each's result into an array of results, which you can then await as a single promise since the calling function is async.
Related
In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}
I'm playing with the Rick and Morty API and I want to get all of the universe's characters
into an array so I don't have to make more API calls to work the rest of my code.
The endpoint https://rickandmortyapi.com/api/character/ returns the results in pages, so
I have to use recursion to get all the data in one API call.
I can get it to spit out results into HTML but I can't seem to get a complete array of JSON objects.
I'm using some ideas from
Axios recursion for paginating an api with a cursor
I translated the concept for my problem, and I have it posted on my Codepen
This is the code:
async function populatePeople(info, universePeople){ // Retrieve the data from the API
let allPeople = []
let check = ''
try {
return await axios.get(info)
.then((res)=>{
// here the current page results is in res.data.results
for (let i=0; i < res.data.results.length; i++){
item.textContent = JSON.stringify(res.data.results[i])
allPeople.push(res.data.results[i])
}
if (res.data.info.next){
check = res.data.info.next
return allPeople.push(populatePeople(res.data.info.next, allPeople))
}
})
} catch (error) {
console.log(`Error: ${error}`)
} finally {
return allPeople
}
}
populatePeople(allCharacters)
.then(data => console.log(`Final data length: ${data.length}`))
Some sharp eyes and brains would be helpful.
It's probably something really simple and I'm just missing it.
The following line has problems:
return allPeople.push(populatePeople(res.data.info.next, allPeople))
Here you push a promise object into allPeople, and as .push() returns a number, you are returning a number, not allPeople.
Using a for loop to push individual items from one array to another is really a verbose way of copying an array. The loop is only needed for the HTML part.
Also, you are mixing .then() with await, which is making things complex. Just use await only. When using await, there is no need for recursion any more. Just replace the if with a loop:
while (info) {
....
info = res.data.info.next;
}
You never assign anything to universePeople. You can drop this parameter.
Instead of the plain for loop, you can use the for...of syntax.
As from res you only use the data property, use a variable for that property only.
So taking all that together, you get this:
async function populatePeople(info) {
let allPeople = [];
try {
while (info) {
let {data} = await axios.get(info);
for (let content of data.results) {
const item = document.createElement('li');
item.textContent = JSON.stringify(content);
denizens.append(item);
}
allPeople.push(...data.results);
info = data.info.next;
}
} catch (error) {
console.log(`Error: ${error}`)
} finally {
section.append(denizens);
return allPeople;
}
}
Here is working example for recursive function
async function getAllCharectersRecursively(URL,results){
try{
const {data} = await axios.get(URL);
// concat current page results
results =results.concat(data.results)
if(data.info.next){
// if there is next page call recursively
return await getAllCharectersRecursively(data.info.next,results)
}
else{
// at last page there is no next page so return collected results
return results
}
}
catch(e){
console.log(e)
}
}
async function main(){
let results = await getAllCharectersRecursively("https://rickandmortyapi.com/api/character/",[])
console.log(results.length)
}
main()
I hesitate to offer another answer because Trincot's analysis and answer is spot-on.
But I think a recursive answer here can be quite elegant. And as the question was tagged with "recursion", it seems worth presenting.
const populatePeople = async (url) => {
const {info: {next}, results} = await axios .get (url)
return [...results, ...(next ? await populatePeople (next) : [])]
}
populatePeople ('https://rickandmortyapi.com/api/character/')
// or wrap in an `async` main, or wait for global async...
.then (people => console .log (people .map (p => p .name)))
.catch (console .warn)
.as-console-wrapper {max-height: 100% !important; top: 0}
<script>/* dummy */ const axios = {get: (url) => fetch (url) .then (r => r .json ())} </script>
This is only concerned with fetching the data. Adding it to your DOM should be a separate step, and it shouldn't be difficult.
Update: Explanation
A comment indicated that this is hard to parse. There are two things that I imagine might be tricky here:
First is the object destructuring in {info: {next}, results} = <...>. This is just a nice way to avoid using intermediate variables to calculate the ones we actually want to use.
The second is the spread syntax in return [...results, ...<more>]. This is a simpler way to build an array than using .concat or .push. (There's a similar feature for objects.)
Here's another version doing the same thing, but with some intermediate variables and an array concatenation instead. It does the same thing:
const populatePeople = async (url) => {
const response = await axios .get (url)
const next = response .info && response .info .next
const results = response .results || []
const subsequents = next ? await populatePeople (next) : []
return results .concat (subsequents)
}
I prefer the original version. But perhaps you would find this one more clear.
First of all, there are some issues with console.log in Google Chrome not functioning as expected. This is not the case as I am working in VSCode.
We begin with two async calls to the server.
promise_a = fetch(url)
promise_b = fetch(url)
Since fetch results are also promises, .json() will needed to be called on each item. The helper function process will be used, as suggested by a Stackoverflow user -- sorry lost the link.
let promiseResults = []
let process = prom => {
prom.then(data => {
promiseResults.push(data);
});
};
Promise.all is called. The resulting array is passed to .then where forEach calls process on item.json() each iteration and fulfilled promises are pushed to promiseResults.
Promise.all([promise_a, promise_b])
.then(responseArr => {
responseArr.forEach(item => {
process(item.json());
});
})
No argument is given to the final .then block because promiseResults are in the outer scope. console.log show confusing results.
.then(() => {
console.log(promiseResults); // correct results
console.log(promiseResults[0]); // undefined ?!?
})
Any help will be greatly appreciated.
If you are familiar with async/await syntax, I would suggest you not to use an external variable promiseResults, but return the results on the fly with this function:
async function getJsonResults(promisesArr) {
// Get fetch promises response
const results = await Promise.all(promisesArr);
// Get JSON from each response promise
const jsonResults = await Promise.all(results.map(r => r.json()));
return jsonResults
}
This is usage example:
promise_a = fetch(url1)
promise_b = fetch(url2)
getJsonResults([promise_a, promise_b])
.then(theResults => console.log('All results:', theResults))
Use theResults variable to extract necessary results.
You can try this, it looks the array loop is not going properly in the promise env.
Specifically: the promiseResults is filled after you are logging.
var resultAll = Promise.all([promise_a, promise_b])
.then(responseArr => {
return Promise.all(responseArr.map(item => return item.json()));
});
resultAll.then(promiseResults => {
console.log(promiseResults);
});
When I try to return a promise from an async function, it's impossible to distinguish the status of the returned Promise and the function.
I think, the simplest solution is to put the promise to be returned in an array. The following is a stupid example, but I hope it demonstrates the problem:
function loadMetaData(id) {/*...*/} // Returns Promise<MetaData>
function loadSingleData(name) {/*...*/} // Returns Promise<SingleData>
async function startLoadingSingleData(id, object) {
const metaData = object.metaData = await loadMetadata(id);
const singleDataPromise = loadSingleData(metaData.dataToLoad);
singleDataPromise.then(metaData => object.metaData = metaData);
return [singleDataPromise];
}
async function logData(id) {
const object = new SomeUsefulClassWhatTakesAdvantageOfMetadataProp();
somePromise = (await startLoadingSingleData(id))[0];
// Now metadata surely loaded, do something with it
console.log(object.metaData);
// But somedata will be loaded only in future
somePromise.then(singleData => console.log(singleData));
// And maybe: (depends of use-case)
await somePromise;
}
When executing logData(/*...*/), first the metaData of the given ID of the given data after a short period, and after a little waiting the full singleData is expected.
But this is kinda hackish.
What is the intended way to overcome this situation?
PS.:
This problem occurs too, when I try to return a Promise which resolves with the promise.
Yes, unfortunately JS promises are not algebraic and you cannot fulfill a promise with another promise. There's no way around that (other than not using native promises, and not using async/await).
The easiest and most common solution is indeed using a wrapper object. It comes naturally to your problem:
// takes an id, returns a Promise<{metaData: Data, singleDataPromise: Promise<Data>}>
async function startLoadingSingleData(id) {
const object = new SomeUsefulClassWhatTakesAdvantageOfMetadataProp();
object.metaData = await loadMetadata(id);
// ^^^^^
object.singleDataPromise = loadSingleData(object.metaData.dataToLoad);
// ^ no await here
return object;
}
async function logData(id) {
const object = await startLoadingSingleData(id));
// Now metadata surely loaded, do something with it
console.log(object.metaData);
// But some singleData will be loaded only in future
const singleData = await object.singleDataPromise;
console.log(singleData);
}
Notice that this potentially leads to problems with unhandled rejections if there is an exception in your code and you never get to await the singleDataPromise.
The (probably much better) alternative is to restructure your functions so that you don't create any promises before using (i.e. awaiting) them, like #Paulpro also suggested. So you'd just write a single strictly sequential function
async function logData(id) {
const object = new SomeUsefulClassWhatTakesAdvantageOfMetadataProp();
object.metaData = await loadMetadata(id);
// Now metadata surely loaded, do something with it
console.log(object.metaData);
// But some singleData will be loaded only in future
object.singleData = await loadSingleData(object.metaData.dataToLoad);
console.log(object.singleData);
}
Your problem is that startLoadingSingleData has too many responsibilities. It is responsible for both loading the metadata and triggering loading of singledata.
Your logData function uses await startLoadingSingleData(id) as a way to make sure that metadata is available, which does not seem very intuituve. It is not obvious that startLoadingSingleData(id) returns a Promise that resolves when the metadata has loaded, and would be quite confusing for a coder looking at it for the first time (or yourself after a few months). Code should be self-documenting as much as possible so that you don't need to explain every line with comments.
My recommendation is to completely remove the startLoadingSingleData function and just do this inside logData:
async function logData(id) {
const metaData = await loadMetadata(id);
console.log(metaData);
const singleData = await loadSingleData(metaData.name);
console.log(singleData);
}
or if you don't want logData to await the SingleData Promise:
async function logData(id) {
const metaData = await loadMetadata(id);
console.log(metaData);
loadSingleData(metaData.name).then( singleData => {
console.log(singleData);
} );
}
If you really want to keep using the function startLoadingSingleData instead then I think you should make it return an array or an object containing two Promises:
function startLoadingSingleData(id) {
const metaDataLoaded = loadMetadata(id);
const singleDataLoaded = metaDataLoaded.then(
metaData => loadSingleData(metaData.dataToLoad)
);
return { metaDataLoaded, singleDataLoaded };
}
Then your usage would look something like:
async function logData(id) {
const { metaDataLoaded, singleDataLoaded } = startLoadingSingleData(id);
const metaData = await metaDataLoaded;
console.log(metaData);
const singleData = await singleDataLoaded;
console.log(singleData);
}
I am creating an API that when GET, a series of calls to the News API are made, news article titles are extracted into a giant string, and that string is processed into an object to be delivered to a wordcloud on the front-end. So far, I've been able to use underscore's _.after and request-promise to make my app wait till all API calls have completed before calling processWordBank() which takes the giant string and cleans it up into an object. However, once processWordBank() is called, I don't understand where the flow of the program is. Ideally, processWordBank() returns obj to cloudObj in the router, so that the obj can be passed to res.json() and spit out as the response. I believe my use of _.after has put me in a weird situation, but it's the only way I've been able to get async calls to finish before proceeding to next desired action. Any suggestions?
(I've tried to leave out all unnecessary code but let me know if this is insufficient)
// includes...
var sourceString = ""
// router
export default ({ config }) => {
let news = Router()
news.get('/', function(req, res){
var cloudObj = getSources()
res.json({ cloudObj })
})
return news
}
// create list of words (sourceString) by pulling news data from various sources
function getSources() {
return getNewsApi()
}
// NEWS API
// GET top 10 news article titles from News API (news sources are determined by the values of newsApiSource array)
function getNewsApi() {
var finished = _.after(newsApiSource.length, processWordBank)
for(var i = 0; i < newsApiSource.length; i++) {
let options = {
uri: 'https://newsapi.org/v1/articles?source=' + newsApiSource[i] + '&sortBy=' + rank + '&apiKey=' + apiKey,
json: true
}
rp(options)
.then(function (res) {
let articles = res.articles // grab article objects from the response
let articleTitles = " " + _.pluck(articles, 'title') // extract title of each news article
sourceString += " " + articleTitles // add all titles to the word bank
finished() // this async task has finished
})
.catch(function (err) {
console.log(err)
})
}
}
// analyse word bank for patterns/trends
function processWordBank(){
var sourceArray = refineSource(sourceString)
sourceArray = combineCommon(sourceArray)
sourceArray = getWordFreq(sourceArray)
var obj = sortToObject(sourceArray[0], sourceArray[1])
console.log(obj)
return obj
}
A big issue in your asynchronous flow is that you use a shared variable sourceString to handle the results. When you have multiple calls to getNewsApi() your result is not predictable and will not always be the same, because there is no predefined order in which the asynchronous calls are executed. Not only that, but you never reset it, so all subsequent calls will also include the results of the previous calls. Avoid modifying shared variables in asynchronous calls and instead use the results directly.
I've been able to use underscore's _.after and request-promise to make my app wait till all API calls have completed before calling processWordBank()
Although it would possible to use _.after, this can be done very nicely with promises, and since you're already using promises for your requests, it's just a matter of collecting the results from them. So because you want to wait until all API calls are completed you can use Promise.all which returns a promise that resolves with an array of the values of all the promises, once all of them are fulfilled. Let's have a look at a very simple example to see how Promise.all works:
// Promise.resolve() creates a promise that is fulfilled with the given value
const p1 = Promise.resolve('a promise')
// A promise that completes after 1 second
const p2 = new Promise(resolve => setTimeout(() => resolve('after 1 second'), 1000))
const p3 = Promise.resolve('hello').then(s => s + ' world')
const promises = [p1, p2, p3]
console.log('Waiting for all promises')
Promise.all(promises).then(results => console.log('All promises finished', results))
console.log('Promise.all does not block execution')
Now we can modify getNewsApi() to use Promise.all. The array of promises that is given to Promise.all are all the API request you're doing in your loop. This will be created with Array.protoype.map. And also instead of creating a string out of the array returned from _.pluck, we can just use the array directly, so you don't need to parse the string back to an array at the end.
function getNewsApi() {
// Each element is a request promise
const apiCalls = newsApiSource.map(function (source) {
let options = {
uri: 'https://newsapi.org/v1/articles?source=' + source + '&sortBy=' + rank + '&apiKey=' + apiKey,
json: true
}
return rp(options)
.then(function (res) {
let articles = res.articles
let articleTitles = _.pluck(articles, 'title')
// The promise is fulfilled with the articleTitles
return articleTitles
})
.catch(function (err) {
console.log(err)
})
})
// Return the promise that is fulfilled with all request values
return Promise.all(apiCalls)
}
Then we need to use the values in the router. We know that the promise returned from getNewsApi() fulfils with an array of all the requests, which by themselves return an array of articles. That is a 2d array, but presumably you would want a 1d array with all the articles for your processWordBank() function, so we can flatten it first.
export default ({ config }) => {
let news = Router()
new.get('/', (req, res) => {
const cloudObj = getSources()
cloudObj.then(function (apiResponses) {
// Flatten the array
// From: [['source1article1', 'source1article2'], ['source2article1'], ...]
// To: ['source1article1', 'source1article2', 'source2article1', ...]
const articles = [].concat.apply([], apiResponses)
// Pass the articles as parameter
const processedArticles = processWordBank(articles)
// Respond with the processed object
res.json({ processedArticles })
})
})
}
And finally processWordBank() needs to be changed to use an input parameter instead of using the shared variable. refineSource is no longer needed, because you're already passing an array (unless you do some other modifications in it).
function processWordBank(articles) {
let sourceArray = combineCommon(articles)
sourceArray = getWordFreq(sourceArray)
var obj = sortToObject(sourceArray[0], sourceArray[1])
console.log(obj)
return obj
}
As a bonus the router and getNewsApi() can be cleaned up with some ES6 features (without the comments from the snippets above):
export default ({ config }) => {
const news = Router()
new.get('/', (req, res) => {
getSources().then(apiResponses => {
const articles = [].concat(...apiResponses)
const processedArticles = processWordBank(articles)
res.json({ processedArticles })
})
})
}
function getNewsApi() {
const apiCalls = newsApiSource.map(source => {
const options = {
uri: `https://newsapi.org/v1/articles?source=${source}&sortBy=${rank}&apiKey=${apiKey}`,
json: true
}
return rp(options)
.then(res => _.pluck(res.articles, 'title'))
.catch(err => console.log(err))
})
return Promise.all(apiCalls)
}