At this moment I have a webpage in which a long list of Axios POST calls are being made. Now, the requests seem to be sent in parallel (JavaScript continues sending the next request before the result is received).
However, the results seem to be returned one by one, not simultaneously. Let's say one POST call to the PHP script takes 4 seconds and I need to make 10 calls. It would currently take 4 seconds per call, which would be 40 seconds in total. I hope to find a solution to both and receive all results at approximately the same time (~4 seconds) instead of ~40 seconds.
Now I've read about threads, multithreading in NodeJS using Workers. I've read that JavaScript itself is only single-threaded, so it may not allow this by itself.
But I'm not sure where to go from here. All I have are some ideas. I'm not sure whether or not I'm heading into the right direction and if I am, I am not sure how to use Workers in NodeJS and apply it in my code. Which road should I take? Any guidance would be highly appreciated!
Here is a small piece of example code:
for( var i = 0; i < 10; i++ )
{
window.axios.post(`/my-url`, {
myVar: 'myValue'
})
.then((response) => {
// Takes 4 seconds, 4 more seconds, 4 more seconds, etc
// Ideally: Takes 4 seconds, returns in the same ~4 seconds, returns in the same ~4 seconds, etc
console.log( 'Succeeded!' );
})
.catch((error) => {
console.log( 'Error' );
});
// Takes < 1 second, < 1 more second, < 1 more second, etc
console.log( 'Request sent!' );
}
There are three cases via you can achieve your goal.
For simultaneous requests with Axios, you can use Axios.all()
axios.all([
axios.post(`/my-url`, {
myVar: 'myValue'
}),
axios.post(`/my-url2`, {
myVar: 'myValue'
})
])
.then(axios.spread((data1, data2) => {
// output of req.
console.log('data1', data1, 'data2', data2)
}));
you can use Promise.allSettled(). The Promise.allSettled() method returns a promise that resolves after all of the given promises have either resolved or rejected,
You can try to use Promise.all() but it has the drawback that if any 1 req failed then it will fail for all and give o/p as an error(or in catch block)
but the best case is the first one.
For simultaneous requests with Axios you can use Axios.all().
axios.all([
axios.get('https://api.github.com/users/MaksymRudnyi'),
axios.get('https://api.github.com/users/taylorotwell')
])
.then(axios.spread((obj1, obj2) => {
// Both requests are now complete
console.log(obj1.data.login + ' has ' + obj1.data.public_repos + ' public repos on GitHub');
console.log(obj2.data.login + ' has ' + obj2.data.public_repos + ' public repos on GitHub');
}));
Also, you can use Promise.all(). Works similar:
Promise.all([
fetch('https://api.github.com/users/MaksymRudnyi'),
fetch('https://api.github.com/users/taylorotwell')
])
.then(async([res1, res2]) => {
const a = await res1.json();
const b = await res2.json();
console.log(a.login + ' has ' + a.public_repos + ' public repos on GitHub');
console.log(b.login + ' has ' + b.public_repos + ' public repos on GitHub');
})
.catch(error => {
console.log(error);
});
But, with Promise.all() there is a specific behavior. In case at least one request will be rejected - the all request will be rejected and code will go to .catch() sections. It's OK in case you need to be sure that all requests are resolved.
In case when it's OK when some of your requests are rejected consider using Promise.allSettled(). The Promise.allSettled() method returns a promise that resolves after all of the given promises have either resolved or rejected, with an array of objects that each describes the outcome of each promise.
Try in this way
window.axios.all([requestOne, requestTwo, requestThree])
.then(axios.spread((...responses) => {
const responseOne = responses[0]
const responseTwo = responses[1]
const responesThree = responses[2]
// use/access the results
})).catch(errors => {
// react on errors.
})
If you want to have it within a loop, you can have slightly modified version of #deelink as below
let promises = [];
for (i = 0; i < 10; i++) {
promises.push(
window.axios.post(`/my-url`, {
myVar: 'myValue'})
.then(response => {
// do something with response
})
)
}
Promise.all(promises).then(() => console.log('all done'));
Try this with Axios.all and
Use Promise.all() method returns a single Promise that fulfills when all of the promises passed as an iterable have been fulfilled Promise MDN ref Link
import axios from 'axios';
let one = "https://api1"
let two = "https://api2"
let three = "https://api3"
const requestOne = axios.get(one);
const requestTwo = axios.get(two);
const requestThree = axios.get(three);
axios.all([requestOne, requestTwo, requestThree]).then(axios.spread((...responses) => {
const responseOne = responses[0]
const responseTwo = responses[1]
const responesThree = responses[2]
// use/access the results
console.log("responseOne",responseOne);
console.log("responseTwo",responseTwo);
console.log("responesThree",responesThree);
})).catch(errors => {
console.log(errors);
})
Ref Link
Find full example here for axios
You can use this way
const token_config = {
headers: {
'Authorization': `Bearer ${process.env.JWD_TOKEN}`
}
}
const [ res1, res2 ] = await Axios.all([
Axios.get(`https://api-1`, token_config),
Axios.get(`https://api-2`, token_config)
]);
res.json({
info: {
"res_1": res1,
"res_2": res2
}
});
This is weird and shouldn't happen. The Javascript engines are single threaded but the Web APIs (which are internally used when making AJAX requests) are not. So the requests should be made approximately at the same time and the response times should depend on server processing times and network delays.
Web browsers have a limit on the number of connections per server (6 in chrome https://bugs.chromium.org/p/chromium/issues/detail?id=12066) which would explain some serialization. But not this.
Since the requests takes 4 seconds, which is long, my guess is that the server is the problem. It may only be able to handle 1 connection at a time. Do you have control over it?
Related
In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}
I have an array of elements to insert in a database. For each of them, I have to check their integrity (I send "Bad request" if I don't find an element):
let ret = []
const { idElement, type, description, name } = req.body
let promises = []
req.body.pjs.forEach((pj) => {
promises.push(new Promise (async function(resolve, reject) {
const { rows } = await db.query(`SELECT * FROM files WHERE uuid = '${pj.uuid}' AND name = '${pj.name}'`)
if (rows.length == 0) { res.status(400).send("Bad request!") }
const idFile = rows[0].id
await db.query(`UPDATE elements
SET base = base || '{"type":"file","valeur":"${idFile}","description":"${description}","name":"${pj.name}"}'::json
WHERE id = ${idElement}; `)
resolve({id: idElement, name: pj.name, val: idFile, description: description})
}))
});
(async function() {
const asyncFunctions = promises
await asyncFunctions.reduce(async (previousPromise, nextAsyncFunction) => {
await previousPromise;
const r = await nextAsyncFunction();
ret.push(r)
}, Promise.resolve());
})();
res.send(ret)
I took the example of the paragraph "3) one-by-one" heree: https://dev.to/afifsohaili/dealing-with-promises-in-an-array-with-async-await-5d7g
This trick works for a lot of use cases in other parts of my code, but not for this particular case. I have this error:
const r = await nextAsyncFunction();
TypeError: nextAsyncFunction is not a function
And I don't know why. If anybody could give me a hand, it would be very kind :)
The error message is correct, the second parameter of reduce is the next entry of the array being reduced, which in this case is the promises array.
So the immediate solution is to await the promise without trying to call it:
const r = await nextAsyncFunction; // no () on the end
Why the nextAsyncFunction name was used instead of nextPromise or variation thereof is not self evident - it's certainly confusing and led to errors.
Aside from that there seems to be some bugs waiting to happen:
If the "Bad request" message is sent, the code continues to execute and tries to update the database and resolve the promise pushed by the forEach function. Subsequently res.send(ret) will (is likely to?) error as an attempt to send a second set of response headers. Try thowing a Bad Request error and catching it in a promise catch handler to send the 400 response.
there is no attempt to wait for asynchronous processing to finish before executing
res.send(ret)
which would send an empty array if it succeeded.
The reduce(async (previousPromise, nextPromise) construct is a rather complicated way of waiting for promises to be resolved in turn by using for ... of :
(async function() {
for( promise of promises) {
ret.push( await promise);
}
}()
.then( ()=> res.send(ret));
.catch( ()=> // server error response?
Handling requests that are a mixture of valid and invalid pj request values may require further attention.
I want to wait for three HTTP requests to complete before calling another function to work on the data returned by the three HTTP requests.
I tried looping the number of rooms (equivalent to the number of HTTP requests required) and then pushing the devices into an array. I need all three HTTP requests to be completed before passing the array of devices to the next function to work on.
getDeviceListByRoom(rooms_id_array: string[]) {
console.log('The rooms_id_array is: ', rooms_id_array);
this.room_device_map = new Map;
let count = 0;
for (const id of rooms_id_array) {
const devicesByRoom = this.Services.getDeviceByRoom(id);
devicesByRoom.subscribe((res: any) => {
if (res.code === 200) {
// this statement will map the list of devices to the room.
this.room_device_map.set(id, res.data);
// this statement will just push the list of devices from different room into one array.
this.all_devices_list.push(res.data);
console.log('count is: ', count++);
}
}, err => {
console.error('ERROR', err);
});
console.log('In all_devices_list is: ', this.all_devices_list);
}
console.log('Lalalalal');
}
The code above will return 'Lalalala' first followed by the console print out of the count variable. Understandably, the for...of function is non blocking...?
forkJoin is your friend.
(am on mobile, sorry for my brevity)
The easiest way is 'forkJoin', in such situations you can use 'zip' or 'combineLatest' too.
Try this:
import { forkJoin } from 'rxjs';
...
let req1 = this.http.get('xxx');
let req2 = this.http.get('yyy');
forkJoin([req1, req2]).subscribe(res => {
// res[0] is from req1
// res[1] is from req2
});
Combine latest will emit once all 3 requests have emitted
combineLatest(
this.http.get('request1'),
this.http.get('request2'),
this.http.get('request3')
).subscribe(([response1, response2, response3]) => {
// Here you can do stuff with response1, response2 and response3
});
I am creating a script in node.js (V8.1.3) which looks at similar JSON data from multiple API's and compares the values. To be more exact I am looking at different market prices of different stocks (actually cryptocurrencies).
Currently, I am using promise.all to wait for all responses from the respective APIs.
let fetchedJSON =
await Promise.all([getJSON(settings1), getJSON(settings2), getJSON(settings3) ... ]);
However, Promise.all throws an error if even just one promise rejects with an error. In the bluebird docos there is a function called Promise.some which is almost what I want. As I understand it takes an array of promises and resolves the two fastest promises to resolve, or otherwise (if less than 2 promises resolve) throws an error.
The problem with this is that firstly, I don't want the fastest two promises resolved to be what it returns, I want any successful promises to be returned, as long as there is more than 2. This seems to be what Promise.any does except with a min count of 1. (I require a minimum count of 2)
Secondly, I don't know how many Promises I will be awaiting on (In other words, I don't know how many API's I will be requesting data from). It may only be 2 or it may be 30. This depends on user input.
Currently writing this it seems to me there is probably just a way to have a promise.any with a count of 2 and that would be the easiest solution. Is this possible?
Btw, not sure if the title really summarizes the question. Please suggest an edit for the title :)
EDIT: Another way I may be writing the script is that the first two APIs to get loaded in start getting computed and pushed to the browser and then every next JSON that gets loaded and computed after it. This way I am not waiting for all Promises to be fulfilled before I start computing the data and passing results to the front end. Would this be possible with a function which also works for the other circumstances?
What I mean kind of looks like this:
Requesting JSON in parallel...
|-----JSON1------|
|---JSON-FAILS---| > catch error > do something with error. Doesn't effect next results.
|-------JSON2-------| > Meets minimum of 2 results > computes JSON > to browser.
|-------JSON3---------| > computes JSON > to browser.
How about thening all the promises so none fail, pass that to Promise.all, and filter the successful results in a final .then.
Something like this:
function some( promises, count = 1 ){
const wrapped = promises.map( promise => promise.then(value => ({ success: true, value }), () => ({ success: false })) );
return Promise.all( wrapped ).then(function(results){
const successful = results.filter(result => result.success);
if( successful.length < count )
throw new Error("Only " + successful.length + " resolved.")
return successful.map(result => result.value);
});
}
This might be somewhat clunky, considering you're asking to implement an anti-pattern, but you can force each promise to resolve:
async function fetchAllJSON(settingsArray) {
let fetchedJSON = await Promise.all(
settingsArray.map((settings) => {
// force rejected ajax to always resolve
return getJSON(settings).then((data) => {
// initial processing
return { success: true, data }
}).catch((error) => {
// error handling
return { success, false, error }
})
})
).then((unfilteredArray) => {
// only keep successful promises
return dataArray.filter(({ success }) => success)
})
// do the rest of your processing here
// with fetchedJSON containing array of data
}
You can use Promise.allSettled([]). the difference is that allSettled will return an array of objects after all the promises are settled regardless if successful or failed. then just find the successful o whatever you need.
let resArr = await Promise.allSettled(userNamesArr.map(user=>this.authenticateUserPassword(user,password)));
return resArr.find(e=>e.status!="rejected");
OR return resArr.find(e=>e.status=="fulfilled").
The other answers have the downside of having to wait for all the promises to resolve, whereas ideally .some would return as soon as any (N) promise(s) passes the predicate.
let anyNPromises = (promises, predicate = a => a, n = 1) => new Promise(async resolve => {
promises.forEach(async p => predicate(await p) && !--n && resolve(true));
await Promise.all(promises);
resolve(false);
});
let atLeast2NumbersGreaterThan5 = promises => anyNPromises(promises, a => a > 5, 2);
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
Promise.resolve(10),
Promise.resolve(11)]
).then(a => console.log('5, 3, 10, 11', a)); // true
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
Promise.resolve(10),
Promise.resolve(-43)]
).then(a => console.log('5, 3, 10, -43', a)); // false
atLeast2NumbersGreaterThan5([
Promise.resolve(5),
Promise.resolve(3),
new Promise(() => 'never resolved'),
Promise.resolve(10),
Promise.resolve(11)]
).then(a => console.log('5, 3, unresolved, 10, 11', a)); // true
I am creating an API that when GET, a series of calls to the News API are made, news article titles are extracted into a giant string, and that string is processed into an object to be delivered to a wordcloud on the front-end. So far, I've been able to use underscore's _.after and request-promise to make my app wait till all API calls have completed before calling processWordBank() which takes the giant string and cleans it up into an object. However, once processWordBank() is called, I don't understand where the flow of the program is. Ideally, processWordBank() returns obj to cloudObj in the router, so that the obj can be passed to res.json() and spit out as the response. I believe my use of _.after has put me in a weird situation, but it's the only way I've been able to get async calls to finish before proceeding to next desired action. Any suggestions?
(I've tried to leave out all unnecessary code but let me know if this is insufficient)
// includes...
var sourceString = ""
// router
export default ({ config }) => {
let news = Router()
news.get('/', function(req, res){
var cloudObj = getSources()
res.json({ cloudObj })
})
return news
}
// create list of words (sourceString) by pulling news data from various sources
function getSources() {
return getNewsApi()
}
// NEWS API
// GET top 10 news article titles from News API (news sources are determined by the values of newsApiSource array)
function getNewsApi() {
var finished = _.after(newsApiSource.length, processWordBank)
for(var i = 0; i < newsApiSource.length; i++) {
let options = {
uri: 'https://newsapi.org/v1/articles?source=' + newsApiSource[i] + '&sortBy=' + rank + '&apiKey=' + apiKey,
json: true
}
rp(options)
.then(function (res) {
let articles = res.articles // grab article objects from the response
let articleTitles = " " + _.pluck(articles, 'title') // extract title of each news article
sourceString += " " + articleTitles // add all titles to the word bank
finished() // this async task has finished
})
.catch(function (err) {
console.log(err)
})
}
}
// analyse word bank for patterns/trends
function processWordBank(){
var sourceArray = refineSource(sourceString)
sourceArray = combineCommon(sourceArray)
sourceArray = getWordFreq(sourceArray)
var obj = sortToObject(sourceArray[0], sourceArray[1])
console.log(obj)
return obj
}
A big issue in your asynchronous flow is that you use a shared variable sourceString to handle the results. When you have multiple calls to getNewsApi() your result is not predictable and will not always be the same, because there is no predefined order in which the asynchronous calls are executed. Not only that, but you never reset it, so all subsequent calls will also include the results of the previous calls. Avoid modifying shared variables in asynchronous calls and instead use the results directly.
I've been able to use underscore's _.after and request-promise to make my app wait till all API calls have completed before calling processWordBank()
Although it would possible to use _.after, this can be done very nicely with promises, and since you're already using promises for your requests, it's just a matter of collecting the results from them. So because you want to wait until all API calls are completed you can use Promise.all which returns a promise that resolves with an array of the values of all the promises, once all of them are fulfilled. Let's have a look at a very simple example to see how Promise.all works:
// Promise.resolve() creates a promise that is fulfilled with the given value
const p1 = Promise.resolve('a promise')
// A promise that completes after 1 second
const p2 = new Promise(resolve => setTimeout(() => resolve('after 1 second'), 1000))
const p3 = Promise.resolve('hello').then(s => s + ' world')
const promises = [p1, p2, p3]
console.log('Waiting for all promises')
Promise.all(promises).then(results => console.log('All promises finished', results))
console.log('Promise.all does not block execution')
Now we can modify getNewsApi() to use Promise.all. The array of promises that is given to Promise.all are all the API request you're doing in your loop. This will be created with Array.protoype.map. And also instead of creating a string out of the array returned from _.pluck, we can just use the array directly, so you don't need to parse the string back to an array at the end.
function getNewsApi() {
// Each element is a request promise
const apiCalls = newsApiSource.map(function (source) {
let options = {
uri: 'https://newsapi.org/v1/articles?source=' + source + '&sortBy=' + rank + '&apiKey=' + apiKey,
json: true
}
return rp(options)
.then(function (res) {
let articles = res.articles
let articleTitles = _.pluck(articles, 'title')
// The promise is fulfilled with the articleTitles
return articleTitles
})
.catch(function (err) {
console.log(err)
})
})
// Return the promise that is fulfilled with all request values
return Promise.all(apiCalls)
}
Then we need to use the values in the router. We know that the promise returned from getNewsApi() fulfils with an array of all the requests, which by themselves return an array of articles. That is a 2d array, but presumably you would want a 1d array with all the articles for your processWordBank() function, so we can flatten it first.
export default ({ config }) => {
let news = Router()
new.get('/', (req, res) => {
const cloudObj = getSources()
cloudObj.then(function (apiResponses) {
// Flatten the array
// From: [['source1article1', 'source1article2'], ['source2article1'], ...]
// To: ['source1article1', 'source1article2', 'source2article1', ...]
const articles = [].concat.apply([], apiResponses)
// Pass the articles as parameter
const processedArticles = processWordBank(articles)
// Respond with the processed object
res.json({ processedArticles })
})
})
}
And finally processWordBank() needs to be changed to use an input parameter instead of using the shared variable. refineSource is no longer needed, because you're already passing an array (unless you do some other modifications in it).
function processWordBank(articles) {
let sourceArray = combineCommon(articles)
sourceArray = getWordFreq(sourceArray)
var obj = sortToObject(sourceArray[0], sourceArray[1])
console.log(obj)
return obj
}
As a bonus the router and getNewsApi() can be cleaned up with some ES6 features (without the comments from the snippets above):
export default ({ config }) => {
const news = Router()
new.get('/', (req, res) => {
getSources().then(apiResponses => {
const articles = [].concat(...apiResponses)
const processedArticles = processWordBank(articles)
res.json({ processedArticles })
})
})
}
function getNewsApi() {
const apiCalls = newsApiSource.map(source => {
const options = {
uri: `https://newsapi.org/v1/articles?source=${source}&sortBy=${rank}&apiKey=${apiKey}`,
json: true
}
return rp(options)
.then(res => _.pluck(res.articles, 'title'))
.catch(err => console.log(err))
})
return Promise.all(apiCalls)
}