How execute promises in order? - javascript

I can't make my code work in order. I need the connection test to come first, and finally the functions are also resolved in order to form a text string that will be sent in a tweet with an NPM package. (This is not my true code, it is a summary example)
I've tried many things and my brain is on fire
// Test DB conection
db.authenticate()
.then(() => {
const server = http.createServer(app)
server.listen(config.port, () => {
console.log(`http://localhost:${config.port}`)
})
reload(app)
})
.catch(err => {
console.log(`Error: ${err}`)
})
// Functions
resumen.man = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is good.'
})
}
resumen.man1 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is bad.'
})
}
resumen.man2 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is big.'
})
}
// Execute resumen.man(1) first and save text in $varStringMultiLine ?
// Execute resumen.man1(1) later and save text in the same $varStringMultiLine ?
// Execute resumen.man2(1) last and save text in the same $varStringMultiLine ?
sendTweet($varStringMultiLine)
Thanx.

As commented by #Barmar and #some, you could chain the promises with .then or use async / await. I would recommend the latter, since .then-chaining will get unwieldy fast.
This is a really good explanation for async / await: https://javascript.info/async-await
Basically, you can use
await db.authenticate();
to halt the code and not execute the next line before the promise is resolved. However, to not freeze the whole execution, this itself needs to be done asynchronously in a promise.

Related

How to test calling to an async not-awaited function

I want to test the execution of both fuctionUT and an inner async unwaited function externalCall passed by injection. The following code is simple working example of my functions and their usage:
const sleep = async (ms) => new Promise( (accept) => setTimeout(() => accept(), ms) )
const callToExternaService = async () => sleep(1000)
const backgroundJob = async (externalCall) => {
await sleep(500) // Simulate in app work
await externalCall() // Simulate external call
console.log('bk job done')
return 'job done'
}
const appDeps = {
externalService: callToExternaService
}
const functionUT = async (deps) => {
await sleep(30) // Simulate func work
// await backgroundJob(deps.externalService) // This make test work but slow down functionUT execution
backgroundJob(deps.externalService) // I don't want to wait for performance reason
.then( () => console.log('bk job ok') )
.catch( () => console.log('bk job error') )
return 'done'
}
functionUT( appDeps )
.then( (result) => console.log(result) )
.catch( err => console.log(err) )
module.exports = {
functionUT
}
Here there is a simple jest test case that fail but just for timing reasons:
const { functionUT } = require('./index')
describe('test', () => {
it('should pass', async () => {
const externaServiceMock = jest.fn()
const fakeDeps = {
externalService: externaServiceMock
}
const result = await functionUT(fakeDeps)
expect(result).toBe('done')
expect(externaServiceMock).toBeCalledTimes(1) //Here fail but just for timing reasons
})
})
What is the correct way to test the calling of externaServiceMock (make the test pass) without slowdown the performance of the functionUT ?
I have already found similar requests, but they threat only a simplified version of the problem.
how to test an embedded async call
You can't test for the callToExternaService to be called "somewhen later" indeed.
You can however mock backgroundJob and test that is was called with the expected arguments (before functionUT completes), as well as unit test backgroundJob on its own.
If a promise exists but cannot be reached in a place that relies on its settlement, this is a potential design problem. A module that does asynchronous side effects on imports is another problem. Both concerns affect testability, also they can affect the application if the way it works changes.
Considering there's a promise, you have an option to chain or not chain it in a specific place. This doesn't mean it should be thrown away. In this specific case it can be possibly returned from a function that doesn't chain it.
A common way to do this is to preserve a promise at every point in case it's needed later, at least for testing purposes, but probably for clean shutdown, extending, etc.
const functionUT = async (deps) => {
await sleep(30) // Simulate func work
return {
status: 'done',
backgroundJob: backgroundJob(deps.externalService)...
};
}
const initialization = functionUT( appDeps )...
module.exports = {
functionUT,
initialization
}
In this form it's supposed to be tested like:
beforeAll(async () => {
let result = await initialization;
await result.backgroundJob;
});
...
let result = await functionUT(fakeDeps);
expect(result.status).toBe('done')
await result.backgroundJob;
expect(externaServiceMock).toBeCalledTimes(1);
Not waiting for initialization can result in open handler if test suite is short enough and cause a reasonable warning from Jest.
The test can be made faster by using Jest fake timers in right places together with flush-promises.
functionUT( appDeps ) call can be extracted from the module to cause a side effect only in the place where it's needed, e.g. in entry point. This way it won't interfere with the rest of tests that use this module. Also at least some functions can be extracted to their own modules to be mockable and improve testability (backgroundJob, as another answer suggests) because they cannot be mocked separately when they are declared in the same module the way they are.

Array of filtered axios results from paginated API is empty

In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}

JS: what's a use-case of Promise.resolve()

I am looking at https://www.promisejs.org/patterns/ and it mentions it can be used if you need a value in the form of a promise like:
var value = 10;
var promiseForValue = Promise.resolve(value);
What would be the use of a value in promise form though since it would run synchronously anyway?
If I had:
var value = 10;
var promiseForValue = Promise.resolve(value);
promiseForValue.then(resp => {
myFunction(resp)
})
wouldn't just using value without it being a Promise achieve the same thing:
var value = 10;
myFunction(10);
Say if you write a function that sometimes fetches something from a server, but other times immediately returns, you will probably want that function to always return a promise:
function myThingy() {
if (someCondition) {
return fetch('https://foo');
} else {
return Promise.resolve(true);
}
}
It's also useful if you receive some value that may or may not be a promise. You can wrap it in other promise, and now you are sure it's a promise:
const myValue = someStrangeFunction();
// Guarantee that myValue is a promise
Promise.resolve(myValue).then( ... );
In your examples, yes, there's no point in calling Promise.resolve(value). The use case is when you do want to wrap your already existing value in a Promise, for example to maintain the same API from a function. Let's say I have a function that conditionally does something that would return a promise — the caller of that function shouldn't be the one figuring out what the function returned, the function itself should just make that uniform. For example:
const conditionallyDoAsyncWork = (something) => {
if (something == somethingElse) {
return Promise.resolve(false)
}
return fetch(`/foo/${something}`)
.then((res) => res.json())
}
Then users of this function don't need to check if what they got back was a Promise or not:
const doSomethingWithData = () => {
conditionallyDoAsyncWork(someValue)
.then((result) => result && processData(result))
}
As a side node, using async/await syntax both hides that and makes it a bit easier to read, because any value you return from an async function is automatically wrapped in a Promise:
const conditionallyDoAsyncWork = async (something) => {
if (something == somethingElse) {
return false
}
const res = await fetch(`/foo/${something}`)
return res.json()
}
const doSomethingWithData = async () => {
const result = await conditionallyDoAsyncWork(someValue)
if (result) processData(result)
}
Another use case: dead simple async queue using Promise.resolve() as starting point.
let current = Promise.resolve();
function enqueue(fn) {
current = current.then(fn);
}
enqueue(async () => { console.log("async task") });
Edit, in response to OP's question.
Explanation
Let me break it down for you step by step.
enqueue(task) add the task function as a callback to promise.then, and replace the original current promise reference with the newly returned thenPromise.
current = Promise.resolve()
thenPromise = current.then(task)
current = thenPromise
As per promise spec, if task function in turn returns yet another promise, let's call it task() -> taskPromise, well then the thenPromise will only resolve when taskPromise resolves. thenPromise is practically equivalent to taskPromise, it's just a wrapper. Let's rewrite above code into:
current = Promise.resolve()
taskPromise = current.then(task)
current = taskPromise
So if you go like:
enqueue(task_1)
enqueue(task_2)
enqueue(task_3)
it expands into
current = Promise.resolve()
task_1_promise = current.then(task_1)
task_2_promise = task_1_promise.then(task_2)
task_3_promise = task_2_promise.then(task_3)
current = task_3_promise
effectively forms a linked-list-like struct of promises that'll execute task callbacks in sequential order.
Usage
Let's study a concrete scenario. Imaging you need to handle websocket messages in sequential order.
Let's say you need to do some heavy computation upon receiving messages, so you decide to send it off to a worker thread pool. Then you write the processed result to another message queue (MQ).
But here's the requirement, that MQ is expecting the writing order of messages to match with the order they come in from the websocket stream. What do you do?
Suppose you cannot pause the websocket stream, you can only handle them locally ASAP.
Take One:
websocket.on('message', (msg) => {
sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
})
This may violate the requirement, cus sendToWorkerThreadPool may not return the result in the original order since it's a pool, some threads may return faster if the workload is light.
Take Two:
websocket.on('message', (msg) => {
const task = () => sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
This time we enqueue (defer) the whole process, thus we can ensure the task execution order stays sequential. But there's a drawback, we lost the benefit of using a thread pool, cus each sendToWorkerThreadPool will only fire after last one complete. This model is equivalent to using a single worker thread.
Take Three:
websocket.on('message', (msg) => {
const promise = sendToWorkerThreadPool(msg)
const task = () => promise.then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
Improvement over take two is, we call sendToWorkerThreadPool ASAP, without deferring, but we still enqueue/defer the writeToMessageQueue part. This way we can make full use of thread pool for computation, but still ensure the sequential writing order to MQ.
I rest my case.

Promise.all results are as expected, but individual items showing undefined

First of all, there are some issues with console.log in Google Chrome not functioning as expected. This is not the case as I am working in VSCode.
We begin with two async calls to the server.
promise_a = fetch(url)
promise_b = fetch(url)
Since fetch results are also promises, .json() will needed to be called on each item. The helper function process will be used, as suggested by a Stackoverflow user -- sorry lost the link.
let promiseResults = []
let process = prom => {
prom.then(data => {
promiseResults.push(data);
});
};
Promise.all is called. The resulting array is passed to .then where forEach calls process on item.json() each iteration and fulfilled promises are pushed to promiseResults.
Promise.all([promise_a, promise_b])
.then(responseArr => {
responseArr.forEach(item => {
process(item.json());
});
})
No argument is given to the final .then block because promiseResults are in the outer scope. console.log show confusing results.
.then(() => {
console.log(promiseResults); // correct results
console.log(promiseResults[0]); // undefined ?!?
})
Any help will be greatly appreciated.
If you are familiar with async/await syntax, I would suggest you not to use an external variable promiseResults, but return the results on the fly with this function:
async function getJsonResults(promisesArr) {
// Get fetch promises response
const results = await Promise.all(promisesArr);
// Get JSON from each response promise
const jsonResults = await Promise.all(results.map(r => r.json()));
return jsonResults
}
This is usage example:
promise_a = fetch(url1)
promise_b = fetch(url2)
getJsonResults([promise_a, promise_b])
.then(theResults => console.log('All results:', theResults))
Use theResults variable to extract necessary results.
You can try this, it looks the array loop is not going properly in the promise env.
Specifically: the promiseResults is filled after you are logging.
var resultAll = Promise.all([promise_a, promise_b])
.then(responseArr => {
return Promise.all(responseArr.map(item => return item.json()));
});
resultAll.then(promiseResults => {
console.log(promiseResults);
});

Firestore cloud functions summing subcollections values

sup guys, i'm working on this firebase project and i need to iterate trought a subcollection of all sales of all stores in the root collection and sum their values... the problem that i'm getting is that i'm getting the sum printed before the iteration. I'm new to TS and Firebase... this is what i got so far:
export const newBilling = functions.firestore.document('billings/{billId}').onCreate(event =>
{
const valueArray = []
const feeArray = []
const storesCollection = afs.collection('stores').where('active', '==', true).get().then(stores => {
stores.forEach(store => {
const salesCollection = afs.collection('stores').doc(store.id).collection('sales').get().then(sales => {
sales.forEach(sale => {
return valueArray.push(sale.data().value) + feeArray.push(sale.data().fee)
// other aproach
// valueArray.push(sale.data().value)
// feeArray.push(sale.data().fee)
})
})
})
}).catch(error => {console.log(error)})
let cashbackSum, feeSum : number
cashbackArray.forEach(value => {
cashbackSum += value
})
feeArray.forEach(value => {
feeSum += value
})
console.log(cashbackSum, feeSum)
return 0
})
TKS =)
You're not using promises correctly. You've got a lot of get() method call, each of which are asynchronous and return a promise, but you're never using them to case the entire function to wait for all the work to complete. Calling then() doesn't actually make your code wait - it just runs the next bit of code and returns another promise. Your final console.log is executing first because none of the work you kicked off ahead of it is complete yet.
Your code actually needs to be substantially different in order to work correctly, and you need to return a promise from the entire function that resolves only after all the work is complete.
You can learn better how to use promises in Cloud Functions by going through the video tutorials.

Categories