This question already has answers here:
How do I return the response from an asynchronous call?
(41 answers)
Using async/await with a forEach loop
(33 answers)
Closed last year.
I want to make a series of ajax requests to a server and then do a final ajax request that uses data I received previously. Obviously, I need to wait for the earlier requests to finish before doing the final request. I'm having trouble implement this in javascript.
I don't want to overwhelm the server, so ideally all requests would be done sequentially.
My simple test code is as follows (replacing web requests with a sleep):
const sleep = (delay) => new Promise((resolve) => setTimeout(resolve, delay));
var urls = ['1', '2', '3'];
const slowFunc = () => {
urls.forEach(async (url) => {
//Don't change this section!
console.log("a"+url);
await sleep(5000);
console.log("b"+url); //I want this to run before c
});
};
slowFunc();
console.log("c");
This prints "c" before by sleep is finished, which is wrong. How can I get the output to be as follows?
a1
b1
a2
b2
a3
b3
c
Out of interest, how would I get this output? (The exact ordering within the a and b section is unimportant.)
a1
a2
a3
b1
b2
b3
c
I tried reading ES2018: asynchronous iteration but it blew my mind.
Update: I quickly had second thoughts about my example, so here is a better one (that still doesn't work):
var urls = ['https://cdnjs.cloudflare.com/ajax/libs/dompurify/2.2.0/purify.min.js', 'https://cdnjs.cloudflare.com/ajax/libs/systemjs/6.8.3/system.min.js', 'https://cdnjs.cloudflare.com/ajax/libs/slim-select/1.18.6/slimselect.min.js'];
var results = {};
const webRequest = (url) => {
$.ajax({
type: "GET",
url: url,
}).then(data => {
results[url] = data;
console.log("b"+url+","+results[url]); //I want this to run before c
});
}
const slowFunc = () => {
urls.forEach((url) => {
console.log("a"+url);
webRequest(url);
});
};
slowFunc();
console.log("c");
Thanks for comments so far.
Update 2: Solution to the web request problem, based on Antonio Della Fortuna's advice:
var urls = ['https://cdnjs.cloudflare.com/ajax/libs/dompurify/2.2.0/purify.min.js', 'https://cdnjs.cloudflare.com/ajax/libs/systemjs/6.8.3/system.min.js', 'https://cdnjs.cloudflare.com/ajax/libs/slim-select/1.18.6/slimselect.min.js'];
var results = {};
const webRequest = (url) => {
return new Promise((resolve, reject) => {
$.ajax({
type: "GET",
url: url,
error: function (data, status, er) {
console.log("b,"+url+",failed");
resolve();
},
}).then(data => {
results[url] = data;
console.log("b,"+url+","+results[url]); //I want this to run before c
resolve();
});
});
}
const slowFunc = async () => {
for (let i = 0; i < urls.length; i++)
{
var url = urls[i];
console.log("a,"+url);
await webRequest(url);
};
};
slowFunc().then(() => {
console.log("c");
console.log(results);
})
There are two ways depending on your use case and you can find the working example here -> https://codesandbox.io/s/zen-carson-ksgzf?file=/src/index.js:569-624:
Parallel Solution: You could run the requests inside the function in parallel and then print "c" like so:
const sleep = (delay) => new Promise((resolve) => setTimeout(resolve, delay));
var urls = ["1", "2", "3"];
const slowFunc = async () => {
await Promise.all(
urls.map(async (url) => {
//Don't change this section!
console.log("a" + url);
await sleep(5000);
console.log("b" + url); //I want this to run before c
})
);
};
slowFunc().then(() => {
console.log("c");
});
Sync Solution: You could run all requests as if they were synchronous and wait sequentially:
const slowFuncSeq = async () => {
for (const url of urls) {
//Don't change this section!
console.log("a" + url);
await sleep(5000);
console.log("b" + url); //I want this to run before c
}
};
slowFuncSeq().then(() => {
console.log("c");
})
Performing async operations while iterating does not work as you might expect it.
When you do forEach each element will be iterated over synchronously. Thus each element will be iterated over and invoke the callback function, which is why you see the 'a' log first for each element.
The exception to this is using a for...of loop, but for other iterators the await will only be blocking inside the callback function.
If you are attempting to limit the amount of request over time to an API you could implement a leaky bucket algorithm. Or you may refactor your iteration to a for...of loop with your delay function to block requests which maintain sequence but is less optimal as the pace of requests will be your delay time plus the time to finish the other async tasks.
Related
There is a function:
export function getImage(requestParameters: TRequestParameters): TRequest<TResponse<ImageBitmap | HTMLImageElement>> {
const request = helper.getArrayBuffer(requestParameters);
return {
response: (async () => {
const response = await request.response;
const image = await arrayBufferToCanvasImageSource(response.data);
return {
data: image,
cacheControl: response.cacheControl,
expires: response.expires
};
})(),
cancel: request.cancel
};
}
It is synchronous, but returns an object consisting of two fields: response - a Promise, which is resolved with an object (3 fields: data, cacheControl, expires, but that's not interesing for us) and cancel - a method that cancels the request.
The function works as expected and everything about it is just fine. However, I need to implement an additional constraint. It is necessary to make sure that the number of parallel (simultaneous) requests to the network at any given point in time does not exceed n.
Thus, if n === 0, no request should be made. If n === 1, then only one image can be loaded at a time (that is, all images are loaded sequentially). For n > 1 < m, no more than m images can be loaded simultaneously.
My solution
Based on the fact that the getImage function is synchronous, the line
const request = helper.getArrayBuffer(requestParameters);
is executed immediately when getImage is called. That's not what we want though, we need to postpone the execution of the request itself. Therefore, we will replace the request variable with the requestMaker function, which we will call only when we need it:
export function getImage(requestParameters: TRequestParameters): TRequest<TResponse<ImageBitmap | HTMLImageElement>> {
if (webpSupported.supported) {
if (!requestParameters.headers) requestParameters.headers = {};
requestParameters.headers['Accept'] = 'image/webp,*/*';
}
function requestMaker() {
const request = helper.getArrayBuffer(requestParameters);
return request;
}
return {
response: (async () => {
const response = await requestMaker().response;
const image = await arrayBufferToCanvasImageSource(response.data);
return {
data: image,
cacheControl: response.cacheControl,
expires: response.expires
};
})(),
cancel() {
//
}
};
}
(Let's omit the cancel for now for the sakes of simplicity).
Now the execution of this requestMaker function, which makes the request itself, needs to be postponed until some point.
Suppose now we are trying to solve the problem only for n === 1.
Let's create an array in which we will store all requests that are currently running:
const ongoingImageRequests = [];
Now, inside requestMaker, we will save requests to this variable as soon as they occur, and delete them as soon as we receive a response:
const ongoingImageRequests = [];
export function getImage(requestParameters: TRequestParameters): TRequest<TResponse<ImageBitmap | HTMLImageElement>> {
if (webpSupported.supported) {
if (!requestParameters.headers) requestParameters.headers = {};
requestParameters.headers['Accept'] = 'image/webp,*/*';
}
function requestMaker() {
const request = helper.getArrayBuffer(requestParameters);
ongoingImageRequests.push(request);
request.response.finally(() => ongoingImageRequests.splice(ongoingImageRequests.indexOf(request), 1));
return request;
}
return {
response: (async () => {
const response = await requestMaker().response;
const image = await arrayBufferToCanvasImageSource(response.data);
return {
data: image,
cacheControl: response.cacheControl,
expires: response.expires
};
})(),
cancel() {
//
}
};
}
It's only left now to add a restriction regarding the launch of requestMaker: before starting it, we need to wait until all the requests from the array are finished:
const ongoingImageRequests = [];
export function getImage(requestParameters: TRequestParameters): TRequest<TResponse<ImageBitmap | HTMLImageElement>> {
if (webpSupported.supported) {
if (!requestParameters.headers) requestParameters.headers = {};
requestParameters.headers['Accept'] = 'image/webp,*/*';
}
function requestMaker() {
const request = helper.getArrayBuffer(requestParameters);
ongoingImageRequests.push(request);
request.response.finally(() => ongoingImageRequests.splice(ongoingImageRequests.indexOf(request), 1));
return request;
}
return {
response: (async () => {
await Promise.allSettled(ongoingImageRequests.map(ongoingImageRequest => ongoingImageRequest.response));
const response = await requestMaker().response;
const image = await arrayBufferToCanvasImageSource(response.data);
return {
data: image,
cacheControl: response.cacheControl,
expires: response.expires
};
})(),
cancel() {
//
}
};
}
I understand it this way: when getImage starts executing (it is called from somewhere outside), it immediately returns an object in which response is a Promise, which will resolve at least not before the moment when all the other requests from the queue are completed.
But, as it turns out, this solution for some reason does not work. The question is why? And how to make it work? At least for n === 1.
I know the title is quite generic but I am inserting 1 Million records into a AWS DynamoDB and currently it takes ~30 minutes to load.
I have the 1 Million records in memory, I just need to improve the speed to insert the items. AWS only allows to send batches of 25 records but I all my code in syncronous.
Usually my data has a very small amount of data in the object (e.g. like 3-5 properties with number ids)
I read the 1 million entries from a CSV and basically store it in data array
Then I do this:
await DatabaseHandler.batchWriteItems('myTable', data); // data length is 1 Million
Which calls my insert function
const documentClient = new DynamoDB.DocumentClient();
export class DatabaseHandler {
static batchWriteItems = async (tableName: string, data: {}[]) => {
// AWS only allows batches of max 25 items
while (data.length) {
const batch = data.splice(0, 25);
const putRequests = batch.map((elem => {
return {
PutRequest: {
Item: elem
}
};
});
const params = {
RequestItems: {
[tableName]: putRequests,
},
};
await documentClient.batchWrite(params).promise();
}
}
}
I believe I am doing 40,000 HTTP requests to create 25 records in the database
Is there any way to improve this? Even some ideas would be great
Your code is "blocking", in the sense that you're waiting for the previous batch to execute before executing the next one. This is not the nature of JavaScript, and you're not taking advantage of promises. Instead, you can send all your requests at once, and JS' asynchronism will kick in and do all the work for you, which will be significantly faster:
// in your class method:
const proms = []; // <-- create a promise array
while (data.length) {
const batch = data.splice(0, 25);
const putRequests = batch.map((elem => {
return {
PutRequest: {
Item: elem
}
};
});
const params = {
RequestItems: {
[tableName]: putRequests,
},
};
proms.push(documentClient.batchWrite(params).promise()); // <-- add the promise to our array
}
}
await Promise.all(proms); // <-- wait for everything to be resolved asynchronously, then be done
This will speed up your request monumentally, as long as AWS lets you send that many concurrent requests.
I'm not sure how exactly you implemented the code, but to prove that it works, here's a dummy implementation (expect to wait about a minute):
const request = (_, t = 5) => new Promise(res => setTimeout(res, t)); // implement a dummy request API
// with your approach
async function a(data) {
while(data.length) {
const batch = data.splice(0, 25);
await request(batch);
}
}
// non-blocking
async function b(data) {
const proms = [];
while(data.length) {
const batch = data.splice(0, 25);
proms.push(request(batch));
}
await Promise.all(proms);
}
(async function time(a, b) {
const data = Array(10000).fill(); // create some dummy data (10,000 instead of a million or you'll be staring at this demo for a while)
console.time("original");
await a(data);
console.timeEnd("original");
console.time("optimized");
await b(data);
console.timeEnd("optimized");
})(a, b);
I am trying to run parallel requests in batches to an API using a bunch of keywords in an array. Article by Denis Fatkhudinov.
The problem I am having is that for each keyword, I need to run the request again with a different page argument for as many times as the number in the pages variable.
I keep getting Cannot read property 'then' of undefined for the return of the chainNext function.
The parallel request in batches on its own, without the for loop, works great, I am struggling to incorporate the for loop on the process.
// Parallel requests in batches
async function runBatches() {
// The keywords to request with
const keywords = ['many keyword strings here...'];
// Set max concurrent requests
const concurrent = 5;
// Clone keywords array
const keywordsClone = keywords.slice()
// Array for future resolved promises for each batch
const promises = new Array(concurrent).fill(Promise.resolve());
// Async for loop
const asyncForEach = async (pages, callback) => {
for (let page = 1; page <= pages; page++) {
await callback(page);
}
};
// Number of pages to loop for
const pages = 2;
// Recursively run batches
const chainNext = (pro) => {
// Runs itself as long as there are entries left on the array
if (keywordsClone.length) {
// Store the first entry and conviently also remove it from the array
const keyword = keywordsClone.shift();
// Run 'the promise to be' request
return pro.then(async () => {
// ---> Here was my problem, I am declaring the constant before running the for loop
const promiseOperation = await asyncForEach(pages, async (page) => {
await request(keyword, page)
});
// ---> The recursive invocation should also be inside the for loop
return chainNext(promiseOperation);
});
}
return pro;
}
return await Promise.all(promises.map(chainNext));
}
// HTTP request
async function request(keyword, page) {
try {
// request API
const res = await apiservice(keyword, page);
// Send data to an outer async function to process the data
await append(res.data);
} catch (error) {
throw new Error(error)
}
}
runBatches()
The problem is simply that pro is undefined, because you haven't initialized it.
You basically execute this code:
Promise.all(new Array(concurrent).fill(Promise.resolve().map(pro => {
// pro is undefined here because the Promise.resolve had no parameter
return pro.then(async () => {})
}));
I'm not completely sure about your idea behind that, but this is your problem in a more condensed version.
I got it working by moving actual request promiseOperation inside the for loop and returning the recursive function there too
// Recursively run batches
const chainNext = async (pro) => {
if (keywordsClone.length) {
const keyword = keywordsClone.shift()
return pro.then(async () => {
await asyncForEach(pages, (page) => {
const promiseOperation = request(keyword, page)
return chainNext(promiseOperation)
})
})
}
return pro
}
Credit for the parallel requests in batches goes to https://itnext.io/node-js-handling-asynchronous-operations-in-parallel-69679dfae3fc
I'd like to reuse the same code in a loop. This code contains promises. However, when iterating, this code results in an error.
I've tried using for and while loops. There seems to be no issue when I use the for loop for a single iteration.
Here is a minimal version of my code:
var search_url = /* Some initial URL */
var glued = "";
for(var i = 0; i < 2; i++)
{
const prom = request(search_url)
.then(function success(response /* An array from a XMLHTTPRequest*/) {
if (/* Some condition */)
{
search_url = /* Gets next URL */
glued += processQuery(response[0]);
} else {
console.log("Done.")
}
})
.catch(function failure(err) {
console.error(err.message); // TODO: do something w error
})
}
document.getElementById('api-content').textContent = glued;
I expect the results to append to the variable glued but instead, I get an error: failure Promise.catch (async) (anonymous) after the first iteration of the loop.
Answer:
You can use the Symbol.iterator in accordance with for await to perform asynchronous execution of your promises. This can be packaged up into a constructor, in the example case it's called Serial (because we're going through promises one by one, in order)
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
What is the above?
It's a constructor called Serial.
It takes as an argument an array of Functions that return Promises.
The functions are stored in Serial.promises
It has an empty array stored in Serial.resolved - this will store the resolved promise requests.
It has two methods:
addPromise: Takes a Function that returns a Promise and adds it to Serial.promises
resolve: Asynchronously calls a custom Symbol.iterator. This iterator goes through every single promise, waits for it to be completed, and adds it to Serial.resolved. Once this is completed, it returns a map function that acts on the populated Serial.resolved array. This allows you to simply call resolve and then provide a callback of what to do with the array of responses. A.e. .resolve()((resolved_requests) => //do something with resolved_requests)
Why does it work?
Although many people don't realize this Symbol.iterator is much more powerful than standard for loops. This is for two big reasons.
The first reason, and the one that is applicable in this situation, is because it allows for asynchronous calls that can affect the state of the applied object.
The second reason is that it can be used to provide two different types of data from the same object. A.e. You may have an array that you would like to read the contents of:
let arr = [1,2,3,4];
You can use a for loop or forEach to get the data:
arr.forEach(v => console.log(v));
// 1, 2, 3, 4
But if you adjust the iterator:
arr[Symbol.iterator] = function* () {
yield* this.map(v => v+1);
};
You get this:
arr.forEach(v => console.log(v));
// 1, 2, 3, 4
for(let v of arr) console.log(v);
// 2, 3, 4, 5
This is useful for many different reasons, including timestamping requests/mapping references, etc. If you'd like to know more please take a look at the ECMAScript Documentation: For in and For Of Statements
Use:
It can be used by calling the constructor with an Array of functions that return Promises. You can also add Function Promises to the Object by using
new Serial([])
.addPromise(() => fetch(url))
It doesn't run the Function Promises until you use the .resolve method.
This means that you can add promises ad hoc if you'd like before you do anything with the asynchronous calls. A.e. These two are the same:
With addPromise:
let promises = new Serial([() => fetch(url), () => fetch(url2), () => fetch(url3)]);
promises.addPromise(() => fetch(url4));
promises.resolve().then((responses) => responses)
Without addPromise:
let promises = new Serial([() => fetch(url), () => fetch(url2), () => fetch(url3), () => fetch(url4)])
.resolve().then((responses) => responses)
Data:
Since I can't really replicate your data calls, I opted for JSONPlaceholder (a fake online rest api) to show the promise requests in action.
The data looks like this:
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1",
"https://jsonplaceholder.typicode.com/todos/2",
"https://jsonplaceholder.typicode.com/todos/3"]
//since our constructor takes functions that return promises, I map over the URLS:
.map(url => () => fetch(url));
To get the responses we can call the above data using our constructor:
let promises = new Serial(searchURLS)
.resolve()
.then((resolved_array) => console.log(resolved_array));
Our resolved_array gives us an array of XHR Response Objects. You can see that here:
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2", "https://jsonplaceholder.typicode.com/todos/3"].map(url => () => fetch(url));
let promises = new Serial(searchURLs).resolve().then((resolved_array) => console.log(resolved_array));
Getting Results to Screen:
I opted to use a closure function to simply add text to an output HTMLElement.
This is added like this:
HTML:
<output></output>
JS:
let output = ((selector) => (text) => document.querySelector(selector).textContent += text)("output");
Putting it together:
If we use the output snippet along with our Serial object the final functional code looks like this:
let promises = new Serial(searchURLs).resolve()
.then((resolved) => resolved.map(response =>
response.json()
.then(obj => output(obj.title))));
What's happening above is this:
we input all our functions that return promises. new Serial(searchURLS)
we tell it to resolve all the requests .resolve()
after it resolves all the requests, we tell it to take the requests and map the array .then(resolved => resolved.map
the responses we turn to objects by using .json method. This is necessary for JSON, but may not be necessary for you
after this is done, we use .then(obj => to tell it to do something with each computed response
we output the title to the screen using output(obj.title)
Result:
let output = ((selector) => (text) => document.querySelector(selector).textContent += text)("output");
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2", "https://jsonplaceholder.typicode.com/todos/3"].map(url => () => fetch(url));
let promises = new Serial(searchURLs).resolve()
.then((resolved) => resolved.map(response =>
response.json()
.then(obj => output(obj.title))));
<output></output>
Why go this route?
It's reusable, functional, and if you import the Serial Constructor you can keep your code slim and comprehensible. If this is a cornerstone of your code, it'll be easy to maintain and use.
Using it with your code:
I will add how to specifically use this with your code to fully answer your question and so that you may understand further.
NOTE glued will be populated with the requested data, but it's unnecessary. I left it in because you may have wanted it stored for a reason outside the scope of your question and I don't want to make assumptions.
//setup urls:
var search_urls = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2"];
var request = (url) => () => fetch(url);
let my_requests = new Serial(search_urls.map(request));
//setup glued (you don't really need to, but if for some reason you want the info stored...
var glued = "";
//setup helper function to grab title(this is necessary for my specific data)
var addTitle = (req) => req.json().then(obj => (glued += obj.title, document.getElementById('api-content').textContent = glued));
// put it all together:
my_requests.resolve().then(requests => requests.map(addTitle));
Using it with your code - Working Example:
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
//setup urls:
var search_urls = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2"];
var request = (url) => () => fetch(url);
let my_requests = new Serial(search_urls.map(request));
//setup glued (you don't really need to, but if for some reason you want the info stored...
var glued = "";
//setup helper function to grab title(this is necessary for my specific data)
var addTitle = (req) => req.json().then(obj => (glued += obj.title, document.getElementById('api-content').textContent = glued));
// put it all together:
my_requests.resolve().then(requests => requests.map(addTitle));
<div id="api-content"></div>
Final Note
It's likely that we will be seeing a prototypal change to the Promise object in the future that allows for easy serialization of Promises. Currently (7/15/19) there is a TC39 Proposal that does add a lot of functionality to the Promise object but it hasn't been fully vetted yet, and as with many ideas trapped within the Proposal stage, it's almost impossible to tell when they will be implemented into Browsers, or even if the idea will stagnate and fall off the radar.
Until then workarounds like this are necessary and useful( the reason why I even went through the motions of constructing this Serializer object was for a transpiler I wrote in Node, but it's been very helpful beyond that! ) but do keep an eye out for any changes because you never know!
Hope this helps! Happy Coding!
Your best bet is probably going to be building up that glued variable with recursion.
Here's an example using recursion with a callback function:
var glued = "";
requestRecursively(/* Some initial URL string */, function() {
document.getElementById('api-content').textContent = glued;
});
function requestRecursively(url, cb) {
request(url).then(function (response) {
if (/* Some condition */) {
glued += processQuery(response[0]);
var next = /* Gets next URL string */;
if (next) {
// There's another URL. Make another request.
requestRecursively(next, cb);
} else {
// We're done. Invoke the callback;
cb();
}
} else {
console.log("Done.");
}
}).catch(function (err) {
console.error(err.message);
});
}
I'm assuming that I'm lacking some fundamentals on promises. I have a process that is within a AWS Lambda that downloads three files, then produces an output that is sent via email.
module.exports.test = async (event) => {
var p = download1();
var c = download2();
var h = download3();
await Promise.all([p, c, h]).then(() => {
... bunch of logic manipulating the data
customers.forEach(i => {
buildFile().then(data => {
sendEmail(data).then(response => {
console.log('Email sent successfully');
});
});
});
}, errHandler);
};
Both the buildFile and sendEmail functions return a Promise, but I never get to the 'Email sent successfully' message. It runs the code, but never actually returns before the Lambda completes (at least that's what I think is happening).
My understanding was that the Promise would fulfill the callback, but now I'm thinking I need to do something similar to how I did the downloads within the original Promise.all(). Is that the right direction?
The process should get the files, then it loops through customers to create each file and send via SES.
You're looking for
module.exports.test = async (event) => {
var p = download1();
var c = download2();
var h = download3();
try {
await Promise.all([p, c, h]);
// ... bunch of logic manipulating the data
var promises = customers.map(async (i) => {
var data = await buildFile();
var response = await sendEmail(data);
console.log('Email sent successfully');
});
await Promise.all(promises);
} catch(e) {
errHandler(e);
}
};
Your test function didn't wait for the promises that you created in the forEach loop, so the lambda completes before everything is done.
#Bergi answer is correct, however I would like to expand it a little bit and give you some resources to increase or strength your Promises knowledge. I'll use the next snippet of code, it's a bit cumbersome because I wrote it on Google Chrome snippets so feel free to paste it there and play around with it:
(function() {
const promise1 = new Promise(function(resolve, reject) {
setTimeout(function() {
resolve('Replicant');
}, 300);
});
const promise2 = new Promise(function(resolve, reject) {
setTimeout(function() {
resolve('Human?');
}, 300);
});
function buildFile(type) {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`${type}`);
}, 300);
});
}
function sendMail(customer, answer) {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Sent test to: ${customer}, he/she is a ${answer}`);
}, 300);
});
}
let customers = ['Rob Batty', 'Rachel', 'Deckard'];
async function myFunc() {
const [a, b, c] = await Promise.all([promise1, promise1, promise2]);
const answers = [a, b, c];
// const promises = customers.map(async (name, index) => {
// const file = await buildFile(answers[index]);
// const mail = await sendMail(name, file);
// console.log(mail);
// });
const promises = customers.map((name, index) => {
buildFile(answers[index])
.then(file => sendMail(name, file))
.then(sent => console.log(sent))
// If you're going to use Promises this is important! :D
.catch(err => console.log(err))
});
const [m1, m2, m3] = await Promise.all(promises);
console.log(m1, m2, m3);
}
myFunc();
})()
As pointed out in the answer, the problem is related to the use of forEach, why? Well, simply because you're executing asynchronous code on a synchronous type of method, the don't get along very well :), so, the solution is to create an Array of Promises, like a Factory. After the map function the Promises are Pending nor Fullfiled or Rejected it's when you call the Promise.all() method that you await for their results and they give you the values or in your use case, generate the file and then send the email to the user. I hope this helps you to get a better understanding on how Promises works. I'll leave at the end two very important links, at least for me, that helped me out at some point with Promises. Cheers, sigfried.
Nolans Lawson's Article
MDN Promises