I am trying to write the code to do the following:
When the user uploads the images, return previews of those images as canvas elements. (This is taken care of by loadImage API)
Attach div elements next to each of those canvases.
To do the above in order, I will need to invoke document.querySelectorAll method only after the canvas elements are fully drawn.
Here is the code I wrote. I tried to use promise here, but apparently the promise is resolved after the loadImage functions are invoked, not after the canvas elements are fully drawn. When I run the function, the previews are populated as canvas elements, but the 'photo_order' div elements are not attached.
function rpu() {
let promise = new Promise(async function(resolve, reject) {
let filez = e.target.files;
let files = Array.from(filez);
files.forEach(function(file) {
loadImage(
file,
function(img) {
document.querySelector('#r_im_preview').appendChild(img);
document.querySelector('#r_im_preview').classList.remove('hide');
}, {
maxWidth: 150,
orientation: true,
contain: true
}
)
});
resolve();
});
promise.then(function() {
let k1 = document.querySelectorAll('canvas');
let k2 = Array.from(k1);
let k3 = '<div class="photo_order" style="cursor: pointer;"></div>'
k2.forEach(k => {
k.insertAdjacentHTML('afterEnd', k3);
})
});
}
How do I fix the code to achieve what I am trying to do? Any advice will be very much welcome.
I think you need to refactor your code to correctly use promises. Something like this:
const appendPreview = (img) => {
document.querySelector('#r_im_preview').appendChild(img)
document.querySelector('#r_im_preview').classList.remove('hide');
}
const loadImageP = (file, cb, settings) => new Promise((resolve, reject) => {
try {
const cbP = (...args) => (cb(...args), resolve)
loadImage(file, cbP, settings)
} catch {
reject()
}
})
const settings = {
maxWidth: 150
orientation: true,
contain: true
}
async function rpu() {
const promises = [...e.target.files].reduce((acc, file) =>
[...acc, loadImageP(file, appendPreview, settings))], [])
await Promise.all(promises)
let canvasNodes = document.querySelectorAll('canvas');
let orderNode = '<div class="photo_order" style="cursor: pointer;"></div>'
[...canvasNodes].forEach((n) => { n.insertAdjacentHTML('afterEnd', orderNode);
}
Related
I want to create a task that only builds the entire project if something changed. For that, I am comparing hashes (irrelevant to the question).
const buildIfChanged = async () => {
const hash = await getHash();
const newHash = await getNewHash();
if (hash !== newHash) {
console.log("START");
const task = series(build, cleanup)();
console.log("END", task);
}
};
In this example, task is undefined, so I cannot add a .on("end", ...) and resolve the promise after that. I also cannot await it.
The problem is, because I am not waiting for it to complete, the buildIfChanged task completes before build even has a chance to run.
Is there any way to do this with modern gulpfiles?
I have found a solution on how to do this in modern gulpfiles. The function call of parallel and series actually take a done function as a parameter.
So to solve this, you can do:
const buildIfChanged = async () => {
return new Promise(async (res, rej) => {
try {
const hash = await getHash();
const newHash = await getNewHash();
if (hash !== newHash) {
console.log("START");
const task = series(build, cleanup)(() => res()); // add the done function here
console.log("END", task);
}
} catch(e) {
rej(e);
}
})
};
I'd like to reuse the same code in a loop. This code contains promises. However, when iterating, this code results in an error.
I've tried using for and while loops. There seems to be no issue when I use the for loop for a single iteration.
Here is a minimal version of my code:
var search_url = /* Some initial URL */
var glued = "";
for(var i = 0; i < 2; i++)
{
const prom = request(search_url)
.then(function success(response /* An array from a XMLHTTPRequest*/) {
if (/* Some condition */)
{
search_url = /* Gets next URL */
glued += processQuery(response[0]);
} else {
console.log("Done.")
}
})
.catch(function failure(err) {
console.error(err.message); // TODO: do something w error
})
}
document.getElementById('api-content').textContent = glued;
I expect the results to append to the variable glued but instead, I get an error: failure Promise.catch (async) (anonymous) after the first iteration of the loop.
Answer:
You can use the Symbol.iterator in accordance with for await to perform asynchronous execution of your promises. This can be packaged up into a constructor, in the example case it's called Serial (because we're going through promises one by one, in order)
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
What is the above?
It's a constructor called Serial.
It takes as an argument an array of Functions that return Promises.
The functions are stored in Serial.promises
It has an empty array stored in Serial.resolved - this will store the resolved promise requests.
It has two methods:
addPromise: Takes a Function that returns a Promise and adds it to Serial.promises
resolve: Asynchronously calls a custom Symbol.iterator. This iterator goes through every single promise, waits for it to be completed, and adds it to Serial.resolved. Once this is completed, it returns a map function that acts on the populated Serial.resolved array. This allows you to simply call resolve and then provide a callback of what to do with the array of responses. A.e. .resolve()((resolved_requests) => //do something with resolved_requests)
Why does it work?
Although many people don't realize this Symbol.iterator is much more powerful than standard for loops. This is for two big reasons.
The first reason, and the one that is applicable in this situation, is because it allows for asynchronous calls that can affect the state of the applied object.
The second reason is that it can be used to provide two different types of data from the same object. A.e. You may have an array that you would like to read the contents of:
let arr = [1,2,3,4];
You can use a for loop or forEach to get the data:
arr.forEach(v => console.log(v));
// 1, 2, 3, 4
But if you adjust the iterator:
arr[Symbol.iterator] = function* () {
yield* this.map(v => v+1);
};
You get this:
arr.forEach(v => console.log(v));
// 1, 2, 3, 4
for(let v of arr) console.log(v);
// 2, 3, 4, 5
This is useful for many different reasons, including timestamping requests/mapping references, etc. If you'd like to know more please take a look at the ECMAScript Documentation: For in and For Of Statements
Use:
It can be used by calling the constructor with an Array of functions that return Promises. You can also add Function Promises to the Object by using
new Serial([])
.addPromise(() => fetch(url))
It doesn't run the Function Promises until you use the .resolve method.
This means that you can add promises ad hoc if you'd like before you do anything with the asynchronous calls. A.e. These two are the same:
With addPromise:
let promises = new Serial([() => fetch(url), () => fetch(url2), () => fetch(url3)]);
promises.addPromise(() => fetch(url4));
promises.resolve().then((responses) => responses)
Without addPromise:
let promises = new Serial([() => fetch(url), () => fetch(url2), () => fetch(url3), () => fetch(url4)])
.resolve().then((responses) => responses)
Data:
Since I can't really replicate your data calls, I opted for JSONPlaceholder (a fake online rest api) to show the promise requests in action.
The data looks like this:
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1",
"https://jsonplaceholder.typicode.com/todos/2",
"https://jsonplaceholder.typicode.com/todos/3"]
//since our constructor takes functions that return promises, I map over the URLS:
.map(url => () => fetch(url));
To get the responses we can call the above data using our constructor:
let promises = new Serial(searchURLS)
.resolve()
.then((resolved_array) => console.log(resolved_array));
Our resolved_array gives us an array of XHR Response Objects. You can see that here:
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2", "https://jsonplaceholder.typicode.com/todos/3"].map(url => () => fetch(url));
let promises = new Serial(searchURLs).resolve().then((resolved_array) => console.log(resolved_array));
Getting Results to Screen:
I opted to use a closure function to simply add text to an output HTMLElement.
This is added like this:
HTML:
<output></output>
JS:
let output = ((selector) => (text) => document.querySelector(selector).textContent += text)("output");
Putting it together:
If we use the output snippet along with our Serial object the final functional code looks like this:
let promises = new Serial(searchURLs).resolve()
.then((resolved) => resolved.map(response =>
response.json()
.then(obj => output(obj.title))));
What's happening above is this:
we input all our functions that return promises. new Serial(searchURLS)
we tell it to resolve all the requests .resolve()
after it resolves all the requests, we tell it to take the requests and map the array .then(resolved => resolved.map
the responses we turn to objects by using .json method. This is necessary for JSON, but may not be necessary for you
after this is done, we use .then(obj => to tell it to do something with each computed response
we output the title to the screen using output(obj.title)
Result:
let output = ((selector) => (text) => document.querySelector(selector).textContent += text)("output");
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
let searchURLs = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2", "https://jsonplaceholder.typicode.com/todos/3"].map(url => () => fetch(url));
let promises = new Serial(searchURLs).resolve()
.then((resolved) => resolved.map(response =>
response.json()
.then(obj => output(obj.title))));
<output></output>
Why go this route?
It's reusable, functional, and if you import the Serial Constructor you can keep your code slim and comprehensible. If this is a cornerstone of your code, it'll be easy to maintain and use.
Using it with your code:
I will add how to specifically use this with your code to fully answer your question and so that you may understand further.
NOTE glued will be populated with the requested data, but it's unnecessary. I left it in because you may have wanted it stored for a reason outside the scope of your question and I don't want to make assumptions.
//setup urls:
var search_urls = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2"];
var request = (url) => () => fetch(url);
let my_requests = new Serial(search_urls.map(request));
//setup glued (you don't really need to, but if for some reason you want the info stored...
var glued = "";
//setup helper function to grab title(this is necessary for my specific data)
var addTitle = (req) => req.json().then(obj => (glued += obj.title, document.getElementById('api-content').textContent = glued));
// put it all together:
my_requests.resolve().then(requests => requests.map(addTitle));
Using it with your code - Working Example:
function Serial(promises = []) {
return {
promises,
resolved: [],
addPromise: function(fn) {
promises.push(fn);
},
resolve: async function(cb = i => i, err = (e) => console.log("trace: Serial.resolve " + e)) {
try {
for await (let p of this[Symbol.iterator]()) {}
return this.resolved.map(cb);
} catch (e) {
err(e);
}
},
[Symbol.iterator]: async function*() {
this.resolved = [];
for (let promise of this.promises) {
let p = await promise().catch(e => console.log("trace: Serial[Symbol.iterator] ::" + e));
this.resolved.push(p);
yield p;
}
}
}
}
//setup urls:
var search_urls = ["https://jsonplaceholder.typicode.com/todos/1", "https://jsonplaceholder.typicode.com/todos/2"];
var request = (url) => () => fetch(url);
let my_requests = new Serial(search_urls.map(request));
//setup glued (you don't really need to, but if for some reason you want the info stored...
var glued = "";
//setup helper function to grab title(this is necessary for my specific data)
var addTitle = (req) => req.json().then(obj => (glued += obj.title, document.getElementById('api-content').textContent = glued));
// put it all together:
my_requests.resolve().then(requests => requests.map(addTitle));
<div id="api-content"></div>
Final Note
It's likely that we will be seeing a prototypal change to the Promise object in the future that allows for easy serialization of Promises. Currently (7/15/19) there is a TC39 Proposal that does add a lot of functionality to the Promise object but it hasn't been fully vetted yet, and as with many ideas trapped within the Proposal stage, it's almost impossible to tell when they will be implemented into Browsers, or even if the idea will stagnate and fall off the radar.
Until then workarounds like this are necessary and useful( the reason why I even went through the motions of constructing this Serializer object was for a transpiler I wrote in Node, but it's been very helpful beyond that! ) but do keep an eye out for any changes because you never know!
Hope this helps! Happy Coding!
Your best bet is probably going to be building up that glued variable with recursion.
Here's an example using recursion with a callback function:
var glued = "";
requestRecursively(/* Some initial URL string */, function() {
document.getElementById('api-content').textContent = glued;
});
function requestRecursively(url, cb) {
request(url).then(function (response) {
if (/* Some condition */) {
glued += processQuery(response[0]);
var next = /* Gets next URL string */;
if (next) {
// There's another URL. Make another request.
requestRecursively(next, cb);
} else {
// We're done. Invoke the callback;
cb();
}
} else {
console.log("Done.");
}
}).catch(function (err) {
console.error(err.message);
});
}
I tried to prevent async problems with promises in the following code. By using a .then function everything within that function gets called after the function has been resolved. But now I have the problem that neither can I extend the scope of the ".then function" enough to include the bits after the second loop nor can I to my knowledge easily pause the code until the function has been properly resolved and THEN continue with the loop iteration.
Here's my main code(simplified):
let total = []
$.each(element, function(data) {
//Some other code
let out;
$.each(element2, function(data2) {
getZip(data2).then(function(txt){ //after everything has finished this get's called
out = someFunction(txt,data2);
total.push(out);
});
)};
console.log(total)//this gets called first
//some other code that does some stuff with total
)};
Here's the getZip code which is asynchronous:
function getZip(zipFile) {
return new Promise(function (resolve, reject){
zip = new JSZip()
JSZipUtils.getBinaryContent("someURL/" + zipFile, function (err, data) {
if (err) {
reject(err)
}
JSZip.loadAsync(data).then(function (zip) {
return zip.file(zipFile.replace(".zip", "")).async("text"); //gets the file within the zip andoutputs as text
}).then(function (txt) {
resolve(txt)
});
});
});
}
I'd be happy if either the getZip code could be made synchronous or if the before mentioned could be done.
I do not think I fully understand the code you have written. However, I recommend you use Promise.all. Here is an example I have written that I hope helps guide you:
let total = [];
$.each([1,2,3,4], function (data) {
// Some other code.
let out;
// Create a new promise so that we can wait on the getZip method.
new Promise(function (resolve, reject) {
// Create a holder variable. This variable with hold all the promises that are output from the getZip method you have.
let gZipPromises = [];
$.each([5,6,7,8], function (data2) {
// Your getZip method would go here. wrap the call to getZip in gZipPromises.push to push all the returned promises onto the holding variable.
gZipPromises.push(new Promise(function (resolve2, reject2) {
// Sample Code
setTimeout(function () {
total.push(data2);
resolve2("");
}, 10);
// End Sample Code.
}));
});
// Pass the holding variable to Promise.all so that all promises in the holding variable are executed before resolving.
Promise.all(gZipPromises).then(function() {
resolve()
});
}).then(function () {
// This will be called only when all getZip promises are completed in the second loop.
console.log(total);
});
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
With that said, I could not test your code. But I think this would work:
(Please note that based on the code you provided, the variable total would be logged for each iteration of the top most $.each
let total = []
$.each(element, function(data) {
//Some other code
let out;
// Define a new promise.
new Promise(function (resolve, reject) {
let gZipPromises = [];
$.each(element2, function(data2) {
gZipPromises.push(
getZip(data2).then(function(txt){ //after everything has finished this get's called
out = someFunction(txt,data2);
total.push(out);
});
);
)};
Promise.all(gZipPromises).then(function() {
resolve()
});
}).then(function () {
console.log(total)
});
)};
const elements = [["foo.zip"],["bar.zip"],["baz.zip"]];
const totalOut = getAllZips(elements)
.then(text => console.info(text))
.catch(error => console.error(error))
function someFunction(text, data) {
return `${text}\nLength: ${data.length}`;
}
async function getAllZips(elements) {
let promises = [];
for(const element of elements) {
for(const data of element) {
promises.push(getZip(data).then(text => {
return someFunction(text, data);
}));
}
}
return Promise.all(promises);
}
async function getZip(file) {
return new Promise((resolve, reject) => {
JSZipUtils.getBinaryContent(`someURL/${file}`, async (err, data) => {
try {
if (err) throw err;
const zip = await JSZip.loadAsync(data);
const name = file.replace(".zip", "");
resolve(await zip.file(name).async('text'));
} catch(error) {
reject(error);
}
});
});
}
<script>/*IGNORE*/const JSZipUtils = {getBinaryContent:(p,c)=>errs.gbc?c(new Error('gbc'),null):c(null,{foo:true})};const JSZip = {loadAsync:(d)=>errs.la?Promise.reject(new Error('la')):({file:n=>({async:a=>errs.a?Promise.reject(new Error('a')):Promise.resolve('Hello World')})})};const errs = {gbc:false,la:false,a:false};/*IGNORE*/</script>
This kind of sounds like a use case for async iterator generators, but maybe I'm just over-engineering. You have a bunch of resources that you want to iterate over and their contents are asynchronous. You want it to "look" synchronous, so you can leverage async/await:
function getZip(zipFile) {
/*
* Theres no point in simplifying this function since it looks like
* the JSZip API deals with callbacks and not Promises.
*/
return Promise.resolve(zipFile);
}
function someFn(a, b) {
return `${a}: ${b.length}`;
}
async function* zipper(elements) {
for (const element of elements) {
for (const data of element) {
const txt = await getZip(data);
yield someFn(txt, data);
}
}
}
(async() => {
const elements = [
["hello"],
["world"],
["foo"],
["bar"]
];
let total = [];
for await (const out of zipper(elements)) {
total.push(out);
}
console.log(total);
})();
I have a stream and I need to convert it to a generator, so an uploader can consume the generic generator.
This means turning:
stream.on('data', chunk => ...);
to:
generator = streamGenerator(stream);
chunk = await generator.next()
...
better yet:
chunk = yield streamGenerator;
Overall my best attempt requires leaking the resolve from a promise and I'd like to avoid that:
function streamToIterable(chunkSize, stream) {
let collector = [];
let value = [];
let done = false;
let _resolve;
let promise = new Promise(resolve => _resolve = resolve);
stream.on('data', chunk => {
collector = collector.concat(chunk);
if (value.length >= chunkSize) {
value = collector.splice(0, chunkSize);
_resolve(value);
stream.pause();
}
});
stream.on('end', () => {
_resolve(collection);
// With done set to true, the next iteration well ignore 'value' and end the loop
done = true;
});
stream.resume();
return {
next: () => ({
value: promise.then(() => {
stream.resume();
promise = new Promise(resolve => _resolve = resolve);
}),
done,
}),
};
}
function* streamToGenerator(stream) {
const iterator = streamToIterable(stream);
let next = iterator.next();
while (!next.done) {
yield next.value;
}
};
Usage in a generator for uploading chunks:
for (const chunkData of generator()) {
let result = yield uploadPost(url, formData, onChunkProgress(chunkIndex));
This is in a redux-saga, so "next()" isn't called on the generator until the return promise is resolved.
You cannot avoid storing the resolve function in a mutable variable if you want to use a single event listener that resolves different promises. You could simplify the promise creation by using the once method similar to the following:
function streamToIterator(stream) {
let done = false;
const end = new Promise(resolve => {
stream.once('end', resolve);
}).then(e => {
done = true;
});
return {
[Symbol.iterator]() { return this; }
next() {
const promise = new Promise(resolve => {
stream.once('data', value => {
resolve(value);
stream.pause();
});
stream.resume();
});
return {
value: Promise.race([promise, end]),
done,
};
}),
};
}
Of course, you are doing the racing between end and data yourself, you resume the stream before next is called the first time and most importantly you do the chunking yourself, so this might to be applicable to your situation.
Apart from that, I'd recommend to check out the buffering internals of node.js streams, it might be easier to read chunks of certain sizes using a lower-level API than data events.
Also you definitely should have a look at the asynchronous iteration proposal for es-next. The iterable interface you're trying to implement is very similar, and surely they either already have or really would welcome an example of making a node readablestream iterable.
EDIT: this answer is only required if you have a volatile stream that doesn't pause right away, and therefor also doesn't have an event system that supports "once". It also allows asyncronous yielding.
I greatly changed my previous answer and this one works.
This one uses two arrays; one of promises and another of resolves, which allows a queue of data that is bi-directional.
So if you iterate faster than the stream, all promises well resolve when they receive data and also if you stream faster than you iterate, you'll have promises to resolve from the iterator.
function streamToAsyncIterator(chunkSize, stream) {
let done = false;
let endPromise = new Promise(resolve => {
//flush out the last data.
stream.on('end', () => {
resolve({ value: collector, done: false });
});
});
//two-track queue for expecting and sending data with promises
let dataPromises = [];
let dataResolves = [];
stream.on('data', value => {
const dataResolve = dataResolves.shift();
if (dataResolve) {
dataResolve({ value, done: false });
} else {
dataPromises.push(Promise.resolve({ value, done: false }));
}
stream.pause();
});
return {
[Symbol.asyncIterator]() {
return this;
},
//TODO handle return() to close the stream
next() {
if (done) return Promise.resolve({ done });
stream.resume();
let dataPromise = dataPromises.shift();
if (!dataPromise) {
dataPromise = new Promise(resolve => dataResolves.push(resolve));
}
return Promise.race([dataPromise, endPromise])
// done must be set in the resolution of the race, or done could complete the generator before the last iteration of data.
.then(next => {
if (next.done) {
done = true;
next.done = false;
}
return next;
});
},
};
}
async function* streamToAsyncGenerator(chunkSize, stream) {
const iterator = streamToAsyncIterator(chunkSize, stream);
let next = await iterator.next();
while (!next.done) {
yield next.value;
// Delete is needed to release resouces
// Without delete, you'll get a memory error at 2GB.
delete next.value;
next = await iterator.next();
}
};
EDIT: I removed the collector, which has nothing to do with the question and I added the delete which is necessary, because GC doesn't appear to run with an array of iterators. This should be the final answer as it works swell for me.
I am making a easy html5 game.
Object.keys(gameConfig.playerElems).map((e) =>{
let img = gameConfig.playerElems[e];
let name = e;
let imgObj;
imgObj = new Image();
imgObj.src = img;
imgObj.onload = () => {
playerElemsCounter++;
drawPlayer(imgObj);
}
});
Is it possible to pause .map() iteration while imgObj will be loaded?
Is it possible to pause .map() iteration while imgObj will be loaded?
No. So instead, you use an asynchronous loop. Here's one example, see comments:
// A named IIFE
(function iteration(keys, index) {
// Get info for this iteration
let name = keys[index];
let img = gameConfig.playerElems[name];
let imgObj = new Image();
// Set event callbacks BEFORE setting src
imgObj.onload = () => {
playerElemsCounter++;
drawPlayer(imgObj);
next();
};
imgObj.onerror = next;
// Now set src
imgObj.src = img;
// Handles triggering the next iteration on load or error
function next() {
++index;
if (index < keys.length) {
iteration(keys, index);
}
}
})(Object.keys(gameConfig.playerElems), 0);
But, as Haroldo_OK points out, this will wait for one image to load before requesting the next, which is not only unnecessary, but harmful. Instead, request them all, draw them as you receive them, and then continue. You might do that by giving yourself a loading function returning a promise:
const loadImage = src => new Promise((resolve, reject) => {
const imgObj = new Image();
// Set event callbacks BEFORE setting src
imgObj.onload = () => { resolve(imgObj); };
imgObj.onerror = reject;
// Now set src
imgObj.src = src;
});
Then:
// Load in parallel, draw as we receive them
Promise.all(Object.keys(gameConfig.playerElems).map(
key => loadImage(gameConfig.playerElems[key])
.then(drawPlayer)
.catch(() => drawPlayer(/*...placeholder image URL...*/))
)
.then(() => {
// All done, if you want to do something here
});
// No need for `.catch`, we handled errors inline
If you wanted (for some reason) to hold up loading the next image while waiting for the previous, that loadImage function could be used differently to do so, for instance with the classic promise reduce pattern:
// Sequential (probably not a good idea)
Object.keys(gameConfig.playerElems).reduce(
(p, key) => p.then(() =>
loadImage(gameConfig.playerElems[key])
.then(drawPlayer)
.catch(() => drawPlayer(/*...placeholder image URL...*/))
)
,
Promise.resolve()
)
.then(() => {
// All done, if you want to do something here
});
// No need for `.catch`, we handled errors inline
...or with ES2017 async/await:
// Sequential (probably not a good idea)
(async function() {
for (const key of Object.keys(gameConfig.playerElems)) {
try {
const imgObj = await loadImage(gameConfig.playerElems[name]);
playerElemsCounter++;
drawPlayer(imgObj);
} catch (err) {
// use placeholder
drawPlayer(/*...placeholder image URL...*/);
}
}
})().then(() => {
// All done
});
// No need for `.catch`, we handled errors inline
Side note: There's no point to using map if you're not A) Returning a value from the callback to use to fill the new array map creates, and B) Using the array map returns. When you're not doing that, just use forEach (or a for or for-of loop).