When I call function saveDamage() and an exception occurs in the Promise.all() at the end of the function. I expected all preceding operations be rolled back but they're not. What am I doing wrong please? Thank you!
saveDamage(damage) {
return this.transaction('rw', [
this.DAMAGE_STORE_NAME,
this.DAMAGE_IMAGE_DATA_STORE_NAME,
this.DAMAGE_IMAGE_DATA_TEMP_STORE_NAME
], (tx) => {
let deletePromise = (damage.id)
? this._deleteDamage(tx, damage.id)
: Promise.resolve();
return deletePromise.then(() => {
// Copy temporary image data of damage into permanent location and update their IDs
let copyPromises = this._getDamageImageFields(damage).map(field => {
return this._copyObject(tx, this.DAMAGE_IMAGE_DATA_TEMP_STORE_NAME, this.DAMAGE_IMAGE_DATA_STORE_NAME, field.value.imageDataId)
.then(id => field.value.imageDataId = id)
});
return Promise.all(copyPromises)
// Save damage
/* The exception occurs inside _saveObject(). Previous DB operations are not rolled back. */
.then(() => this._saveObject(tx, this.DAMAGE_STORE_NAME, damage, damage.id))
.then(id => damage.id = id)
// Delete temporary image data
.then(() => this._clearStore(tx, this.DAMAGE_IMAGE_DATA_TEMP_STORE_NAME));
});
});
}
transaction(mode, storeNames, executorFnc) {
let tx = this._db.transaction(this._transactionStoreNames(storeNames), this._transactionMode(mode));
tx.onabort = (event) => reject("Transaction failed: " + event.target.errorCode);
return executorFnc(tx);
}
_deleteDamage(tx, id) {
// Fetch damage from DB
return this._getObject(tx, this.DAMAGE_STORE_NAME, id).then(damage => {
// Delete all big image data
let promises = this._getIdsOfDamageImageData(damage)
.map(imageDataId => this._deleteObject(tx, this.DAMAGE_IMAGE_DATA_STORE_NAME, imageDataId));
// Delete damage itself
promises.push(this._deleteObject(tx, this.DAMAGE_STORE_NAME, id));
return Promise.all(promises);
});
}
_deleteObject(tx, storeName, id) {
return new Promise((resolve, reject) => {
let result = tx.objectStore(storeName).delete(id);
result.onsuccess = (event) => resolve();
result.onerror = (event) => reject(event);
});
}
_copyObject(tx, sourceStoreName, targetStoreName, id) {
return this._getObject(tx, sourceStoreName, id)
.then(obj => this._saveObject(tx, targetStoreName, obj));
}
_saveObject(tx, storeName, object, id) {
return new Promise((resolve, reject) => {
let store = tx.objectStore(storeName);
let result = store.put(object, id);
result.onsuccess = (event) => resolve(event.target.result /* object id */);
result.onerror = (event) => reject(event);
});
}
The exception is (I know how to fix it):
Uncaught (in promise) DOMException: Failed to execute 'put' on 'IDBObjectStore': The object store uses in-line keys and the key parameter was provided.
EDIT: After some experimenting I understood that as soon as I exit success handler and there are no more unfinished success handlers then the transaction commits. It looks like operations issued in 'then' function are executed in another transaction because 'then' is executed as a result of returning resolved promise when exitting a success handler. The only way I found to execute dependent operations in a single transaction is using plain success handlers. But using plain success handlers becomes difficult when next-step-operations depends on the result of the current operation. For example when I need to delete object 'A' stored in store 'a' and it's many sub-objects 'B' stored in store 'b'. In such case I need to fetch object 'A' first, then delete sub-objects 'B' one by one, which requires generating handling code dynamically, the delete 'A'. I can't believe it's so difficult to achieve such basic thing so I suspect I'm still missing something.
The error means id is undefined, I think in _saveObject
If your promises represent individual transactions, then all transactions in the Promise.all array that were committed prior to the error will have already committed regardless of the fact that the Promise.all promise rejected. Run all database operations a single transaction to achieve a complete roll back if any one operation fails.
Related
Currently I have many concurrent identical calls to my backend, differing only on an ID field:
getData(1).then(...) // Each from a React component in a UI framework, so difficult to aggregate here
getData(2).then(...)
getData(3).then(...)
// creates n HTTP requests... inefficient
function getData(id: number): Promise<Data> {
return backend.getData(id);
}
This is wasteful as I make more calls. I'd like to keep my getData() calls, but then aggregate them into a single getDatas() call to my backend, then return all the results to the callers. I have more control over my backend than the UI framework, so I can easily add a getDatas() call on it. The question is how to "mux" the JS calls into one backend call, the "demux" the result into the caller's promises.
const cache = Map<number, Promise<Data>>()
let requestedIds = []
let timeout = null;
// creates just 1 http request (per 100ms)... efficient!
function getData(id: number): Promise<Data> {
if (cache.has(id)) {
return cache;
}
requestedIds.push(id)
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas(requestedIds).then((datas: Data[]) => {
// TODO: somehow populate many different promises in cache??? but how?
requestedIds = []
timeout = null
}
}, 100)
}
return ???
}
In Java I would create a Map<int, CompletableFuture> and upon finishing my backend request, I would look up the CompletableFuture and call complete(data) on it. But I think in JS Promises can't be created without an explicit result being passed in.
Can I do this in JS with Promises?
A little unclear on what your end goal looks like. I imagine you could loop through your calls as needed; Perhaps something like:
for (let x in cache){
if (x.has(id))
return x;
}
//OR
for (let x=0; x<id.length;x++){
getData(id[x])
}
Might work. You may be able to add a timing method into the mix if needed.
Not sure what your backend consists of, but I do know GraphQL is a good system for making multiple calls.
It may be ultimately better to handle them all in one request, rather than multiple calls.
The cache can be a regular object mapping ids to promise resolution functions and the promise to which they belong.
// cache maps ids to { resolve, reject, promise, requested }
// resolve and reject belong to the promise, requested is a bool for bookkeeping
const cache = {};
You might need to fire only once, but here I suggest setInterval to regularly check the cache for unresolved requests:
// keep the return value, and stop polling with clearInterval()
// if you really only need one batch, change setInterval to setTimeout
function startGetBatch() {
return setInterval(getBatch, 100);
}
The business logic calls only getData() which just hands out (and caches) promises, like this:
function getData(id) {
if (cache[id]) return cache[id].promise;
cache[id] = {};
const promise = new Promise((resolve, reject) => {
Object.assign(cache[id], { resolve, reject });
});
cache[id].promise = promise;
cache[id].requested = false;
return cache[id].promise;
}
By saving the promise along with the resolver and rejecter, we're also implementing the cache, since the resolved promise will provide the thing it resolved to via its then() method.
getBatch() asks the server in a batch for the not-yet-requested getData() ids, and invokes the corresponding resolve/reject functions:
function getBatch() {
// for any
const ids = [];
Object.keys(cache).forEach(id => {
if (!cache[id].requested) {
cache[id].requested = true;
ids.push(id);
}
});
return backend.getDatas(ids).then(datas => {
Object.keys(datas).forEach(id => {
cache[id].resolve(datas[id]);
})
}).catch(error => {
Object.keys(datas).forEach(id => {
cache[id].reject(error);
delete cache[id]; // so we can retry
})
})
}
The caller side looks like this:
// start polling
const interval = startGetBatch();
// in the business logic
getData(5).then(result => console.log('the result of 5 is:', result));
getData(6).then(result => console.log('the result of 6 is:', result));
// sometime later...
getData(5).then(result => {
// if the promise for an id has resolved, then-ing it still works, resolving again to the -- now cached -- result
console.log('the result of 5 is:', result)
});
// later, whenever we're done
// (no need for this if you change setInterval to setTimeout)
clearInterval(interval);
I think I've found a solution:
interface PromiseContainer {
resolve;
reject;
}
const requests: Map<number, PromiseContainer<Data>> = new Map();
let timeout: number | null = null;
function getData(id: number) {
const promise = new Promise<Data>((resolve, reject) => requests.set(id, { resolve, reject }))
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas([...requests.keys()]).then(datas => {
for (let [id, data] of Object.entries(datas)) {
requests.get(Number(id)).resolve(data)
requests.delete(Number(id))
}
}).catch(e => {
Object.values(requests).map(promise => promise.reject(e))
})
timeout = null
}, 100)
}
return promise;
}
The key was figuring out I could extract the (resolve, reject) from a promise, store them, then retrieve and call them later.
In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}
I am looking at https://www.promisejs.org/patterns/ and it mentions it can be used if you need a value in the form of a promise like:
var value = 10;
var promiseForValue = Promise.resolve(value);
What would be the use of a value in promise form though since it would run synchronously anyway?
If I had:
var value = 10;
var promiseForValue = Promise.resolve(value);
promiseForValue.then(resp => {
myFunction(resp)
})
wouldn't just using value without it being a Promise achieve the same thing:
var value = 10;
myFunction(10);
Say if you write a function that sometimes fetches something from a server, but other times immediately returns, you will probably want that function to always return a promise:
function myThingy() {
if (someCondition) {
return fetch('https://foo');
} else {
return Promise.resolve(true);
}
}
It's also useful if you receive some value that may or may not be a promise. You can wrap it in other promise, and now you are sure it's a promise:
const myValue = someStrangeFunction();
// Guarantee that myValue is a promise
Promise.resolve(myValue).then( ... );
In your examples, yes, there's no point in calling Promise.resolve(value). The use case is when you do want to wrap your already existing value in a Promise, for example to maintain the same API from a function. Let's say I have a function that conditionally does something that would return a promise — the caller of that function shouldn't be the one figuring out what the function returned, the function itself should just make that uniform. For example:
const conditionallyDoAsyncWork = (something) => {
if (something == somethingElse) {
return Promise.resolve(false)
}
return fetch(`/foo/${something}`)
.then((res) => res.json())
}
Then users of this function don't need to check if what they got back was a Promise or not:
const doSomethingWithData = () => {
conditionallyDoAsyncWork(someValue)
.then((result) => result && processData(result))
}
As a side node, using async/await syntax both hides that and makes it a bit easier to read, because any value you return from an async function is automatically wrapped in a Promise:
const conditionallyDoAsyncWork = async (something) => {
if (something == somethingElse) {
return false
}
const res = await fetch(`/foo/${something}`)
return res.json()
}
const doSomethingWithData = async () => {
const result = await conditionallyDoAsyncWork(someValue)
if (result) processData(result)
}
Another use case: dead simple async queue using Promise.resolve() as starting point.
let current = Promise.resolve();
function enqueue(fn) {
current = current.then(fn);
}
enqueue(async () => { console.log("async task") });
Edit, in response to OP's question.
Explanation
Let me break it down for you step by step.
enqueue(task) add the task function as a callback to promise.then, and replace the original current promise reference with the newly returned thenPromise.
current = Promise.resolve()
thenPromise = current.then(task)
current = thenPromise
As per promise spec, if task function in turn returns yet another promise, let's call it task() -> taskPromise, well then the thenPromise will only resolve when taskPromise resolves. thenPromise is practically equivalent to taskPromise, it's just a wrapper. Let's rewrite above code into:
current = Promise.resolve()
taskPromise = current.then(task)
current = taskPromise
So if you go like:
enqueue(task_1)
enqueue(task_2)
enqueue(task_3)
it expands into
current = Promise.resolve()
task_1_promise = current.then(task_1)
task_2_promise = task_1_promise.then(task_2)
task_3_promise = task_2_promise.then(task_3)
current = task_3_promise
effectively forms a linked-list-like struct of promises that'll execute task callbacks in sequential order.
Usage
Let's study a concrete scenario. Imaging you need to handle websocket messages in sequential order.
Let's say you need to do some heavy computation upon receiving messages, so you decide to send it off to a worker thread pool. Then you write the processed result to another message queue (MQ).
But here's the requirement, that MQ is expecting the writing order of messages to match with the order they come in from the websocket stream. What do you do?
Suppose you cannot pause the websocket stream, you can only handle them locally ASAP.
Take One:
websocket.on('message', (msg) => {
sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
})
This may violate the requirement, cus sendToWorkerThreadPool may not return the result in the original order since it's a pool, some threads may return faster if the workload is light.
Take Two:
websocket.on('message', (msg) => {
const task = () => sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
This time we enqueue (defer) the whole process, thus we can ensure the task execution order stays sequential. But there's a drawback, we lost the benefit of using a thread pool, cus each sendToWorkerThreadPool will only fire after last one complete. This model is equivalent to using a single worker thread.
Take Three:
websocket.on('message', (msg) => {
const promise = sendToWorkerThreadPool(msg)
const task = () => promise.then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
Improvement over take two is, we call sendToWorkerThreadPool ASAP, without deferring, but we still enqueue/defer the writeToMessageQueue part. This way we can make full use of thread pool for computation, but still ensure the sequential writing order to MQ.
I rest my case.
I'm writing a service that makes an API call to get all bookings, then from those bookings gets a list all the coworkerIds who made those bookings. I then take the unique ids and use them to make another set of calls, to get an array of all the coworker records. Once this step is done I'll do the same for line item calls, however for some reason I can't get my list of coworker objects.
Here's my code:
apiService.getBookingsList(`?size=1000&from_Booking_FromTime=${weekAgo.toISOString()}`).then(bookings => {
const bookingCoworkerIds = bookings.map(booking => booking.CoworkerId);
const bookingCoworkerIdsUnique = bookingCoworkerIds.filter(recordParser.onlyUnique);
const getCoworker = function getCoworkerInfo(coworkerId) {
return apiService.getSingleCoworker(coworkerId);
}
const bookingCoworkerCalls = bookingCoworkerIdsUnique.map(coworkerId => getCoworker(coworkerId));
const bookingCoworkers = Promise.all(bookingCoworkerCalls);
bookingCoworkers.then(coworkers => {
console.log(coworkers.length);
console.log(coworkers[0]);
});
});
And here's the code for apiService.getSingleCoworker:
getSingleCoworker(coworkerId) {
// returns a single coworker object given their ID
return httpsRequest.createRequest(this.URL.coworkerList + `?Coworker_Id=${coworkerId}`, {}, this.requestHeaders, 'GET')
.then(result => JSON.parse(result).Records[0]);
}
I can also post my https code but I don't think that's the issue.
I suspect I'm doing something wrong with my promise pattern, as I'm still new to promises and async code in general. Neither of these console logs are reached, and instead the only output is:
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): SyntaxError: Unexpected end of JSON input
So what am I doing wrong here?
The idea of working with promises is chaining "then" calls (or commands).
In order to do that, a promise must return another promise or a value.
Like this:
apiService.getBookingsList(`?size=1000&from_Booking_FromTime=${weekAgo.toISOString()}`)
.then(bookings => {
const bookingCoworkerIds = bookings.map(booking => booking.CoworkerId);
const bookingCoworkerIdsUnique = bookingCoworkerIds.filter(recordParser.onlyUnique);
const bookingPromises = bookingCoworkerIdsUnique.map(coworkerId => apiService.getSingleCoworker(coworkerId));
return Promise.all(bookingPromises);
}).then(coworkers => {
console.log(coworkers.length);
console.log(coworkers[0]);
}).catch(err => {
// do something with error
});
inside the first "then" callback I return Promise.all which generates a promise.
The next "then" receives all the resolved values by the "all" promise.
Bear in mind that promises should always contain a "catch" call at the end in order to capture promise errors.
EDIT:
getSingleCoworker(coworkerId) {
const self = this;
return new Promise(function(resolve, reject) {
httpsRequest.createRequest(self.URL.coworkerList + `?Coworker_Id=${coworkerId}`, {}, this.requestHeaders, 'GET')
.then(result => {
const jsonVal = JSON.parse(result).Records[0];
resolve(jsonVal);
}).catch(reject);
});
}
Hope this helps
Currently, I am trying to get the md5 of every value in array. Essentially, I loop over every value and then hash it, as such.
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (var userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
console.log(userHashString.toUpperCase() + "this is the hashed string")
if (userHashString.toUpperCase() === poster){
return console.log("this is the poster")
}
else {
..
}
}
else {
return null
}
})
)}
However, this leads to two problems. The first is that I am receiving the error warning "Don't make functions within a loop". The second problem is that the hashes are all returning the same. Even though every userID is unique, the userHashString is printing out the same value for every user in the console log, as if it is just using the first userID, getting the hash for it, and then printing it out every time.
Update LATEST :
exports.sendNotificationForPost = functions.firestore
.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data()
const watching = value.watchedBy
const poster = value.poster
const postContentNotification = value.post
const refPromises = []
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (let userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
if (userHashString.toUpperCase() === poster){
return null
}
else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload)
}
}
else {
return null
}
})
)}
return Promise.all(refPromises);
});
You have a couple issues going on here. First, you have a non-blocking asynchronous operation inside a loop. You need to fully understand what that means. Your loop runs to completion starting a bunch of non-blocking, asynchronous operations. Then, when the loop finished, one by one your asynchronous operations finish. That is why your loop variable userID is sitting on the wrong value. It's on the terminal value when all your async callbacks get called.
You can see a discussion of the loop variable issue here with several options for addressing that:
Asynchronous Process inside a javascript for loop
Second, you also need a way to know when all your asynchronous operations are done. It's kind of like you sent off 20 carrier pigeons with no idea when they will all bring you back some message (in any random order), so you need a way to know when all of them have come back.
To know when all your async operations are done, there are a bunch of different approaches. The "modern design" and the future of the Javascript language would be to use promises to represent your asynchronous operations and to use Promise.all() to track them, keep the results in order, notify you when they are all done and propagate any error that might occur.
Here's a cleaned-up version of your code:
const crypto = require('crypto');
exports.sendNotificationForPost = functions.firestore.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data();
const watching = value.watchedBy;
const poster = value.poster;
const postContentNotification = value.post;
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
return Promise.all(Object.keys(watching).map(userID => {
return admin.database().ref('notifications/' + userID).once('value').then(snapshot => {
if (snapshot.exists()) {
const userHashString = userHash(userID);
if (userHashString.toUpperCase() === poster) {
// user is same as poster, don't send to them
return {response: null, user: userID, poster: true};
} else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload).then(response => {
return {response, user: userID};
}).catch(err => {
console.log("err in sendToDevice", err);
// if you want further processing to stop if there's a sendToDevice error, then
// uncomment the throw err line and remove the lines after it.
// Otherwise, the error is logged and returned, but then ignored
// so other processing continues
// throw err
// when return value is an object with err property, caller can see
// that that particular sendToDevice failed, can see the userID and the error
return {err, user: userID};
});
}
} else {
return {response: null, user: userID};
}
});
}));
});
Changes:
Move require() out of the loop. No reason to call it multiple times.
Use .map() to collect the array of promises for Promise.all().
Use Object.keys() to get an array of userIDs from the object keys so we can then use .map() on it.
Use .then() with .once().
Log sendToDevice() error.
Use Promise.all() to track when all the promises are done
Make sure all promise return paths return an object with some common properties so the caller can get a full look at what happened for each user
These are not two problems: the warning you get is trying to help you solve the second problem you noticed.
And the problem is: in Javascript, only functions create separate scopes - every function you define inside a loop - uses the same scope. And that means they don't get their own copies of the relevant loop variables, they share a single reference (which, by the time the first promise is resolved, will be equal to the last element of the array).
Just replace for with .forEach.