Using JavaScript Promise All - is this syntax valid? - javascript

I have two tables with users, where each id for one user is same in both tables (don't ask why I have two user tables).
At some point, I need to filter users from table 1, and if certain condition is true, I store a promise (deleting request) for each user into (let's call it) tableOnePromises. I do the same for table 2.
In order to empty table 2, I MUST first empty table one due to some requirements.
this is what I did:
let tableOnePromises = [];
let tableTwoPromises = [];
tableOne.forEach(item => {
if(item.deactivated) {
const tableOneDeleted = supabase
.from("table-one")
.delete()
.match({id: item.id});
tableOnePromises.push(tableOneDeleted);
const tableTwoDeleted = supabase
.from("table-two")
.delete()
.match({id: item.id});
tableOnePromises.push(tableTwoDeleted);
}
});
await Promise.all(tableOnePromises).then(() => {
return Promise.all(tableTwoPromises)
}).catch(err => console.log(err));

Assuming the code using await is inside an async function (or at the top level of a module), the syntax is correct, but it's probably not what I'd use (in general, avoid mixing async/await with explicit callbacks via .then and .catch), and separately it's probably not working quite as you expect (this is borne out by your saying that your code was failing to delete from table-two).
For any particular id value, your code starts deleting from table-one and then immediately starts deleting from table-two without waiting for the deletion in table-one to complete:
// STARTS the deletion but doesn't wait for it to finish
const tableOneDeleted = supabase
.from("table-one")
.delete()
.match({id: item.id});
// ...
// Starts deleting from `table-two`, even though the item may still be in `table-one`
const tableTwoDeleted = supabase
.from("table-two")
.delete()
.match({id: item.id});
Remember that a promise is just a way of observing an asynchronous process; by the time you have the promise, the process it's observing is already underway.¹ So even though you don't wait for the table-two promises until later, you start the table-two deletions immediately.
...I MUST first empty table one due to some requirements...
If by "empty" you mean just that you have to ensure you've done the delete for a particular id on table-one before doing it on table-two, you need to wait for the table-one deletion to be completed before starting the table-two deletion. I'd put that in a function:
async function deleteItem(id) {
await supabase
.from("table-one")
.delete()
.match({id});
await supabase
.from("table-two")
.delete()
.match({id});
}
Then the code becomes:
const promises = [];
for (const {deactivated, id} of tableOne) {
if (deactivated) {
promises.push(deleteItem(id));
}
}
await Promise.all(promises); // With the `try`/`catch` if desired
...or if it's okay to make two passes through the array:
await Promise.all( // With the `try`/`catch` if desired
tableOne.filter(({deactivated}) => deactivated)
.map(({id}) => deleteItem(id))
);
¹ "...by the time you have the promise, the process it's observing is already underway." That's the normal case. There is unfortunately a popular document DB library that doesn't start its work on something until/unless you call then on the promise for it. But that's an exception, and an anti-pattern.

Related

Promise Chain Within a for-each Loop Inside a .then() of Another Promise

In JavaScript, I'm calling a promise which returns an id; we'll call this promise1. promise1 has a following .then() which, given the id returned from promise1, calls a for-each loop. On each iteration of this loop, promise2 is carried out
promsie2 also has a .then(). so the data returned from promise2 can be utilized.
mycode.js
exports.my_function = (req, res) => {
var data = req.body;
var name_array = ["Jeff", "Sophie", "Kristen"]
var personal_ID_arr = []
promise1.getID(data) //promise1 gets an int 'id'
.then(id => {
array.forEach(name => { //for each name
promise2.getPersonalID(name, id) //promise2 creates a personal id for each person using their name and the id generated from promise1
.then(personalID => {
personal_ID_arr.push(personalID) //add the generated personal id to an arr
})
})
})
//will need to operate on 'personal_ID_arr' here after the above finishes or possibly still within the promise1's .then; as long as the arr is filled fully
}
However, I'm running into the issue of synchronicity not allowing for this type of logic to happen, as things begin happening out of order once it gets to the loop aspect of the program. I need to carry out another function with the filled-out 'personal_ID_arr' after these promises are carried out as well
I've looked at other questions regarding promise chaining but this is a different case due to the fact the for loop needs to work with the data from the first promise and use .then() inside a .then(). Surely, I don't need to keep this structure though if a better one exists.
Any help would be greatly appreciated
Keeping an array of IDs that an asynchronous operation will fill, is not something that will help you carrying out more asynchronous operations on the array later. Instead, consider using an array of Promises for each name, and wait them all to resolve using Promise.all. Also, i removed the personal_ID_arr from the function my_function variables and moved it to the appropriate asynchronous operation. I did a simple console log of the array, you should fill your next step there.
Although maybe you need to write this logic with Promises, but I would suggest using async / await for this task, it will lead to more readable code.
exports.my_function = (req, res) => {
var data = req.body;
var name_array = ["Jeff", "Sophie", "Kristen"]
promise1.getID(data) //promise1 gets an int 'id'
.then(id =>
//map the array of Strings to array of Promises instead, wait for them to resolve
Promise.all(name_array.map(name =>
promise2.getPersonalID(name, id)
))
)
.then(personal_ID_arr => console.log(personal_ID_arr))
//will need to operate on 'personal_ID_arr' here after the above finishes or possibly still within the promise1's .then; as long as the arr is filled fully
}

Multiple fetch request in a raw

For a project, I have to generate a PDF that lists the vehicles in a DB. To do this, I use the JSPDF library.
1st step: I get data from a DB (via asynchronous requests on an API) and images on a server that I store in an Array.
2nd step: I call a function that generates the PDF with JSPDF.
The problem is that I need to have all my data retrieved before calling my generatePDF function otherwise the fields and images are empty because they have not yet been retrieved from the DB or the server.
A solution I found is to use the setTimeout function to put a delay between each call. However, this makes the code very slow and inflexible because you have to change the timeout manually depending on the number of data and images to retrieve. Moreover, it is impossible to determine exactly how long it will take to retrieve the data, especially since this can vary depending on the state of the network, so you have to allow for a margin which is often unnecessary.
Another solution is to use callbacks or to interweave fetch / ajax calls with .then / .done calls, but this becomes very complicated when it comes to retrieving the images since they are retrieved one by one and there are more than a hundred of them.
What would be the easiest way to do this in a clean and flexible way?
Thanks for your help and sorry for the long text, I tried to be as clear as possible :)
To do a series of asynchronous things in order, you start the next operation in the fulfillment handler of the previous operation.
An async function is the easiest way:
async function buildPDF() {
const response = await fetch("/path/to/the/data");
if (!response.ok) {
throw new Error(`HTTP error ${response.status}`);
}
const data = await response.json(); // Or `.text()` or whatever
const pdf = await createPDF(data); // Assuming it returns a promise
// ...
}
If you can't use async functions in your environment and don't want to transpile, you can write your fulfullment handlers as callbacks:
function buildPDF() {
return fetch("/path/to/the/data")
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error ${response.status}`);
}
return response.json(); // Or `.text()` or whatever
})
.then(data => createPDF(data))
.then(pdf => {
// ...
});
}
Note that I'm returning the result of the chain, so that the caller can handle errors.

Problem with reading data from Cache API (still in promise?)

That is my first post here. I am not well skilled in asynchronous code so can not resolve the problem by myself
In a React/Redux app I have added cache. The idea behind it is to have something like 'Favorites' functionality but on clients' computer. So, I would like to store over there some data about books. While writing to cache works I can not successively dispatch data to store> Now my code looks like this:
export function fetchFromFavorites() {
return async (dispatch, getState) => {
const URL = getState().books.currentURL;
const Cache = await caches.open(URL);
Cache.matchAll()
.then(function (response) {
const ar = [];
response.forEach(async item => ar.push(await item.json()));
return ar;
})
.then(response => dispatch(test(response)));
};
}
In the code above test is an action that only sets the state field with payload. While the payload can be log-consoled from reducer, I can not perform on that any further action with another external function, well-checked on that kind of data. Besides DevTools mark it with blue 'i' what indicates that it has been calculated very lately. What is wrong with that code? BTW - it has nothing to do with service workers it is just inside regular React.
The function you are passing to response.forEach is returning a promise. You'd need to wait for all of those promises to resolve before returning ar.
For example, you may use something like:
// await all promises to be fulfilled before proceeding.
// note that we use response.map instead of forEach as
// we want to retain a reference to the promise returned
// by the callback.
// Additionally, we can just return the promise returned
// by `item.json()`
await Promise.all(response.map(item => item.json());
Remember, any function marked as async will return a promise wrapping the function's return type.
Note that you're mixing async/await with older style then/catch promises here. For consistency and ease of reading, you may want to use one style consistently.

Knex bulk insert not waiting for it to finish before passing to the next async operation

I am having a problem where I am making a bulk insert of multiple elements into a table, then I immediatly get the last X elements from that table that were recently inserted but when I do that it seems that the elements have not yet been inserted fully even thought I am using async await to wait for the async operations.
I am making a bulk insert like
const createElements = elementsArray => {
return knex
.insert(elementsArray)
.into('elements');
};
Then I have a method to immediately access those X elements that were inserted:
const getLastXInsertedElements = (userId, length, columns=['*']) => {
return knex.select(...columns)
.from('elements').where('userId', userId)
.orderBy('createdAt', 'desc')
.limit(length);
}
And finally after getting those elements I get their ids and save them into another table that makes use of element_id of those recently added elements.
so I have something like:
// A simple helper function that handles promises easily
const handleResponse = (promise, message) => {
return promise
.then(data => ([data, undefined]))
.catch(error => {
if (message) {
throw new Error(`${message}: ${error}`);
} else {
return Promise.resolve([undefined, `${message}: ${error}`])
}
}
);
};
async function service() {
await handleResponse(createElements(list), 'error text'); // insert x elements from the list
const [elements] = await handleResponse(getLastXInsertedElements(userId, list.length), 'error text') // get last x elements that were recently added
await handleResponse(useElementsIdAsForeignKey(listMakingUseOfElementsIds), 'error text'); // Here we use the ids of the elements we got from the last query, but we are not getting them properly for some reason
}
So the problem:
Some times when I execute getLastXInsertedElements it seems that the elements are not yet finished inserting, even thought I am waiting with async/await for it, any ideas why this is? maybe something related to bulk inserts that I don't know of? an important note, all the elements always properly inserted into the table at some point, it just seems like this point is not respected by the promise (async operation that returns success for the knex.insert).
Update 1:
I have tried putting the select after the insert inside a setTimeout of 5 seconds for testing purposes, but the problem seems to persist, that is really weird, seems one would think 5 seconds is enough between the insert and the select to get all the data.
I would like to have all X elements that were just inserted accessible in the select query from getLastXInsertedElements consistently.
Which DB are you using, how big list of data are you inserting? You could also test if you are inserting and getLastXInsertedElements in a transaction if that hides your problem.
Doing those operations in transaction also forces knex to use the same connection for both queries so it might lead to a tracks where is this coming from.
Another trick to force queries to use the same connection would be to set pool's min and max configuration to be 1 (just for testing is parallelism is indeed the problem here).
Also since you have not provided complete reproduction code for this, I'm suspecting there is something else here in the mix which causes this odd behavior. Usually (but not always) this kind of weird cases that shouldn't happen are caused by user error in elsewhere using the library.
I'll update the answer if there is more information provided. Complete reproduction code would be the most important piece of information.
I am not 100% sure but I guess the knex functions do not return promise by default (but a builder object for the query). That builder has a function called then that transforms the builder into a promise. So you may try to add a call to that:
...
limit(length)
.then(x => x); // required to transform to promise
Maybe try debugging the actual type of the returned value. It might happen that this still is not a promise. In this case you may not use async await but need to use the then Syntax because it might not be real js promises but their own implementation.
Also see this issue about standard js promise in knex https://github.com/knex/knex/issues/1588
In theory, it should work.
You say "it seems"... a more clear problem explanation could be helpful.
I can argue the problem is that you have elements.length = list.length - n where n > 0; in your code there are no details about userId property in your list; a possible source of the problem could be that some elements in your list has a no properly set userId property.

why my javascript its use to long time to run?

I'm working with Cloud Functions for Firebase, and I get a timeout with some of my functions. I'm pretty new with JavaScript. It looks like I need to put a for inside a promise, and I get some problems. The code actually get off from for too early, and I think he make this in a long time. Do you have some way to improve this code and make the code faster?
exports.firebaseFunctions = functions.database.ref("mess/{pushId}").onUpdate(event => {
//first i get event val and a object inside a firebase
const original = event.data.val();
const users = original.uids; // THIS ITS ALL USERS UIDS!!
// so fist i get all users uids and put inside a array
let usersUids = [];
for (let key in users) {
usersUids.push(users[key]);
}
// so now i gonna make a promise for use all this uids and get token's device
//and save them inside a another child in firebase!!
return new Promise((resolve) => {
let userTokens = [];
usersUids.forEach(element => {
admin.database().ref('users/' + element).child('token').once('value', snapShot => {
if (snapShot.val()) { // if token exist put him inside a array
userTokens.push(snapShot.val());
}
})
})
resolve({
userTokens
})
}) // now i make then here, from get userTokens and save in another child inside a firebase database
.then((res) => {
return admin.database().ref("USERS/TOKENS").push({
userTokens: res,
})
})
})
You are making network requests with firebase, so maybe that's why it's slow. You are making one request per user, so if you have 100 ids there, it might as well take a while.
But there's another problem that I notice, that is: you are just resolving to an empty list. To wait for several promises, create an array of promises, and then use Promise.all to create a promise that waits for all of them in parallel.
When you call resolve, you have already done the forEach, and you have started every promise, but they have not been added to the list yet. To make it better, chance it to a map and collect all the returned promises, and then return Promise.all.

Categories