I'm using indexedDB Promised library to convert the indexedDB API to promises.
Looks like by the time my fetch is completed my indexed db transaction is no longer active. I'm guessing it is timing out?
The error I get is:
DOMException: Failed to execute 'delete' on 'IDBCursor': The transaction has finished.
What I'm trying to accomplish is to delete the item from indexedDB, only and only if the fetch is completed successfully. I understand that I can create a second transaction after the fetch to get the item and remove it. But I'm wondering if there's a better way without doing a new transaction? Am I missing something?
Can anyone explain to me why I'm seeing this problem?
DBHelper.DBPromised.then( db => {
const store = db.transaction('deferredReviews', 'readwrite').objectStore('deferredReviews');
const submittedRes = {};
store.openCursor()
.then(function submitAndDelete(cursor) {
if (!cursor) return;
console.log(cursor.value);
fetch(`${DBHelper.DATABASE_URL}/reviews`, {
method: 'POST',
body: JSON.stringify({
restaurant_id: cursor.value.restaurant_id,
name: cursor.value.name,
createdAt: cursor.value.deferredAt,
rating: cursor.value.rating,
comments: cursor.value.comments
})
})
.then(response => {
if (!response.ok) {
throw Error(response.statusText);
}
return response.json();
})
// If response is ok then delete from indexedDB.
.then(review => {
if (!review) return new Error('Could not submit');
if (cursor.value.restaurant_name in submittedRes) {
submittedRes[cursor.value.restaurant_name] = submittedRes[cursor.value.restaurant_name] + 1;
} else {
submittedRes[cursor.value.restaurant_name] = 1;
}
cursor.delete();
return cursor.continue().then(submitAndDelete);
});
})
.then(() => {
if (Object.keys(submittedRes).length === 0 && submittedRes.constructor === Object) return;
DBHelper.showDeferredSubmitModal(submittedRes);
});
});
You cannot do async operations in the middle of indexedDB operations. An IDBTransaction will automatically timeout if it does not detect any pending requests when it reaches the end of the current iteration of the JavaScript event loop. An async operation introduces a pause, so later requests that bind after the pause are bound too late, because by that time the transaction timed out and ended. Transactions are intended to be short lived, because a readwrite mode transaction can potentially lock up an object store for its entire duration, leading to serious blocking issues.
To work around this, do all of your async work either before or after the transaction, not in the middle of it. Or use two separate transactions, if data integrity is not an issue.
Related
I have the following function that synchronizes some locally cached data with my online database.
In the process, existing values are retrieved from the database, added to the locally cached values, and then sent as an update to the database. Because of this summing process, this procedure must not run multiple times in parallel because it could result in invalid values (adding more than it should).
I implemented a locking mechanism using local storage. It seems to work well because JavaScript is single-threaded and runs all the locking code synchronously, even if this function is called rapidly in succession.
The problem is, how do I also make this thread-safe across browser tabs/windows? The way I see it, multiple windows behave like separate threads, taking away the benefit of JavaScript's single-threading approach.
So 1) Is my assumption about multiple browser windows behaving like multiple threads correct and 2) How do I make my particular code safe across these different threads/processes?
export async function mergeCachedGameStats(supabaseClient: SupabaseClient) {
const lockKey = "merge_cached_game_stats_lock";
const lock = localStorage.getItem(lockKey);
if (lock && (Date.now() - parseInt(lock) > 15_000)) localStorage.removeItem(lockKey);
if (lock) return;
localStorage.setItem(lockKey, Date.now().toString());
try {
const { data, error } = await supabaseClient?.auth.getSession();
if (error) throw error;
const loggedInUser = data.session?.user;
for (const key of Object.values(gameStatsKeys)) {
if (!localStorage.getItem(key + "_cached")) continue;
const gameStats = await loadCombinedGameStats(key, supabaseClient);
if (loggedInUser) {
const { error } = await supabaseClient
.from("user_profiles")
.update({ [key]: JSON.stringify(gameStats) })
.eq("id", loggedInUser.id);
if (error) throw error;
} else {
localStorage.setItem(key, JSON.stringify(gameStats));
}
localStorage.removeItem(key + "_cached");
}
} catch (error) {
console.error(error);
} finally {
localStorage.removeItem(lockKey)
}
}
In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}
I'm very new to promises. I need my server to wait for socket connections from an external api and from browser clients connections. The external api sends a number of objects (4 in this example) to the server, which is received as a promise and calls the function which waits for the promise. For each object received by the promise, a browser client can make a connection (promise) and join the game.
I have a function which should wait for variables to be populated by these two promises. It is successful in waiting for the external api objects, but it never receives the promise to indicate that the correct number of clients have made connections.
I wrapped the socket listening for the external API objects in a promise as it will only we sent once. I also call the function which handles the two promises here as it didn't seem to work anywhere else.
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
// gameObject.files = imageObj.imagePath;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList) //resolve the promise with the array of objects
sendData()
}
catch (e) {
console.error(e)
}
});
});
I also wrapped the client req listener in a promise because after countless tries to nest the promise inside, this was the only solution which didn't return the actual socket as the promise, so I feel this is probably the closest solution for me.
This promise should only resolve when there are the same amount of client connections as there are server objects received in the first promise. I a testing by simply connecting from 4 open tabs to localhost:3000.
//HANDLER FOR CLIENT REQUEST TO JOIN GAME
const playerPromise = new Promise ((resolve, reject) => {
socket.on('joinGame', async () => {
try {
gameState.totalPlayerCount++;
gameState.playerList.push(socket.id)
switch (true) {
case gameState.totalPlayerCount < gameState.totalServerCount :
console.log("Not enough players", gameState.totalPlayerCount)
break;
case gameState.totalPlayerCount <= gameState.totalServerCount :
console.log("Counts are equal", gameState)
readyPlayers = true;
resolve(gameState.playerList)
break;
case gameState.totalServerCount == 0 :
console.log("Server not ready", gameState)
break;
default :
console.log("Too many players", gameState.totalPlayerCount)
reject("Too many players", gameState.playerList)
}
}
catch(e) {
console.error(e);
}
})
})
sendData() function logs the 1st and 2nd tests to the console, but never the 3rd.
async function sendData() {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2", dataMax)
const dataPlay = await playerPromise;
console.log("TEST3", dataPlay)
for (var key in await dataPlay) {
io.to(dataPlay[key]).emit('gameData', dataPlay[key]);
}
}
catch(e) {
console.error(e)
}
};
I've looked at every other similar post on stackoverflow and online but cannot find any solution to this or where I'm going wrong. I have also devised the above solution with minimal knowledge of socket.io and promises, so is there is a better/cleaner way to do the above please let me know.
EDIT:
This is my current solution using only one promise, but now the promise is not being populated at all in the send function:
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList)
}
catch (e) {
console.error(e)
}
});
});
async function sendData(playerData) {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2")
for (var key in await playerData) {
io.to(playerData[key]).emit('gameData', dataMax);
}
}
catch(e) {
console.error(e)
}
};
The sendData() is called in the Client socket handler which just passes the array of connections as playerData. "TEST2" is never logged.
Seeing as the promise maxPromise is global, shouldn't it be able to access its value?
You've probably figured this out by now. It's an interesting problem. I'd like to know how you solved it. As I understand it, you need to wait for data to arrive in order to select players. Seems like a good use of promises. You could use socket.once instead of socket.on for dictData if it's a one-time event
At the same time you don't want to block players yet still need to wait for enough players to join. Awaiting another promise is again a good gating technique
If you haven't solved all issues I suggest first removing socket.io from the equation while developing the gating logic. You can do this in node with custom event emitters. I'd simulate players and data events occurring at random times. You can also do this in the browser with custom events or broadcast channels. You'll find this more convenient than manually connecting to a port
I'd put in a lot of logging with millisecond timestamps to easily understand the sequence of events - when they occur and when they're handled
Currently, I am trying to get the md5 of every value in array. Essentially, I loop over every value and then hash it, as such.
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (var userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
console.log(userHashString.toUpperCase() + "this is the hashed string")
if (userHashString.toUpperCase() === poster){
return console.log("this is the poster")
}
else {
..
}
}
else {
return null
}
})
)}
However, this leads to two problems. The first is that I am receiving the error warning "Don't make functions within a loop". The second problem is that the hashes are all returning the same. Even though every userID is unique, the userHashString is printing out the same value for every user in the console log, as if it is just using the first userID, getting the hash for it, and then printing it out every time.
Update LATEST :
exports.sendNotificationForPost = functions.firestore
.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data()
const watching = value.watchedBy
const poster = value.poster
const postContentNotification = value.post
const refPromises = []
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (let userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
if (userHashString.toUpperCase() === poster){
return null
}
else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload)
}
}
else {
return null
}
})
)}
return Promise.all(refPromises);
});
You have a couple issues going on here. First, you have a non-blocking asynchronous operation inside a loop. You need to fully understand what that means. Your loop runs to completion starting a bunch of non-blocking, asynchronous operations. Then, when the loop finished, one by one your asynchronous operations finish. That is why your loop variable userID is sitting on the wrong value. It's on the terminal value when all your async callbacks get called.
You can see a discussion of the loop variable issue here with several options for addressing that:
Asynchronous Process inside a javascript for loop
Second, you also need a way to know when all your asynchronous operations are done. It's kind of like you sent off 20 carrier pigeons with no idea when they will all bring you back some message (in any random order), so you need a way to know when all of them have come back.
To know when all your async operations are done, there are a bunch of different approaches. The "modern design" and the future of the Javascript language would be to use promises to represent your asynchronous operations and to use Promise.all() to track them, keep the results in order, notify you when they are all done and propagate any error that might occur.
Here's a cleaned-up version of your code:
const crypto = require('crypto');
exports.sendNotificationForPost = functions.firestore.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data();
const watching = value.watchedBy;
const poster = value.poster;
const postContentNotification = value.post;
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
return Promise.all(Object.keys(watching).map(userID => {
return admin.database().ref('notifications/' + userID).once('value').then(snapshot => {
if (snapshot.exists()) {
const userHashString = userHash(userID);
if (userHashString.toUpperCase() === poster) {
// user is same as poster, don't send to them
return {response: null, user: userID, poster: true};
} else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload).then(response => {
return {response, user: userID};
}).catch(err => {
console.log("err in sendToDevice", err);
// if you want further processing to stop if there's a sendToDevice error, then
// uncomment the throw err line and remove the lines after it.
// Otherwise, the error is logged and returned, but then ignored
// so other processing continues
// throw err
// when return value is an object with err property, caller can see
// that that particular sendToDevice failed, can see the userID and the error
return {err, user: userID};
});
}
} else {
return {response: null, user: userID};
}
});
}));
});
Changes:
Move require() out of the loop. No reason to call it multiple times.
Use .map() to collect the array of promises for Promise.all().
Use Object.keys() to get an array of userIDs from the object keys so we can then use .map() on it.
Use .then() with .once().
Log sendToDevice() error.
Use Promise.all() to track when all the promises are done
Make sure all promise return paths return an object with some common properties so the caller can get a full look at what happened for each user
These are not two problems: the warning you get is trying to help you solve the second problem you noticed.
And the problem is: in Javascript, only functions create separate scopes - every function you define inside a loop - uses the same scope. And that means they don't get their own copies of the relevant loop variables, they share a single reference (which, by the time the first promise is resolved, will be equal to the last element of the array).
Just replace for with .forEach.
The situation is simple :
I have a nodejs server (called it API-A) that use Bluebird as a Promise handler.
I have a client(browser) that ask data through the API-A which get the data from another API (API-B). API-B could be a Weather service from example and then API-A aggregate the data with other data and send it back to client.
The situation is the next: API-B need a token with a TTL of 1800 second.
So for each request done by the client, I check if my token is expired or not.
I have this kind of code :
function getActivities()
{
return this.requestAuthorisation()
.then(()=>{
// some other code that is not interesting
})
}
Everything works fine with the glory of promise.
requestAuthorisation() check if the token is valid (next!!!) and if not (I do a request the API-B to refresh the token)
The problem is here:
between the time, the token is expired and the time to obtain a fresh one, some times happen. if 1000 clients ask at the same time these, I will have 1000 request of token to API-B, that is not cool.
How can I avoid that ? I try to avoid a cron-way to do it, to avoid unnecessary call and the problem is the same.
I try to create a sort of global variable (boolean) that track the refreshing status of the token but impossible to find a sort of Promise.WaitFor (variable change)
the Promise.all can not be use because I am in different scope of event.
Is there a way to queue until the token is refresh ?
Please help !
If I understand this write, we need to do two things:
Do not call our refreshToken several times when one is in progress
Once completed, let all the waiting request know that request for the token in completed so that they can continue their work.
If you combine Observable pattern and a state to maintain the in-progress state, this can be done like below
// MyObservable.js:
var util = require('util');
var EventEmitter = require('events').EventEmitter;
let inProgress = false;
function MyObservable() {
EventEmitter.call(this);
}
// This is the function responsible for getting a refresh token
MyObservable.prototype.getToken = function(token) {
console.log('Inside getToken');
if (!inProgress) {
console.log('calling fetchToken');
resultPromise = this.fetchToken();
inProgress = true;
resultPromise.then((result) => {
console.log('Resolved fetch token');
inProgress = false;
this.emit('done', 'token refreshed');
});
}
}
// This is a mock function to simulate the promise based API.
MyObservable.prototype.fetchToken = function(token) {
console.log('Inside fetchToken');
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('resolving');
resolve("Completed");
}, 2000);
});
}
util.inherits(MyObservable, EventEmitter);
module.exports = MyObservable;
Now we can implement this and observe for the call to complete
const MyObservable = require('./MyObservable');
const observable = new MyObservable();
const A = () => {
console.log('Inside A');
observable.on('done', (message) => {
console.log('Completed A');
});
observable.getToken('test');
}
for (let i = 0; i < 5; i++) {
setTimeout(A, 1000);
}
If we run this code, you will get an output where fetchToeken is called only once even though our method A is called 5 times during the same duration.
Hope this helps!