Can two browser windows cause race conditions in synchronous JavaScript code? - javascript

I have the following function that synchronizes some locally cached data with my online database.
In the process, existing values are retrieved from the database, added to the locally cached values, and then sent as an update to the database. Because of this summing process, this procedure must not run multiple times in parallel because it could result in invalid values (adding more than it should).
I implemented a locking mechanism using local storage. It seems to work well because JavaScript is single-threaded and runs all the locking code synchronously, even if this function is called rapidly in succession.
The problem is, how do I also make this thread-safe across browser tabs/windows? The way I see it, multiple windows behave like separate threads, taking away the benefit of JavaScript's single-threading approach.
So 1) Is my assumption about multiple browser windows behaving like multiple threads correct and 2) How do I make my particular code safe across these different threads/processes?
export async function mergeCachedGameStats(supabaseClient: SupabaseClient) {
const lockKey = "merge_cached_game_stats_lock";
const lock = localStorage.getItem(lockKey);
if (lock && (Date.now() - parseInt(lock) > 15_000)) localStorage.removeItem(lockKey);
if (lock) return;
localStorage.setItem(lockKey, Date.now().toString());
try {
const { data, error } = await supabaseClient?.auth.getSession();
if (error) throw error;
const loggedInUser = data.session?.user;
for (const key of Object.values(gameStatsKeys)) {
if (!localStorage.getItem(key + "_cached")) continue;
const gameStats = await loadCombinedGameStats(key, supabaseClient);
if (loggedInUser) {
const { error } = await supabaseClient
.from("user_profiles")
.update({ [key]: JSON.stringify(gameStats) })
.eq("id", loggedInUser.id);
if (error) throw error;
} else {
localStorage.setItem(key, JSON.stringify(gameStats));
}
localStorage.removeItem(key + "_cached");
}
} catch (error) {
console.error(error);
} finally {
localStorage.removeItem(lockKey)
}
}

Related

Is this the correct way to write a non-blocking while loop using try-catch, setTimeout and recursion?

I have an endpoint on my server named "/order".
When it gets triggered, I need to send an order over to an API. Sometimes, because some data over ipfs takes too long to download, the order fails within the try-catch block, and I need to resend it.
Since a while loop that tries to resend the order until it is successful would be blocking other requests, I thought I could make one myself using recursion and a setTimeout, to try to send the same request, say, 3 times every 5 minutes and if the third time it fails then it won't try again.
I wanted to know if this was the right way to achieve the non-blocking functionality and if there are some vulnerabilities/things I'm not taking into account:
async function makeOrder(body, tries) {
if (tries > 0) {
let order;
try {
order = getOrderTemplate(body.shipping_address)
for (const index in body.line_items) {
order.items.push(await getItemTemplate(body.line_items[index]))
}
await sendOrderToAPI(order)
} catch (err) {
setTimeout(function(){
makeOrder(body, tries - 1)
}, 180000)
}
} else {
console.log("order n " + body.order_number + " failed")
//write failed order number to database to deal with it later on
}
}
One significant problem is that, if there's a problem, it'll return a Promise that resolves not when the final API call is finished, but when the initial (failed) API call is finished. This:
setTimeout(function(){
makeOrder(body, tries - 1)
}, 180000)
is not chained properly with the async function outside.
As a result, the following logic will fail:
makeOrder(body, 3)
.then(() => {
// All orders are made
})
Promisify it instead, so that the recursive call can be chained with the outer function easily.
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
async function makeOrder(body, tries) {
if (tries > 0) {
let order;
try {
order = getOrderTemplate(body.shipping_address)
for (const index in body.line_items) {
order.items.push(await getItemTemplate(body.line_items[index]))
}
await sendOrderToAPI(order)
} catch (err) {
await delay(180_000);
return makeOrder(body, tries - 1);
}
} else {
console.log("order n " + body.order_number + " failed")
//write failed order number to database to deal with it later on
}
}
Another possible improvement would be to, instead of awaiting inside a loop here:
for (const index in body.line_items) {
order.items.push(await getItemTemplate(body.line_items[index]))
}
to use Promise.all, or a different mechanism that allows for fetching multiple templates at once, instead of having to wait for each one-by-one in serial.
Another potential issue is that makeOrder does not reject when it exceeds the number of tries allowed. That is, the consumer wouldn't be able to do:
makeOrder(body, 3)
.catch((error) => {
// implement logic here to handle the error
})
If you wanted to permit the above, at the very end of makeOrder, throw at the same time you're logging:
} else {
console.log("order n " + body.order_number + " failed")
//write failed order number to database to deal with it later on
throw new Error('Failed');
}

Multiple arguments in Gio.Subprocess

I'm developing my first gnome-shell-extension currently. In the extension, I want to execute a simple shell command and use the output afterwards, for which I use Gio.Subprocess like it is used in this wiki: https://wiki.gnome.org/AndyHolmes/Sandbox/SpawningProcesses
Currently, I have an argument like this one with some parameters: "ProgramXYZ -a -bc" which I pass as the argument vector argv as ['ProgramXYZ','-a','-bc']. This case works fine.
So let's say I would like to call two programs and combine the output with your approach, like: "ProgramXYZ -a -bc && ProgramB". My current output is correct in a normal terminal, but I'm not sure how to pass it to the Gio.Subprocess. Something like ['ProgramXYZ','-a','-bc','&&','ProgramB'] does not work, is there a way to achieve that or do i have to make two seperate calls?
Sorry, I haven't managed to finish that page (that's why it's in my sandbox 😉).
Here is our Promise wrapper for running a subprocess:
function execCommand(argv, input = null, cancellable = null) {
let flags = Gio.SubprocessFlags.STDOUT_PIPE;
if (input !== null)
flags |= Gio.SubprocessFlags.STDIN_PIPE;
let proc = new Gio.Subprocess({
argv: argv,
flags: flags
});
proc.init(cancellable);
return new Promise((resolve, reject) => {
proc.communicate_utf8_async(input, cancellable, (proc, res) => {
try {
resolve(proc.communicate_utf8_finish(res)[1]);
} catch (e) {
reject(e);
}
});
});
}
Now you have two reasonable choices, since you have a nice wrapper.
I would prefer this option myself, because if I'm launching sequential processes I probably want to know which failed, what the error was and so on. I really wouldn't worry about extra overhead, since the second process only executes if the first succeeds, at which point the first will have been garbage collected.
async function dualCall() {
try {
let stdout1 = await execCommand(['ProgramXYZ', '-a', '-bc']);
let stdout2 = await execCommand(['ProgramB']);
} catch (e) {
logError(e);
}
}
On the other hand, there is nothing preventing you from launching a sub-shell if you really want to do shell stuff. Ultimately you're just offloading the same behaviour to a shell, though:
async function shellCall() {
try {
let stdout = await execCommand([
'/bin/sh',
'-c',
'ProgramXYZ -a -bc && ProgramB'
]);
} catch (e) {
logError(e);
}
}

Using promises and async with socket.io doesn't return expected results

I'm very new to promises. I need my server to wait for socket connections from an external api and from browser clients connections. The external api sends a number of objects (4 in this example) to the server, which is received as a promise and calls the function which waits for the promise. For each object received by the promise, a browser client can make a connection (promise) and join the game.
I have a function which should wait for variables to be populated by these two promises. It is successful in waiting for the external api objects, but it never receives the promise to indicate that the correct number of clients have made connections.
I wrapped the socket listening for the external API objects in a promise as it will only we sent once. I also call the function which handles the two promises here as it didn't seem to work anywhere else.
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
// gameObject.files = imageObj.imagePath;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList) //resolve the promise with the array of objects
sendData()
}
catch (e) {
console.error(e)
}
});
});
I also wrapped the client req listener in a promise because after countless tries to nest the promise inside, this was the only solution which didn't return the actual socket as the promise, so I feel this is probably the closest solution for me.
This promise should only resolve when there are the same amount of client connections as there are server objects received in the first promise. I a testing by simply connecting from 4 open tabs to localhost:3000.
//HANDLER FOR CLIENT REQUEST TO JOIN GAME
const playerPromise = new Promise ((resolve, reject) => {
socket.on('joinGame', async () => {
try {
gameState.totalPlayerCount++;
gameState.playerList.push(socket.id)
switch (true) {
case gameState.totalPlayerCount < gameState.totalServerCount :
console.log("Not enough players", gameState.totalPlayerCount)
break;
case gameState.totalPlayerCount <= gameState.totalServerCount :
console.log("Counts are equal", gameState)
readyPlayers = true;
resolve(gameState.playerList)
break;
case gameState.totalServerCount == 0 :
console.log("Server not ready", gameState)
break;
default :
console.log("Too many players", gameState.totalPlayerCount)
reject("Too many players", gameState.playerList)
}
}
catch(e) {
console.error(e);
}
})
})
sendData() function logs the 1st and 2nd tests to the console, but never the 3rd.
async function sendData() {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2", dataMax)
const dataPlay = await playerPromise;
console.log("TEST3", dataPlay)
for (var key in await dataPlay) {
io.to(dataPlay[key]).emit('gameData', dataPlay[key]);
}
}
catch(e) {
console.error(e)
}
};
I've looked at every other similar post on stackoverflow and online but cannot find any solution to this or where I'm going wrong. I have also devised the above solution with minimal knowledge of socket.io and promises, so is there is a better/cleaner way to do the above please let me know.
EDIT:
This is my current solution using only one promise, but now the promise is not being populated at all in the send function:
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList)
}
catch (e) {
console.error(e)
}
});
});
async function sendData(playerData) {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2")
for (var key in await playerData) {
io.to(playerData[key]).emit('gameData', dataMax);
}
}
catch(e) {
console.error(e)
}
};
The sendData() is called in the Client socket handler which just passes the array of connections as playerData. "TEST2" is never logged.
Seeing as the promise maxPromise is global, shouldn't it be able to access its value?
You've probably figured this out by now. It's an interesting problem. I'd like to know how you solved it. As I understand it, you need to wait for data to arrive in order to select players. Seems like a good use of promises. You could use socket.once instead of socket.on for dictData if it's a one-time event
At the same time you don't want to block players yet still need to wait for enough players to join. Awaiting another promise is again a good gating technique
If you haven't solved all issues I suggest first removing socket.io from the equation while developing the gating logic. You can do this in node with custom event emitters. I'd simulate players and data events occurring at random times. You can also do this in the browser with custom events or broadcast channels. You'll find this more convenient than manually connecting to a port
I'd put in a lot of logging with millisecond timestamps to easily understand the sequence of events - when they occur and when they're handled

Async operations inside indexeddb cursor

I'm using indexedDB Promised library to convert the indexedDB API to promises.
Looks like by the time my fetch is completed my indexed db transaction is no longer active. I'm guessing it is timing out?
The error I get is:
DOMException: Failed to execute 'delete' on 'IDBCursor': The transaction has finished.
What I'm trying to accomplish is to delete the item from indexedDB, only and only if the fetch is completed successfully. I understand that I can create a second transaction after the fetch to get the item and remove it. But I'm wondering if there's a better way without doing a new transaction? Am I missing something?
Can anyone explain to me why I'm seeing this problem?
DBHelper.DBPromised.then( db => {
const store = db.transaction('deferredReviews', 'readwrite').objectStore('deferredReviews');
const submittedRes = {};
store.openCursor()
.then(function submitAndDelete(cursor) {
if (!cursor) return;
console.log(cursor.value);
fetch(`${DBHelper.DATABASE_URL}/reviews`, {
method: 'POST',
body: JSON.stringify({
restaurant_id: cursor.value.restaurant_id,
name: cursor.value.name,
createdAt: cursor.value.deferredAt,
rating: cursor.value.rating,
comments: cursor.value.comments
})
})
.then(response => {
if (!response.ok) {
throw Error(response.statusText);
}
return response.json();
})
// If response is ok then delete from indexedDB.
.then(review => {
if (!review) return new Error('Could not submit');
if (cursor.value.restaurant_name in submittedRes) {
submittedRes[cursor.value.restaurant_name] = submittedRes[cursor.value.restaurant_name] + 1;
} else {
submittedRes[cursor.value.restaurant_name] = 1;
}
cursor.delete();
return cursor.continue().then(submitAndDelete);
});
})
.then(() => {
if (Object.keys(submittedRes).length === 0 && submittedRes.constructor === Object) return;
DBHelper.showDeferredSubmitModal(submittedRes);
});
});
You cannot do async operations in the middle of indexedDB operations. An IDBTransaction will automatically timeout if it does not detect any pending requests when it reaches the end of the current iteration of the JavaScript event loop. An async operation introduces a pause, so later requests that bind after the pause are bound too late, because by that time the transaction timed out and ended. Transactions are intended to be short lived, because a readwrite mode transaction can potentially lock up an object store for its entire duration, leading to serious blocking issues.
To work around this, do all of your async work either before or after the transaction, not in the middle of it. Or use two separate transactions, if data integrity is not an issue.

Node.js for loop using previous values?

Currently, I am trying to get the md5 of every value in array. Essentially, I loop over every value and then hash it, as such.
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (var userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
console.log(userHashString.toUpperCase() + "this is the hashed string")
if (userHashString.toUpperCase() === poster){
return console.log("this is the poster")
}
else {
..
}
}
else {
return null
}
})
)}
However, this leads to two problems. The first is that I am receiving the error warning "Don't make functions within a loop". The second problem is that the hashes are all returning the same. Even though every userID is unique, the userHashString is printing out the same value for every user in the console log, as if it is just using the first userID, getting the hash for it, and then printing it out every time.
Update LATEST :
exports.sendNotificationForPost = functions.firestore
.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data()
const watching = value.watchedBy
const poster = value.poster
const postContentNotification = value.post
const refPromises = []
var crypto = require('crypto');
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
for (let userID in watching) {
refPromises.push(admin.database().ref('notifications/'+ userID).once('value', (snapshot) => {
if (snapshot.exists()) {
const userHashString = userHash(userID)
if (userHashString.toUpperCase() === poster){
return null
}
else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload)
}
}
else {
return null
}
})
)}
return Promise.all(refPromises);
});
You have a couple issues going on here. First, you have a non-blocking asynchronous operation inside a loop. You need to fully understand what that means. Your loop runs to completion starting a bunch of non-blocking, asynchronous operations. Then, when the loop finished, one by one your asynchronous operations finish. That is why your loop variable userID is sitting on the wrong value. It's on the terminal value when all your async callbacks get called.
You can see a discussion of the loop variable issue here with several options for addressing that:
Asynchronous Process inside a javascript for loop
Second, you also need a way to know when all your asynchronous operations are done. It's kind of like you sent off 20 carrier pigeons with no idea when they will all bring you back some message (in any random order), so you need a way to know when all of them have come back.
To know when all your async operations are done, there are a bunch of different approaches. The "modern design" and the future of the Javascript language would be to use promises to represent your asynchronous operations and to use Promise.all() to track them, keep the results in order, notify you when they are all done and propagate any error that might occur.
Here's a cleaned-up version of your code:
const crypto = require('crypto');
exports.sendNotificationForPost = functions.firestore.document('posts/{posts}').onCreate((snap, context) => {
const value = snap.data();
const watching = value.watchedBy;
const poster = value.poster;
const postContentNotification = value.post;
function userHash(userIDstring) {
return crypto.createHash('md5').update(userIDstring).digest('hex');
}
return Promise.all(Object.keys(watching).map(userID => {
return admin.database().ref('notifications/' + userID).once('value').then(snapshot => {
if (snapshot.exists()) {
const userHashString = userHash(userID);
if (userHashString.toUpperCase() === poster) {
// user is same as poster, don't send to them
return {response: null, user: userID, poster: true};
} else {
const payload = {
notification: {
title: "Someone posted something!",
body: postContentNotification,
sound: 'default'
}
};
return admin.messaging().sendToDevice(snapshot.val(), payload).then(response => {
return {response, user: userID};
}).catch(err => {
console.log("err in sendToDevice", err);
// if you want further processing to stop if there's a sendToDevice error, then
// uncomment the throw err line and remove the lines after it.
// Otherwise, the error is logged and returned, but then ignored
// so other processing continues
// throw err
// when return value is an object with err property, caller can see
// that that particular sendToDevice failed, can see the userID and the error
return {err, user: userID};
});
}
} else {
return {response: null, user: userID};
}
});
}));
});
Changes:
Move require() out of the loop. No reason to call it multiple times.
Use .map() to collect the array of promises for Promise.all().
Use Object.keys() to get an array of userIDs from the object keys so we can then use .map() on it.
Use .then() with .once().
Log sendToDevice() error.
Use Promise.all() to track when all the promises are done
Make sure all promise return paths return an object with some common properties so the caller can get a full look at what happened for each user
These are not two problems: the warning you get is trying to help you solve the second problem you noticed.
And the problem is: in Javascript, only functions create separate scopes - every function you define inside a loop - uses the same scope. And that means they don't get their own copies of the relevant loop variables, they share a single reference (which, by the time the first promise is resolved, will be equal to the last element of the array).
Just replace for with .forEach.

Categories