How to update an IndexedDB item with autoincrement key? [duplicate] - javascript

This question already has an answer here:
indexeddb put not update a value, it creates a new (id)
(1 answer)
Closed 2 years ago.
I created an object store with autoIncrement: db.createObjectStore("items", {autoIncrement:true});
Now I want to be able to update an item given its key and a new value, so I wrote this function:
let updateItem = (key, newData) => {
let objectStore = db.transaction(["items"], "readwrite").objectStore("items");
let request = objectStore.get(key);
request.onsuccess = (e) => {
let data = e.target.result;
Object.assign(data, newData);
let requestUpdate = objectStore.put(data);
};
}
However, instead of updating the value, it creates a new item with the new data. I think it makes sense since e.target.result does not contain any information about its key. So how do I update an element in such an object store?

You need to add a key as a second parameter, like objectStore.put(data, key).
key
The primary key of the record you want to update (e.g. from IDBCursor.primaryKey). This is only needed for object stores that have an autoIncrement primary key, therefore the key is not in a field on the record object. In such cases, calling put(item) will always insert a new record, because it doesn't know what existing record you might want to modify.
-- IDBObjectStore.put() - Web APIs | MDN

I found another solution using cursor.update():
let updateItem = (key, newData) => {
let objectStore = db.transaction("items","readwrite").objectStore("items");
objectStore.openCursor().onsuccess = (e) => {
let cursor = e.target.result;
if (cursor && cursor.key == key) {
cursor.update(Object.assign(cursor.value, newData));
cursor.continue();
}
};
}

Related

IndexedDB syncronism issue when creating parallel stores in the same db

Whenever I try to archieve creating diffrent stores in the same database at the same time only one of them is created. Is there a way to resolve this syncronism issue.
I've already been able to solve this issue but I'll clarify what I ment in case it might help someone else. I have a GridView component which is mapped multiple times. This component saves the columns, how they are aranged and their specific behaviors to be stored inside indexedDB. The issue I had was that I used a function that created a Db named after the page and the stores (one for each GridView inside the same DB). In order to create all the stores at the same time (in this case I had to create 9 of them) I had to trigger a version change for each of the new stores in order to be able to persist the information. Inside the function I searched for the Db actual version and added 1 to trigger the version change event. The problem has that because they where searching the version synchronously inside this function all of the itterations were getting the same version and the result would be that only the first store was beeing created because only the first iteration of the map would trigger a version change. In order to resolve this issue I used index prop iside the map function and passed it as an order prop to my GridView component. Then instead of triggering the version change (version+1) inside the function I triggered by using version+order, this way all the stores where being created because it assures that all the versions were going to be higher than the previous ones.
I'll give some code to maybe help the explanation.
This is the map:
{status.map((status, index) => {
return (
<GridView
pageName={"Page"} // DB Name
gridName={status} // Store Name
order={index + 1} // index + 1 so that the order is never 0
//all the other props...
/>
);
})}
Inside the GridView component I have a function that triggers on first render to search for the store inside the db and if there is no information inside creates and then fills the store with the information needed. This the function:
/**
*
#param {String} DB_NAME
#param {String} STORE_NAME
#param {String} keyPath
#param {Int} order
#param {Object/Array} info
#returns {Object}: resolve: {success,message,storeInfo), reject:{error, message}
*/
const createAndPopulateStore = (
DB_NAME,
STORE_NAME,
keyPath,
order = 1,
info
) => {
return new Promise((resolve, reject) => {
const request = indexedDB.open(DB_NAME);
request.onsuccess = function (e) {
let database = e.target.result;
let version = parseInt(database.version);
database.close();
//This was the critical part in order to create multiple stores at the same time.
let secondRequest = indexedDB.open(DB_NAME, version + order);
secondRequest.onupgradeneeded = (e) => {
let database = e.target.result;
//Early return if the store already exist.
if (database.objectStoreNames.contains(STORE_NAME)) {
reject({
success: false,
message: `There is already a store named: ${STORE_NAME} created in the database: ${DB_NAME}`,
});
return;
}
let objectStore = database.createObjectStore(STORE_NAME, { keyPath });
if (info) {
// Populates the store within the db named after the gridName prop with the indexedColumns array.
if (Array.isArray(info)) {
info.map((item) => objectStore.put(item));
} else {
Object.entries(info).map((item) => objectStore.put(item));
}
}
};
secondRequest.onsuccess = function (e) {
resolve({
success: true,
message: `Store: ${STORE_NAME}, created successfully.`,
storeInfo: info ?? {},
});
let database = e.target.result;
database.close();
};
};
});
};
I hope this will help! Fell free to ask any questions regarding this issue and I'll try to answer them as soon as I can.

INDEXEDDB update/put not returning updated object in the expected state

I have an INDEXEDDB database that i've created with two object stores: 'games' and 'plays' (in reference to football). I am letting IDB create the keys for each store via 'autoincrement'. The 'games' store can have multiple games and likewise, there will be multiple plays for each game. Later, i export these stores via JSON to PHP and am attempting to correlate the plays that took place in game 1 (for example) to that game and so on. I am using a 'foreign key'-like value (a gameID attribute) in the plays store to indicate that the play goes with a certain game. However, upon JSON export of the two stores, i have found that the 'games' store does not have its key value exported and therefore, i cannot for sure connect a play (which has a reference to 'gameID') to a particular game (which does not contain the reference within its structure).
So, i thought the answer to be simple: create a value called 'gameID' within the 'game' store and once i have that id, update the record in the store with the gameID value.
The problem is that i've written IDB 'update' code or 'put' code which seems to be 'successful', yet when i go get the game in question later, the value is not correct. I'm finding that my updates are not updating the data structures as i would expect to see them in Chrome Developer tools. Below is an example of what i am talking about:
Object in question in Chrome Developer tools
Above you can see graphically the issue and i'm not sure what is happening. You'll see that in the areas marked "A" and "C", there are the updated values listed (i do a similar update later to mark a game 'complete' at the end of a game). However, the actual data structure in IDB (indicated with "B") shows the old values that i "thought" that i'd updated successfully. So, i'm not at all sure how to read this structure in Chrome Developer, which seems to report the updates that were made separately from the object itself.
I've tried doing this update thru passing the gameID in question and via cursor.
function putGameID (conn, thisGameID) {
return new Promise((resolve, reject) => {
const tx = conn.transaction(['gamesList'], 'readwrite');
const gameStore = tx.objectStore(['gamesList']);
const gameRequest = gameStore.get(thisGameID);
gameRequest.onsuccess = () => {
const game = gameRequest.result;
game.gameID = thisGameID;
console.log({game});
const updateGameRequest = gameStore.put(game);
updateGameRequest.onsuccess = () => {
console.log("Successfully updated this game ID.");
resolve(updateGameRequest.onsuccess);
}
etc....
It appears the record was updated, just not in the manner i would expect.
I've also attempted this using a cursor update to similar effect:
function putGameID (conn, thisGameID) {
return new Promise((resolve, reject) => {
const tx = conn.transaction(['gamesList'], 'readwrite');
const gameStore = tx.objectStore(['gamesList']);
gameStore.openCursor().onsuccess = function(event) {
const cursor = event.target.result;
if (cursor) {
if (!cursor.value.gameID) {
const updatedGame = cursor.value;
updatedGame.gameID = thisGameID;
const request = cursor.update(updatedGame);
cursor.continue;
}
}
}
etc.....
Can someone help me to understand:
(1) how to read the structure in the CDT? Why are the updated values not part of the object's structure?
and ...
(2) how can i modify my approach to get the results that i wish to achieve?
As per requested, this is the code that originally creates the two object stores and it is called upon entry into the form:
async function idbConnect(name, version) {
return new Promise((resolve, reject) => {
const request = indexedDB.open(DBName, DBVersion);
request.onupgradeneeded = function(event) {
//if (!request.objectStoreNames.contains('gamesList')) {
console.log('Doing indexeddb upgrade');
db = request.result;
/*Create the two stores - plays and games. */
playObjectStore = db.createObjectStore('playsList',{keyPath: "id", autoIncrement:true});
gameObjectStore = db.createObjectStore('gamesList',{keyPath: "gameID", autoIncrement:true});
/* Create indexes */
playObjectStore.createIndex("playIDIdx","id", {unique:false});
playObjectStore.createIndex("gamePlayIDIdx","gameID", {unique:false});
playObjectStore.createIndex("playCreatedDateIdx","createdDate", {unique:false});
gameObjectStore.createIndex("gameIDIdx","gameID", {unique:true});
gameObjectStore.createIndex("gameCreatedDateIdx","createdDate", {unique:false});
//return db;
//}
}
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
request.onblocked = () => { console.log('blocked'); };
});
}
This code makes the call to add the game:
try {
conn = await idbConnect(DBName,DBVersion);
game = await addGameIDB(conn);
// Understand what is going on in the line below.
//Saving the game ID to populate.
globalGameID = game.gameID;
// Here is where i'm attempting to update the gameID....
await putGameID(conn, globalGameID);
console.log({globalGameID});
Once the stores are created, the following code adds a game:
function addGameIDB(conn) {
return new Promise((resolve, reject) => {
// some irrelevant stuff to format dates, etc....
let newGame = [
{
gameID: null, // What i'd like to populate....
gameDate: thisGameDate,
gameTime: thisGameTime,
team1Name: thisTeamOne,
team2Name: thisTeamTwo,
gameCompleted: false,
createdDate: d
}
];
db = conn.transaction('gamesList','readwrite');
let gameStore = db.objectStore('gamesList');
let gameRequest = gameStore.add(newGame);
gameRequest.onsuccess = (ev) => {
console.log('Successfully inserted an game object');
const newGameRequest = gameStore.get(gameRequest.result);
newGameRequest.onsuccess = () => {
resolve(newGameRequest.result);
}
};
gameRequest.onerror = (err) => {
console.log('error attempting to insert game object' + err);
reject(gameRequest.error);
}
});

Deep copy of the Object to add a key : value

I am pre-fetching a product from a database using mongoose with next.js and react-query. I was wondering why I need to do a deep copy of a nested object in order to add a key:value to it. Otherwise it does not work. Let me know what I am not understanding.
await queryClient.prefetchQuery(['productSlug', slug], async () => {
const product = await read(slug);
const existingRatingObject = product.ratings.find(
(item) => item.postedBy.toString() === user._id.toString()
);
const copyProduct = JSON.parse(JSON.stringify(product));
if (existingRatingObject) {
copyProduct.star = existingRatingObject.star;
} else {
copyProduct.star = 0;
}
console.log({ copyProduct });
return JSON.stringify(copyProduct);
});
The reason is that the product fetched is a Mongoose document not a plain old JavaScript object.
When you convert it to plain old javascript Object, you will be able to add any key to it.
You can add .lean() to you query or add toObject/toJSON to you the fetched document

Edit a set of objects using localStorage

Not sure where to start with this one... I'm creating a basic todo app, using localStorage. (I specially, am not using a backend database).
So far, I can update and display, the objects I have stored locally. And I am displaying them on my page.
form.addEventListener('submit', function(e){
e.preventDefault();
// Set object
let data = {
name: nameInput.value,
url: urlInput.value
};
bookMarksArray.push(data);
console.log("Added bookmark #" + data);
// Saving
localStorage.setItem("bookMarksArray", JSON.stringify(bookMarksArray));
});
However, I also want to able to edit, each item in my DOM. Once edited, I want that specific object, which correlates to this, to be updated in localStorage.
How do I do this?
I'm not sure where to start. Here's a codepen, of my code so far:
https://codepen.io/ReenaVerma1981/pen/LYEPbjL
EG
- if I want to update a URL value to www.google.co.uk
- And this is updated, in the correct object, in localStorage
Here's some psuedo code, is this a good approach?
// List each object as an individual form in DOM
// So I can easily update the input.value, (with a new value)
// The **edit** button, would be a submit button
// Or there's an eventHandler on this button
// Which onClick, takes the input.value, (either name.value or url.value)
// Identifies whether these values, match values in the localStorage object
// And if so, get the object index
// Then update these object values, based on the object index?
// Update localStorage via localStorage.setItem
Here's some example code, I'm writing, to try and do this:
// UPDATE/EDIT EXISTING
const list = document.querySelector('.url-list');
list.addEventListener('click', event => {
if (event.target.classList.contains('js-edit-url')) {
console.log('edit');
const editName = event.target.parentElement.name.value;
const editURL = event.target.parentElement.url.value;
let data = {
name: editName,
url: editURL
};
Object.keys(bookMarksArray).map(function (old_key, index) {
// console.log('old_key',old_key);
let new_key = data;
console.log('data', data);
if (old_key !== new_key) {
Object.defineProperty(bookMarksArray, new_key,
Object.getOwnPropertyDescriptor(bookMarksArray, old_key));
// console.log('bookMarksArray',bookMarksArray);
// localStorage.setItem("bookMarksArray", JSON.stringify(bookMarksArray));
delete bookMarksArray[old_key];
}
});
}
});
I figured it out!
I found a really good example here, using the ES6 way, without mutating original data:
// UPDATE/EDIT EXISTING
const list = document.querySelector('.url-list');
list.addEventListener('click', event => {
if (event.target.classList.contains('js-edit-url')) {
console.log('edit');
const editName = event.target.parentElement.name.value;
const editURL = event.target.parentElement.url.value;
// Find the object index, by looping through and matching name.value
const objIndex = bookMarksArray.findIndex(obj => obj.name === editName);
// make new object of updated object.
const updatedObj = { ...bookMarksArray[objIndex], url: editURL};
// make final new array of objects by combining updated object.
const updatedProjects = [
...bookMarksArray.slice(0, objIndex),
updatedObj,
...bookMarksArray.slice(objIndex + 1),
];
localStorage.setItem("bookMarksArray", JSON.stringify(updatedProjects));
}
});

Return all data back in new node in Firebase Cloud Functions

This is my inputted data:
This is my function:
exports.updateAll = functions.database.ref('/update/{uid}/{values}').onCreate(event => {
const data = event.data.val()
console.log(data)
return db.ref(`somewhereElse/somepath/`).update(data)
})
This is the error I am recieving:
Is it possible to update all the created values back into another path? I thought creating a const with the event.data.val() would work, but then I get the error.
Your reference for the onCreate trigger is /update/{uid}/{values} and so the event data will be that of the values. For example, the event data for /update/{uid}/email would be a string, and this is not an object containing children.
If you have a reason for needing to trigger this way, there is a solution to the error you are getting.
functions.database.ref('/update/{uid}/{values}').onCreate(event => {
const key = event.params.values
const value = event.data.val()
console.log(key, value)
return db.ref(`somewhereElse/somepath/{key}`).set(value)
})
The key value that was created can be retrieved from the event params and data. Then we can interpolate the key into the somewhere else ref, and set the value instead of updating it.

Categories