how to use localstorage in to do list project - javascript

Is this the correct way to store projects and tasks with localstorage? I also need to get the localstorage every time the page refreshes. So how do I do that?
export function newProject(name) {
allProjects.push({
projectTitle: name,
id: crypto.randomUUID(),
tasks: []
})
getProjectId(name)
save(name, project)
}
export function save(title, task) {
localStorage.setItem(title, JSON.stringify(task))
}

project is undefined so you need to define it first.
const project = {
projectTitle: name,
id: crypto.randomUUID(),
tasks: []
}
allProjects.push(project)
getProjectId(name)
save(name, project)
to get all projects on refresh, you need to maintain an array of names in localstorage, or save all projects as array in one key.
EDIT:
So the question is not entirely well described, but i did my best. Add exports to functions you need and use your getProjectId if needed. I used an approach with separate array of ID's to maintain the list of projects.
function createNewProject(name) {
// create and return project object
return {
title: name,
id: crypto.randomUUID(),
tasks: []
};
}
function saveProject(storageKey, projectObject) {
// get current list of project keys or create new list
const allProjectKeys = JSON.parse(localStorage.getItem("allProjectKeys")) ?? [];
// add new one to the list
allProjectKeys.push(storageKey);
// save current list of project keys
localStorage.setItem("allProjectKeys", JSON.stringify(allProjectKeys));
// save project data
localStorage.setItem(storageKey, JSON.stringify(projectObject));
}
function getProjectByKey(storageKey) {
// get single project by given key
return JSON.parse(localStorage.getItem(storageKey));
}
function getAllProjects() {
// get list of all project keys, and map each of them to get actual project instead of project key
return JSON.parse(localStorage.getItem("allProjectKeys")).map(getProjectByKey);
}
const testProject = createNewProject("test");
saveProject(testProject.id, testProject);
console.log(getProjectByKey(testProject.id));
console.log(getAllProjects());

Related

Apollo fetchMore updates data globally

I have two TaskList components that use the same query GET_TASKS.
Both use a different filter query variable which is passed down to them in props as queryVars.
I defined a standard merge function in type policies to merge the incoming and existing data together.
The TaskList component uses
const { data, fetchMore } = useQuery<Response, Variables>(GET_TASKS, { variables: queryVars })
to retrieve the data.
A Fetch more button has () => fetchMore({ variables: queryVars }) in the onClick attribute.
When I click on the Fetch more button on the left, the tasks on the right get updated as well, however, without its filter applied, so the data that come with Assigned to me filter are also put to the Assigned by me task list and vice versa.
The merge function basically rewrites every data object that uses the given query.
How do I tell Apollo to only update the data that is bound to the component where fetchMore is defined?
You should be able to add filter to the keyArgs property. This should create different cache results based on the filter.
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
tasks: {
keyArgs: ["filter"],
merge(existing, incoming, { args: { offset = 0 }}) {
//Custom merge
},
},
},
},
},
});

Realtime database update or create functionality

I have a user object in firebases realtime database. I am querying realtime database and when successful I am wanting to write some new data to the user object under the users node.
My desired outcome: When a user has not fetched the information before, a new field called 'lastViewed' under the user object is created and if the field has already been created then we update the timeViewed keys value. A user can have multiple objects in the array corresponding to the uuid of the fetched data.
Please see the user object below
This may not need to be an array if using .push()
-N0X8VLHTw3xgvD2vJs- : { // this is the users unique key
name: 'myName',
lastViewed: {
[
{
timeViewed: 1651558791, // this is the field to update if exists
datasUniqueKey: 'N17ZmwIsbqaVSGh93Q0' // if this value exists update timeViewed else we create the entry.
},
{
timeViewed: 1651558952,
datasUniqueKey: 'N17ZmwIsbqaVSad3gad'
},
]
}
}
Please see my attempt below.
const getData = database()
.ref(`data/${uniqueKeyFromData}`)
.on('value', snapshot => {
if (snapshot.exists()) {
database()
.ref(`users/${currentFirebaseUserKey}/lastViewed`) // currentFirebaseUserKey = N0X8VLHTw3xgvD2vJs
.once('value', childSnapshot => {
if (childSnapshot.exists()) {
// update
database()
.ref(
`users/${currentFirebaseUserKey}/lastViewed`,
)
.update({
timeViewed: new Date(), // new data will not give us the corresponding date format in the user object above but don't worry about that
fetchedDatasUniqueKey: uniqueKeyFromData,
});
} else {
// create
database()
.ref(
`users/${currentFirebaseUserKey}/lastViewed`,
)
// Push creates a unique key which might not be required so maybe set?
.push({
timeViewed: new Date(),
fetchedDatasUniqueKey: uniqueKeyFromData,
});
}
});
}
});
Where I think I am going wrong
Above I am not creating an array, if I use push I would get a unique key generated from firebase but then would have to use that key when updating, something like
`users/${currentFirebaseUserKey}/lastViewed/${lastViewedUniqueKey}`
So the user object would look like so
-N0X8VLHTw3xgvD2vJs- : { // this is the users unique key
name: 'myName',
lastViewed: {
-N17i2X2-rKYXywbJGmQ: { // this is lastViewedUniqueKey
timeViewed: 1651558791,
datasUniqueKey: 'N17ZmwIsbqaVSGh93Q0'
},
}
}
then check for snapshot.key in the if?, any help would be appreciated.
Since you don't want a list of data, but a single set of properties for ``, you should just call set instead of push:
database()
.ref(`users/${currentFirebaseUserKey}/lastViewed`)
.set({ // 👈
timeViewed: new Date(),
fetchedDatasUniqueKey: uniqueKeyFromData,
});
I also don't think you need the two different cases here for create vs update, as both seem to do the exact same thing. If you do need both cases though, consider using a transaction.

How do you create edgeless graphql element in Gatsby?

The title may be miss leading but I'm not really sure how do I ask this question correctly. Here is the problem: I'd like to query my own API(not created yet so I made placeholder data) for global settings which might change in the future and I will only need to rebuild the website instead of editing it manually, I want to create source node called CmsSettings and pass it to GraphQL (structure similar to site.siteMetadata) but I don't know how can I achieve that. What I achieved so far is to create a source node called allCmsSettings which has my data as an object in nodes array.
exports.sourceNodes = ({ actions, createNodeId, createContentDigest }) => {
const { createNode } = actions;
const myData = {
key: 123,
app_title: `The foo field of my node`,
...
}
const nodeContent = JSON.stringify(myData);
const nodeMeta = {
id: createNodeId(`my-data${ myData.key }`),
parent: null,
children: [],
internal: {
type: `CmsSettings`,
mediaType: `text/html`,
content: nodeContent,
contentDigest: createContentDigest(myData)
}
}
const node = Object.assign({}, myData, nodeMeta);
createNode(node);
}
Here is the query used to get the data of the source node
allCmsSettings {
edges {
node {
id
app_title
...
}
}
}
Creating a query results in an array of results(which I know is the result of creating source nodes) but I'd like to create that source so that I could query it like this and:
CmsSettings {
app_title
app_keywords
app_descriptions
app_logo_path
brand_name
...
}
You get the point. I was browsing the gatsby node API but I can't find how to achieve this.
Thank you for your help
Nevermind, the answer is pretty simple, if you are new to gatsby just like me the sourceNodes export creates 2 graphql fields for you with all prefix and camel case source node. The thing that I wanted to make is already there and is queryable with
cmsSettings {
app_title
app_keywords
app_descriptions
app_logo_path
brand_name
...
}
Notice the lowercase letter even though it was declared as CmsSettings. It seems that gatsby really does some magic under the hood.

firebase $add() .push() .set()

I am using firebase, and angularfire.
there are so many ways to do CRUD with the Firebase Api
actually, I still don't get what is specific difference for using
$add with $firebaseArray
.push() method
.set() method
I think they are technically same, I prefer to use .set method() without knowing the exact reason, why I'd using that. is there any specific reason to not use it? what is exactly $firebaseArray did? if we could just declare basic reference variable.
in this case:
var usersRef = Ref.child('users');
$scope.createUser = function() {
$scope.userRef.child($id).set({
name: name
});
};
or
$scope.data = $firebaseArray(Ref.child('users'));
$scope.createUser = function() {
$scope.data.child($id).$add({
name: name
});
};
thank you.
If I have the following data tree in Firebase:
{
users:
{
key: { name:"bob" }
}
}
When I do an $add, I will create a new item in the tree
$scope.data.child('users').$add({
name: name
});
Since $add uses the Push method in Firebase, new random Key will be used when pushing data to the child.
{
users:
{[
key: { name:"bob" },
key2: { name:"name" }
]}
}
If I do a set on the same Users object, I will overwrite the data that is already there. So, in your example, without specifying a key, you will overwrite the entire user object.
$scope.userRef.child('users').set({
name: name
});
};
This will result with this data
{
users:
{
name: "name"
}
}
This happens because any null values you pass to the Set method will delete any data that was originally there.
Passing null to set() will remove the data at the specified location.
https://www.firebase.com/docs/web/api/firebase/set.html

Migrate simple List Array key to another key with an extra attribute in MongoDB

Sorry if I'm not getting the terminology right. Here's what I have currently my MongoDB user docs db.users:
"liked" : [
"EBMKgrD4DjZxkxvfY",
"WJzAEF5EKB5aaHWC7",
"beNdpXhYLnKygD3yd",
"RHP3hngma9bhXJQ2g",
"vN7uZ2d6FSfzYJLmm",
"NaqAsFmMmnhqNbqbG",
"EqWEY3qkeJYQscuZJ",
"6wsrFW5pFdnQfoWMs",
"W4NmGXyha8kpnJ2bD",
"8x5NWZiwGq5NWDRZX",
"Qu8CSXveQxdYbyoTa",
"yLLccTvcnZ3D3phAs",
"Kk36iXMHwxXNmgufj",
"dRzdeFAK28aKg3gEX",
"27etCj4zbrKhFWzGS",
"Hk2YpqgwRM4QCgsLv",
"BJwYWumwkc8XhMMYn",
"5CeN95hYZNK5uzR9o"
],
And I am trying to migrate them to a new key that also captures the time that a user liked the post
"liked_time" : [
{
"postId" : "5CeN95hYZNK5uzR9o",
"likedAt" : ISODate("2015-09-23T08:05:51.957Z")
}
],
I am wondering if it might be possible to simply do this within the MongoDB Shell with a command that iterates over each user doc and then iterates over the liked array and then updates and $push the new postId and time.
Or would it be better to do this in JavaScript. I am using Meteor.
I almost got it working for individual users. But want to know if I could do all users at once.
var user = Meteor.users.findOne({username:"atestuser"});
var userLiked = user.liked;
userLiked.forEach(function(entry) {
Meteor.users.update({ username: "atestuser" },
{ $push: { liked_times: { postId: entry, likedAt: new Date() }}});
console.log(entry);
});
Still a bit of a newbie to MongoDB obviously......
Here is something i made real quick you should run this on the server side just put it into a file e.g. "migrate.js" in root meteor and run the meteor app
if (Meteor.isServer) {
Meteor.startup(function () {
var users = Meteor.users.find().fetch();
users.forEach(function (doc) {
liked.forEach(function (postId) {
Meteor.users.update(doc._id, { $push: { liked_times: { postId: postId, likedAt: new Date() } } });
});
});
console.log('finished migrating');
});
}
p.s I didn't test it
If this is a one time migration i would do something like this in a one time js script.
Get all users
Iterate over each user
Get all likes
Iterate over them, get likedAt
var liked_times = _.collect(likes, function (likeId) {
return {
'postId' : likeId,
'likedAt': // get post liked time from like id.
}
});
Insert the above in the collection of choice.
Note:
The above example makes use of lodash
I would rather just save likedAt as a timestamp.

Categories