Firebase Function With Converter Wipes Document Reference on Set() - javascript

I have a scheduled function that resets an integer value back to zero in my firestore. The problem that I'm running into is that, while the merge-set succeeds (for the specified properties), it somehow resets my Organization document reference to null.
So far I've tried the following
Not using a converter along with the Update() function (instead of Set()). While this works, it is untyped, and I have to get rid of the converter which encapsulates the moment() to Date conversion.
Using Set() and simply pass the entire object.
user.reference?.withConverter(userConverter).set(user)
This is also working but it overrides the entire user object and can lead to concurrency issues in case a user also updates his object while the timed function is running.
I'm looking for a solution that allows me to use the converter class along with a merge Set().
The User interface looks like this
export interface User extends Document {
email?: string
name?: string
organization?: Organization | null
numberOfForwards?: number
lastForwardReset?: moment.Moment
}
with its converter like so
export class UserConverter implements firestore.FirestoreDataConverter<User> {
toFirestore(user: User): firestore.DocumentData {
return {
email: user.email,
name: user.name,
organization: user.organization ? user.organization.reference : null,
number_of_forwards: user.numberOfForwards,
last_forward_reset: user.lastForwardReset?.toDate()
}
}
fromFirestore(snapshot: firestore.QueryDocumentSnapshot): User {
const data = snapshot.data()!
return {
reference: snapshot.ref,
email: data.email,
name: data.name,
organization: data.organization ? { reference: data.organization } : null,
numberOfForwards: data.number_of_forwards,
lastForwardReset: moment(data.last_forward_reset.toDate())
}
}
}
export const resetNumberOfForwards = functions.pubsub
.schedule('every 15 minutes')
.onRun(async () => {
const reset = (user: User) => {
console.log(`Resetting ${user.email} from [${user.numberOfForwards}] to [0]`)
// Claim user reference
user.reference
?.withConverter(userConverter)
.set({ numberOfForwards: 0, lastForwardReset: Moment() }, { merge: true })
}
for the partial set to work, I've included the following snippet on top of my file
firebase.firestore().settings({
ignoreUndefinedProperties: true
})

I think there are two issues going on here. For a partial set() you should use the merge option or else it will overwrite the document.
ref.set(data, {merge: true})
In addition, in your toFirestore method, either set the organization field as undefined and let the ignoreUndefinedProperties: true setting remove it, or don't include it at all if organization was not given. Something like this
toFirestore((numberOfForwards, lastForwardReset, ...user): User): firestore.DocumentData {
if (user.organization) {
user.organization = user.organization.reference;
}
return {
...user,
number_of_forwards: numberOfForwards,
last_forward_reset: lastForwardReset?.toDate()
}
}
I took out the numberOfForwards and lastForwardReset fields from the user object here and use the spread operator to copy over the remaining fields to the return value, but you could also save a temporary object, modify it, and return that.
PS: I know this is old, but it came up in my search so thought I might add an answer still.

Related

Prisma/React Query Dependent undefined type challenges

I would like to take the output of one query (a TRPC query on Prisma) and use this as the dependent input in a future query.
I followed the dependent documentation for React Query but running into type errors that the return of the first may possibly be undefined (e.g. product is possibly 'undefined'):
const { data: product } = api.product.getUnique.useQuery({ id: pid });
const options = api.option.getAll.useQuery(
{
product: product.productSize,
region: product.productRegion,
},
{ enabled: !!product }
);
Does the inclusion of enabled not already handle this? If not, what is the correct way to adapt for Typescript.
Just casting the product value as a boolean return any truthy value (f.e if product will be equal to {} it will still result in true, that means that product won't necessarily have the productSize or productRegion properties, I would change it first to:
{ enabled: !!product && product.productSize && product.productRegion }
If that doesn't fix the typescript error, you as a developer can know for sure that the values are actually there so what you can use the as keyword in typescript to tell it that you know for sure that the type is what you want it to be:
(In this example I assumed that the values are string but you can change it to number or whatever the true value of them are)
const options = api.option.getAll.useQuery(
{
product: product.productSize as string,
region: product.productRegion as string,
},
{ enabled: !!product && product.productSize && product.productRegion }
);

Is using GraphQL input for every mutation a problem?

I am developing an application that has a quite sizeable amount of Queries and Mutation. Structures for data are often not complex, but there is plenty of them, so I have made myself a snippet, that generates the most common things repeating throughout them. This snippet also generates an input for mutations so it can be used for both simple and complex data structures. In quite a bit of instances, the input is just for adding a name. The API is supposed to be used mainly by my fronted, but after the app gets mature enough should be publicly available. Is doing this a problem in terms on conventions?
Sample of what I mean
/*=============================================
Types
=============================================*/
interface AddSampleSchemaInput {
input: AddSampleSchema
}
interface AddSampleSchema {
name: string
}
/*=============================================
Main
=============================================*/
export const SampleSchemaModule = {
typeDefs: gql`
type Mutation {
addSampleSchema(input: AddSampleSchemaInput): SampleSchema!
}
type SampleSchema {
_id: ID!
name: String!
}
input AddSampleSchemaInput {
name: String!
}
`
,
resolvers: {
Mutation: {
addSampleSchema: async (parents: any, args: AddSampleSchemaInput, context: GraphqlContext) => {
}
}
}
}
Sample of what I assume it should be.
/*=============================================
Main
=============================================*/
export const SampleSchemaModule = {
typeDefs: gql`
type Mutation {
addSampleSchema(name: String): SampleSchema!
}
type SampleSchema {
_id: ID!
name: String!
}
`
,
resolvers: {
Mutation: {
addSampleSchema: async (parents: any, args: { name: string }, context: GraphqlContext) => {
}
}
}
}
export default SampleSchemaModule
Would usage of the first code example be a problem. This means using input (input AddSampleSchemaInput), even if it were to contain just a single value (in this case name).
Or in other words is using input for every mutation a problem no matter the complexity.
Or the impact on frontent:
addDogBreed({
variables: {
input: {
name: "Retriever",
avergeHeight: 0.65
}
}
})
addDog({
variables: {
input: {
name: "Charlie"
}
}
})
// ======= VS =======
addDogBreed({
variables: {
input: {
name: "Retriever",
avergeHeight: 0.65
}
}
})
addDog({
variables: {
name: "Charlie"
}
})
In this case, is having the first one instead of the second one a problem?
Is having an input that only contains one key is something problematic?
No, on the contrary, it is something desirable in GraphQL. While nesting may sometimes seem superfluous, it is key in forward compatibility and extensibility of your schema. You should not have different conventions of how to design your mutation arguments depending on the number of inputs. If you always use an input object, you can easily deprecate existing fields or add new optional fields and stay compatible with all existing clients. If you were to completely change the shape of the mutation arguments just because you have an object with a single key, it would break compatibility.
I'm not seeing a problem that would drive you to
"only use GraphQL when dealing with Fetching / Get Data, and normal
REST API Request for mutating data (create, update, delete)."
Like #Bergi said. Plus you can provide your entity with multiple mutators some which can work like a PATCH or a PUT request.

Is it possible to store a user id as the key of a field in a Firestore document?

So I saw this in the "Get to know Firestore" youtube series from the official Firebase channel, where they used a userId as the key of a field. However, I can't seem to recreate this in my project using "firebase": "^9.6.6", and angular 13.0.4.
private async addMemberToGroupDetail(groupId: string) {
const groupRef = doc(this.firestore, 'groupDetail', groupId);
const userId = this.authService.userId;
updateDoc(groupRef, {
roles: {
`${userId}`: 'member',
},
});
}
Error: Property assignment expected.
Give this syntax a shot:
updateDoc(groupRef, {
roles: {
[`${userId}`]: 'member',
},
});
Might just need those square brackets as assigning the key dynamically.
As #Frank added in comments, if you don't need to convert to string, you can just do:
[userId]: 'member'

What is the best way to keep track of changes of a document's property in MongoDB?

I would like to know how to keep track of the values of a document in MongoDB.
It's a MongoDB Database with a Node and Express backend.
Say I have a document, which is part of the Patients collection.
{
"_id": "4k2lK49938d82kL",
"firstName": "John",
"objective": "Burn fat"
}
Then I edit the "objective" property, so the document results like this:
{
"_id": "4k2lK49938d82kL",
"firstName": "John",
"objective": "Gain muscle"
}
What's the best/most efficient way to keep track of that change? In other words, I would like to know that the "objective" property had the value "Burn fat" in the past, and access it in the future.
Thanks a lot!
Maintaining/tracking history in the same document is not all recommended. As the document size will keep on increasing leading to
probably if there are too many updates, 16mb document size limit
Performance degrades
Instead, you should maintain a separate collection for history. You might have use hibernates' Javers or envers for auditing for your relational databases. if not you can check how they work. A separate table (xyz_AUD) is maintained for each table (xyz). For each row (with primary key abc) in xyz table, there exist multiple rows in xyz_AUD table, where each row is version of that row.
Moreover, Javers also support MongoDB auditing. If you are using java you can directly use it. No need to write your own logic.
Refer - https://nullbeans.com/auditing-using-spring-boot-mongodb-and-javers/
One more thing, Javers Envers Hibernate are java libraries. But I'm sure for other programming languages also, similar libraries will be present.
There is a mongoose plugin as well -
https://www.npmjs.com/package/mongoose-audit (quite oudated 4 years)
https://github.com/nassor/mongoose-history#readme (better)
Maybe you can change the type of "objective" to array and track the changes in it. the last one of the array is the latest value.
Maintain it as a sub-document like below
{
"_id": "4k2lK49938d82kL",
"firstName": "John",
"objective": {
obj1: "Gain muscle",
obj2: "Burn fat"
}
}
You can also maintain it as an array field but remember, mongodb doesn't allow you to maintain uniqueness in an array field and if you plan to index the "objective" field, you'll have to create a multi key index
I think the simplest solution would be to use and update an array:
const patientSchema = new Schema({
firstName: { type: String, required: true },
lastName: { type: String, required: true },
objective: { type: String, required: true }
notes: [{
date: { type: Date, default: Date.now() },
note: { type: String, required: true }
}],
});
Then when you want to update the objective...
const updatePatientObjective = async (req, res) => {
try {
// check if _id and new objective exist in req.body
const { _id, objective, date } = req.body;
if (!_id || !objective) throw "Unable to update patient's objective.";
// make sure provided _id is valid
const existingPatient = await Patient.findOne({ _id });
if (!existingPatient) throw "Unable to locate that patient.";
// pull out objective as previousObjective
const { objective: previousObjective } = existingPatient;
// update patient's objective while pushing
// the previous objective into the notes sub document
await existingPatient.updateOne({
// update current objective
$set { objective },
// push an object with a date and note (previouseObjective)
// into a notes array
$push: {
notes: {
date,
note: previousObjective
},
},
}),
);
// send back response
res
.status(201)
.json({ message: "Successfully updated your objective!" });
} catch (err) {
return res.status(400).json({ err: err.toString() });
}
};
Document will look like:
firstName: "John",
lastName: "Smith",
objective: "Lose body fat.",
notes: [
{
date: 2019-07-19T17:45:43-07:00,
note: "Gain muscle".
},
{
date: 2019-08-09T12:00:38-07:00,
note: "Work on cardio."
}
{
date: 2019-08-29T19:00:38-07:00,
note: "Become a fullstack web developer."
}
...etc
]
Alternatively, if you're worried about document size, then create a separate schema for patient history and reference the user's id (or just store the patient's _id as a string instead of referencing an ObjectId, whichever you prefer):
const patientHistorySchema = new Schema({
_id: { type: Schema.Types.ObjectId, ref: "Patient", required: true },
objective: { type: String, required: true }
});
Then create a new patient history document when the objective is updated...
PatientHistory.create({ _id, objective: previousObjective });
And if you need to access to the patient history documents...
PatientHistory.find({ _id });

Modify collection value in Meteor

I want to write a query that would change the type of a project field from string to object.
So, if project field has the value abcd now, I want it to have an object like this:
{id: 'abcd'}
So:
project: 'abcd'
Turns to:
project: {id: 'abcd'}
I have no problems doing it in mongo:
db.hello.find({}).forEach((project) => {
project.project = {
id: x.project
}
db.hello.save(x)
})
But I don't know how to do it in Meteor. So far I have:
Projects.update($set: { client: ??? } }, { multi: true });
My 2 main problems are:
I don't know how to get the current value of client
I don't know how to change type
First of all, if you already ran the query, then you are aware that the db has already been adjusted yes? Because if you did run that, it would have updated all of the documents in that collection!
Please note that this should be ran server-side, I don't think that the $type is supported by all versions of minimongo.
// grab the cursor all string typed `project` fields
const cursor = Projects.find({ project: { $type : "string" } });
// grab the data from the cursor
const projects = cursor.fetch();
// Loop on each project and update
projects.forEach( project => Projects.update(project._id, {
$set: {
project: { id: project }
}
}) )

Categories