how to clone the mongoose query object in javascript - javascript

I am facing the problem of clone of the mongoose query object .Javascript the copy the one object into another object by call-by-ref but in my project there is scenario i need to copy one object into another object by call-by-value.
var query=domain.User.find({
deleted: false,
role: role
})
var query1=query;
I have the scenario change in the query object is not reflected in query1. I google and try so many way to clone the object but it does't work.The query object is used in another function for pagination and query1 object is used for count query.
1.I used to Object.clone(query1) error Object.clone is not function
2.I used Object.assign(query1) but it does't works fine.
3.I used other so many ways can anybody help me to sort this problem

Alternative solution using merge method:
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(0)
const [users, count] = await Promise.all([query, countQuery.count()])

you are trying to clone a cursor, but it is not the right approach, you probably just need to create another
like this:
var buildQuery = function() {
return domain.User.find({
deleted: false,
role: role
});
};
var query = buildQuery();
var query1 = buildQuery();

This is work for me:
const qc = sourceQuery.toConstructor();
const clonedQuery = new qc();
This code work in pagination function where sourceQuery passed as parameter and i dont known what models used. Also it work with aggregations and complex queries.
public async paging(
query: mongoose.DocumentQuery<mongoose.Document[], mongoose.Document>,
params,
transformer: any = null
) {
let page = Number(params.page);
if (!page) page = 1;
let page_size = Number(params.count);
if (!page_size) page_size = 100;
const qc = query.toConstructor();
const cq = new qc();
return cq.countDocuments().exec()
.then(async (total) => {
const s = params.sort;
if (s) {
query.sort(s);
}
query.limit(page_size);
query.skip(page_size * (page - 1));
let results = await query.exec();
if (transformer) {
results = await Promise.all(results.map((i) => transformer(i)));
}
const r = new DtoCollection();
r.pages = Math.ceil(total / page_size);
r.total = total;
(r.results as any) = results;
return r;
});
}

Sergii Stotskyi's answer works just fine and is very elegant, except that count is deprecated.
countDocuments or estimatedDocumentCount should be used instead.
However, this causes the error the limit must be positive. We can walk around this by set limit to a large integer.
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(Number.MAX_SAFE_INTEGER)
const [users, count] = await Promise.all([query, countQuery.countDocuments()])

Since mongoose v6 you can use Query.prototype.clone
E.g. for your code snippet:
const query = domain.User.find({
deleted: false,
role: role
})
const query1 = query.clone();

Related

Can't update firebase collection field - Expected type 'ya', but it was: a custom Ia object

I am trying to make barbershop web app where costumer can see list of free appointments and when they reserve free appointment I want to delete that field from firebase.
I have a collection which represents one barber.
This is how it looks in firebase.
As you see radno_vrijeme is object or map in firebase which contains 6 arrays, and in each array there is list of free working hours.
In my function I am able to do everthing except last line where I need to update firebase collection.
const finishReservation = async () => {
try {
const freeTimeRef = collection(db, `${barber}`);
const q = query(freeTimeRef);
const querySnap = await getDoc(q);
querySnap.forEach(async (doc) => {
const radnoVrijeme = doc.data().radno_vrijeme;
// Find the index of the hour you want to delete
const index = radnoVrijeme["Mon"].indexOf(hour);
// Remove the hour from the array
radnoVrijeme["Mon"].splice(index, 1);
// Update the document in the collection
console.log(radnoVrijeme);
const radnoVrijemeMap = new Map(Object.entries(radnoVrijeme));
await freeTimeRef.update({ radno_vrijeme: radnoVrijemeMap });
});
} catch (error) {
console.log(error);
}
};
I tried to pass it as JSON stringified object, but it didn't work. I always get this error :
"FirebaseError: Expected type 'ya', but it was: a custom Ia object"
When you are trying to fetch multiple documents using a collection reference or query, then you must use getDocs():
const finishReservation = async () => {
try {
const freeTimeRef = collection(db, `${barber}`);
const q = query(freeTimeRef);
const querySnap = await getDocs(q);
const updates = [];
querySnap.forEach((d) => {
const radnoVrijeme = d.data().radno_vrijeme;
const index = radnoVrijeme["Mon"].indexOf(hour);
radnoVrijeme["Mon"].splice(index, 1);
const radnoVrijemeMap = new Map(Object.entries(radnoVrijeme));
updates.push(updateDoc(d.ref, { radno_vrijeme: radnoVrijemeMap }))
});
await Promise.all(updates);
console.log("Documents updated")
} catch (error) {
console.log(error);
}
};
getDoc() is used to fetch a single document using a document reference.

react-query dynamic number of parameters

I will have product filter on ecommerce page. I'm not sure how can I pass dynamic number of parameters in my query? Because sometimes will query use 1 parameter sometimes 6 and more.
Example
const fetchProducts = async (size, color, weight, deep,) => {
const parsed = await ky(
`http://localhost:3000/api/products/get?size=${size}&color=${color}`
).json();
return parsed;
};
const { data, status } = useQuery([size, color,], () => fetchPosts(size, color));
Sometimes will be 10 different parameters from the product filter, sometimes just 1...
How can I handle this dynamically? I will need then put the filter on prisma backend.
You can use qs package to handle all params.
Example:
const fetchProducts = (params)=>{
const query = qs.stringify(params);
const baseUrl = "http://localhost:3000/api/products/get?";
const parsed = await ky(baseUrl + query);
....
}
const params = {color:"red", size:"big"}
const query = qs.stringify(params);
//output: "color=red&size=big"

How to retrieve entities by passing array of IDs as input in Google datastore?

I am trying to implement the following SQL logic in the datastore,
SELECT * from table where id in [1,2,3,4,5]
Implementing this in datastore, I want to retrieve all the corresponding entities with these IDs as an array.
let employees = []
try {
for (let id of idArray) {
const employee = await employeeRepo.getOneById(workspace, id)
employees.push(employee)
}
} catch (e) {
throw e;
}
This is the naive logic of the function, and I am trying to reduce it to a single query.
Are you using the Node.js library referenced here: https://cloud.google.com/datastore/docs/reference/libraries
There is a function get where you can pass in an array of keys and it will return the array of entities.
https://googleapis.dev/nodejs/datastore/latest/Datastore.html#get
It's possible to do by using get as mentioned in the documentation
Here's an example of how to do it based on your code:
const employee1 = this.datastore.key(['Employee', 1]);
const employee2 = this.datastore.key(['Employee', 2]);
const employee3 = this.datastore.key(['Employee', 3]);
const keys = [employee1, employee2, employee3];
try {
const [employees] = await datastore.get(keys);
} catch (e) {
throw e;
}

Javascript object retaining "old" properties, can't override?

I have the following code:
const readDataFromSql = () => {
// going to have to iterate through all known activities + load them here
let sql = "[...]"
return new Promise((resolve, reject) => {
executeSqlQuery(sql).then((dict) => {
let loadedData = [];
for (let key in dict) {
let newItemVal = new ItemVal("reading hw", 7121, progress.DONE);
loadedData.push(newItemVal);
}
resolve(loadedData);
});
});
}
ItemVal implementation:
class ItemVal {
constructor(name, time, type) {
this.name = name
this.time = time
this.type = type
}
}
Let's assume that newItemVal = "reading hwj", 5081, progress.PAUSED when readDataFromSql() first runs.
readDataFromSql() is then again called after some state changes -- where it repulls some information from a database and generates new values. What is perplexing, however, is that when it is called the second time, newItemVal still retains its old properties (attaching screenshot below).
Am I misusing the new keyword?
From what I can see in your example code, you are not mutating existing properties but creating a new object with the ItemVal constructor function and adding them to an array, that you then return as a resolved promise. Are you sure the examples you give a correct representation of what you are actually doing
Given that, I'm not sure what could be causing the issue you are having, but I would at least recommend a different structure for your code, using a simpler function for the itemVal.
Perhaps with this setup, you might get an error returned that might help you debug your issue.
const itemVal = (name, time, type) => ({ name, time, type })
const readDataFromSql = async () => {
try {
const sql = "[...]"
const dict = await executeSqlQuery(sql)
const loadedData = dict.map((key) =>
ItemVal("reading hw", 7121, progress.DONE)
)
return loadedData
} catch (error) {
return error
}
};
If the issue is not in the function, then I would assume that the way you handle the data, returned from the readDataFromSql function, is where the issue lies. You need to then share more details about your implementation.
const readDataFromSql = async () => {
let sql = "[...]"
------> await executeSqlQuery(sql).then((dict) => {
Use the await keyword instead of creating a new promise.
I did some modification and found that below code is working correctly, and updating the new values on each call.
const readDataFromSql = () => {
return new Promise((resolve, reject) => {
let loadedData = [];
let randomVal = Math.random();
let newItemVal = new ItemVal(randomVal*10, randomVal*100, randomVal*1000);
loadedData.push(newItemVal);
resolve(loadedData);
});
}
Could you recheck if you are using below line in the code, as it will instantiate object with same properties again and again.
let newItemVal = new ItemVal("reading hw", 7121, progress.DONE);
You can modify your code as below to simplify the problem.
const readDataFromSql = async () => {
// going to have to iterate through all known activities + load them here
let sql = "[...]" // define sql properly
let result = await executeSqlQuery(sql);
let loadedData = [];
for (let row in result) {
let newItemVal = new ItemVal(row.name, row.time, row.type);
loadedData.push(newItemVal);
}
return loadedData;
}
class ItemVal {
constructor(name, time, type) {
this.name = name
this.time = time
this.type = type
}
}
What you are talking about is an issue related to Object mutation in Redux, however, you didn't add any redux code. Anyway, you might be making some mistake while recreating(not mutating) the array.
General solution is the use spread operator as:
loadedData = [ ...loadedData.slice(0) , ...newloadedData]
In Dropdown.js line 188 instead of console.log-ing your variable write debugger;
This will function as a breakpoint. It will halt your code and you can inspect the value by hovering your mouse over the code BEFORE the newItemVal is changed again.
I can see in your screenshot that the newItemVal is modified again after you log it.

What's the most efficient way to store data to multiple refs in firebase?

Let's see the next situation:
If we create an user, we have to create a new client, a new user, and a new, inital project for the user.
db = {
users: {},
clients: {},
projects: {}
};
const usersRef = firebase.database().ref("/users");
const clientsRef = firebase.database().ref("/clients");
const projectsRef = firebase.database().ref("/projects");
To keep the code clean, and separated, we can create three functions:
const newUserToDb = name => {
const newUser = usersRef.push();
newUser.set({name});
};
const newClientToDb = name => {
const newClient = clientsRef.push();
newClient.set({name});
};
const newProjectToDb = name => {
const newProject = projectsRef.push();
newProject.set({name});
};
const createUserToDb = (userName, clientName, projectName) => {
newUserToDb(userName);
newClientToDb(clientName);
newProjectToDb(projectName);
};
To make all the changes in one place, but make the code less separated:
const createUserToDb = (userName, clientName, projectName) => {
const userId = usersRef.push().key;
const clientId = clientsRef.push().key;
const projectId = projectsRef.push().key;
const updates = {};
updates[`/users/${userId}`] = userName;
updates[`/clients/${clientId}`] = clientName;
updates[`/projects/${projectId}`] = projectName;
firebase.database().ref().update(updates);
};
Is there any important difference between the two solutions above? Which is more efficient?
The important difference to the above approach is atomicity. In the first scenario, the individual collection or documents update will succeed or fail without affecting other updates. In the second scenario, all the updates will succeed else none will.
I don't think efficiency is the right term to be used for comparing the above scenarios, its more of the business/use case which will define which one you need to use
The first way seems more separated and explicit which would probably be easier for other developers to understand.

Categories