I would like to build a search function with several different filters. I have a RangeSlider component and function that give me the respective min max values. I save these filter values as objects and send them to the backend immediately each time a filter is changed.
Here I work with if, else conditions, which is certainly not the right way, but I didn't know what else to do and wanted to have at least a working prototype.
If you have one or two filters, this can still work, but not if you have many different ones. Furthermore, I wonder how to optimise the whole filtering process? With every request, the entire collection is searched. It would be great if the previous search query is applied to each new filter instead of searching through the entire collection again.
How can this be achieved?
Frontend
Every time a filter is updated activeFilters will be sent to the backend
const activeFilters = reactive({ salePrice: '', space: '' })
async function updateFilter(minmax, property) {
activeFilters[property] = minmax
const filteredObjects = await $fetch('/api/properties/filtered', {
method: 'POST',
body: activeFilters,
})
return filteredObjects
}
Backend
body = { "salePrice": { "min": 630000, "max": 850948 }, "space": { "min": 53, "max": 167 } }
export default defineEventHandler(async (event) => {
const body = await readBody(event)
try {
if (body.salePrice !== '' && body.space !== '') {
const properties = await Property.find({
salePrice: { $gte: body.salePrice.min, $lte: body.salePrice.max },
livableSurface: { $gte: body.space.min, $lte: body.space.max },
})
return properties
}
if (body.salePrice !== '') {
const properties = await Property.find({
salePrice: { $gte: body.salePrice.min, $lte: body.salePrice.max },
})
return properties
}
if (body.space !== '') {
const properties = await Property.find({
livableSurface: { $gte: body.space.min, $lte: body.space.max },
})
return properties
}
const properties = await Property.find()
return properties
} catch (err) {
console.dir(err)
event.res.statusCode = 500
return {
code: 'ERROR',
message: 'Something went wrong.',
}
}
})
Html
<InputsRangeSlider
:config="salePriceSliderConfig"
#updated-min-max="updateFilter($event, 'salePrice')"
/>
<InputsRangeSlider
:config="spaceSliderConfig"
#updated-min-max="updateFilter($event, 'space')"
/>
Maybe something like this?
const {salePrice, space} = body;
const conditions = {};
if (salePrice) conditions.salePrice = {$gte: salePrice.min, $lte: salePrice.max};
if (space) conditions.livableSurface = {$gte: space.min, $lte: space.max};
return Property.find(conditions);
Related
i want to do the following: get a random name with fetch from this website https://swapi.dev/api/people/, which i did and i can see it in my html page then i want also to get a random planet, here i need to access the homeworld key, and to return the link, before returning the link i formatted to get a random url and from this one i also have to show the name of the planet on my page. The first fetch works fine, at least i think but the 3rd .then() is not working or at least i don't know how to access the information from the homeworld url. This is my first time trying fetch() and it will be nice if you guys can help me telling where i did wrong in code and maybe different solutions but not so complicated :D tnks
let randomNumber = Math.floor(Math.random()*9)
const fetchPromise = fetch("https://swapi.dev/api/people/");
let test
let test2
let planets = document.querySelector('#age')
fetchPromise
.then((response) => {
if (!response.ok) {
throw new Error(`Http error: ${response.status}`);
}
return response.json();
})
.then((json) => {
console.log(json.results[randomNumber].name)
showRandomUserData(json)
test = json.results[0].homeworld
test = test.slice(0, -2)
// console.log(test + randomNumber + "/");
// console.log(test + "/" + randomNumber + "/");
test = test + randomNumber + "/";
return fetch(test)
// return fetch("https://swapi.dev/api/planets/2/");
})
.then(response => response.json()).then(json =>
{ test2=json.name
console.log(test2);
planets.innerHTML = test2
})
showRandomUserData = (randomUser) => {
document.querySelector("#name").innerHTML =
randomUser.results[randomNumber].name;
}
Solved
Here's a simple solution that uses fetch() to grab data from both those URLs and then insert all the people and the one planet that is returned into your web page:
function myFetch(...args) {
return fetch(...args).then(response => {
if (!response.ok) {
throw new Error(`fetch failed with status ${response.status}`);
}
return response.json();
});
}
Promise.all([
myFetch("https://swapi.dev/api/people/"),
myFetch("https://swapi.dev/api/planets/2/")
]).then(([people, planet]) => {
const peopleDiv = document.getElementById("people");
let peopleHTML = "";
for (let p of people.results) {
peopleHTML += `<div>${p.name}</div>`;
}
peopleDiv.innerHTML = peopleHTML;
const planetDiv = document.getElementById("planets");
let planetHTML = `<div>${planet.name}</div>`;
planetDiv.innerHTML = planetHTML;
}).catch(err => {
console.log(err);
});
<div id="people"></div>
<hr>
<div id="planets"></div>
As for using the results, the people URL returns a structure that looks like this:
{
count: 82,
next: 'https://swapi.dev/api/people/?page=2',
previous: null,
results: [
{
name: 'Luke Skywalker',
height: '172',
mass: '77',
hair_color: 'blond',
skin_color: 'fair',
eye_color: 'blue',
birth_year: '19BBY',
gender: 'male',
homeworld: 'https://swapi.dev/api/planets/1/',
films: [Array],
species: [],
vehicles: [Array],
starships: [Array],
created: '2014-12-09T13:50:51.644000Z',
edited: '2014-12-20T21:17:56.891000Z',
url: 'https://swapi.dev/api/people/1/'
},
{
name: 'C-3PO',
height: '167',
mass: '75',
hair_color: 'n/a',
skin_color: 'gold',
eye_color: 'yellow',
birth_year: '112BBY',
gender: 'n/a',
homeworld: 'https://swapi.dev/api/planets/1/',
films: [Array],
species: [Array],
vehicles: [],
starships: [],
created: '2014-12-10T15:10:51.357000Z',
edited: '2014-12-20T21:17:50.309000Z',
url: 'https://swapi.dev/api/people/2/'
}
}
So, you have people.results which is an array and you can access people.results[n] to get an item from that array. That item will be an object which has properties like .name, .height, etc...
The specific planet URL you show returns a single planet object like this:
{
name: 'Alderaan',
rotation_period: '24',
orbital_period: '364',
diameter: '12500',
climate: 'temperate',
gravity: '1 standard',
terrain: 'grasslands, mountains',
surface_water: '40',
population: '2000000000',
residents: [
'https://swapi.dev/api/people/5/',
'https://swapi.dev/api/people/68/',
'https://swapi.dev/api/people/81/'
],
films: [
'https://swapi.dev/api/films/1/',
'https://swapi.dev/api/films/6/'
],
created: '2014-12-10T11:35:48.479000Z',
edited: '2014-12-20T20:58:18.420000Z',
url: 'https://swapi.dev/api/planets/2/'
}
So, you access properties on that object as in planet.name.
Notice that the people results are paged. There are 82 total results, but only 10 come in this first result. The rest come with results for other pages such as https://swapi.dev/api/people/?page=2.
Similar to this answer but using async/await to avoid callback hell. If you can, try using this approach. Why?
Excellent recommendation in that answer by jfriend00 to use Promise.all instead of separate fetch calls, as that enables fetching to happen in parallel. To know more.
sandbox to test and try
const fetchData = async (...args) => {
try {
const response = await fetch(...args);
return response.json();
} catch (err) {
throw new Error(`fetch failed with status ${err?.message}`);
}
};
const updateDOM = (people, planet) => {
document.getElementById("people").innerHTML =
people.results.reduce((s, p) => s + `<div>${p.name}</div>`, "");
document.getElementById("planets").innerHTML = `<div>${planet.name}</div>`;
};
const populateData = async () => {
try {
const [people, planet] = await Promise.all([
fetchData("https://swapi.dev/api/people/"),
fetchData("https://swapi.dev/api/planets/2/"),
]);
// do stuff with 'people' or 'planet'
// example, get
// const firstPersonsHomeworld = people.results[0].homeworld;
// console.log(firstPersonsHomeworld);
// or
// const planetName = planet.name;
// console.log(planetName);
updateDOM(people, planet);
} catch (err) {
// errorHandler(err);
console.error(err);
}
};
// start app
populateData();
I am trying to write some code that searches through a bunch of objects in a MongoDB database. I want to pull the objects from the database by ID, then those objects have ID references. The program should be searching for a specific ID through this process, first getting object from id, then ids from the object.
async function objectFinder(ID1, ID2, depth, previousList = []) {
let route = []
if (ID1 == ID2) {
return [ID2]
} else {
previousList.push(ID1)
let obj1 = await findObjectByID(ID1)
let connectedID = obj1.connections.concat(obj1.inclusions) //creates array of both references to object and references from object
let mapPromises = connectedID.map(async (id) => {
return findID(id) //async function
})
let fulfilled = await Promise.allSettled(mapPromises)
let list = fulfilled.map((object) => {
return object.value.main, object.value.included
})
list = list.filter(id => !previousList.includes(id))
for (id of list) {
await objectFinder(id, ID2, depth - 1, previousList).then(result => {
route = [ID1].concat(result)
if (route[route.length - 1] == ID2) {
return route
}})
}
}
if (route[route.length - 1] == ID2) {
return route
}
}
I am not sure how to make it so that my code works like a tree search, with each object and ID being a node.
I didn't look too much into your code as I strongly believe in letting your database do the work for you if possible.
In this case Mongo has the $graphLookup aggregation stage, which allows recursive lookups. here is a quick example on how to use it:
db.collection.aggregate([
{
$match: {
_id: 1,
}
},
{
"$graphLookup": {
"from": "collection",
"startWith": "$inclusions",
"connectFromField": "inclusions",
"connectToField": "_id",
"as": "matches",
}
},
{
//the rest of the pipeline is just to restore the original structure you don't need this
$addFields: {
matches: {
"$concatArrays": [
[
{
_id: "$_id",
inclusions: "$inclusions"
}
],
"$matches"
]
}
}
},
{
$unwind: "$matches"
},
{
"$replaceRoot": {
"newRoot": "$matches"
}
}
])
Mongo Playground
If for whatever reason you want to keep this in code then I would take a look at your for loop:
for (id of list) {
await objectFinder(id, ID2, depth - 1, previousList).then(result => {
route = [ID1].concat(result);
if (route[route.length - 1] == ID2) {
return route;
}
});
}
Just from a quick glance I can tell you're executing this:
route = [ID1].concat(result);
Many times at the same level. Additional I could not understand your bottom return statements, I feel like there might be an issue there.
I'm trying to render a page with some details I get from a api call.
useEffect(() =>{
getCards();
}, [])
const [userCards, setCards] = useState([])
const getCards = async (event) => {
let token = localStorage.getItem("user");
await api
.get("/fetch-card-balance",
{headers:{"token":`${token}`}})
.then((response) => {
console.log(response);
if (response.data.success === false) {
toast.error(response.data.message);
setCards(false);
} else if (response.data.success === true) {
console.log(response.data.payload)
setCards(response.data.payload)
}
})
.catch((err) => {
toast.error(err.response.data.message);
});
};
console.log(userCards)
Here userCards is logged as
[
{
balance: 0.00,
cifNumber: "0001111222",
createdAt: "2021-08-03T12:19:51.000Z",
first6: "123456",
id: 1234,
last4: "7890"
},
{
balance: 20.00,
cifNumber: "0002222333",
createdAt: "2021-07-03T12:19:51.000Z",
first6: "234567",
id: 2345,
last4: "8901"
}
]
Then I try to use forEach to filter the properties I need
const cardDetails = []
userCards.forEach(option => cardDetails.push(
{
cardNumber: `${option.first6}******${option.last4}`,
balance: `${option.balance}`
}
))
But when I run
console.log(cardDetails[0].balance)
I get "Uncaught TypeError: Cannot read property 'balance' of undefined". I've gone over it several times and the only conclusion I have is that I'm missing something that may not be so obvious. Could someone help point out what it is.
Using cardDetails[0].balance will only work when there is at least one element in cardDetails. Otherwise getting the first element in the array yields undefined, causing your error message. Since you only fill the array after the API request returns, at least your first render will be done with an empty array.
An easy way to handle this would be checking for if (cardDetails.length > 0) first.
Try this out
const cardDetails = userCards.map(function(option) { return {cardNumber: ${option.first6}******${option.last4}, balance: ${option.balance}}});
I am beating my head against a wall. I have updated to Apollo 3, and cannot figure out how to migrate an updateQuery to a typePolicy. I am doing basic continuation based pagination, and this is how I used to merged the results of fetchMore:
await fetchMore({
query: MessagesByThreadIDQuery,
variables: {
threadId: threadId,
limit: Configuration.MessagePageSize,
continuation: token
},
updateQuery: (prev, curr) => {
// Extract our updated message page.
const last = prev.messagesByThreadId.messages ?? []
const next = curr.fetchMoreResult?.messagesByThreadId.messages ?? []
return {
messagesByThreadId: {
__typename: 'MessagesContinuation',
messages: [...last, ...next],
continuation: curr.fetchMoreResult?.messagesByThreadId.continuation
}
}
}
I have made an attempt to write the merge typePolicy myself, but it just continually loads and throws errors about duplicate identifiers in the Apollo cache. Here is what my typePolicy looks like for my query.
typePolicies: {
Query: {
fields: {
messagesByThreadId: {
keyArgs: false,
merge: (existing, incoming, args): IMessagesContinuation => {
const typedExisting: IMessagesContinuation | undefined = existing
const typedIncoming: IMessagesContinuation | undefined = incoming
const existingMessages = (typedExisting?.messages ?? [])
const incomingMessages = (typedIncoming?.messages ?? [])
const result = existing ? {
__typename: 'MessageContinuation',
messages: [...existingMessages, ...incomingMessages],
continuation: typedIncoming?.continuation
} : incoming
return result
}
}
}
}
}
So I was able to solve my use-case. It seems way harder than it really needs to be. I essentially have to attempt to locate existing items matching the incoming and overwrite them, as well as add any new items that don't yet exist in the cache.
I also have to only apply this logic if a continuation token was provided, because if it's null or undefined, I should just use the incoming value because that indicates that we are doing an initial load.
My document is shaped like this:
{
"items": [{ id: string, ...others }],
"continuation": "some_token_value"
}
I created a generic type policy that I can use for all my documents that have a similar shape. It allows me to specify the name of the items property, what the key args are that I want to cache on, and the name of the graphql type.
export function ContinuationPolicy(keyArgs: Array<string>, itemPropertyKey: string, typeName: string) {
return {
keyArgs,
merge(existing: any, incoming: any, args: any) {
if (!!existing && !!args.args?.continuation) {
const existingItems = (existing ? existing[itemPropertyKey] : [])
const incomingItems = (incoming ? incoming[itemPropertyKey] : [])
let items: Array<any> = [...existingItems]
for (let i = 0; i < incomingItems.length; i++) {
const current = incomingItems[i] as any
const found = items.findIndex(m => m.__ref === current.__ref)
if (found > -1) {
items[found] === current
} else {
items = [...items, current]
}
}
// This new data is a continuation of the last data.
return {
__typename: typeName,
[itemPropertyKey]: items,
continuation: incoming.continuation
}
} else {
// When we have no existing data in the cache, we'll just use the incoming data.
return incoming
}
}
}
}
When i fetch new alerts, i want to check if the ID of the new alert was already recorded. The issue is that that ID is nested inside an array. There's the alertsDetails array, which contains objects and those objects have an _ID filed which is what i want to check. I am not sure how to achieve that. I got the code below but then i have to iterate over the result to check the exists value. Im sure there must be a better way.
const mongoose = require('mongoose');
const { Schema } = mongoose;
const G2AlertsSchema = new Schema(
{
status: { type: String, required: true },
openDate: { type: Date, required: true },
alertType: { type: Array, required: true },
severity: { type: Array, required: true },
locationName: { type: Array, required: true },
history: { type: Array, required: true },
alertDetails: { type: Array, required: false },
assignedTo: { type: Schema.Types.ObjectId, ref: 'user' },
},
{
timestamps: true,
},
);
const G2Alerts = mongoose.model('G2Alert', G2AlertsSchema);
module.exports = G2Alerts;
This is the code i found on mongodb's website. I just want to see if the ID exists only. Basically when i fetch the new alerts i get an array and i iterate over it, i want to check each item's ID against what's inside the Database. If it's there, skip and go to the next. If it's new, then create a new alert and save it.
const exists = await G2Alerts.aggregate([
{
$project: {
exists: {
$in: ['5f0b4f508bda3805754ab343', '$alertDetails._id'],
},
},
},
]);
EDIT: Another thing. I am getting a eslint warning saying i should use array iteration instead of a for loop. The issue is, i need to use await when looking up the Alert ID. If i use, reduce or filter, i can't use await. If i use async inside the reduce or filter function, then it will return promises in or just an empty array.
This below works, based on the answer provided by Tom Slabbaert
const newAlertsData = [];
for (let item of alertData.data.items) {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
newAlertsData.push(item);
}
}
if (newAlertsData.length !== 0) {......
But this does not
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
filtered.push(item);
}
return filtered;
}, []);
You're not far off, here is an example using the correct syntax:
const exists = await G2Alerts.findOne({"alertDetails._id": '5f0b4f508bda3805754ab343'}});
if (!exists) {
... do something
}
This can also be achieve using aggregate with a $match stage instead of a $project stage or even better countDocuments which just returns the count instead of the entire object if you do not require it.
One more thing I'd like to add is that make sure alertDetails._id is string type as you're using string in you're $in. otherwise you'll need to cast them to ObjectId type in mongoose like so:
new mongoose.Types.ObjectId('5f0b4f508bda3805754ab343')
And for Mongo:
import {ObjectId} from "mongodb"
...
new ObjectId('5f0b4f508bda3805754ab343')
EDIT
Try something like this?
let ids = alertData.data.items.map(item => item._id.toString());
let existing = await G2Alerts.distinct("alertsDetails._id", {"alertsDetails._id": {$in: ids}});
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString())) {
return [item].concat(filtered)
}
return filtered;
}, []);
This way you only need to call the db once and not multiple times.
Final code based on the provided answer.
const ids = alertData.data.items.map(item => item._id);
const existing = await G2Alerts.find({ 'alertDetails._id': { $in: ids } }).distinct(
'alertDetails._id',
(err, alerts) => {
if (err) {
res.send(err);
}
return alerts;
},
);
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString()) && item.openDate > dateLimit) {
return [item].concat(filtered);
}
return filtered;
}, []);