Match two values in a collection - javascript

I have two values I need to match as you can see in the picture below:
I tried something like this:
const index = state.locks.users.findIndex(
stateUser => stateUser._id === action.payload.customerPayload.accessid
But I’m getting the error:
findIndex of undefined.
And I guess that’s because of locks being an array.
But I’m really uncertain how to fix this issue. Should I have multiple findIndexes? One for the lock and one for to match the users?
Thanks for reading my post. And I appreciate all the help I can get.

The code snippet should be
let itemIndex = -1;
state.locks.map((lock) => {
lock.users.findIndex(...)
});
Assuming state is an object containing locks array.

My suggestion to you is do a double for loop (as you've already figured) to get the user object that you need.
Consider the following snippet (adjust to your data structure):
let state = {
locks: [
{
users: [
{ _id: '123' },
{ _id: '456' }
]
},
{
users: [
{ _id: '678' },
{ _id: '789' }
]
},
]
};
function getUserObjByID(stateObj, userID) {
for (let usersObject of state.locks) {
for (let user of usersObject.users) {
if (user._id === userID) {
return user;
}
}
}
}
let myObj = getUserObjByID(state, '678');
console.log(myObj);

So it works now. What I had to do with my reducer was this:
case 'REMOVE_USER':
return {
...state,
locks: state.locks.map(lock => {
return {
...lock,
users: lock.users
? lock.users.filter(
user => user._id != action.payload.customerPayload.accessid
)
: []
}
})
}

Related

Trying to write a recursive asynchronous search in JavaScript

I am trying to write some code that searches through a bunch of objects in a MongoDB database. I want to pull the objects from the database by ID, then those objects have ID references. The program should be searching for a specific ID through this process, first getting object from id, then ids from the object.
async function objectFinder(ID1, ID2, depth, previousList = []) {
let route = []
if (ID1 == ID2) {
return [ID2]
} else {
previousList.push(ID1)
let obj1 = await findObjectByID(ID1)
let connectedID = obj1.connections.concat(obj1.inclusions) //creates array of both references to object and references from object
let mapPromises = connectedID.map(async (id) => {
return findID(id) //async function
})
let fulfilled = await Promise.allSettled(mapPromises)
let list = fulfilled.map((object) => {
return object.value.main, object.value.included
})
list = list.filter(id => !previousList.includes(id))
for (id of list) {
await objectFinder(id, ID2, depth - 1, previousList).then(result => {
route = [ID1].concat(result)
if (route[route.length - 1] == ID2) {
return route
}})
}
}
if (route[route.length - 1] == ID2) {
return route
}
}
I am not sure how to make it so that my code works like a tree search, with each object and ID being a node.
I didn't look too much into your code as I strongly believe in letting your database do the work for you if possible.
In this case Mongo has the $graphLookup aggregation stage, which allows recursive lookups. here is a quick example on how to use it:
db.collection.aggregate([
{
$match: {
_id: 1,
}
},
{
"$graphLookup": {
"from": "collection",
"startWith": "$inclusions",
"connectFromField": "inclusions",
"connectToField": "_id",
"as": "matches",
}
},
{
//the rest of the pipeline is just to restore the original structure you don't need this
$addFields: {
matches: {
"$concatArrays": [
[
{
_id: "$_id",
inclusions: "$inclusions"
}
],
"$matches"
]
}
}
},
{
$unwind: "$matches"
},
{
"$replaceRoot": {
"newRoot": "$matches"
}
}
])
Mongo Playground
If for whatever reason you want to keep this in code then I would take a look at your for loop:
for (id of list) {
await objectFinder(id, ID2, depth - 1, previousList).then(result => {
route = [ID1].concat(result);
if (route[route.length - 1] == ID2) {
return route;
}
});
}
Just from a quick glance I can tell you're executing this:
route = [ID1].concat(result);
Many times at the same level. Additional I could not understand your bottom return statements, I feel like there might be an issue there.

The fastest way to find an element in an array with a given property name and replace it

I have a performance issue with NgRx, I have an array with thousands of objects that looks like this (I can't change that structure even I don't like it):
state.alarms structure:
[
{ global: {...} },
{ 282: {...} },
{ 290: {...} },
{ 401: {...} }
etc...
]
addNewAlarm(state, alarm) here alarm object is for example:
{ 282: {...} }
As you can see the object looks something like this { someNumber: nestedObjectForThatNumber }
I'm listening for changes and if some appear I have to replace object where "the key" is given number.
In the case from the screenshot for example I get { 282: {x: 1, y: 2, z: 3} } so I have to replace the item of array with index 1.
In my reducer I've created something like this but it doesn't work as I expected:
export function addNewAlarm(state: State, alarm: AlarmsObject): State | undefined {
const alarms: AlarmsObject[] = [...state.alarms];
if (state) {
const existingRecord = state.alarms.find(alarm1 => alarm1.hasOwnProperty(Object.keys(alarm)[0]));
if (existingRecord) {
const index = state.alarms.indexOf(existingRecord);
alarms[index] = alarm;
}
}
return { ...state, alarms };
}
Maybe someone can give me a hint how to do it in a right way?
you can use findIndex (If not found return -1) but, why not create an object?
stateObj: any = {};
this.state.forEach((x) => {
this.stateObj = { ...this.stateObj, ...x };
});
So you only need use
//Note you needn't return anything
addNewAlarm(stateObj: any, alarm: AlarmsObject){
const key=Object.keys(alarm)[0]
this.stateObj[key]=this.alarm[key]
}
A fool stackblitz

javascript string comparison issue in array.filter()

I have an array which contains following objects.
myArray = [
{ item: { id: 111557 } },
{ item2: { id: 500600 } }]
and I have a variable
targetItemID = '111557'
Note that one is string, and the ones in array are numbers. I'm trying to get the object having the correct item id.
Here is what I have tried,
myArray = [
{ item: { id: 111557 } },
{ item2: { id: 500600 } }]
targetItemID = '111557'
var newArray = myArray.filter(x => {
console.log(x.item.id.toString())
console.log(targetItemID.toString())
x.item.id.toString() === itemID.toString()
})
console.log(newArray);
I expect all matching objects to be added to 'newArray'. I tried to check the values before comparison, They are both strings, they seem exactly same, but my newArray is still empty.
Your second object doesn't have an item property and should.
You need a return in your filter function.
You must compare x.item.id against targetItemID, not itemID. Since you are using console.log() you would have seen and error of itemID id not defined ;).
myArray = [
{ item: { id: 111557 } },
{ item: { id: 500600 } }
];
targetItemID = '111557'
var newArray = myArray.filter(x => {
//console.log(x.item.id.toString())
//console.log(targetItemID.toString())
return x.item.id.toString() === targetItemID.toString();
});
console.log(newArray);
There are a few issues here. First, not all your objects have an item property, so you'll need to check it exists. Second, you're comparing them against a non-existent itemID instead of targetItemID, and finally, and #bryan60 mentioned, if you open a block in an anonymous lambda, you need an explicit return statement, although, to be honest, you really don't need the block in this case:
var newArray =
myArray.filter(x => x.item && x.item.id && x.item.id.toString() === targetItemID)
you need to return for filter to work:
return x.item.id.toString() === itemID.toString();

Apollo GraphQL updateQuery to typePolicy

I am beating my head against a wall. I have updated to Apollo 3, and cannot figure out how to migrate an updateQuery to a typePolicy. I am doing basic continuation based pagination, and this is how I used to merged the results of fetchMore:
await fetchMore({
query: MessagesByThreadIDQuery,
variables: {
threadId: threadId,
limit: Configuration.MessagePageSize,
continuation: token
},
updateQuery: (prev, curr) => {
// Extract our updated message page.
const last = prev.messagesByThreadId.messages ?? []
const next = curr.fetchMoreResult?.messagesByThreadId.messages ?? []
return {
messagesByThreadId: {
__typename: 'MessagesContinuation',
messages: [...last, ...next],
continuation: curr.fetchMoreResult?.messagesByThreadId.continuation
}
}
}
I have made an attempt to write the merge typePolicy myself, but it just continually loads and throws errors about duplicate identifiers in the Apollo cache. Here is what my typePolicy looks like for my query.
typePolicies: {
Query: {
fields: {
messagesByThreadId: {
keyArgs: false,
merge: (existing, incoming, args): IMessagesContinuation => {
const typedExisting: IMessagesContinuation | undefined = existing
const typedIncoming: IMessagesContinuation | undefined = incoming
const existingMessages = (typedExisting?.messages ?? [])
const incomingMessages = (typedIncoming?.messages ?? [])
const result = existing ? {
__typename: 'MessageContinuation',
messages: [...existingMessages, ...incomingMessages],
continuation: typedIncoming?.continuation
} : incoming
return result
}
}
}
}
}
So I was able to solve my use-case. It seems way harder than it really needs to be. I essentially have to attempt to locate existing items matching the incoming and overwrite them, as well as add any new items that don't yet exist in the cache.
I also have to only apply this logic if a continuation token was provided, because if it's null or undefined, I should just use the incoming value because that indicates that we are doing an initial load.
My document is shaped like this:
{
"items": [{ id: string, ...others }],
"continuation": "some_token_value"
}
I created a generic type policy that I can use for all my documents that have a similar shape. It allows me to specify the name of the items property, what the key args are that I want to cache on, and the name of the graphql type.
export function ContinuationPolicy(keyArgs: Array<string>, itemPropertyKey: string, typeName: string) {
return {
keyArgs,
merge(existing: any, incoming: any, args: any) {
if (!!existing && !!args.args?.continuation) {
const existingItems = (existing ? existing[itemPropertyKey] : [])
const incomingItems = (incoming ? incoming[itemPropertyKey] : [])
let items: Array<any> = [...existingItems]
for (let i = 0; i < incomingItems.length; i++) {
const current = incomingItems[i] as any
const found = items.findIndex(m => m.__ref === current.__ref)
if (found > -1) {
items[found] === current
} else {
items = [...items, current]
}
}
// This new data is a continuation of the last data.
return {
__typename: typeName,
[itemPropertyKey]: items,
continuation: incoming.continuation
}
} else {
// When we have no existing data in the cache, we'll just use the incoming data.
return incoming
}
}
}
}

Data getting added suspiciously after Firebase call

I am struggling with a uncertain issue and not able to figure out what can be the root cause.
I am calling Firebase for data using the below code:
public getAllObject(filters: Filter[]): Observable<T[]> {
return this.afs
.collection('somecollection')
.snapshotChanges()
.pipe(
take(1),
map((changes) => {
return changes.map((a) => {
console.log(a.payload.doc.data()) ==============> displays everything is alright
const data: any = a.payload.doc.data() as T;
const code = a.payload.doc.id;
return { code, ...data } as T;
});
})
);
}
and I am consuming the above service mentioned below:
this.service.getAllObject(this.service.getFilters()).subscribe(
(entities) => {
console.log(entities);==============> display's array and things are wrong.
},
(error) => {
console.error(error);
}
)
Problem description
When I call the above API, I get an array of below objects. Problem is with the stores attribute. As we move forward in the array the stores attribute contains values from pre elements. This phenomenon is HAPPENING ON CLIENT SIDE only.
I was wondering if I am using attribute name 'stores' which is any kind of reserved keyword which is causing this to happen. Or i am using rxjs in wrong way.
Current results
{
code: 123,
stores: { abc }
},
{
code: 345,
stores: { def, abc }
},
{
code: 678,
stores: { xyz, def, abc }
},
Expected results
getAllObject console.log displays the following which is correct
{
code: 123,
stores: { abc }
},
{
code: 345,
stores: { def }
},
{
code: 678,
stores: { xyz }
},
Current analysis
console.log(a.payload.doc.data()); ====> Showing correct
const data: any = a.payload.doc.data();
const code: string = a.payload.doc.id;
console.log({ code, ...data });
return { code, ...data } as T; =====> Showing INCORRECT and adding stores from earlier element to current one.

Categories