I am beating my head against a wall. I have updated to Apollo 3, and cannot figure out how to migrate an updateQuery to a typePolicy. I am doing basic continuation based pagination, and this is how I used to merged the results of fetchMore:
await fetchMore({
query: MessagesByThreadIDQuery,
variables: {
threadId: threadId,
limit: Configuration.MessagePageSize,
continuation: token
},
updateQuery: (prev, curr) => {
// Extract our updated message page.
const last = prev.messagesByThreadId.messages ?? []
const next = curr.fetchMoreResult?.messagesByThreadId.messages ?? []
return {
messagesByThreadId: {
__typename: 'MessagesContinuation',
messages: [...last, ...next],
continuation: curr.fetchMoreResult?.messagesByThreadId.continuation
}
}
}
I have made an attempt to write the merge typePolicy myself, but it just continually loads and throws errors about duplicate identifiers in the Apollo cache. Here is what my typePolicy looks like for my query.
typePolicies: {
Query: {
fields: {
messagesByThreadId: {
keyArgs: false,
merge: (existing, incoming, args): IMessagesContinuation => {
const typedExisting: IMessagesContinuation | undefined = existing
const typedIncoming: IMessagesContinuation | undefined = incoming
const existingMessages = (typedExisting?.messages ?? [])
const incomingMessages = (typedIncoming?.messages ?? [])
const result = existing ? {
__typename: 'MessageContinuation',
messages: [...existingMessages, ...incomingMessages],
continuation: typedIncoming?.continuation
} : incoming
return result
}
}
}
}
}
So I was able to solve my use-case. It seems way harder than it really needs to be. I essentially have to attempt to locate existing items matching the incoming and overwrite them, as well as add any new items that don't yet exist in the cache.
I also have to only apply this logic if a continuation token was provided, because if it's null or undefined, I should just use the incoming value because that indicates that we are doing an initial load.
My document is shaped like this:
{
"items": [{ id: string, ...others }],
"continuation": "some_token_value"
}
I created a generic type policy that I can use for all my documents that have a similar shape. It allows me to specify the name of the items property, what the key args are that I want to cache on, and the name of the graphql type.
export function ContinuationPolicy(keyArgs: Array<string>, itemPropertyKey: string, typeName: string) {
return {
keyArgs,
merge(existing: any, incoming: any, args: any) {
if (!!existing && !!args.args?.continuation) {
const existingItems = (existing ? existing[itemPropertyKey] : [])
const incomingItems = (incoming ? incoming[itemPropertyKey] : [])
let items: Array<any> = [...existingItems]
for (let i = 0; i < incomingItems.length; i++) {
const current = incomingItems[i] as any
const found = items.findIndex(m => m.__ref === current.__ref)
if (found > -1) {
items[found] === current
} else {
items = [...items, current]
}
}
// This new data is a continuation of the last data.
return {
__typename: typeName,
[itemPropertyKey]: items,
continuation: incoming.continuation
}
} else {
// When we have no existing data in the cache, we'll just use the incoming data.
return incoming
}
}
}
}
Related
I'm trying to work around the fact that Datocms doesn't support a where filter in their GraphQL schema. Since there isn't that much data, I figured I could query all of it, and do the find on my end, but ... I'm not succeeding, at least not using "modern" methods.
What I get back when I query all of the data looks like this:
"foo": {
"data": {
"allGiveawayLandingPages": [
{
"lpSection": [
{},
{},
{},
{},
{},
{},
{},
{
"id": "34525949",
"products": [
{
"__typename": "PurchaseCardRecord",
"discountAmount": 50,
"discountAmountPct": null,
"discountEndDate": "2022-11-01T23:00:00+00:00",
"id": "44144096"
},
{
"__typename": "PurchaseCardRecord",
"discountAmount": null,
"discountAmountPct": null,
"discountEndDate": null,
"id": "44144097"
}
]
}
]
}
]
}
}
I need to find the object down in the "products" array by "id". This general question has been asked and answered lots of times, but the only answer I can get to work is from way back in 2013, and it seems to me there aught to be a more modern way to do it.
I'm doing this inside of a try/catch block, which I mention because it seems to be making this hard to debug (I'll come back to this):
export default async function createPaymentIntentHandler(req, res) {
const body = JSON.parse(req.body);
const {
productId,
productType
} = body;
let data;
if ('POST' === req.method) {
try {
switch (productType) {
case 'SeminarRecord':
data = await request({ query: singleSeminarQuery(productId) });
productObjName = 'seminar';
break;
default:
data = await request({ query: singleProductQuery(productId) });
productObjName = 'product';
}
/**
* Here's where I want to do my query / filtering
*/
// ... do more stuff and create Stripe paymentIntent
res.status(200).send({clientSecret: paymentIntent.client_secret})
} catch (error) {
logger.error({error}, 'Create Payment Intent error');
return res.status(400).end(`Create Payment Intent error: ${error.message}`);
}
} else {
res.status(405).end('Method not allowed');
}
}
My first, naive attempt was
const foo = await request({ query: ALL_PURCHASE_CARDS_QUERY });
const card = foo.data.allGiveawayLandingPages.find((page) => {
return page.lpSection.find((section) => {
return section?.products.find((record) => record.id === parentId)
})
});
logger.debug({card}, 'Got card');
In the abstract, aside from the fact that the above is fairly brittle because it relies on the schema not changing, I'd expect some similar sort of ES6 construction to work. This particular one, however, throws, but not in a particularly useful way:
[08:09:18.690] ERROR: Create Payment Intent error
env: "development"
error: {}
That's what I meant by it being hard to debug — I don't know why the error object is empty. But, in any case, that's when I started searching StackOverflow. The first answer which looked promising was this one, which I implemented as
...
const {
productId,
productType,
parentId
} = body;
...
function findCard(parent, id) {
logger.debug({parent}, 'searching in parent')
for (const item of parent) {
if ('PurchaseCardRecord' === item.__typename && item.id === id) return item;
if (item.children?.length) {
const innerResult = findCard(item.children, id);
if (innerResult) return innerResult;
}
}
}
if ('POST' === req.method) {
try {
...
const foo = await request({ query: ALL_PURCHASE_CARDS_QUERY });
const card = findCard(foo, parentId);
logger.debug({card}, 'Got card');
This similarly throws unhelpfully, but my guess is it doesn't work because in the structure, not all children are iterables. Then I found this answer, which uses reduce instead of my original attempt at find, so I took a pass at it:
const card = foo.data.allGiveawayLandingPages.reduce((item) => {
item?.lpSection.reduce((section) => {
section?.products.reduce((record) => {
if ('PurchaseCardRecord' === record.__typename && record.id === parentId) return record;
})
})
})
This is actually the closest I've gotten using ES6 functionality. It doesn't throw an error; however, it's also not returning the matching child object, it's returning the first parent object that contains the match (i.e., it's returning the whole "lpSection" object). Also, it has the same brittleness problem of requiring knowledge of the schema. I'm relatively certain something like this is the right way to go, but I'm just not understanding his original construction:
arr.reduce((a, item) => {
if (a) return a;
if (item.id === id) return item;
I've tried to understand the MDN documentation for Array.reduce, but, I don't know, I must be undercaffeinated or something. The syntax is described as
reduce((previousValue, currentValue) => { /* … */ } )
and then several variations on the theme. I thought it would return all the way up the stack in my construction, but it doesn't. I also tried
const card = foo.data.allGiveawayLandingPages.reduce((accumulator, item) => {
return item?.lpSection.reduce((section) => {
return section?.products.reduce((record) => {
if ('PurchaseCardRecord' === record.__typename && record.id === parentId) return record;
})
})
})
but the result was the same. Finally, not understanding what I'm doing, I went back to an older answer that doesn't use the ES6 methods but relies on recursing the object.
...
function filterCards(object) {
if (object.hasOwnProperty('__typename') && object.hasOwnProperty('id') && ('PurchaseCardRecord' === object.__typename && parentId === object.id)) return object;
for (let i=0; i<Object.keys(object).length; i++) {
if (typeof object[Object.keys(object)[i]] == 'object') {
const o = filterCards(object[Object.keys(object)[i]]);
if (o != null) return o;
}
}
return null;
}
if ('POST' === req.method) {
try {
...
const foo = await request({ query: ALL_PURCHASE_CARDS_QUERY });
const card = filterCards(foo);
logger.debug({card}, 'Got card');
This actually works, but ISTM there should be a more elegant way to solve the problem with modern Javascript. I'm thinking it's some combination of .find, .some, and .reduce. Or maybe just for ... in.
I'll keep poking at this, but if anyone has an elegant/modern answer, I'd appreciate it!
I need to be able to receive data from an external API and map it dynamically to classes. When the data is plain object, a simple Object.assign do the job, but when there's nested objects you need to call Object.assign to all nested objects.
The approach which I used was to create a recursive function, but I stumble in this case where there's a nested array of objects.
Classes
class Organization {
id = 'org1';
admin = new User();
users: User[] = [];
}
class User {
id = 'user1';
name = 'name';
account = new Account();
getFullName() {
return `${this.name} surname`;
}
}
class Account {
id = 'account1';
money = 10;
calculate() {
return 10 * 2;
}
}
Function to initialize a class
function create(instance: object, data: any) {
for (const [key, value] of Object.entries(instance)) {
if (Array.isArray(value)) {
for (const element of data[key]) {
// get the type of the element in array dynamically
const newElement = new User();
create(newElement, element)
value.push(newElement);
}
} else if (typeof value === 'object') {
create(value, data[key]);
}
Object.assign(value, data);
}
}
const orgWithError = Object.assign(new Organization(), { admin: { id: 'admin-external' }});
console.log(orgWithError.admin.getFullName()); // orgWithError.admin.getFullName is not a function
const org = new Organization();
const data = { id: 'org2', admin: { id: 'admin2' }, users: [ { id: 'user-inside' }]}
create(org, data);
// this case works because I manually initialize the user in the create function
// but I need this function to be generic to any class
console.log(org.users[0].getFullName()); // "name surname"
Initially I was trying to first scan the classes and map it and then do the assign, but the problem with the array of object would happen anyway I think.
As far as I understand from your code, what you basically want to do is, given an object, determine, what class it is supposed to represent: Organization, Account or User.
So you need a way to distinguish between different kinds of objects in some way. One option may be to add a type field to the API response, but this will only work if you have access to the API code, which you apparently don't. Another option would be to check if an object has some fields that are unique to the class it represents, like admin for Organization or account for User. But it seems like your API response doesn't always contain all the fields that the class does, so this might also not work.
So why do you need this distinction in the first place? It seems like the only kind of array that your API may send is array of users, so you could just stick to what you have now, anyway there are no other arrays that may show up.
Also a solution that I find more logical is not to depend on Object.assign to just assign all properties somehow by itself, but to do it manually, maybe create a factory function, like I did in the code below. That approach gives you more control, also you can perform some validation in these factory methods, in case you will need it
class Organization {
id = 'org1';
admin = new User();
users: User[] = [];
static fromApiResponse(data: any) {
const org = new Organization()
if(data.id) org.id = data.id
if(data.admin) org.admin = User.fromApiResponse(data.admin)
if(data.users) {
this.users = org.users.map(user => User.fromApiResponse(user))
}
return org
}
}
class User {
id = 'user1';
name = 'name';
account = new Account();
getFullName() {
return `${this.name} surname`;
}
static fromApiResponse(data: any) {
const user = new User()
if(data.id) user.id = data.id
if(data.name) user.name = data.name
if(data.account)
user.account = Account.fromApiResponse(data.account)
return user
}
}
class Account {
id = 'account1';
money = 10;
calculate() {
return 10 * 2;
}
static fromApiResponse(data: any) {
const acc = new Account()
if(data.id) acc.id = data.id
if(data.money) acc.money = data.money
return acc
}
}
const data = { id: 'org2', admin: { id: 'admin2' }, users: [ { id: 'user-inside' }]}
const organization = Organization.fromApiResponse(data)
I can't conceive of a way to do this generically without any configuration. But I can come up with a way to do this using a configuration object that looks like this:
{
org: { _ctor: Organization, admin: 'usr', users: '[usr]' },
usr: { _ctor: User, account: 'acct' },
acct: { _ctor: Account }
}
and a pointer to the root node, 'org'.
The keys of this object are simple handles for your type/subtypes. Each one is mapped to an object that has a _ctor property pointing to a constructor function, and a collection of other properties that are the names of members of your object and matching properties of your input. Those then are references to other handles. For an array, the handle is [surrounded by square brackets].
Here's an implementation of this idea:
const create = (root, config) => (data, {_ctor, ...keys} = config [root]) =>
Object.assign (new _ctor (), Object .fromEntries (Object .entries (data) .map (
([k, v]) =>
k in keys
? [k, /^\[.*\]$/ .test (keys [k])
? v .map (o => create (keys [k] .slice (1, -1), config) (o))
: create (keys [k], config) (v)
]
: [k, v]
)))
class Organization {
constructor () { this.id = 'org1'; this.admin = new User(); this.users = [] }
}
class User {
constructor () { this.id = 'user1'; this.name = 'name'; this.account = new Account() }
getFullName () { return `${this.name} surname`}
}
class Account {
constructor () { this.id = 'account1'; this.money = 10 }
calculate () { return 10 * 2 }
}
const createOrganization = create ('org', {
org: { _ctor: Organization, admin: 'usr', users: '[usr]' },
usr: { _ctor: User, account: 'acct' },
acct: { _ctor: Account }
})
const orgWithoutError = createOrganization ({ admin: { id: 'admin-external' }});
console .log (orgWithoutError .admin .getFullName ()) // has the right properties
const data = { id: 'org2', admin: { id: 'admin2' }, users: [ { id: 'user-inside' }]}
const org = createOrganization (data)
console .log (org .users [0] .getFullName ()) // has the right properties
console .log ([
org .constructor .name,
org .admin .constructor.name, // has the correct hierarchy
org .users [0]. account. constructor .name
] .join (', '))
console .log (org) // entire object is correct
.as-console-wrapper {min-height: 100% !important; top: 0}
The main function, create, receives the name of the root node and such a configuration object. It returns a function which takes a plain JS object and hydrates it into your Object structure. Note that it doesn't require you to pre-construct the objects as does your attempt. All the calling of constructors is done internally to the function.
I'm not much of a Typescript user, and I don't have a clue about how to type such a function, or whether TS is even capable of doing so. (I think there's a reasonable chance that it is not.)
There are many ways that this might be expanded, if needed. We might want to allow for property names that vary between your input structure and the object member name, or we might want to allow other collection types besides arrays. If so, we probably would need a somewhat more sophisticated configuration structure, perhaps something like this:
{
org: { _ctor: Organization, admin: {type: 'usr'}, users: {type: Array, itemType: 'usr'} },
usr: { _ctor: User, account: {type: 'acct', renameTo: 'clientAcct'} },
acct: { _ctor: Account }
}
But that's for another day.
It's not clear whether this approach even comes close to meeting your needs, but it was an interesting problem to consider.
When i fetch new alerts, i want to check if the ID of the new alert was already recorded. The issue is that that ID is nested inside an array. There's the alertsDetails array, which contains objects and those objects have an _ID filed which is what i want to check. I am not sure how to achieve that. I got the code below but then i have to iterate over the result to check the exists value. Im sure there must be a better way.
const mongoose = require('mongoose');
const { Schema } = mongoose;
const G2AlertsSchema = new Schema(
{
status: { type: String, required: true },
openDate: { type: Date, required: true },
alertType: { type: Array, required: true },
severity: { type: Array, required: true },
locationName: { type: Array, required: true },
history: { type: Array, required: true },
alertDetails: { type: Array, required: false },
assignedTo: { type: Schema.Types.ObjectId, ref: 'user' },
},
{
timestamps: true,
},
);
const G2Alerts = mongoose.model('G2Alert', G2AlertsSchema);
module.exports = G2Alerts;
This is the code i found on mongodb's website. I just want to see if the ID exists only. Basically when i fetch the new alerts i get an array and i iterate over it, i want to check each item's ID against what's inside the Database. If it's there, skip and go to the next. If it's new, then create a new alert and save it.
const exists = await G2Alerts.aggregate([
{
$project: {
exists: {
$in: ['5f0b4f508bda3805754ab343', '$alertDetails._id'],
},
},
},
]);
EDIT: Another thing. I am getting a eslint warning saying i should use array iteration instead of a for loop. The issue is, i need to use await when looking up the Alert ID. If i use, reduce or filter, i can't use await. If i use async inside the reduce or filter function, then it will return promises in or just an empty array.
This below works, based on the answer provided by Tom Slabbaert
const newAlertsData = [];
for (let item of alertData.data.items) {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
newAlertsData.push(item);
}
}
if (newAlertsData.length !== 0) {......
But this does not
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
filtered.push(item);
}
return filtered;
}, []);
You're not far off, here is an example using the correct syntax:
const exists = await G2Alerts.findOne({"alertDetails._id": '5f0b4f508bda3805754ab343'}});
if (!exists) {
... do something
}
This can also be achieve using aggregate with a $match stage instead of a $project stage or even better countDocuments which just returns the count instead of the entire object if you do not require it.
One more thing I'd like to add is that make sure alertDetails._id is string type as you're using string in you're $in. otherwise you'll need to cast them to ObjectId type in mongoose like so:
new mongoose.Types.ObjectId('5f0b4f508bda3805754ab343')
And for Mongo:
import {ObjectId} from "mongodb"
...
new ObjectId('5f0b4f508bda3805754ab343')
EDIT
Try something like this?
let ids = alertData.data.items.map(item => item._id.toString());
let existing = await G2Alerts.distinct("alertsDetails._id", {"alertsDetails._id": {$in: ids}});
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString())) {
return [item].concat(filtered)
}
return filtered;
}, []);
This way you only need to call the db once and not multiple times.
Final code based on the provided answer.
const ids = alertData.data.items.map(item => item._id);
const existing = await G2Alerts.find({ 'alertDetails._id': { $in: ids } }).distinct(
'alertDetails._id',
(err, alerts) => {
if (err) {
res.send(err);
}
return alerts;
},
);
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString()) && item.openDate > dateLimit) {
return [item].concat(filtered);
}
return filtered;
}, []);
MirageJS provides all model ids as strings. Our backend uses integers, which are convenient for sorting and so on. After reading around MirageJS does not support integer IDs out of the box. From the conversations I've read the best solution would be to convert Ids in a serializer.
Output:
{
id: "1",
title: "Some title",
otherValue: "Some other value"
}
But what I want is:
Expected Output:
{
id: 1,
title: "Some title",
otherValue: "Some other value"
}
I really want to convert ALL ids. This would included nested objects, and serialized Ids.
I think you should be able to use a custom IdentityManager for this. Here's a REPL example. (Note: REPL is a work in progress + currently only works on Chrome).
Here's the code:
import { Server, Model } from "miragejs";
class IntegerIDManager {
constructor() {
this.ids = new Set();
this.nextId = 1;
}
// Returns a new unused unique identifier.
fetch() {
let id = this.nextId++;
this.ids.add(id);
return id;
}
// Registers an identifier as used. Must throw if identifier is already used.
set(id) {
if (this.ids.has(id)) {
throw new Error('ID ' + id + 'has already been used.');
}
this.ids.add(id);
}
// Resets all used identifiers to unused.
reset() {
this.ids.clear();
}
}
export default new Server({
identityManagers: {
application: IntegerIDManager,
},
models: {
user: Model,
},
seeds(server) {
server.createList("user", 3);
},
routes() {
this.resource("user");
},
});
When I make a GET request to /users with this server I get integer IDs back.
My solution is to traverse the data and recursively convert all Ids. It's working pretty well.
I have a number of other requirements, like removing the data key and embedding or serializing Ids.
const ApplicationSerializer = Serializer.extend({
root: true,
serialize(resource, request) {
// required to serializedIds
// handle removing root key
const json = Serializer.prototype.serialize.apply(this, arguments)
const root = resource.models
? this.keyForCollection(resource.modelName)
: this.keyForModel(resource.modelName)
const keyedItem = json[root]
// convert single string id to integer
const idToInt = id => Number(id)
// convert array of ids to integers
const idsToInt = ids => ids.map(id => idToInt(id))
// check if the data being passed is a collection or model
const isCollection = data => Array.isArray(data)
// check if data should be traversed
const shouldTraverse = entry =>
Array.isArray(entry) || entry instanceof Object
// check if the entry is an id
const isIdKey = key => key === 'id'
// check for serialized Ids
// don't be stupid and create an array of values with a key like `arachnIds`
const isIdArray = (key, value) =>
key.slice(key.length - 3, key.length) === 'Ids' && Array.isArray(value)
// traverse the passed model and update Ids where required, keeping other entries as is
const traverseModel = model =>
Object.entries(model).reduce(
(a, c) =>
isIdKey(c[0])
? // convert id to int
{ ...a, [c[0]]: idToInt(c[1]) }
: // convert id array to int
isIdArray(c[0], c[1])
? { ...a, [c[0]]: idsToInt(c[1]) }
: // traverse nested entries
shouldTraverse(c[1])
? { ...a, [c[0]]: applyFuncToModels(c[1]) }
: // keep regular entries
{ ...a, [c[0]]: c[1] },
{}
)
// start traversal of data
const applyFuncToModels = data =>
isCollection(data)
? data.map(model =>
// confirm we're working with a model, and not a value
model instance of Object ? traverseModel(model) : model)
: traverseModel(data)
return applyFuncToModels(keyedItem)
}
})
I had to solve this problem as well (fingers crossed that this gets included into the library) and my use case is simpler than the first answer.
function convertIdsToNumbers(o) {
Object.keys(o).forEach((k) => {
const v = o[k]
if (Array.isArray(v) || v instanceof Object) convertIdsToNumbers(v)
if (k === 'id' || /.*Id$/.test(k)) {
o[k] = Number(v)
}
})
}
const ApplicationSerializer = RestSerializer.extend({
root: false,
embed: true,
serialize(object, request) {
let json = Serializer.prototype.serialize.apply(this, arguments)
convertIdsToNumbers(json)
return {
status: request.status,
payload: json,
}
},
})
ers,
I'm having some trouble with this algorithm.
I'm using Redux, though I don't think that is really relevant for this problem. Basically the console.log statement in this code returns only one object, just as it should, but the function A returns an array of the two objects (even the one that didn't pass the test in function C)
I separated the functions into 3 parts to see if that would help me fix it, but I couldn't figure it out still.
Any advice?
const A = (state) => {
// looks through an array and passes down a resource
return state.resources.locked.filter((resource) => {
return B(state, resource);
})
};
// looks through an array and passes down a building
const B = (state, resource) => {
return state.bonfire.allStructures.filter((building) => {
return C(building, resource);
})
};
// checks if building name and resource requirment are the same, and if building is unlocked
// then returns only that one
const C = (building, resource) => {
if (building.unlocked && building.name == resource.requires.structure) {
console.log(resource);
return resource;
}
}
When using filter, do realise that the callback functions you pass to it are expected to return a boolean value indicating whether a particular element needs to be filtered in or out.
But in your case, B does not return a boolean, but an array. And even when that array is empty (indicating no resource matches), such a value will not be considered false by filter, and so the corresponding resource will still occur in the array returned by A.
A quick fix: get the length of the array that is returned by B, and return that instead. Zero will be considered false:
const A = (state) => {
// looks through an array and passes down a resource
return state.resources.locked.filter((resource) => {
return B(state, resource).length; /// <---- length!
})
};
// looks through an array and passes down a building
const B = (state, resource) => {
return state.bonfire.allStructures.filter((building) => {
return C(building, resource);
})
};
// checks if building name and resource requirement are the same, and if building
// is unlocked and then returns only that one
const C = (building, resource) => {
if (building.unlocked && building.name == resource.requires.structure) {
return resource;
}
}
// Sample data. Only x matches.
var state = {
resources: {
locked: [{ // resource
requires: {
structure: 'x'
}
}, { // resource
requires: {
structure: 'y'
}
}]
},
bonfire: {
allStructures: [{ // building
unlocked: true,
name: 'x'
}, { // building
unlocked: true,
name: 'z'
}]
}
};
console.log(A(state));
But better would be to really return booleans at each place where they are expected. So C should just return the result of the condition, and B could use some instead of filter, which not only returns a boolean, but also stops looking further once a match is found. In A you can have the original code now, as you really want A to return data (not a boolean).
Note also that you can use the short-cut notation for arrow functions that only have an expression that is evaluated:
// looks through an array and passes down a resource
const A = state => state.resources.locked.filter( resource => B(state, resource) );
// looks through an array and passes down a building
// Use .some instead of .filter: it returns a boolean
const B = (state, resource) =>
state.bonfire.allStructures.some( building => C(building, resource) );
// checks if building name and resource requirment are the same, and if building
// is unlocked and then returns only that one
// Return boolean
const C = (building, resource) => building.unlocked
&& building.name == resource.requires.structure;
// Sample data. Only x matches.
var state = {
resources: {
locked: [{ // resource
requires: {
structure: 'x'
}
}, { // resource
requires: {
structure: 'y'
}
}]
},
bonfire: {
allStructures: [{ // building
unlocked: true,
name: 'x'
}, { // building
unlocked: true,
name: 'z'
}]
}
};
console.log(A(state));