In my React application I have created a custom hook which returns a set of data wherever used.
It's used like const projects = useProjects();
It's an array of objects , we can assume it looks like this :
[{project_id : 123 , name : 'p1'} , {project_id : 1234 , name : 'p2'} ]
Now I need to enrich this data with data from an API. So i have to loop through projects and basically add a new field to each object, so the new array of objects will look like this :
[{project_id : 123 , name : 'p1' field3: 'api data'} , {project_id : 1234 , name : 'p2' , field3: 'api data1' } ]
How can I achieve this ?
What I did was loop through the projects data and directly added the field inside the loop. But I dont know if i should be mutating this data like that ? I was hoping to see if this is good practice or not.
There's a few ways you could solve this - it all depends on how you're getting the data back from the API. If you want to have that injected into the hook, you could do something like this -
const DEFAULT_PROJECTS = [
{ project_id : 123, name: 'p1' },
{ project_id : 1234, name: 'p2' },
];
const useProjects = (dataFromApi) => {
// Assuming that dataFromApi is what you got back from your API call,
// and it's a dictionary keyed on the project_id.
return useMemo(() => {
return DEFAULT_PROJECTS.map(proj => {
const extraData = dataFromApi.get(proj.project_id) || {};
return {
...proj,
...extraData,
};
});
}, [dataFromApi]);
};
The useMemo here isn't super helpful if the dataFromApi is always changing - it will just rebuild the returned object every time.
If you wanted to get the data as part of your hook, you do something like this -
import { useEffect, useMemo, useState } from 'react';
const DEFAULT_PROJECTS = [
{ project_id : 123, name: 'p1' },
{ project_id : 1234, name: 'p2' },
];
const useProjects = () => {
const [dataFromApi, setDataFromApi] = useState(null);
useEffect(() => {
if (!!dataFromApi) return;
// Simulate the data fetch
const fetchData = async () => {
return new Promise(resolve => {
setTimeout(() => {
const map = new Map();
map.set(123, {
field3: 'api data',
field4: 'other data',
});
setDataFromApi(map);
}, 2000);
});
};
fetchData();
}, [dataFromApi]);
return useMemo(() => {
let extraData = dataFromApi || new Map();
return DEFAULT_PROJECTS.map(proj => {
const extraFields = extraData.get(proj.project_id) || {};
return {
...proj,
...extraFields,
};
});
}, [dataFromApi]);
}
export default useProjects;
Here's a code sandbox that shows it in action.
Related
I'm trying to build an index on using flexsearch and nodejs and store it on a local disk as it take quite a bit of time to build. The export seems to work, but when trying to import the file again with a new document index I get this error:
TypeError: Cannot read property 'import' of undefined
at Q.t.import (/opt/hermetic/hermetic/server/node_modules/flexsearch/dist/flexsearch.bundle.js:33:330)
at Object.retrieveIndex (/opt/hermetic/hermetic/server/build/search.js:86:25)
at Object.search (/opt/hermetic/hermetic/server/build/search.js:96:32)
at init (/opt/hermetic/hermetic/server/build/server.js:270:27)
I'm running nodejs version 14 and flexsearch version 0.7.21. Below is the code I am using:
import fs from 'fs';
import Flexsearch from 'flexsearch';
const createIndex = async () => {
const { Document } = Flexsearch;
const index = new Document({
document: {
id: 'id',
tag: 'tag',
store: true,
index: [
'record:a',
'record:b',
'tag',
],
},
});
index.add({ id: 0, tag: 'category1', record: { a: '1 aaa', b: '0 bbb' } });
index.add({ id: 1, tag: 'category1', record: { a: '1 aaa', b: '1 bbb' } });
index.add({ id: 2, tag: 'category2', record: { a: '2 aaa', b: '2 bbb' } });
index.add({ id: 3, tag: 'category2', record: { a: '2 aaa', b: '3 bbb' } });
console.log('search', index.search('aaa'));
await index.export((key, data) => fs.writeFile(`./search_index/${key}`, data, err => !!err && console.log(err)));
return true;
}
const retrieveIndex = async () => {
const { Document } = Flexsearch;
const index = new Document({
document: {
id: 'id',
tag: 'tag',
store: true,
index: [
'record:a',
'record:b',
'tag',
],
},
});
const keys = fs
.readdirSync('./search_index', { withFileTypes: true }, err => !!err && console.log(err))
.filter(item => !item.isDirectory())
.map(item => item.name);
for (let i = 0, key; i < keys.length; i += 1) {
key = keys[i];
const data = fs.readFileSync(`./search_index/${key}`, 'utf8');
index.import(key, data);
}
return index;
}
await createIndex();
const index = await retrieveIndex();
console.log('cached search', index.search('aaa'));
I was trying to find a way to export the index properly too, originally trying to put everything into one file. While it worked, I didn't really like the solution.
Which brought me to your SO question, I've checked your code and managed to find out why you get that error.
Basically the export is a sync operation, while you also (randomly) use async. In order to avoid the issue, you need to remove all async code and only use sync node.fs operations. For my solution, I also only once created the Document store, to then just fill it via retrieveIndex() rather than creating new Document() per function.
I also added a .json extension to guarantee that node.fs reads the file properly and for sanity purposes - afterall it's json stored.
So thanks for giving me the idea to store each key as file #Jamie Nicholls 🤝
import fs from 'fs';
import { Document } from 'flexsearch'
const searchIndexPath = '/Users/user/Documents/linked/search-index/'
let index = new Document({
document: {
id: 'date',
index: ['content']
},
tokenize: 'forward'
})
const createIndex = () => {
index.add({ date: "2021-11-01", content: 'asdf asdf asd asd asd asd' })
index.add({ date: "2021-11-02", content: 'fobar 334kkk' })
index.add({ date: "2021-11-04", content: 'fobar 234 sffgfd' })
index.export(
(key, data) => fs.writeFileSync(`${searchIndexPath}${key}.json`, data !== undefined ? data : '')
)
}
createIndex()
const retrieveIndex = () => {
const keys = fs
.readdirSync(searchIndexPath, { withFileTypes: true })
.filter(item => !item.isDirectory())
.map(item => item.name.slice(0, -5))
for (let i = 0, key; i < keys.length; i += 1) {
key = keys[i]
const data = fs.readFileSync(`${searchIndexPath}${key}.json`, 'utf8')
index.import(key, data ?? null)
}
}
const searchStuff = () => {
retrieveIndex()
console.log('cached search', index.search('fo'))
}
searchStuff()
After looking into this further, the feature is not currently available for document type searches. See this issue in github for more information
I am beating my head against a wall. I have updated to Apollo 3, and cannot figure out how to migrate an updateQuery to a typePolicy. I am doing basic continuation based pagination, and this is how I used to merged the results of fetchMore:
await fetchMore({
query: MessagesByThreadIDQuery,
variables: {
threadId: threadId,
limit: Configuration.MessagePageSize,
continuation: token
},
updateQuery: (prev, curr) => {
// Extract our updated message page.
const last = prev.messagesByThreadId.messages ?? []
const next = curr.fetchMoreResult?.messagesByThreadId.messages ?? []
return {
messagesByThreadId: {
__typename: 'MessagesContinuation',
messages: [...last, ...next],
continuation: curr.fetchMoreResult?.messagesByThreadId.continuation
}
}
}
I have made an attempt to write the merge typePolicy myself, but it just continually loads and throws errors about duplicate identifiers in the Apollo cache. Here is what my typePolicy looks like for my query.
typePolicies: {
Query: {
fields: {
messagesByThreadId: {
keyArgs: false,
merge: (existing, incoming, args): IMessagesContinuation => {
const typedExisting: IMessagesContinuation | undefined = existing
const typedIncoming: IMessagesContinuation | undefined = incoming
const existingMessages = (typedExisting?.messages ?? [])
const incomingMessages = (typedIncoming?.messages ?? [])
const result = existing ? {
__typename: 'MessageContinuation',
messages: [...existingMessages, ...incomingMessages],
continuation: typedIncoming?.continuation
} : incoming
return result
}
}
}
}
}
So I was able to solve my use-case. It seems way harder than it really needs to be. I essentially have to attempt to locate existing items matching the incoming and overwrite them, as well as add any new items that don't yet exist in the cache.
I also have to only apply this logic if a continuation token was provided, because if it's null or undefined, I should just use the incoming value because that indicates that we are doing an initial load.
My document is shaped like this:
{
"items": [{ id: string, ...others }],
"continuation": "some_token_value"
}
I created a generic type policy that I can use for all my documents that have a similar shape. It allows me to specify the name of the items property, what the key args are that I want to cache on, and the name of the graphql type.
export function ContinuationPolicy(keyArgs: Array<string>, itemPropertyKey: string, typeName: string) {
return {
keyArgs,
merge(existing: any, incoming: any, args: any) {
if (!!existing && !!args.args?.continuation) {
const existingItems = (existing ? existing[itemPropertyKey] : [])
const incomingItems = (incoming ? incoming[itemPropertyKey] : [])
let items: Array<any> = [...existingItems]
for (let i = 0; i < incomingItems.length; i++) {
const current = incomingItems[i] as any
const found = items.findIndex(m => m.__ref === current.__ref)
if (found > -1) {
items[found] === current
} else {
items = [...items, current]
}
}
// This new data is a continuation of the last data.
return {
__typename: typeName,
[itemPropertyKey]: items,
continuation: incoming.continuation
}
} else {
// When we have no existing data in the cache, we'll just use the incoming data.
return incoming
}
}
}
}
In my post request I need to pass an array with an object inside it.
when I tried to add new properties inside an object its adding.
but when I tried to add when an object is present inside an array its not adding.
I have sportsvalues as array const sportsValues = [{ ...values }];
I am trying to build something like this, so that I can pass in the api
[
{
"playerName": 3,
"playerHeight": 1
}
]
can you tell me how to fix it.
providing my code snippet below.
export function sports(values) {
const sportsValues = [{ ...values }];
sportsValues.push(playerName:'3');
console.log("sportsValues--->", sportsValues);
// sportsValues.playerName = 3//'';
// sportsValues.playerHeight = 1//'';
console.log("after addition sportsValues--->", sportsValues);
console.log("after deletion sportsValues--->", sportsValues);
return dispatch => {
axios
.post(`${url}/sport`, sportsValues)
.then(() => {
return;
})
.catch(error => {
alert(`Error\n${error}`);
});
};
}
Since sportsValues is an array of objects, you can push new object into it. Check out code below.
const sportsValues = [];
sportsValues.push({
playerName:'3',
playerHeight: 1,
});
console.log(sportsValues);
I don't fully understand what you're trying to do, but here's some pointers:
If you're trying to update the object that's inside the array, you first have to select the object inside the array, then update it's attribute:
sportsValues[0].playerName = 3
although, I recommend building the object correctly first, then passing it to the array, it makes it a little easier to understand in my opinion:
const sportsValues = [];
const firstValue = { ...values };
firstValue.playerName = '3';
sportsValues.push(firstValue);
or
const firstValue = { ...values };
firstValue.playerName = '3';
const sportsValues = [firstValue];
or
const sportsValues = [{
...values,
playername: '3',
}];
if you're trying to add a new object to the array, you can do this:
const sportsValues = [{ ...values }];
sportsValues.push({ playerName: '3' });
etc...
Array.push adds a new item to the array, so in your code, you're going to have 2 items because you assign 1 item at the beginning and then push a new item:
const ar = [];
// []
ar.push('item');
// ['item']
ar.push({ text: 'item 2' });
// ['item', { text: 'item 2' }]
etc...
export function sports(values) {
const sportsValues = [{ ...values }];
sportsValues.push(playerName:'3');
let playerName='3'
sportsValues.playerName= playerName; // you can bind in this way
console.log("sportsValues--->", sportsValues);
return dispatch => {
axios
.post(`${url}/sport`, sportsValues)
.then(() => {
return;
})
.catch(error => {
alert(`Error\n${error}`);
});
};
}
I need to add an id number to the nested array in data called action. The code I'm using is:
const { data } = this.state
const newData = Object.assign([...data.action], Object.assign([...data.action],{0:'id' }))
but this code is not working. The result I am looking for is:
{id:1 action: "user...}
You can just use the spread operator.
const newData = {
...data,
action: {
...data.action,
id: 1
}
};
If action is an array, you can try something like this:
const newAction = data.action.map((actionItem, index) => ({
...actionItem,
id: index + 1
}));
const newData = {
...data,
action: newAction
};
The relevant Redux state consists of an array of objects representing layers.
Example:
let state = [
{ id: 1 }, { id: 2 }, { id: 3 }
]
I have a Redux action called moveLayerIndex:
actions.js
export const moveLayerIndex = (id, destinationIndex) => ({
type: MOVE_LAYER_INDEX,
id,
destinationIndex
})
I would like the reducer to handle the action by swapping the position of the elements in the array.
reducers/layers.js
const layers = (state=[], action) => {
switch(action.type) {
case 'MOVE_LAYER_INDEX':
/* What should I put here to make the below test pass */
default:
return state
}
}
The test verifies that a the Redux reducer swaps an array's elements in immutable fashion.
Deep-freeze is used to check the initial state is not mutated in any way.
How do I make this test pass?
test/reducers/index.js
import { expect } from 'chai'
import deepFreeze from'deep-freeze'
const id=1
const destinationIndex=1
it('move position of layer', () => {
const action = actions.moveLayerIndex(id, destinationIndex)
const initialState = [
{
id: 1
},
{
id: 2
},
{
id: 3
}
]
const expectedState = [
{
id: 2
},
{
id: 1
},
{
id: 3
}
]
deepFreeze(initialState)
expect(layers(initialState, action)).to.eql(expectedState)
})
One of the key ideas of immutable updates is that while you should never directly modify the original items, it's okay to make a copy and mutate the copy before returning it.
With that in mind, this function should do what you want:
function immutablySwapItems(items, firstIndex, secondIndex) {
// Constant reference - we can still modify the array itself
const results= items.slice();
const firstItem = items[firstIndex];
results[firstIndex] = items[secondIndex];
results[secondIndex] = firstItem;
return results;
}
I wrote a section for the Redux docs called Structuring Reducers - Immutable Update Patterns which gives examples of some related ways to update data.
You could use map function to make a swap:
function immutablySwapItems(items, firstIndex, secondIndex) {
return items.map(function(element, index) {
if (index === firstIndex) return items[secondIndex];
else if (index === secondIndex) return items[firstIndex];
else return element;
}
}
In ES2015 style:
const immutablySwapItems = (items, firstIndex, secondIndex) =>
items.map(
(element, index) =>
index === firstIndex
? items[secondIndex]
: index === secondIndex
? items[firstIndex]
: element
)
There is nothing wrong with the other two answers, but I think there is even a simpler way to do it with ES6.
const state = [{
id: 1
}, {
id: 2
}, {
id: 3
}];
const immutableSwap = (items, firstIndex, secondIndex) => {
const result = [...items];
[result[firstIndex], result[secondIndex]] = [result[secondIndex], result[firstIndex]];
return result;
}
const swapped = immutableSwap(state, 2, 0);
console.log("Swapped:", swapped);
console.log("Original:", state);