I've got these Interfaces:
export interface QueryObject {
id: ID;
path: string[];
filters: Filters;
}
export interface Filters {
state: string;
localeID: string;
role: string;
}
And trying to come up with a Functional Programming solution in order to just copy the properties that exist and have a value from a tmp object to the existing data model. Now,.. obviously this doesnt work. Filters will be completely overwritten while losing the properties localeID and role.
let queryObject: QueryObject;
let filters: Filters = { state: 'GB'}; // this obviously gives an error on Interface implementation
queryObject.filters = filters;
Right now I'am taking the original object, traversing to the property and overwriting it with the updated value.
const queryObject: QueryObject = _.cloneDeep(this.queryObject);
queryObject.filters.state = state; // 'GB'
this.portareService.update(queryObject, this.portareQuery.getActiveId());
Would be nice to solve this with a Object.assign or spread ... solution like:
{
return ...createQueryObject, updatedQueryObject
}
I know how to do this with a function using loop(s), but looking for a Functional Programming approach to this.
You could implement a concat method for both QueryObject and Filters. In the concat, you define what "merge logic" you want to use. The QueryObject calls the Filters' concat method internally.
Within the concat methods, you can use the spread syntax or any other logic to ensure new objects are created and you're not mutating anything.
By adding an empty constructor, you can easily start using those concatenators inside a reduce or other automated merge.
I found this blog post on Semigroups by Tom Harding super inspiring. This post about Monoids has some info about the empty part in it.
const QueryObject = ({id = null, path = null, filters = Filters.empty() })=> ({
id,
path,
filters,
concat: other => QueryObject({
id: other.id || id,
path: other.path || path,
filters: filters.concat(other.filters)
}),
toString: () => `QueryObject(${id}, ${path}, ${filters.toString()})`
});
QueryObject.empty = () => QueryObject({});
QueryObject.merge = (x, y) => x.concat(y);
const Filters = ({ state = null, localeID = null, role = null }) => ({
state,
localeID,
role,
concat: other => Filters({
state: other.state || state,
localeID: other.localeID || localeID,
role: other.role || role
}),
toString: () => `Filters(${state}, ${localeID}, ${role})`
});
Filters.empty = () => Filters({});
Filters.merge = (x, y) => x.concat(y);
const userFilter = Filters({ role: "User" });
const gbFilter = Filters({ localeID: "GB" });
const filterSettings = [userFilter, gbFilter];
const mergedFilter = filterSettings.reduce(Filters.merge, Filters.empty());
console.log(
"Merged Filter:",
mergedFilter.toString()
);
// Some base query
const accountQuery = QueryObject({ id: "CUSTOM_Q_1", path: "/accounts" });
// Derived queries
const userQuery = accountQuery.concat(QueryObject({ filters: userFilter }));
const gbQuery = accountQuery.concat(QueryObject({ filters: gbFilter }));
console.log(
"User Query:",
userQuery.toString()
);
console.log(
"Brittish Users Query",
userQuery.concat(gbQuery).toString()
);
Edit:
Of course, without the "theory", there's also the more generic:
const uniques = xs => Array.from(new Set(xs));
const nullMergeStrategy = (obj1, obj2) =>
uniques(
Object.keys(obj1)
.concat(Object.keys(obj2))
).reduce(
(acc, k) => Object.assign(acc, { [k]: obj2[k] || obj1[k] }),
{}
);
const Filter = ({ state = null, localeID = null, role = null }) =>
({ state, localeID, role });
const userFilter = Filter({ role: "User" });
const gbFilter = Filter({ localeID: "GB" });
console.log(
nullMergeStrategy(userFilter, gbFilter)
)
Related
I have a function in Typescript in which, from a string, it returns the key of the object that contains that variable.
I'm using Array.prototype.includes() to check that the variable exists, this can return true or false, so typing the resulting variable as a string gives me a typing error.
This is the error:
Type 'unknown' cannot be assigned to type 'string'
This is my function:
let testData = {
data1: ['CAR','PLANE'],
data2: ['COUNTRY','CITY']
};
let asset = 'car';
function resKey(asset) {
let res = Object.keys(testData).find(key => {
const value = testData[key]
return value.includes(asset.toUpperCase())
})
return res
}
console.log(resKey(asset));
I'm only going to pass it values that are in the object, so I don't need to check if it exists.
My problem: how can I modify the function so that it only returns the key without the need to check if it exists?
As I noted in the comments you can make your current function Typescript friendly by giving it some type annotations.
Alternatively you can avoid the find() by creating a Map of the testData and just retrieving the key by value directly. Keep in mind that if there are duplicate values between different asset arrays this will return the key of the last instance (as opposed to find() which will return the key of the first instance).
let testData = {
data1: ['CAR', 'PLANE'],
data2: ['COUNTRY', 'CITY']
};
let asset = 'car';
function resKey(asset) {
const resMap = new Map(Object.entries(testData).flatMap(([k, v]) => v.map(a => [a, k])));
return resMap.get(asset.toUpperCase());
}
console.log(resKey(asset));
To avoid creating a new Map on every call of the function you might employ a little currying.
function resKeyFactory(data) {
const resMap = new Map(
Object.entries(data)
.flatMap(([k, v]) => v.map(a => [a, k]))
);
return (asset) => resMap.get(asset.toUpperCase());
}
const
testData = {
data1: ['CAR', 'PLANE'],
data2: ['COUNTRY', 'CITY']
},
asset = 'car',
resKeyTestData = resKeyFactory(testData);
console.log(resKeyTestData(asset));
console.log(resKeyTestData('city'));
Typescript requires a high enough target to accept flatMap playground.
You could use Object.entries() and find().
let testData = {
data1: ['CAR', 'PLANE'],
data2: ['COUNTRY', 'CITY']
};
function resKey(data, asset) {
return Object.entries(data).find(([_, v]) => {
return v.includes(asset.toUpperCase())
})?.[0] || "no key found";
}
console.log(resKey(testData, "car"));
console.log(resKey(testData, "something"));
console.log(resKey(testData, "city"));
If you want to type, it would look something like this.
interface IData {
data1: string[];
data2: string[];
}
let testData: IData = {
data1: ['CAR', 'PLANE'],
data2: ['COUNTRY', 'CITY']
};
function resKey(data: IData, asset: string): string {
return Object.entries(data).find(([_, v]) => {
return v.includes(asset.toUpperCase())
})?.[0] || "no key found";
}
I have a JavaScript object with some nested properties that I want to update based on some conditions. The starting object could be something like:
const options = {
formatOption: {
label: 'Model Format',
selections: {
name: 'Specific Format',
value: '12x28',
}
},
heightOption: {
label: 'Model Height',
selections: {
name: 'Specific Height',
value: '15',
}
}
};
I have come up with a solution using Object.keys, reduce and the spread operator, but I would like to know if this is the best / more concise way as of today or if there is a better way. I'm not looking for the most performing option, but for a "best practice" (if there is one) or a more elegant way.
EDIT 30/01/20
As pointed out in the comments by #CertainPerformance my code was mutating the original options variable, so I am changing the line const option = options[key]; to const option = { ...options[key] };. I hope this is correct and that the function is not mutating the original data.
const newObject = Object.keys(options).reduce((obj, key) => {
const option = { ...options[key] };
const newVal = getNewValue(option.label); // example function to get new values
// update based on existence of new value and key
if (option.selections && option.selections.value && newVal) {
option.selections.value = newVal;
}
return {
...obj,
[key]: option,
};
}, {});
getNewValue is an invented name for a function that I am calling in order to get an 'updated' version of the value I am looking at. In order to reproduce my situation you could just replace
the line const newVal = getNewValue(option.label); with const newVal = "bla bla";
Since you tagged this q with functional-programming here is a functional approach. Functional Lenses are an advanced FP tool and hence hard to grasp for newbies. This is just an illustration to give you an idea of how you can solve almost all tasks and issues related to getters/setters with a single approach:
// functional primitives
const _const = x => y => x;
// Identity type
const Id = x => ({tag: "Id", runId: x});
const idMap = f => tx =>
Id(f(tx.runId));
function* objKeys(o) {
for (let prop in o) {
yield prop;
}
}
// Object auxiliary functions
const objSet = (k, v) => o =>
objSetx(k, v) (objClone(o));
const objSetx = (k, v) => o =>
(o[k] = v, o);
const objDel = k => o =>
objDelx(k) (objClone(o));
const objDelx = k => o =>
(delete o[k], o);
const objClone = o => {
const p = {};
for (k of objKeys(o))
Object.defineProperty(
p, k, Object.getOwnPropertyDescriptor(o, k));
return p;
};
// Lens type
const Lens = x => ({tag: "Lens", runLens: x});
const objLens_ = ({set, del}) => k => // Object lens
Lens(map => ft => o =>
map(v => {
if (v === null)
return del(k) (o);
else
return set(k, v) (o)
}) (ft(o[k])));
const objLens = objLens_({set: objSet, del: objDel});
const lensComp3 = tx => ty => tz => // lens composition
Lens(map => ft =>
tx.runLens(map) (ty.runLens(map) (tz.runLens(map) (ft))));
const lensSet = tx => v => o => // set operation for lenses
tx.runLens(idMap) (_const(Id(v))) (o);
// MAIN
const options = {
formatOption: {
label: 'Model Format',
selections: {
name: 'Specific Format',
value: '12x28',
}
},
heightOption: {
label: 'Model Height',
selections: {
name: 'Specific Height',
value: '15',
}
}
};
const nameLens = lensComp3(
objLens("formatOption"))
(objLens("selections"))
(objLens("name"));
const options_ = lensSet(nameLens) ("foo") (options).runId;
// deep update
console.log(options_);
// reuse of unaffected parts of the Object tree (structural sharing)
console.log(
options.heightOptions === options_.heightOptions); // true
This is only a teeny-tiny part of the Lens machinery. Functional lenses have the nice property to be composable and to utilize structural sharing for some cases.
If you want to set a value for a nested property in a immutable fashion,
then you should consider adopting a library rather than doing it manually.
In FP there is the concept of lenses
Ramda provides a nice implementation: https://ramdajs.com/docs/
const selectionsNameLens = R.lensPath(
['formatOption', 'selections', 'name'],
);
const setter = R.set(selectionsNameLens);
// ---
const data = {
formatOption: {
label: 'Model Format',
selections: {
name: 'Specific Format',
value: '12x28',
},
},
heightOption: {
label: 'Model Height',
selections: {
name: 'Specific Height',
value: '15',
},
},
};
console.log(
setter('Another Specific Format', data),
);
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js" integrity="sha256-xB25ljGZ7K2VXnq087unEnoVhvTosWWtqXB4tAtZmHU=" crossorigin="anonymous"></script>
The first comment from CertainPerformance made me realize that I was mutating the original options variable. My first idea was to make a copy with the spread operator, but the spread operator only makes a shallow copy, so even in my edit I was still mutating the original object.
What I think is a solution is to create a new object with only the updated property, and to merge the two objects at the end of the reducer.
EDIT
The new object also needs to be merged with the original option.selections, otherwise I would still overwrite existing keys at that level (ie I would overwrite option.selections.name).
Here is the final code:
const newObject = Object.keys(options).reduce((obj, key) => {
const option = options[key];
const newVal = getNewValue(option.label); // example function to get new values
const newOption = {}; // create a new empty object
// update based on existence of new value and key
if (option.selections && option.selections.value && newVal) {
// fill the empty object with the updated value,
// merged with a copy of the original option.selections
newOption.selections = {
...option.selections,
value: newVal
};
}
return {
...obj, // accumulator
[key]: {
...option, // merge the old option
...newOption, // with the new one
},
};
}, {});
A more concise version that has been suggested to me would be to use forEach() instead of reduce(). In this case the only difficult part would be to clone the original object. One way would be to use lodash's _.cloneDeep(), but there are plenty of options (see here).
Here is the code:
const newObject = _.cloneDeep(options);
Object.keys(newObject).forEach(key => {
const newVal = getNewValue(newObject[key].label); // example function to get new values
// update based on existence of new value and key
if (newObject[key].selections && newObject[key].selections.value && newVal) {
newObject[key].selections.value = newVal;
}
});
The only problem is that forEach() changes values that are declared outside of the function, but reduce() can mutate its parameter (as it happened in my original solution), so the problem is not solved by using reduce() alone.
I'm not sure that this is the best solution, but it surely is much more readable for the average developer than my first try or the other solutions.
const orignalArr = [
{
personName: 'Joe'
}
]
expected output:
const convertedArr = [
{
name: 'Joe'
}
]
I'm thinking the renamed keys are defined in an object (but fine if there's a better way to map them):
const keymaps = {
personName: 'name'
};
How can I do this with Ramda?
Something with R.map
There is an entry in Ramda's Cookbook for this:
const renameKeys = R.curry((keysMap, obj) =>
R.reduce((acc, key) => R.assoc(keysMap[key] || key, obj[key], acc), {}, R.keys(obj))
);
const originalArr = [{personName: 'Joe'}]
console .log (
R.map (renameKeys ({personName: 'name'}), originalArr)
)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>
But with the ubiquity of ES6, it's pretty easy to write this directly:
const renameKeys = (keysMap) => (obj) => Object.entries(obj).reduce(
(a, [k, v]) => k in keysMap ? {...a, [keysMap[k]]: v} : {...a, [k]: v},
{}
)
You can combine Ramda with Ramda Adjunct. Using the renameKeys (https://char0n.github.io/ramda-adjunct/2.27.0/RA.html#.renameKeys) method is very useful. With it you can simply do something like this:
const people = [
{
personName: 'Joe'
}
]
const renameKeys = R.map(RA.renameKeys({ personName: 'name' }));
const __people__ = renameKeys(people);
console.log(__people__) // [ { name: 'Joe' }]
Hope it helped you :)
This is my take on renameKeys. The main idea is to separate the keys and values to two array. Map the array of keys, and replace with values from keyMap (if exist), then zip back to object:
const { pipe, toPairs, transpose, converge, zipObj, head, map, last } = R
const renameKeys = keysMap => pipe(
toPairs, // convert to entries
transpose, // convert to array of keys, and array of values
converge(zipObj, [ // zip back to object
pipe(head, map(key => keysMap[key] || key)), // rename the keys
last // get the values
])
)
const originalArr = [{ personName: 'Joe', lastName: 'greg' }]
const result = R.map(renameKeys({ personName: 'name' }), originalArr)
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>
My idea to make it is to first check that the old prop I want to rename exists, and the new key I want to create doesn’t.
Then, I will use the S_ common combinator to make it point-free.
Find JS common combinators here
const {
allPass, assoc, compose: B, complement, has, omit, prop, when
} = require('ramda');
const S_ = (f) => (g) => (x) => f (g (x)) (x);
const renameKey = (newKey) => (oldKey) => when(allPass([
has(oldKey)
, complement(has)(newKey)
]))
(B(omit([oldKey]), S_(assoc(newKey))(prop(oldKey))))
const obj = { fullname: 'Jon' };
renameKey('name')('fullname')(obj) // => { name: ‘Jon’ }
Here is my own solution, not too many arrow functions (just one), mostly pure Ramda calls. And it is one of shortest, if not the shortest ;)
First, based on your example
const { apply, compose, either, flip, identity, map, mergeAll, objOf, prop, replace, toPairs, useWith } = require('ramda');
const RenameKeys = f => compose(mergeAll, map(apply(useWith(objOf, [f]))), toPairs);
const originalArr = [
{
personName: 'Joe',
},
];
const keymaps = {
personName: 'name',
};
// const HowToRename = flip(prop)(keymaps); // if you don't have keys not specified in keymaps explicitly
const HowToRename = either(flip(prop)(keymaps), identity);
console.log(map(RenameKeys(HowToRename))(originalArr));
Second option, using any arbitrary lambda with renaming rules:
const { apply, compose, map, mergeAll, objOf, replace, toPairs, useWith } = require('ramda');
const RenameKeys = f => compose(mergeAll, map(apply(useWith(objOf, [f]))), toPairs);
const HowToRename = replace(/(?<=.)(?!$)/g, '_'); // for example
console.log(RenameKeys(HowToRename)({ one: 1, two: 2, three: 3 }));
Yields
{ o_n_e: 1, t_w_o: 2, t_h_r_e_e: 3 }
Third, you can use object-based rename rules from the first example and use fallback strategy, e.g. replace like in the second example, instead of identity.
I'm trying to create a set of reducers in order to change an attribute of all objects in a nested list.
The input payload looks like the following:
const payload = [
{
name: "Peter",
children: [
{
name: "Sarah",
children: [
{
name: "Sophie",
children: [
{
name: "Chris"
}
]
}
]
}
]
}
];
I now want to change the name attribute of all elements and child elements.
const mapJustNickname = elem => {
return {
...elem,
nickname: elem.name + "y"
};
};
How do I use this map function recursively on all child elements?
I found a way to do this by putting the the recursion within the same mapping function.
const mapToNickname = (elem) => {
return {
nickname: elem.name +'y',
children: elem.children && elem.children.map(mapToNickname)
}
}
console.log(payload.map(mapToNickname));
But I'd like to have the mapping of the name separated from the recursion (for reasons of keeping the mapping functions as simple as possible) and being able to chain them later. Is it somehow possible to do this with two reducers and then chaining them together?
Let's start by rigorously defining the data structures:
data Person = Person { name :: String, nickname :: Maybe String }
data Tree a = Tree { value :: a, children :: Forest a }
type Forest a = [Tree a]
type FamilyTree = Tree Person
type FamilyForest = Forest Person
Now, we can create mapTree and mapForest functions:
const mapTree = (mapping, { children=[], ...value }) => ({
...mapping(value),
children: mapForest(mapping, children)
});
const mapForest = (mapping, forest) => forest.map(tree => mapTree(mapping, tree));
// Usage:
const payload = [
{
name: "Peter",
children: [
{
name: "Sarah",
children: [
{
name: "Sophie",
children: [
{
name: "Chris"
}
]
}
]
}
]
}
];
const mapping = ({ name }) => ({ name, nickname: name + "y" });
const result = mapForest(mapping, payload);
console.log(result);
Hope that helps.
Create a recursive map function that maps an item, and it's children (if exists). Now you can supply the recursiveMap with a ever transformer function you want, and the transformer doesn't need to handle the recursive nature of the tree.
const recursiveMap = childrenKey => transformer => arr => {
const inner = (arr = []) =>
arr.map(({ [childrenKey]: children, ...rest }) => ({
...transformer(rest),
...children && { [childrenKey]: inner(children) }
}));
return inner(arr);
};
const mapNickname = recursiveMap('children')(({ name, ...rest }) => ({
name,
nickname: `${name}y`,
...rest
}));
const payload = [{"name":"Peter","children":[{"name":"Sarah","children":[{"name":"Sophie","children":[{"name":"Chris"}]}]}]}];
const result = mapNickname(payload);
console.log(result)
const fields = ['email', 'password'];
const objFields = {};
fields.forEach(value => {
objFields[value] = '';
});
console.log(objFields);
// Outputs {email: "", password: ""}
I want to achieve the same result but without having to initialize an empty object.
Actually my case is that I want to set initial state of a React component.
class App extends Component {
fields = ['email', 'password'];
state = {
fields: // the one liner code here that should return the object created from fields array,
};
...
expected result would be
// state = {fields: {email: "", password: ""}}
Whenever you're looking for reducing an array of values to one value, you're looking for .reduce()
state = {
fields: fields.reduce((acc, key) => ({...acc, [key]: ''}), {}),
};
You could map objects and assign all to a single object.
const
fields = ['email', 'password'],
object = Object.assign({}, ...fields.map(key => ({ [key]: '' })));
console.log(object);
In modern browsers, or by using polyfills, you can use Object.fromEntries() to create an object from an array, using the array's values as keys/properties, and fill the object's values with a default.
const fields = ['email', 'password'];
const result = Object.fromEntries(fields.map(value => [value, '']));
The result is {email: "", password: ""}.
You need to transform your array which contains keys into a real object.
To do it you have many possibilites, but you still have to do something, there is no magical trick.
My favorite soluce is to use a function to insert into your Utilitary class. So it's easy to read and re-usable.
number 1 : The function
function initializeKeys(keys, initialValue, object) {
return keys.reduce((tmp, x) => {
tmp[x] = initialValue;
return tmp;
}, object);
}
const objFields = initializeKeys(['email', 'password'], '', {
otherKey: 'a',
});
console.log(objFields);
number 2 : The forEach
const fields = ['email', 'password'];
const objFields = {};
fields.forEach(value => {
objFields[value] = '';
});
console.log(objFields);
number 3 : The reduce
const fields = ['email', 'password'];
const objFields = {
...fields.reduce((tmp, x) => {
tmp[x] = '';
return tmp;
}, {}),
};
console.log(objFields);