How to add an object in reducer? - javascript

I have a store that looks like this, 3 objects within an array that carry array. How can I add an object in between the first and second object that carry index 1 and 2?
{
object: [
{
"index": 1,
"title": "title",
"caption": "caption",
},
{
"index": 2,
"title": "title",
"caption": "caption",
},
{
"index": 3,
"title": "title",
"caption": "caption",
},
]
}
Would like to have the final output like this after clicking a button that pass in the index value of 1.
{
object: [
{
"index": 1,
"title": "title",
"caption": "caption",
},
{
"index": 2,
"title": "NEW",
"caption": "NEW",
},
{
"index": 3,
"title": "title",
"caption": "caption",
},
{
"index": 4,
"title": "title",
"caption": "caption",
},
]
}
I can use the following codes to change the index value through action, but how to add another new object in between object 1 and object 2, plus changing the index value at the same time?
switch (action.type) {
case ADDCOMPONENT:
return {
...state,
object: state.object.map(component =>
(component.index > action.index ?
{ ...component, index: component.index + 1 } : component)),
};

smth.object.splice(index, 0, item);
And don't keep item index as a string. You can easily get position of item in array, and then add 1 to the value

state.object.splice(component.index, 0, component);
should do the trick

The other answers seem to miss the point that you need to increment the indexes of the other objects in the array. I'd approach it in two steps: first adding the new object with splice, then looping through and incrementing all subsequent indexes.
var index = 1;
state.object.splice(index, 0, new_component); //Insert the new object
//Starting where you inserted, add one to all later components
for (var i = index +1; i < state.object.length; i++) {
state.object[i].index ++;
}
After this state holds the value you want.
However, I'd encourage you to think about if the objects really need to know where they are in the array. In my experience, any code that would need to access the objects would be able to tell where they are in the array.

The shape of your store is created by the combination of all your reducers. The data that you want to update your store with should be created in your application and then sent to your reducer with dispatch(action). The reducer takes the information on the action and updates your state accordingly. If you want to add an object into an array you can use Array.prototype.splice() as follows: myArray.splice( startIndex, 0, itemsToInsert, ... ).
In short, don't add the object in your reducer. Add it to the data you are sending in your action, before you send the action.
If you would like to be able to insert things into an array and not mutate them, you should think about using a function like the one in this snippet:
function nonMutatingArrayInsert( array, index, insert ) {
// return array if index is outside range of array
if ( index < 0 || index > array.length - 1 ) return array;
// ensure insert is array for concat purposes
let _insert = Array.isArray( insert ) ? insert : [ insert ];
// handle 0 index insert
if ( index === 0 ) { return [ ..._insert, ...array ]; }
// handle end of array insert
if ( index === array.length ) { return [ ...array, ..._insert ] }
// handle everyhing else
const before = array.slice( 0, index );
const after = array.slice( index, array.length );
// return new non-mutated array
return [ ...before, ..._insert, ...after ];
}
let myArray = [ "one", "four" ];
let newArray = nonMutatingArrayInsert( myArray, 1, ["two", "three"] );
console.log( "myArray:\n", myArray );
console.log( "newArray:\n", newArray );

All the answers works, however in order to not mutate the array, I have used the following codes instead.
Slice the array before and insert the new object in between.
let newObject = state.portfolio.components.slice(0, action.index);
newObject = newObject.concat(NEWOBJECT);
newObject = newObject.concat(state.portfolio.components.slice(action.index));
for (let i = action.index; i < newComponent.length; i++) {
newComponent[i].index = i + 1;
}
Replace the object with the newObject.
switch (action.type) {
case ADDCOMPONENT:
return {
...state,
object: newObject,
};
EDITED: Used immutability-helper in the end with simpler codes without mutating.
return update(state, {
portfolio: {
object: {
$splice: [
[action.index + 1, 0, action.newObject],
],
},
}
});

Related

Re-Map an array of Object based on sort order of a property

Let's see I have a data in following structure
[
{
"Id": "xyz7",
"CurrentRow": 0,
"ReportTime": "2022-07-18T09:00:00+00:00",
"ExitTime": null,
"DateField": "2022-07-18"
},
{
"Id": "xyz8",
"CurrentRow": 1,
"ReportTime": "2022-07-18T08:00:00+00:00",
"ExitTime": null,
"DateField": "2022-07-18"
},
{
"Id": "wxyz0",
"CurrentRow": 0,
"ReportTime": "2022-07-19T00:00:00+00:00",
"ExitTime": null,
"DateField": "2022-07-19"
},
{
"Id": "wxyz1",
"CurrentRow": 1,
"ReportTime": "2022-07-19T00:00:00+00:00",
"ExitTime": null
"DateField": "2022-07-19"
}
]
If I have to say sort the structure based on ReportTime of Date : 2022-07-18, that will change the CurrentRow of entries for DateField 2022-07-18 as
0 to 1 (as it will now belong to 1st Index) and for 2nd entry 1 - 0.
In addition, the CurrentRow of other entries (for other date shall also be adapted if they were same as that of day being sorted.)
In order to achieve this my implementation goes like,
I convert the structure to a two dimensional array based on CurrentRow.
Index in dimension 1, represents the CurrentRow.
The element of array will be an array of specific data entry like [entry_for_date_18,entry_for_date_19] (Kind of spread sheet with date as columns and CurrentRow as rows.
And then in order to sort, I pick all the entries for a particular date, sort it, and collect it with the original CurrentRow. (Pass 1).
Then I go and update the CurrentRow of original array, using the index (pass 2).
e.g pseudo code:
for(let i=0;i<sortedDayArray.length;i++){
findByInOriginalArray(sortedDayArray[i].CurrentRow).updateCurrentRowTo(i)
}
Was wondering if there is a better or more efficient way to do that, using map ?
This is how I got your question: you want to sort your array based on ReportTime, then re-arrange CurrentRow based on its position in DateField and this is the data you are expecting:
[
{
Id: 'xyz8',
CurrentRow: 0,
ReportTime: '2022-07-18T08:00:00+00:00',
ExitTime: null,
DateField: '2022-07-18'
},
{
Id: 'xyz7',
CurrentRow: 1,
ReportTime: '2022-07-18T09:00:00+00:00',
ExitTime: null,
DateField: '2022-07-18'
},
{
Id: 'wxyz0',
CurrentRow: 0,
ReportTime: '2022-07-19T00:00:00+00:00',
ExitTime: null,
DateField: '2022-07-19'
},
{
Id: 'wxyz1',
CurrentRow: 1,
ReportTime: '2022-07-19T00:00:00+00:00',
ExitTime: null,
DateField: '2022-07-19'
}
]
this is the code I came up with:
var tempRow = 0;
var tempDate = ''
YOUR_ARRAY
.sort((a, b) => (a.ReportTime > b.ReportTime) ? 1 : ((b.ReportTime > a.ReportTime) ? -1 : 0))
.forEach((row, i) => {
if (row.DateField != tempDate) {
tempDate = row.DateField
tempRow = 0
}
row.CurrentRow = tempRow
tempRow++
})

Filter some values of big array in Javascript

I have an array fetched from our server which holds 2400 objects (total size is about 7MB) and I want to filter some first values in it. Right now I'm using combination of filter and slice method:
const keyword = 'whatever word';
const recommendList =bigArray.filter(item => item.name.includes(keyword)).slice(0, 5);
What I know is filter method iterates all the element in array and I think it can impact to performance of my app (React Native) cause its large data. So is there any approach to filter the array for some values, without iterating all the elements ?
If you simply want to to break(stop) the loop when you find 5th element then you can do the bellow:
const keyword = 'v';
const bigArray = ['a','v','a','v','a','v','a','v','a','v','a','v','a','v','a','v','a','v'];
const recommendList = [];
for (let i=0; i<bigArray.length; i++) { // loop till you reach end of big array index
if (recommendList.length == 5) // if length is 5 this will break the loop
{
break;
}
if (bigArray[i].includes(keyword)) {
recommendList.push(bigArray[i]); // add if you find
}
}
console.log(recommendList);
If you dont want to use lambda operation can simply use, some, find etc which only works till they return the first response as true
const bigArray = [{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
},
{
"name": "a"
},
{
"name": "v"
}
];
const keyword = 'v';
const recommendList = [];
// some operator only iterates till its condition returns true
// so if we get 5 recommended list before the bigArray end we return true and stop the iteration.
bigArray.some(obj => {
if (obj.name.includes(keyword)) {
recommendList.push(obj)
}
return recommendList.length === 5; // return true if 5 values are found, that will terminate the iteration
})
console.log(recommendList);
const keyword = 'whatever word';
const recommendList = [];
for (let i=0; i<bigArray.length; i++) {
if ( recommendList.length >= 5)
break;
const item = bigArray[i];
if (item.name.includes(keyword))
recommendList.push(item);
}
The most efficient approach is to process the array as an iterable sequence.
Example below is based on iter-ops library:
import {pipe, filter, take} from 'iter-ops';
const i = pipe(
bigArray,
filter(item => item.name.includes(keyword)),
take(5)
);
console.log('matches:', [...i]);
This way, you won't be iterating through everything even once, it will stop just as the first 5 matches are found.
In this condition, I think filter (the complexity is O(n)) should be the best solution that lead to minimized performance impact.
Think of that, if you just filter some of the values among those 2400 objects, say 1500. Then you would only get filtered results of 1500 objects, and the rest 900 objects would never be used. So at least one loop is necessary.

Delete duplicates in an array but add a count property to see the number of duplicates

Basically if I had this array:
[{"id" = 1, "product" = "Book"}, {"id" = 1, "product" = "Book"}, {"id" = 1, "product" = "Book"}, {"id" = 2, "product" = "Chair"}]
It would turn into this array:
[{"id" = 1, "product" = "Book", "count" = 3}, {"id" = 2, "product" = "Chair", "count" = 1}]
I am using react. Another option I have is to add the count property when making and adding to the array so that duplicates don't get added, but I am curious if there is a way to do it with an existing array.
Edit:
If two products have the same id they are duplicates.
I have tried filtering the array by using the id, then getting the first object. I filtered the array again by id to get the length. Then I added a new property "count" to the first object which is the length of the filtered array, after that I added the first object to a new array.
The problem with doing it this way is that I have to hard code this for every possible id even if it is not included in my array.
Reduce the array to a Map (or even just a plain object) keyed by id (assuming that's all it takes to identify a duplicate).
The values of that map will be the array you're after.
const arr = [{"id":1,"product":"Book"},{"id":1,"product":"Book"},{"id":1,"product":"Book"},{"id":2,"product":"Chair"}]
const zipped = Array.from(arr.reduce((map, o) => {
// check if id already registered
if (map.has(o.id)) {
// increment count
map.get(o.id).count++
} else {
// otherwise, store the new object with count starting at 1
map.set(o.id, { ...o, count: 1 })
}
return map
}, new Map()).values())
console.log(zipped)
.as-console-wrapper { max-height:100% !important; }
You could reduce the array into a new array with the count property added. The id property is assumed to be sufficient for considering uniqueness. If the element has already been seen then increment the count, otherwise append a new augmented element object.
const data = [{
"id": 1,
"product": "Book"
}, {
"id": 1,
"product": "Book"
}, {
"id": 1,
"product": "Book"
}, {
"id": 2,
"product": "Chair"
}];
const dedupedData = data.reduce((data, el) => {
const index = data.findIndex(item => item.id === el.id);
if (index !== -1) {
data[index].count++;
} else {
data.push({ ...el, count: 1 });
}
return data;
}, []);
console.log(dedupedData);

Finding nested object data using basic JavaScript

I want to loop through 600+ array items in an object and find one particular item based on certain criteria. The array in the object is called "operations" and its items are arrays themselves.
My goal is to get the index of operation's array item which has the deeply nested string "Go".
In the sample below this would be the first element. My problem is that I can check if an array element contains "call" and "draw" but I don't know how to test for the nested dictionary "foobar". I only have basic JavaScript available, no special libraries.
let json = {
"head": {},
"operations": [
[
"call",
"w40",
"draw",
{
"parent": "w39",
"style": [
"PUSH"
],
"index": 0,
"text": "Modify"
}
],
[
"call",
"w83.gc",
"draw",
{
"foobar": [
["beginPath"],
[
"rect",
0,
0,
245,
80
],
["fill"],
[
"fillText",
"Go",
123,
24
],
[
"drawImage",
"rwt-resources/c8af.png",
]
]
}
],
[
"create",
"w39",
"rwt.widgets.Menu",
{
"parent": "w35",
"style": [
"POP_UP"
]
}
],
[
"call",
"w39",
"draw",
{
"parent": "w35",
"style": [
"POP_UP"
]
}
]
]
};
let index = "";
let operationList = json.operations;
for (i = 0; i < operationList.length; i++) {
if (operationList[i].includes('call') && operationList[i].includes('draw')) //missing another check if the dictionary "foobar" exists in this element )
{
index = i;
}
}
document.write(index)
I'll preface by saying that this data structure is going to be tough to manage in general. I would suggest a scheme for where an operation is an object with well defined properties, rather than just an "array of stuff".
That said, you can use recursion to search the array.
If any value in the array is another array, continue with the next level of recursion
If any value is an object, search its values
const isPlainObject = require('is-plain-object');
const containsTerm = (value, term) => {
// if value is an object, search its values
if (isPlainObject(value)) {
value = Object.values(value);
}
// if value is an array, search within it
if (Array.isArray(value)) {
return value.find((element) => {
return containsTerm(element, term);
});
}
// otherwise, value is a primitive, so check if it matches
return value === term;
};
const index = object.operations.findIndex((operation) => {
return containsTerm(operation, 'Go');
});

update/merge array values in React Redux store correctly without duplicates

My initial state is like below and if new Book added or price is changed then new updated array is coming from service whose result i need to merge in my initial state.
const initialState = {
booksData: [
{"Code":"BK01","price":"5"},
{"code":"BK02","price":"30"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"}
]
};
Updated array from server with few records updated/new
data: [
{"Code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK05","price":"100"}
]
updated state should become after merging updated array with old array.
booksData: [
{"Code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"},
{"code":"BK05","price":"100"}
]
I would filter out elements of the old data that are in the new data, and concat.
const oldBooks = booksData.filter(book => !newData.some(newBook => newBook.code === book.code));
return oldBooks.concat(newData);
Keep in mind you must NOT push values into the old array. In your reducer you MUST create new instances, here a new array. 'concat' does that.
You can first merge both the array together and then reduce it to remove duplicates like
var booksData = [
{"code":"BK01","price":"5"},
{"code":"BK02","price":"30"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"}
]
var newData = [
{"code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK05","price":"100"}
]
const result = [...newData, ...booksData].reduce((res, data, index, arr) => {
if (res.findIndex(book => book.code === data.code ) < 0) {
res.push(data);
}
return res;
}, [])
console.log(result);
Merge the two array and filter using 'Code' property
const initialState = {
booksData: [
{ "Code": "BK01", "price": "5" },
{ "code": "BK02", "price": "30" },
{ "code": "BK03", "price": "332" },
{ "code": "BK04", "price": "123" }
]
};
const data =
[
{ "Code": "BK01", "price": "10" },
{ "code": "BK02", "price": "25" },
{ "code": "BK05", "price": "100" }
]
let newState = [...initialState.booksData, ...data];
newState = newState.filter((obj, pos, arr) => {
return arr.map(mapObj => mapObj['Code']).indexOf(obj['Code']) !== pos;
});
console.log(newState);
Collection of Objects
Filter a merged array to pick only non-existent items by iterating every item in the merged array which its index is before the current index of the "parent" filter iterator
const mergedUnique = [
...[{id:1}, {id:2}, {id:3}],
...[{id:1}, {id:4}, {id:2}]
]
.filter((item, idx, arr) =>
!arr.some(({id}, subIdx) => subIdx < idx && id == item.id)
)
console.log( mergedUnique )
Basic technique for "simple" arrays
Merge some arrays and filter them to pick only non-existent items by checking if the same item exists anywhere before the current item's index in the merged array.
lastIndexOf is used to check backwards, if the current value exists already, which contributes to keeping the order of the merged array in a certain way which might be desirable, which can only be achieved by checking backward and not forward.
Skip checking the first item - is obviously not a duplicate.
const mergedUniqe = [...[1,2,3], ...[1,3,4,5,2]] // [1, 2, 3, 1, 3, 4, 5, 2]
.filter((item, idx, arr) =>
!~arr.lastIndexOf(item, idx-1) || !idx
)
console.log( mergedUniqe )

Categories