I am really struggling with the Object transformation. I have an array of Object and want to transform it into Highcharts multi line chart input.
For x-axis i need a object of unique dates.
For Y -axis i need a series of objects, each will have equal number of points as unique dates from step 1. For a ID and date if data is available it will add that value otherwise null.
Original:
[{
"date": "1997-09-29",
"Count": 100,
"ID": "AB12-R"
},
{
"date": "1997-12-30",
"Count": 104.7,
"ID": "AB13-R"
},
{
"date": "1997-12-30",
"Count": 1192,
"ID": "BA12-R"
},
{
"date": "1998-03-30",
"Count": 981,
"ID": "BA12-R"
},
{
"date": "1998-06-01",
"Count": 341,
"ID": "BA12-R"
}
]
Output:
[
{
Identiy: 'AB12-R',
data: [100, null, null, null]
},
{
Identiy: 'AB13-R',
data: [null, 104.7, null, null]
}, {
Identiy: 'BA12-R',
data: [null, 1192, 981, 341]
}
]
Explanation:
In Original i have 4 unique dates, Hence the length of data for each identity is 4. Now for each unique date i'll check if there is any entry in Original if yes i'll push it in data if not that null.
My Code:
From some help from, this is what i was trying. But this is not checking the unique dates and just creating the data object length which is equal to number of object in the Original. Here is that implementation https://jsfiddle.net/3u9penko/
const data = [{"date":"1997-09-29","Count":100,"ID":"AB12-R"},{"date":"1997-12-30","Count":104.7,"ID":"AB13-R"},{"date":"1997-12-30","Count":1192,"ID":"BA12-R"},{"date":"1998-03-30","Count":981,"ID":"BA12-R"},{"date":"1998-06-01","Count":341,"ID":"BA12-R"}];
const hcData = [];
data.forEach((d, i) => {
const checkIfExist = hcData.find(data => data.id === d["ID"])
if (checkIfExist) {
checkIfExist.data[i] = d["Count"];
} else {
const initialData = [...Array(data.length)]
initialData.fill(null, 0, data.length)
initialData[i] = d["Count"];
hcData.push({
data: initialData,
id: d["ID"]
})
}
})
console.log(hcData)
Create a Map of unique dates to index using a Set to get the unique dates, and then converting to a Map with the date as key, and the index as value.
Reduce the array to a Map using the ID as the key, and initializing the value with an empty array, that has the length of the dates Map size filled with null. Add the value of the current object to the array of nulls at the index of the date in the dates Map.
Convert the Map to an array of objects using Array.from().
const fn = arr => {
// create a Map of uniqe dates with their index
const dates = new Map(Array.from(
new Set(arr.map(o => o.date)),
(v, i) => [v, i]
))
return Array.from(
arr.reduce((acc, o, i) => {
// if an ID doesn't exist on the Map init it with an empty array of null values (counts)
if(!acc.has(o.ID)) acc.set(o.ID, new Array(dates.size).fill(null))
// add the current ID count to the array of counts
acc.get(o.ID)[dates.get(o.date)] = o.Count
return acc
}, new Map),
([Identity, data]) => ({ Identity, data }) // convert to an array of objects
)
}
const arr = [{"date":"1997-09-29","Count":100,"ID":"AB12-R"},{"date":"1997-12-30","Count":104.7,"ID":"AB13-R"},{"date":"1997-12-30","Count":1192,"ID":"BA12-R"},{"date":"1998-03-30","Count":981,"ID":"BA12-R"},{"date":"1998-06-01","Count":341,"ID":"BA12-R"}]
const result = fn(arr)
console.log(result)
Related
I'm stuck on this type of situation where the values of the object is changed to a different value. Is there way to shift a value to a key or would simply deleting and adding be better? I tried to loop to see which of the keys overlap in value and using the if statement and conditions i tried adding or deleting using Array methods. However, since the inter data is an object i am sruggling to find the right methods or even the process. I also tried using a function to insert the data and pushing to a new empty array that is returned from the function.
If I have objects in an array like so:
const data = [
{
"date": "12/22",
"treatment": "nausea",
"count": 2
},
{
"date": "12/23",
"treatment": "cold",
"count": 3
},
{
"date": "12/22",
"treatment": "cold",
"count": 2
}
];
and wanting to change the data like so:
const newData = [
{
"date": "12/22",
"cold": 2
"nausea": 2,
},
{
"date": "12/23",
"cold": 3
}
];
try this code using loop and reduce and every time add to new array
const data = [
{
"date": "12/22",
"treatment": "nausea",
"count": 2
},
{
"date": "12/23",
"treatment": "cold",
"count": 3
},
{
"date": "12/22",
"treatment": "cold",
"count": 2
}
];
const newData = [];
const dataByDate = data.reduce((acc, curr) => {
if (!acc[curr.date]) {
acc[curr.date] = { date: curr.date };
}
acc[curr.date][curr.treatment] = curr.count;
return acc;
}, {});
for (let date in dataByDate) {
newData.push(dataByDate[date]);
}
console.log(newData);
We want to reduce the data by unique dates. This can be done with:
An object as a dictionary,
Set or Map, or
Some other custom implementation.
Prefer to use Array.reduce() when reducing an array. This is standardized and more expressive than a custom implementation.
Using a map-like structure as the accumulator allows reduction of the dates by uniqueness and the data itself, simultaneously.
Note: Properties of objects are converted to Strings (except for Symbols). So if you want to use different "keys" that are equal after conversion (e.g. 0 and "0"), you cannot use objects; use Map instead.
(All our dates are Strings already, so this warning does not apply here.)
When using an object we can use the nullish coalescing assignment ??=: This allows us to assign an initial "empty" entry ({ date: dataEntry.date }) when encountering a new unique date.
Further, that assignment evaluates to the dictionary's entry; the entry that was either already present or just assigned.
Then we only need to assign the treatment and its count as a key-value pair to the entry.
const data = [
{ "date": "12/22", "treatment": "nausea", "count": 2 },
{ "date": "12/23", "treatment": "cold", "count": 3 },
{ "date": "12/22", "treatment": "cold", "count": 2 }
];
const newData = reduceByDate(data);
console.log(newData);
function reduceByDate(data) {
const dataByDate = data.reduce((dict, dataEntry) => {
const dictEntry = dict[dataEntry.date] // Get existing or ...
??= { date: dataEntry.date }; // ... just initialized entry.
dictEntry[dataEntry.treatment] = dataEntry.count;
return dict;
}, {});
// Transform dictionary to array of reduced entries
return Object.values(dataByDate);
}
You can make use of reduce() and Object.assign().
First we use reduce to combine objects with the same date into one object and then use assign to merge the values:
const data = [{
"date": "12/22",
"treatment": "nausea",
"count": 2
},
{
"date": "12/23",
"treatment": "cold",
"count": 3
},
{
"date": "12/22",
"treatment": "cold",
"count": 2
}
];
const newData = data.reduce((acc, curr) => {
const dateIndex = acc.findIndex(item => item.date === curr.date);
if (dateIndex === -1) {
acc.push({
date: curr.date,
[curr.treatment]: curr.count
});
} else {
acc[dateIndex] = Object.assign({}, acc[dateIndex], {
[curr.treatment]: curr.count
});
}
return acc;
}, []);
console.log(newData)
Basically if I had this array:
[{"id" = 1, "product" = "Book"}, {"id" = 1, "product" = "Book"}, {"id" = 1, "product" = "Book"}, {"id" = 2, "product" = "Chair"}]
It would turn into this array:
[{"id" = 1, "product" = "Book", "count" = 3}, {"id" = 2, "product" = "Chair", "count" = 1}]
I am using react. Another option I have is to add the count property when making and adding to the array so that duplicates don't get added, but I am curious if there is a way to do it with an existing array.
Edit:
If two products have the same id they are duplicates.
I have tried filtering the array by using the id, then getting the first object. I filtered the array again by id to get the length. Then I added a new property "count" to the first object which is the length of the filtered array, after that I added the first object to a new array.
The problem with doing it this way is that I have to hard code this for every possible id even if it is not included in my array.
Reduce the array to a Map (or even just a plain object) keyed by id (assuming that's all it takes to identify a duplicate).
The values of that map will be the array you're after.
const arr = [{"id":1,"product":"Book"},{"id":1,"product":"Book"},{"id":1,"product":"Book"},{"id":2,"product":"Chair"}]
const zipped = Array.from(arr.reduce((map, o) => {
// check if id already registered
if (map.has(o.id)) {
// increment count
map.get(o.id).count++
} else {
// otherwise, store the new object with count starting at 1
map.set(o.id, { ...o, count: 1 })
}
return map
}, new Map()).values())
console.log(zipped)
.as-console-wrapper { max-height:100% !important; }
You could reduce the array into a new array with the count property added. The id property is assumed to be sufficient for considering uniqueness. If the element has already been seen then increment the count, otherwise append a new augmented element object.
const data = [{
"id": 1,
"product": "Book"
}, {
"id": 1,
"product": "Book"
}, {
"id": 1,
"product": "Book"
}, {
"id": 2,
"product": "Chair"
}];
const dedupedData = data.reduce((data, el) => {
const index = data.findIndex(item => item.id === el.id);
if (index !== -1) {
data[index].count++;
} else {
data.push({ ...el, count: 1 });
}
return data;
}, []);
console.log(dedupedData);
Here is the data am getting
["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0", ]
First four elements are column names and next four elements are values of those column names. I want the data as
[
{
"node_name": "0YXe_ws",
"ip":"10.0.10.147",
"name":"generic",
"active":"0"
}
]
I have tried by dividing the array by four elements and it didnt work
If you can make sure your mentioned strucutre is always the same, a simple for loop should do the trick.
Simply divide the array length to run over all keys and add the half length of the array to each key to get the corresponding value.
let arr = ["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0"];
let data = {}
for(let i=0;i<arr.length / 2;i++){
data[arr[i]] = arr[i+arr.length/2];
}
console.log(data);
You can chunk your data into arrays containing by firstly creating a chunk function. You can then make a zip function which you can use to merge each arrays data together. Once you've done that, you can chunk your data into fours again and map over each entry with Object.fromEntries() to obtain your objects:
const arr = ["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0"];
const chunk = (arr, s) =>
arr.length ? [arr.slice(0, s), ...chunk(arr.slice(s), 4)] : [];
const zip = arrs => arrs[0].map((_, i) => arrs.map(arr => arr[i]));
const res = chunk(chunk(arr, 4), 2).map(arr => Object.fromEntries(zip(arr)))
console.log(res);
Example of flow:
["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0", ];
Apply chunk(arr, 4) on above data your data chunked into elements of four:
[
["node_name","ip","name","active"],
["0YXe_ws","10.0.10.147","generic","0"]
]
Apply chunk(arr, 2) to the above array. This will group keys and entries for each object into its own array:
[
[
["node_name","ip","name","active"],
["0YXe_ws","10.0.10.147","generic","0"]
]
]
This is then iterated on using .map(). Each inner array zipped first to create an array of [key, value] pair entries. An example of the above inner array if converted into (note this is still wrapped by the outer array):
[
["node_name","0YXe_ws"],
["ip","10.0.10.147"],
["name","generic"],
["active","0"]
]
As this array now takes the correct form of [[key1, value1], [key2, value2, ...] we can use Object.fromEntries() on it, which takes an array of this form and converts it into an object for us. So, the above gets transformed into the following object. Resulting in the final result of:
[
{
node_name: "0YXe_ws",
ip: "10.0.10.147",
name: "generic",
active: "0"
}
]
As a result, this method will work if you have an array with multiple keys and multiple objects, creating multiple objects like so:
const arr = ["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0", "node_name", "ip", "name", "active", "1ABf_xy", "127.0.0.1", "abc", "1"];
const chunk = (arr, s) =>
arr.length ? [arr.slice(0, s), ...chunk(arr.slice(s), 4)] : [];
const zip = arrs => arrs[0].map((_, i) => arrs.map(arr => arr[i]));
const res = chunk(chunk(arr, 4), 2).map(arr => Object.fromEntries(zip(arr)))
console.log(res);
You could separate keys and values and reduce the values array by looking to the index and the corresponding key.
var data = ["node_name", "ip", "name", "active", "0YXe_ws", "10.0.10.147", "generic", "0", "sms", "10.0.10.17", "normal", "1"],
length = 4,
keys = data.slice(0, length),
result = data.slice(length).reduce((r, v, i) => {
const index = Math.floor(i / length);
r[index] = r[index] || {};
r[index][keys[i % length]] = v;
return r;
}, []);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
I've got two arrays that have multiple objects
[
{
"name":"paul",
"employee_id":"8"
}
]
[
{
"years_at_school": 6,
"department":"Mathematics",
"e_id":"8"
}
]
How can I achieve the following with either ES6 or Lodash?
[
{
"name":"paul",
"employee_id":"8"
"data": {
"years_at_school": 6
"department":"Mathematics",
"e_id":"8"
}
}
]
I can merge but I'm not sure how to create a new child object and merge that in.
Code I've tried:
school_data = _.map(array1, function(obj) {
return _.merge(obj, _.find(array2, {employee_id: obj.e_id}))
})
This merges to a top level array like so (which is not what I want):
{
"name":"paul",
"employee_id":"8"
"years_at_school": 6
"department":"Mathematics",
"e_id":"8"
}
The connector between these two is "employee_id" and "e_id".
It's imperative that it's taken into account that they could be 1000 objects in each array, and that the only way to match these objects up is by "employee_id" and "e_id".
In order to match up employee_id and e_id you should iterate through the first array and create an object keyed to employee_id. Then you can iterate though the second array and add the data to the particular id in question. Here's an example with an extra item added to each array:
let arr1 = [
{
"name":"mark",
"employee_id":"6"
},
{
"name":"paul",
"employee_id":"8"
}
]
let arr2 = [
{
"years_at_school": 6,
"department":"Mathematics",
"e_id":"8"
},
{
"years_at_school": 12,
"department":"Arr",
"e_id":"6"
}
]
// empObj will be keyed to item.employee_id
let empObj = arr1.reduce((obj, item) => {
obj[item.employee_id] = item
return obj
}, {})
// now lookup up id and add data for each object in arr2
arr2.forEach(item=>
empObj[item.e_id].data = item
)
// The values of the object will be an array of your data
let merged = Object.values(empObj)
console.log(merged)
If you perform two nested O(n) loops (map+find), you'll end up with O(n^2) performance. A typical alternative is to create intermediate indexed structures so the whole thing is O(n). A functional approach with lodash:
const _ = require('lodash');
const dataByEmployeeId = _(array2).keyBy('e_id');
const result = array1.map(o => ({...o, data: dataByEmployeeId.get(o.employee_id)}));
Hope this help you:
var mainData = [{
name: "paul",
employee_id: "8"
}];
var secondaryData = [{
years_at_school: 6,
department: "Mathematics",
e_id: "8"
}];
var finalData = mainData.map(function(person, index) {
person.data = secondaryData[index];
return person;
});
Sorry, I've also fixed a missing coma in the second object and changed some other stuff.
With latest Ecmascript versions:
const mainData = [{
name: "paul",
employee_id: "8"
}];
const secondaryData = [{
years_at_school: 6,
department: "Mathematics",
e_id: "8"
}];
// Be careful with spread operator over objects.. it lacks of browser support yet! ..but works fine on latest Chrome version for example (69.0)
const finalData = mainData.map((person, index) => ({ ...person, data: secondaryData[index] }));
Your question suggests that both arrays will always have the same size. It also suggests that you want to put the contents of array2 within the field data of the elements with the same index in array1. If those assumptions are correct, then:
// Array that will receive the extra data
const teachers = [
{ name: "Paul", employee_id: 8 },
{ name: "Mariah", employee_id: 10 }
];
// Array with the additional data
const extraData = [
{ years_at_school: 6, department: "Mathematics", e_id: 8 },
{ years_at_school: 8, department: "Biology", e_id: 10 },
];
// Array.map will iterate through all indices, and gives both the
const merged = teachers.map((teacher, index) => Object.assign({ data: extraData[index] }, teacher));
However, if you want the data to be added to the employee with an "id" matching in both arrays, you need to do the following:
// Create a function to obtain the employee from an ID
const findEmployee = id => extraData.filter(entry => entry.e_id == id);
merged = teachers.map(teacher => {
const employeeData = findEmployee(teacher.employee_id);
if (employeeData.length === 0) {
// Employee not found
throw new Error("Data inconsistency");
}
if (employeeData.length > 1) {
// More than one employee found
throw new Error("Data inconsistency");
}
return Object.assign({ data: employeeData[0] }, teacher);
});
A slightly different approach just using vanilla js map with a loop to match the employee ids and add the data from the second array to the matching object from the first array. My guess is that the answer from #MarkMeyer is probably faster.
const arr1 = [{ "name": "paul", "employee_id": "8" }];
const arr2 = [{ "years_at_school": 6, "department": "Mathematics", "e_id": "8" }];
const results = arr1.map((obj1) => {
for (const obj2 of arr2) {
if (obj2.e_id === obj1.employee_id) {
obj1.data = obj2;
break;
}
}
return obj1;
});
console.log(results);
My initial state is like below and if new Book added or price is changed then new updated array is coming from service whose result i need to merge in my initial state.
const initialState = {
booksData: [
{"Code":"BK01","price":"5"},
{"code":"BK02","price":"30"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"}
]
};
Updated array from server with few records updated/new
data: [
{"Code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK05","price":"100"}
]
updated state should become after merging updated array with old array.
booksData: [
{"Code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"},
{"code":"BK05","price":"100"}
]
I would filter out elements of the old data that are in the new data, and concat.
const oldBooks = booksData.filter(book => !newData.some(newBook => newBook.code === book.code));
return oldBooks.concat(newData);
Keep in mind you must NOT push values into the old array. In your reducer you MUST create new instances, here a new array. 'concat' does that.
You can first merge both the array together and then reduce it to remove duplicates like
var booksData = [
{"code":"BK01","price":"5"},
{"code":"BK02","price":"30"},
{"code":"BK03","price":"332"},
{"code":"BK04","price":"123"}
]
var newData = [
{"code":"BK01","price":"10"},
{"code":"BK02","price":"25"},
{"code":"BK05","price":"100"}
]
const result = [...newData, ...booksData].reduce((res, data, index, arr) => {
if (res.findIndex(book => book.code === data.code ) < 0) {
res.push(data);
}
return res;
}, [])
console.log(result);
Merge the two array and filter using 'Code' property
const initialState = {
booksData: [
{ "Code": "BK01", "price": "5" },
{ "code": "BK02", "price": "30" },
{ "code": "BK03", "price": "332" },
{ "code": "BK04", "price": "123" }
]
};
const data =
[
{ "Code": "BK01", "price": "10" },
{ "code": "BK02", "price": "25" },
{ "code": "BK05", "price": "100" }
]
let newState = [...initialState.booksData, ...data];
newState = newState.filter((obj, pos, arr) => {
return arr.map(mapObj => mapObj['Code']).indexOf(obj['Code']) !== pos;
});
console.log(newState);
Collection of Objects
Filter a merged array to pick only non-existent items by iterating every item in the merged array which its index is before the current index of the "parent" filter iterator
const mergedUnique = [
...[{id:1}, {id:2}, {id:3}],
...[{id:1}, {id:4}, {id:2}]
]
.filter((item, idx, arr) =>
!arr.some(({id}, subIdx) => subIdx < idx && id == item.id)
)
console.log( mergedUnique )
Basic technique for "simple" arrays
Merge some arrays and filter them to pick only non-existent items by checking if the same item exists anywhere before the current item's index in the merged array.
lastIndexOf is used to check backwards, if the current value exists already, which contributes to keeping the order of the merged array in a certain way which might be desirable, which can only be achieved by checking backward and not forward.
Skip checking the first item - is obviously not a duplicate.
const mergedUniqe = [...[1,2,3], ...[1,3,4,5,2]] // [1, 2, 3, 1, 3, 4, 5, 2]
.filter((item, idx, arr) =>
!~arr.lastIndexOf(item, idx-1) || !idx
)
console.log( mergedUniqe )