So I was looking for some workaround for flat map as it doesn't work on IE and I find this one:
But I don't really understand why does it work
var gadjets = [
{computers:['asus', 'hp'],
sellphones:['Galaxy', 'Nokia']
},
{computers:['dell', 'insys'],
sellphones:['iphone', 'samsung']
}
];
const getValues = gadjets.reduce((acc, gadjet) => acc.concat(gadjet[computers]), []) // instead of gadjets.flatMap(gadjet=> gadjet[computers])
This code returns:
['asus','hp','dell','insys']
But shouldn't it return:
['asus','hp'],['dell', 'insys']
This is because reduce adds up the elements you give it. For example, take the following code:
let arr = [1,2,3,4,5];
console.log(arr.reduce((before, value)=>before+value));
This code takes each value and adds it to before. It then passes that added value into the next iteration of reduce, in the before variable.
In your code, you were passing an array into before, or in your case acc, and concatenates (merge) a new array from gadgets['computers'] and returns that array. This creates a list of the computers from the array of objects.
More info on reduce here.
But shouldn't it return
I'm not sure what you're trying to show us there, but if you mean
[['asus','hp'],['dell', 'insys']]
then no, it shouldn't. concat flattens arrays you pass it (to a single level):
const a = [].concat(['asus','hp'], ['dell', 'insys']);
console.log(a); // ["asus", "hp", "dell", "insys"]
So acc.concat(gadjet[computers]) flattens out each of those computers arrays into a new array, which is the accumulation result of the reduce.
In case you want the output to be array of arrays. Try this:
var gadjets = [
{ computers: ["asus", "hp"], sellphones: ["Galaxy", "Nokia"] },
{ computers: ["dell", "insys"], sellphones: ["iphone", "samsung"] }
];
const groupBy = key => {
let res = gadjets.reduce((objectsByKeyValue, obj) => {
let arr = [];
arr.push(obj[key]);
return objectsByKeyValue.concat(arr);
}, []);
return res;
};
console.log(groupBy("computers"));
Related
I know that if there is an array of values it must be used this approach:
console.log(['joe', 'jane', 'mary'].includes('jane')); // true
But in case of an array of arrays, is there a short way to do it? Without other computations between.
For this input:
[['jane'],['joe'],['mary']]
You can use flat method to flatten the array. For more neted array, you can also mention depth like flat(depth)
let arr = [["jane"],["joe"],["mary"]];
arr.flat().includes('jane'); //true
You can easily achieve this result using some
arr.some((a) => a.includes("jane"))
const arr = [
["jane"],
["joe"],
["mary"]
];
const arr2 = [
["joe"],
["mary"]
];
console.log(arr.some((a) => a.includes("jane")));
console.log(arr2.some((a) => a.includes("jane")));
it can also be done by first flattening the 2d arrays in 1 d aaray and then using includes to find whether the array contains the element or not
var arr = [['jane'],['joe'],['marry']]
var newarr=[].concat(...arr)
var v=newarr.includes('jane')
console.log(v)
I'm trying to get this output ['1', '12', '123'] from:
getSubstrings('123');
function getSubstrings(string) {
let array = string.split('');
let subarray = [];
return array.map(element => {
subarray.push(element);
console.log(subarray) //this gets me ['1'], then ['1','2'] ...
return subarray; //but this is not returning the same as in console.log(subarray)
});
}
I know I can get the result I want by adding .join('') to the end of subarray. My question is not about how to get ['1', '12', '123'] But, why returning the same subarray as the one being outputted on the console would result in [[1','2','3'], ['1','2','3'], ['1','2','3']] instead of [[1], [1,2], [1,2,3]] weird enough, the output from the console.log(subarray) is not the same as the returning subarray populated in the .map() Am I missing something here? Thanks in advance.
The problem is that subarray is changing in each map 'iteration', because you are not creating a new subarray, just passing the same reference to return. It will change all the three cases.
First, it returns [1], but then it returns [1,2] and change the first to [1,2]. Same happens with last element. Finally, you'd have an array made by three copies of the same reference (subarray).
Fix it by the spread operator
function getSubstrings(string) {
let array = string.split('');
let subarray = [];
return array.map(element => {
subarray.push(element);
return [...subarray].join(""); // join to get the desired output
});
}
console.log(getSubstrings("123"))
Notice that the subarray is being modify in every iteration and is not creating a new subarray. so what it actually happening is that you're passing the exact same reference to return, hence resulting in changing all results.
I suggest to refactor this to use forEach, something like this:
const arr = getSubstrings('123');
function getSubstrings(string) {
let array = string.split('');
let mainArray = [];
let cur = '';
array.forEach(x => {
cur += x;
mainArray.push(cur);
})
return mainArray;
}
console.log(arr)
I have two arrays and I would like to compare if these arrays have duplicated values, then return the values that aren't duplicates. Based on these two arrays I would like to return the string Eucalipto.
const plants = [
{
id: 59,
kind: "Cana-de-açucar"
},
{
id: 60,
kind: "Citros"
}
];
const auxPlants = [
"Cana-de-açucar",
"Citros",
"Eucalipto"
]
You can use Array#map to find all the kind values, pass that to the Set constructor, and then use Array#filter to find all elements of the array not in that Set.
const plants = [
{
id: 59,
kind: "Cana-de-açucar"
},
{
id: 60,
kind: "Citros"
}
];
const auxPlants = [
"Cana-de-açucar",
"Citros",
"Eucalipto"
];
const set = new Set(plants.map(({kind})=>kind));
const res = auxPlants.filter(x => !set.has(x));
console.log(res);
sounds like you want to filter the array of values you're interested in based on if they're not found in the other array, like so:
const nonDuplicates = auxPlants.filter(a => !plants.find(p => p.kind === a))
it's unclear if you'd also want values from the plants array that are non duplicate as well, or if you're only interested in uniques from the auxPlants array
This is the solution to it, I have explained it's working using comments
// create a set in order to store values in it
// assuming you have unique values
let set = new Set();
// iterating over array of object and storing the value of 'kind' in the set
for(obj of plants){
set.add(obj.kind);
}
// iterating over array and checking for values in set,
// if not in set then printing it
for(ele of auxPlants){
if(!set.has(ele)){
console.log(ele);
}
}
As said, please search for an already posted solution first. Here's what I found.
Anyhow, the solution would be to separate the types of plants from the first array, as so:
const plantsTypes = plants.map(obj => obj.kind)
Then, filter out the non duplicates:
const nonDuplicates = auxPlants.filter(plant => !plantsTypes.includes(plant))
Note that it matters which array you call the .filter() function on.
I am trying to split an array of integers into an array of arrays by duplicate values. The original array is composed of a list of 6 digit integers, some of these integers come in pairs, others come in groups of 3 or 4s. I'd like to get these duplicates pushed to their own arrays and have all of these arrays of duplicates composed into an array of arrays that I can later loop through.
I've looked on in the lodash library for some method or combination of but can't quite find anything that seems to work. I've also tried a few different configurations with nested for loops but also am struggling with that.
const directory = "X/";
let files = fs.readdirSync(directory);
let first6Array = [ ];
for(i=0; i< files.length; i++){
let first6 = files[i].substring(0, 6);
first6Array.push(first6);
};
console.log(first6Array);
example output of first6Array:
[ '141848',
'141848',
'141848',
'142851',
'142851',
'143275',
'143275']
I'd like to end up with something like
let MasterArray = [[141848,141848,141848],[142851,142851],[143275,143275]];
You can use new Set() to filter out the duplicates.
Then you use the unique Array and filter for every value.
const firstArray = [ '141848', '141848', '141848', '142851', '142851', '143275', '143275'];
const numberArray = firstArray.map(Number);
const masterArray = [];
const unique = new Set (numberArray); // Set {141848, 142851, 143275}
unique.forEach(u => {
masterArray.push(numberArray.filter(e => e === u));
});
console.log(masterArray);
Using lodash, you can create a function with flow:
map the items by truncating them and converting to numbers.
groupBy the value (the default).
convert to an array of arrays using values.
const { flow, partialRight: pr, map, truncate, groupBy, values } = _;
const truncate6 = s => truncate(s, { length: 6, omission: '' });
const fn = flow(
pr(map, flow(truncate6, Number)),
groupBy,
values,
);
const firstArray = [ '141848abc', '141848efg', '141848hij', '142851klm', '142851opq', '143275rst', '143275uvw'];
const result = fn(firstArray);
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.11/lodash.min.js"></script>
Use reduce to create an object of arrays, indexed by number, and push to the associated array on each iteration (creating the array at the key first if needed), then get the values of the object:
const directory = "X/";
const files = fs.readdirSync(directory);
const output = Object.values(
files.reduce((a, file) => {
const num = Number(file.slice(0, 6));
if (!a[num]) a[num] = [];
a[num].push(num);
return a;
}, {})
);
It's pretty weird to have an array of identical values, though - you might consider a different data structure like
{
'141848': 3,
'142851': 2
}
to keep track of the number of occurrences of each number:
const output = files.reduce((a, file) => {
const num = file.slice(0, 6);
a[num] = (a[num] || 0) + 1;
return a;
}, {})
To obtain exactly the result you desire, you need a nested find, something like this should works:
const directory = "X/";
let files = fs.readdirSync(directory);
let first6Array = files.reduce((acc, value)=> {
let n = +value.substr(0, 6); // assumes it can't be NaN
let arr = acc.find(nested => nested.find(item => item === n));
if (arr) {
arr.push(n);
} else {
acc.push([n]);
}
return acc;
}, []);
console.log(first6Array);
Notice that an hashmap instead, with the value and the number of occurrence, would be better, also in term of performance, but I don't think it mind since you have really few elements.
Also, it assumes the first six characters are actually numbers, otherwise the conversion would fail and you'll get NaN.
It would be safer adding a check to skip this scenario:
let n = +value.substr(0, 6);
if (isNaN(n)) {
return acc;
}
// etc
Are there any substantial reasons why modifying Array.push() to return the object pushed rather than the length of the new array might be a bad idea?
I don't know if this has already been proposed or asked before; Google searches returned only a myriad number of questions related to the current functionality of Array.push().
Here's an example implementation of this functionality, feel free to correct it:
;(function() {
var _push = Array.prototype.push;
Array.prototype.push = function() {
return this[_push.apply(this, arguments) - 1];
}
}());
You would then be able to do something like this:
var someArray = [],
value = "hello world";
function someFunction(value, obj) {
obj["someKey"] = value;
}
someFunction(value, someArray.push({}));
Where someFunction modifies the object passed in as the second parameter, for example. Now the contents of someArray are [{"someKey": "hello world"}].
Are there any drawbacks to this approach?
See my detailed answer here
TLDR;
You can get the return value of the mutated array, when you instead add an element using array.concat[].
concat is a way of "adding" or "joining" two arrays together. The awesome thing about this method, is that it has a return value of the resultant array, so it can be chained.
newArray = oldArray.concat[newItem];
This also allows you to chain functions together
updatedArray = oldArray.filter((item) => {
item.id !== updatedItem.id).concat[updatedItem]};
Where item = {id: someID, value: someUpdatedValue}
The main thing to notice is, that you need to pass an array to concat.
So make sure that you put your value to be "pushed" inside a couple of square brackets, and you're good to go.
This will give you the functionality you expected from push()
You can use the + operator to "add" two arrays together, or by passing the arrays to join as parameters to concat().
let arrayAB = arrayA + arrayB;
let arrayCD = concat(arrayC, arrayD);
Note that by using the concat method, you can take advantage of "chaining" commands before and after concat.
Are there any substantial reasons why modifying Array.push() to return the object pushed rather than the length of the new array might be a bad idea?
Of course there is one: Other code will expect Array::push to behave as defined in the specification, i.e. to return the new length. And other developers will find your code incomprehensible if you did redefine builtin functions to behave unexpectedly.
At least choose a different name for the method.
You would then be able to do something like this: someFunction(value, someArray.push({}));
Uh, what? Yeah, my second point already strikes :-)
However, even if you didn't use push this does not get across what you want to do. The composition that you should express is "add an object which consist of a key and a value to an array". With a more functional style, let someFunction return this object, and you can write
var someArray = [],
value = "hello world";
function someFunction(value, obj) {
obj["someKey"] = value;
return obj;
}
someArray.push(someFunction(value, {}));
Just as a historical note -- There was an older version of JavaScript -- JavaScript version 1.2 -- that handled a number of array functions quite differently.
In particular to this question, Array.push did return the item, not the length of the array.
That said, 1.2 has been not been used for decades now -- but some very old references might still refer to this behavior.
http://web.archive.org/web/20010408055419/developer.netscape.com/docs/manuals/communicator/jsguide/js1_2.htm
By the coming of ES6, it is recommended to extend array class in the proper way , then , override push method :
class XArray extends Array {
push() {
super.push(...arguments);
return (arguments.length === 1) ? arguments[0] : arguments;
}
}
//---- Application
let list = [1, 3, 7,5];
list = new XArray(...list);
console.log(
'Push one item : ',list.push(4)
);
console.log(
'Push multi-items :', list.push(-9, 2)
);
console.log(
'Check length :' , list.length
)
Method push() returns the last element added, which makes it very inconvenient when creating short functions/reducers. Also, push() - is a rather archaic stuff in JS. On ahother hand we have spread operator [...] which is faster and does what you needs: it exactly returns an array.
// to concat arrays
const a = [1,2,3];
const b = [...a, 4, 5];
console.log(b) // [1, 2, 3, 4, 5];
// to concat and get a length
const arrA = [1,2,3,4,5];
const arrB = [6,7,8];
console.log([0, ...arrA, ...arrB, 9].length); // 10
// to reduce
const arr = ["red", "green", "blue"];
const liArr = arr.reduce( (acc,cur) => [...acc, `<li style='color:${cur}'>${cur}</li>`],[]);
console.log(liArr);
//[ "<li style='color:red'>red</li>",
//"<li style='color:green'>green</li>",
//"<li style='color:blue'>blue</li>" ]
var arr = [];
var element = Math.random();
assert(element === arr[arr.push(element)-1]);
How about doing someArray[someArray.length]={} instead of someArray.push({})? The value of an assignment is the value being assigned.
var someArray = [],
value = "hello world";
function someFunction(value, obj) {
obj["someKey"] = value;
}
someFunction(value, someArray[someArray.length]={});
console.log(someArray)