Consider the following,
const arr = [ 1, 5, null, null, 10 ];
console.log(arr.join(',')); // '1,5,,,10'
console.log(`${arr}`); // '1,5,,,10'
I need to keep these null values, how can I do this?
Only thing I could think of is something with reduce,
const result = arr.reduce((acc, el, index, self) => `${acc += el}${index !== self.length - 1 ? ',' : ''}`, '');
Any better way?
Using reduce()
const arr = [ 1, 5, null, null, 10 ];
const jin = arr.reduce((p, c) => `${p},${c}`);
console.log(jin);
Using map() and String()
Or use map() with String function to convert each value to a string so that join() will keep it:
const arr = [ 1, 5, null, null, 10 ];
const jin = arr.map(String).join(',');
console.log(jin);
Output
1,5,null,null,10
Not a pretty answer, but you could turn the nulls to strings.
const arr = [ 1, 5, null, null, 10 ];
const arr2 = arr.map(x => String(x));
console.log(arr2);
console.log(arr2.join(','));
const arr = [ 1, 5, null, null, 10 ]
console.log(String(arr.map(String)))
You could map everything to strings first by concatenating with an empty string:
const arr = [ 1, 5, null, null, 10 ];
console.log(arr.map((item) => item + "").join(',')); // '1,5,null,null,10'
console.log(`${arr}`); // '1,5,,,10'
Related
i have a array of array like below.
const array1 = [[8,1,2,3,1],[3,1,1,1],[4,2,1]];
what i need to do is append six empty values " " in-between last two values for each element.
Expected output:
[ [ 8, 1, 2, '', '', '', '', '', '', 3, 1 ],
[ 3, 1 '', '', '', '', '', '' , 1, 1,],
[ 4, '', '', '', '', '', '', 2, 1 ] ]
What i tried:
i know how to append this to end of each element like below. can I modify my code with adding positioning?
What is the most efficient way to do this?
const array1 = [[8,1,2,3,1],[3,1,1,1],[4,2,1]];
const appendArray = new Array(6).fill('');
const map1 = array1.map(x => x.concat(appendArray));
console.log(map1)
Array .splice could be one way
const array1 = [[8,1,2,3,1],[3,1,1,1],[4,2,1]];
const map1 = array1.map(x => {
const copy = [...x];
copy.splice(-2, 0, ...Array(6).fill(''))
return copy;
})
console.log(map1)
Although ... personally I hate splice ... this is better because it's a one liner :p
const array1 = [[8,1,2,3,1],[3,1,1,1],[4,2,1]];
const map1 = array1.map(x => [...x.slice(0, -2), ...Array(6).fill(''), ...x.slice(-2)])
console.log(map1)
What concat does is just adds the empty value array to the end of array x. What you need is to separate the beginnings and the ends. Than return the array with spreded values like so
const array1 = [[8,1,2,3,1],[3,1,1,1],[4,2,1]];
const appendArray = new Array(6).fill('');
const map1 = array1.map(x => {
const beginning = x.slice(0, x.length - 2);
const end = x.slice(-2);
return [...beginning, ...appendArray, ...end]
});
console.log(map1)
Given the array const vals = [1, 2, 3, 4, 5, 6, 7, 8, 9];
How can I filter and return a new array of indexed key/value pair objects for example:
const vals = [1, 2, 3, 4, 5, 6, 7, 8, 9];
// My fail attempt using filter()
let obj = vals.filter((n, i) => {
return new Object({ i: n % 2 });
});
return obj;
// expected result [{1:2}, {3:4}, {5:6}, {7:8}]
I need to keep the index values as I will filter 2 different arrays with different criteria and associated them later.
Update
Second attempt using map() as suggested in the comments
let obj = vals.map((n, i) => {
if (n % 2) {
return { [i]: n };
}
});
Gives me the following:
[{0:1}, undefined, {2:3}, undefined, {4:5}, undefined, {6:7}, undefined, {8:9}]
To get a list of { key: value } objects where key is the index, and the values are only even without the odd values, you can do this:
const vals = [1, 2, 3, 4, 5, 6, 7, 8, 9];
const result = vals.map((v, i) => [i, v])
.filter(([_, v]) => v % 2 == 0)
.map(([i, v]) => ({ [i]: v }));
console.log(result);
With the first map, you make a list of [[0, 1], ...] pairs to save the index for later.
Then you filter your index-value pairs so only even values remain.
Then you pack those pairs into an object in another map.
This can be done more efficiently with a single iteration using reduce:
const vals = [1, 2, 3, 4, 5, 6, 7, 8, 9];
const result = vals.reduce((a, v, i) => {
if (v % 2 == 0) {
a.push({ [i]: v });
}
return a;
}, []);
console.log(result);
Youn can try simple for loop or the reduce function
let arr = [];
for(let i = 0; i<vals.length-1;i += 2)
{
let obj={};
obj[vals[i]]=vals[i+1];
arr.push(obj);
};
I have a nested/multi-dimensional array like so:
[ [ 1, 1, a ], [ 1, 1 , b ], [ 2, 2, c ], [ 1 ,1, d ] ]
And I want to filter it so that it returns only unique values of the outer array based on the 1st value of each nested array.
So from the above array, it would return:
[ [1,1,a] [2,2,c] ]
Am trying to do this in vanilla javascript if possible. Thanks for any input! =)
Here is my solution.
const dedup = arr.filter((item, idx) => arr.findIndex(x => x[0] == item[0]) == idx)
It looks simple and also somehow tricky a bit.
I realize there's already three solutions, but I don't like them. My solution is
Generic - you can use unique with any selector function
O(n) - it uses a set, it doesn't run in O(n^2) time
So here it is:
/**
* #param arr - The array to get the unique values of
* #param uniqueBy - Takes the value and selects a criterion by which unique values should be taken
*
* #returns A new array containing the original values
*
* #example unique(["hello", "hElLo", "friend"], s => s.toLowerCase()) // ["hello", "friend"]
*/
function unique(arr, uniqueBy) {
const temp = new Set()
return arr.filter(v => {
const computed = uniqueBy(v)
const isContained = temp.has(computed)
temp.add(computed)
return !isContained
})
}
const arr = [ [ 1, 1, 'a' ], [ 1, 1, 'b' ], [ 2, 2, 'c' ], [ 1, 1, 'd' ] ]
console.log(unique(arr, v => v[0]))
You could filter with a set and given index.
const
uniqueByIndex = (i, s = new Set) => array => !s.has(array[i]) && s.add(array[i]),
data = [[1, 1, 'a'], [1, 1, 'b'], [2, 2, 'c'], [1, 1, 'd']],
result = data.filter(uniqueByIndex(0));
console.log(result);
const input = [[1,1,'a'], [1,1,'b'], [2,2,'c'], [1,1,'d']]
const res = input.reduce((acc, e) => acc.find(x => x[0] === e[0])
? acc
: [...acc, e], [])
console.log(res)
Create the object with keys as first element of array. Iterate over array, check if the first element of array exist in the Object, if not push into the array.
const nestedArr = [ [1,1,"a"], [1,1,"b"], [2,2,"c"], [1,1,"d"] ];
const output = {};
for(let arr of nestedArr) {
if(!output[arr[0]]) {
output[arr[0]] = arr;
}
}
console.log(Object.values(output));
Another solution, would be to maintain the count of first array element and if the count is equal to 1, then push in the final array.
const input = [ [1,1,"a"], [1,1,"b"], [2,2,"c"], [1,1,"d"] ],
count = {},
output = [];
input.forEach(arr => {
count[arr[0]] = (count[arr[0]] || 0) + 1;
if(count[arr[0]] === 1) {
output.push(arr);
}
})
console.log(output);
given this input:
const set1 = new Set([10, "someText", {a: 1, b: 2}]);
const set2 = new Set([10, "someText", {a: 1, b: 2}]);
const set3 = new Set([5, "someText", {a: 3, b: 4}]);
const arr = [set1, set2, set3];
combineDupSets(arr);
Wanted result:
[
Set { 10, 'someText', { a: 1, b: 2 } },
Set { 5, 'someText', { a: 3, b: 4 } }
]
I am writing a function to eliminate all the duplicate sets, and since Set() won't check for duplicates when it's an object or set itself, I wrote the following:
function combineDupSets(arr) {
const hold = [];
arr.forEach(set =>{
const copySet = [...set];
const stringify = JSON.stringify(copySet);
if(hold.indexOf(stringify) === -1) {
hold.push(stringify)
}
})
const end = hold.map(item => JSON.parse(item));
const res = end.map(item => item = new Set(item))
return res;
}
Here, I have to use 3 arrays sized O(n) to check for this, and I was just wondering if there's any other solution that is readable that will be more efficient in checking for this for both time and space complexity?
Thank you
Instead of using indexOf in an array, consider putting the sets onto an object or Map, where the key is the stringified set and the value is the original set. Assuming that the values are in order:
function combineDupSets(arr) {
const uniques = new Map();
for (const set of arr) {
uniques.set(JSON.stringify([...set]), set);
}
return [...uniques.values()];
}
This
iterates over the arr (O(n))
iterates over each item inside once (total of O(n * m) - there's no getting around that)
Iterates over the created Map and puts it into an array (O(n))
If the set values aren't necessarily in order - eg, if you have
Set([true, 'foo'])
Set(['foo', true])
that should be considered equal, then it'll get a lot more complicated, since every item in each Set not only has to be iterated over, but also compared against every other item in every other Set somehow. One way to implement this is to sort by the stringified values:
function combineDupSets(arr) {
const uniques = new Map();
for (const set of arr) {
const key = [...set].map(JSON.stringify).sort().join();
uniques.set(key, set);
}
return [...uniques.values()];
}
You could iterate the sets and check the values and treat object only equal if they share the same object reference.
function combineDupSets(array) {
return array.reduce((r, s) => {
const values = [...s];
if (!r.some(t => s.size === t.size && values.every(Set.prototype.has, t))) r.push(s);
return r;
}, []);
}
const
a = { a: 1, b: 2 },
b = { a: 3, b: 4 },
set1 = new Set([10, "someText", a]),
set2 = new Set([10, "someText", a]),
set3 = new Set([5, "someText", b]),
arr = [set1, set2, set3];
console.log(combineDupSets(arr).map(s => [...s]));
Suppose I want to loop through an array of characters and build up an object which represents the frequency of each character, for example:
const frequency = {};
const str = 'stackoverflow';
for (let i = 0; i < str.length; i++) {
frequency[str[i]] = (frequency[str[i]] + 1) || 1;
}
With the above, we would expect an object in the form of:
{
s: 1,
t: 1,
a: 1,
c: 1,
k: 1,
o: 2,
v: 1,
e: 1,
r: 1,
f: 1,
l: 1,
w: 1
}
Now suppose I wanted to loop through an array of nested arrays, each with a form of [id, val]. Can I create an object such that I would end up with multiple keys, each representing a different id, and a corresponding array, filled with the values, in the same shorthand form as above?
For example:
[ [1,2], [1,3], [1,5], [2,7], [3,0], [1,10] ]
{
1: [2,3,5,10],
2: [7],
3: [0]
}
Is there something similar to:
const map = {};
for (let i = 0; i < list.length; i++) {
map[list[i][0]] = (map[list[i][0].push(map[list[i][1])) || [list[i][1]];
}
Functional approach with .reduce():
const list = [[1,2],[1,3],[1,5],[2,7],[3,0],[1,10]]
const x = list.reduce((a, [k, v]) => (a[k] = [...a[k] || [], v], a), {})
console.log(x)
Edit: stole #jered's much better conditional instead of ternary
Classic use case for a reducer:
const arr = [ [1,2], [1,3], [1,5], [2,7], [3,0], [1,10] ];
const map = arr.reduce((acc, cur) => {
return {
...acc,
[cur[0]]: [...(acc[cur[0]] || []), cur[1]]
};
}, {});
Of course, it could be a lot more readable than this... but computed property names and the spread syntax make it pretty slick and compact (it'd fit on a single line if you wanted to).
Edit: ooo I like #woat's use of single character var names and destructuring assignment in the function signature :) We could throw that in too:
const arr = [ [1,2], [1,3], [1,5], [2,7], [3,0], [1,10] ];
const map = arr.reduce((a, [k, v]) => ({...a, [k]: [...(a[k] || []), v]}), {});
const list = [ [1,2], [1,3], [1,5], [2,7], [3,0], [1,10] ];
const map = {};
for (let item of list) {
map[item[0]] = [...map[item[0]] || [], item[1]];
}
console.log(map);