Related
I am new to ES6 destructing. I have an object which contains another object. I want to store certain values from the nested object.
For example -
z = {g: 1, h: 2, i: {d1:5, d2:6, d3:7}}
When I do
let { g, i : {d1, d3}, ...less } = z
the less variable only stores h and not d2.
Is there a way to make it so it is
less = {h, i : {d2}}
No there is not. What you could do is
let { g, i: { d1, d3, ...less2 }, ...less } = z
let less = { ...less, i: less2 };
This extracts the remainder and merges them back together while preserving the shape.
No, unfortunately this is not possible.
You can however extract the missing values from i with a second rest spread:
let z = {g: 1, h: 2, i: {d1:5, d2:6, d3:7}};
let { g, i : {d1, d3, ...i_less}, ...rest_less } = z;
let less = { i: i_less, ...rest_less };
console.log(less)
This is my way, hope it could help.
let z = {
g: 1,
h: 2,
i: {
d1:5,
d2:6,
d3:7
}
}
let {g, i: {d1, d3, ...less1}, ...less2} = z
let less = {
i: less1,
...less2,
}
console.log(less); // output: {h: 2, i:{d2:6}}
I have 2 arrays: A and B, when I change one both change. Is there a way to edit one without changing the other one.
a = [[0,0,0,0,0],[0,0,0,0,0]]
b = [[1,2,3,4,5],[6,7,8,9,10]]
a = b.slice(0)
a[0][0] = 10
console.log(a) /* [[10,2,3,4,5],[6,7,8,9,10]] */
console.log(b) /* [[10,2,3,4,5],[6,7,8,9,10]] */
The a is fine but I need b to stay [[1,2,3,4,5],[6,7,8,9,10]]
When you do splice, you change the reference of a and b, however, the reference of arrays in array b still share references, hence, update your code to following. Use Array.map
a = [[0,0,0,0,0],[0,0,0,0,0]]
b = [[1,2,3,4,5],[6,7,8,9,10]]
a = b.map(x => [...x])
a[0][0] = 10
console.log(a) /* [[10,2,3,4,5],[6,7,8,9,10]] */
console.log(b) /* [[1,2,3,4,5],[6,7,8,9,10]] */
You can use map to slice each array.
a = [[0,0,0,0,0],[0,0,0,0,0]]
b = [[1,2,3,4,5],[6,7,8,9,10]]
a = b.map(o=>o.slice(0));
a[0][0] = 10
console.log(a);
console.log(b);
Doc: map()
You take a shallow copy with Array#slice, which means nested arrays are taken by their object reference.
You could use Array#map with a check for arrays and map these recursively.
const deep = a => Array.isArray(a) ? a.map(deep) : a;
var a = [[0, 0, 0, 0, 0], [0, 0, 0, 0, 0]],
b = [[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]];
a = b.map(deep);
a[0][0] = 10;
console.log(a);
console.log(b);
.as-console-wrapper { max-height: 100% !important; top: 0; }
slice(), like Object.freeze() have a shallow scope, so this works:
var a = [1,2,3,4];
var b = a.slice(0);
a[0] = 10;
console.log(b); // [1, 2, 3]
console.log(a); // [10, 2, 3, 4]
But this doesn't work:
var a = [[0,0,0,0,0],[0,0,0,0,0]]; //multidimensional!
var b = [[1,2,3,4,5],[6,7,8,9,10]];
a = b.slice(0);
a[0][0] = 10;
console.log(a);
console.log(b);
Then, the key is go deep with slice(), for or something, Here an example using for:
var a = [];
for (var i = 0, len = b.length; i < len; i++) {
a[i] = b[i].slice();
}
Keep in mind that const won't work:
var a = [[0,0,0,0,0],[0,0,0,0,0]];
const b = [[1,2,3,4,5],[6,7,8,9,10]];// doesn't work
var a = b.slice(0);
a[0][0] = 10; // a changes b
console.log(a);
console.log(b);
The following code produces (in Chrome javascript console)
a: (3) [1, 2, 3] b: (4) [1, 2, 3, 99] c: 4
I expected c to look like b. Why doesn't it?
function snafu(){
var a = [1,2,3];
var b = a.slice();
var c = a.slice().push(99);
b.push(99);
console.log("a:",a," b:",b," c:",c);
}
Array.push() gives you value of Array.length and not array itself
var a = [];
var b = a.push(8); /* returns length of array after pushing value into array */
console.log('a = ', a, ', b = ', b);
Well, remember Array.slice() will return you new Array. So while pushing it on slice(), it'll return you length of the array.
function snafu(){
var a = [1,2,3];
var b = a.slice();
var c = a.slice();
c.push(99);
b.push(99);
console.log("a:",a," b:",b," c:",c);
}
snafu();
variable c will give you new Array so you can do whatever you want with c.
That's it. Easy!!!!
What is the difference between spread operator and array.concat()
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log([...numbers, ...parts]);
Array.concat() function
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log(numbers.concat(parts));
Both results are same. So, what kind of scenarios we want to use them? And which one is best for performance?
concat and spreads are very different when the argument is not an array.
When the argument is not an array, concat adds it as a whole, while ... tries to iterate it and fails if it can't. Consider:
a = [1, 2, 3]
x = 'hello';
console.log(a.concat(x)); // [ 1, 2, 3, 'hello' ]
console.log([...a, ...x]); // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Here, concat treats the string atomically, while ... uses its default iterator, char-by-char.
Another example:
x = 99;
console.log(a.concat(x)); // [1, 2, 3, 99]
console.log([...a, ...x]); // TypeError: x is not iterable
Again, for concat the number is an atom, ... tries to iterate it and fails.
Finally:
function* gen() { yield *'abc' }
console.log(a.concat(gen())); // [ 1, 2, 3, Object [Generator] {} ]
console.log([...a, ...gen()]); // [ 1, 2, 3, 'a', 'b', 'c' ]
concat makes no attempt to iterate the generator and appends it as a whole, while ... nicely fetches all values from it.
To sum it up, when your arguments are possibly non-arrays, the choice between concat and ... depends on whether you want them to be iterated.
The above describes the default behaviour of concat, however, ES6 provides a way to override it with Symbol.isConcatSpreadable. By default, this symbol is true for arrays, and false for everything else. Setting it to true tells concat to iterate the argument, just like ... does:
str = 'hello'
console.log([1,2,3].concat(str)) // [1,2,3, 'hello']
str = new String('hello');
str[Symbol.isConcatSpreadable] = true;
console.log([1,2,3].concat(str)) // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Performance-wise concat is faster, probably because it can benefit from array-specific optimizations, while ... has to conform to the common iteration protocol. Timings:
let big = (new Array(1e5)).fill(99);
let i, x;
console.time('concat-big');
for(i = 0; i < 1e2; i++) x = [].concat(big)
console.timeEnd('concat-big');
console.time('spread-big');
for(i = 0; i < 1e2; i++) x = [...big]
console.timeEnd('spread-big');
let a = (new Array(1e3)).fill(99);
let b = (new Array(1e3)).fill(99);
let c = (new Array(1e3)).fill(99);
let d = (new Array(1e3)).fill(99);
console.time('concat-many');
for(i = 0; i < 1e2; i++) x = [1,2,3].concat(a, b, c, d)
console.timeEnd('concat-many');
console.time('spread-many');
for(i = 0; i < 1e2; i++) x = [1,2,3, ...a, ...b, ...c, ...d]
console.timeEnd('spread-many');
Well console.log(['one', 'two', 'three', 'four', 'five']) has the same result as well, so why use either here? :P
In general you would use concat when you have two (or more) arrays from arbitrary sources, and you would use the spread syntax in the array literal if the additional elements that are always part of the array are known before. So if you would have an array literal with concat in your code, just go for spread syntax, and just use concat otherwise:
[...a, ...b] // bad :-(
a.concat(b) // good :-)
[x, y].concat(a) // bad :-(
[x, y, ...a] // good :-)
Also the two alternatives behave quite differently when dealing with non-array values.
I am replying just to the performance question since there are already good answers regarding the scenarios. I wrote a test and executed it on the most recent browsers. Below the results and the code.
/*
* Performance results.
* Browser Spread syntax concat method
* --------------------------------------------------
* Chrome 75 626.43ms 235.13ms
* Firefox 68 928.40ms 821.30ms
* Safari 12 165.44ms 152.04ms
* Edge 18 1784.72ms 703.41ms
* Opera 62 590.10ms 213.45ms
* --------------------------------------------------
*/
Below the code I wrote and used.
const array1 = [];
const array2 = [];
const mergeCount = 50;
let spreadTime = 0;
let concatTime = 0;
// Used to popolate the arrays to merge with 10.000.000 elements.
for (let i = 0; i < 10000000; ++i) {
array1.push(i);
array2.push(i);
}
// The spread syntax performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = [ ...array1, ...array2 ];
spreadTime += performance.now() - startTime;
}
// The concat performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = array1.concat(array2);
concatTime += performance.now() - startTime;
}
console.log(spreadTime / mergeCount);
console.log(concatTime / mergeCount);
The one difference I think is valid is that using spread operator for large array size will give you error of Maximum call stack size exceeded which you can avoid using the concat operator.
var someArray = new Array(600000);
var newArray = [];
var tempArray = [];
someArray.fill("foo");
try {
newArray.push(...someArray);
} catch (e) {
console.log("Using spread operator:", e.message)
}
tempArray = newArray.concat(someArray);
console.log("Using concat function:", tempArray.length)
There is one very important difference between concat and push in that the former does not mutate the underlying array, requiring you to assign the result to the same or different array:
let things = ['a', 'b', 'c'];
let moreThings = ['d', 'e'];
things.concat(moreThings);
console.log(things); // [ 'a', 'b', 'c' ]
things.push(...moreThings);
console.log(things); // [ 'a', 'b', 'c', 'd', 'e' ]
I've seen bugs caused by the assumption that concat changes the array (talking for a friend ;).
Update:
Concat is now always faster than spread. The following benchmark shows both small and large-size arrays being joined: https://jsbench.me/nyla6xchf4/1
// preparation
const a = Array.from({length: 1000}).map((_, i)=>`${i}`);
const b = Array.from({length: 2000}).map((_, i)=>`${i}`);
const aSmall = ['a', 'b', 'c', 'd'];
const bSmall = ['e', 'f', 'g', 'h', 'i'];
const c = [...a, ...b];
// vs
const c = a.concat(b);
const c = [...aSmall, ...bSmall];
// vs
const c = aSmall.concat(bSmall)
Previous:
Although some of the replies are correct when it comes to performance on big arrays, the performance is quite different when you are dealing with small arrays.
You can check the results for yourself at https://jsperf.com/spread-vs-concat-size-agnostic.
As you can see, spread is 50% faster for smaller arrays, while concat is multiple times faster on large arrays.
The answer by #georg was helpful to see the comparison. I was also curious about how .flat() would compare in the running and it was by far the worst. Don't use .flat() if speed is a priority. (Something I wasn't aware of until now)
let big = new Array(1e5).fill(99);
let i, x;
console.time("concat-big");
for (i = 0; i < 1e2; i++) x = [].concat(big);
console.timeEnd("concat-big");
console.time("spread-big");
for (i = 0; i < 1e2; i++) x = [...big];
console.timeEnd("spread-big");
console.time("flat-big");
for (i = 0; i < 1e2; i++) x = [[], big].flat();
console.timeEnd("flat-big");
let a = new Array(1e3).fill(99);
let b = new Array(1e3).fill(99);
let c = new Array(1e3).fill(99);
let d = new Array(1e3).fill(99);
console.time("concat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3].concat(a, b, c, d);
console.timeEnd("concat-many");
console.time("spread-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, ...a, ...b, ...c, ...d];
console.timeEnd("spread-many");
console.time("flat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, a, b, c, d].flat();
console.timeEnd("flat-many");
Coming from a matlab background, I am trying to replicate the following scenario in Javascript
A = [1, 2, 3, 4, 5];
B = 4;
C = A == B;
answer => C = [0, 0, 0, 1, 0]
In other words, it generates a logical array where only the value compared is set to 1. I can do this using a loop but I was wondering if there a 1 liner solution to this in javascript?
You can use the map() function to do something similar to what you were looking for:
var A = [1, 2, 3, 4, 5];
var B = 4;
var C = function (x) { return +(x === B); };
var answer = A.map(C);
var C = x => +(x === B); would look cleaner, but that's ES6 code (experimental).
About the fanciest you could get would be
var C = A.map(function(v) { return v == B ? 1 : 0; });
That's supported in newer JavaScript runtime systems.
In JavaScript it'd probably be more idiomatic to prefer a result array containing boolean values:
var C = A.map(function(v) { return v == B; });
There's not a one-liner, but using Array.map can get you pretty close to what you want:
var a = [1, 2, 3, 4, 5];
var b = 4;
var c = a.map(function(item) { return item === b? 1: 0; });
console.log(c);
Fiddle
Note map isn't supported by older browsers, the MDN link above has polyfil code or you can include any number of libraries that provide something equivalent (e.g. jQuery has a .map() function).
You could write your own function :
function equals(a, b) {
var result = [];
while (result.length < a.length) {
result.push(+(a[result.length] == b));
}
return result;
}
var A = [1, 2, 3, 4, 5];
var B = 4;
var C = equals(A, B); // [0, 0, 0, 1, 0]