JS Array Destructuring - Using a Binding Pattern as the Rest Property - javascript

MDN recently updated some of their docs and I came across the below piece of code. Can anyone explain what would be a practical use case of the following?
From MDN:
The rest property of array destructuring assignment can be another array or object binding pattern. This allows you to simultaneously unpack the properties and indices of arrays.
const [a, b, ...{ pop, push }] = [1, 2];
console.log(a, b); // 1 2
console.log(pop, push)
push() is called on an array, so what is the point of unpacking the push() method out of an array? I can't think of a single practical use to ever using this.

<subjective>That's ... not a good example of that feature. :-D</subjective> It works because the rest element creates a new array, and then the object destructuring pattern { pop, push } picks out those properties from that new array (not the original array).
The closest I can come to a useful example is if you want to know how many additional elements there were beyond the ones you wanted, but you don't want/need the actual array of them:
let a, b, length;
[a, b, ...{ length }] = [1, 2];
console.log(length); // 0, there were no extra elements
[a, b, ...{ length }] = [1, 2, 3];
console.log(length); // 1, there was one extra
[a, b, ...{ length }] = [1, 2, 3, 4];
console.log(length); // 2, there were two extra
...but I think the fact is that while you can use object/array destructuring on the rest element because it falls out naturally from the way destructuring patterns work, it's unlikely to be all that useful.
I should note that in the more general case, using object destructuring on an array can indeed be useful, just probably not when applied to a rest element. It's especially useful for augmented arrays like the one you get from RegExp.prototype.exec, which includes not just the array elements for the match and any capture group values, but also index (where the match occurred), input, groups, and indices. You might want those. Object destructuring on an array lets you get the specific elements you want as well as non-element properties, for example:
const str = "an example string";
const match = /example/.exec(str);
if (match) {
const {0: matched, index } = match;
console.log(`Matched ${JSON.stringify(matched)} at index ${index}.`);
}
It's also useful for picking out just a couple of elements from the middle:
const array = ["a", "b", "c", "d", "e", "f", "g", "h", "i"];
const {2: two, 7: seven} = array;
console.log(`two = ${two}, seven = ${seven}`);

Related

JS .reverse array using .sort - a trick which doesn't seem to work?

I want to get the same results which method Array.prototype.reverse() gives. I found this trick using .sort(), but it doesn't work in my Chrome console.
Why doesn't this code reverse an array?
const trickReverse = arr => arr.sort(x => 1);
console.log(trickReverse([1, 2, 3])); -> [1, 2, 3]
Explanation: The .sort method accepts a function that normally takes 2 arguments, in this case we don't care about them and use 'x'. That function compares the 2 elements and returns a negative number if they are in correct order, zero if they are equal, and a positive number if they are in incorrect order. Usually you give it a function that actually does something useful. In this solution we've exploited it to always return a positive number (1). So that means it will think every 2 elements that are compared are in the wrong order, hence reversing the entire array.
The sort() method sorts the elements of an array in place and returns the sorted array. The default sort order is ascending, built upon converting the elements into strings, then comparing their sequences of UTF-16 code units values.
// Functionless sort()
// Arrow function sort((a, b) => { /* ... */ } )
// a is the first element for comparison and b is the second element for
// comparison.
Parameters in the sort() method specify a function that defines the sort order. If omitted, the array elements are converted to strings, then sorted according to each character's Unicode code point value.
Note that the array is sorted in place, and no copy is made.
The answer you need might be the answer given below or nearly corresponds to it.
The sort method can be conveniently used with function expressions:
const numbers = [1, 2, 3];
numbers.sort(function(a, b) {
return b - a;
});
console.log(numbers);
// [3, 2, 1]
This perfectly reverses an array with any count of numbers you have.
For reference, you can read sort() method documentation here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort#:~:text=The%20sort%20method%20can%20be%20conveniently%20used%20with%20function%20expressions%3A
The comparison function will swap the elements if the comparison function returns negative .
so, just change the `
const trickReverse = arr => arr.sort(e => -1);
console.log(trickReverse([1, 2, 3])); -> [ 3,2,1]
`
This will do the trick
For
reference

A copy of an array with spread syntax modifies the copied array, how come? [duplicate]

From mdn: Spread Syntax
Note: Typically the spread operators in ES2015 goes one level deep while copying an array. Therefore, they are unsuitable for copying multidimensional arrays. It's the same case with Object.assign() and Object spread syntax. Look at the example below for a better understanding.
var a = [[1], [2], [3]];
var b = [...a];
b.shift().shift(); // 1
// Now array b is: [[2], [3]]
What is the point of the above statement? The above code sample works just the same as if you'd copied the array in a to b using the .slice() method. I tried adding another dimension to the array here: https://repl.it/HKOq/2 and things still worked as expected.
So why is the spread syntax unsuitable for copying multidimensional arrays?
I'd appreciate any help.
EDIT:
Reading the answers by estus and vol7ron helped me figure things out. Basically, as estus points out technically there are just arrays inside arrays rather than multidimensional arrays.
And as vol7ron explains only the first level of the array is copied so the objects in memory remain the same for any further nested elements.
I was also wrong to suspect that using the spread syntax was supposed to behave any differently than the slice operator
Man, programmers are really poor at displaying examples that actually show the difference.
var a = [[['a', 'b'], ['c', 'd']], 'e'];
var b = [...a];
b[0][0][0] = 'z';
b[1] = 'x';
console.log('a', a);
console.log('b', b);
This outputs:
a [[["z", "b"], ["c", "d"]], "e"]
b [[["z", "b"], ["c", "d"]], "x"]
Notice something fishy? Both arrays [0][0][0] value was changed. Meaning that the object sitting at [0][0][0] in both arrays are referenced to the same object, and is not a copy. However the [1] values are different meaning that it is indeed a copy.
Shallow copy means the first level is copied, deeper levels are referenced.
Arrays are objects, and [...a] creates a shallow copy of a array object.
For the language itself there are no multidimentional arrays - there are another arrays inside an array. It doesn't matter if contains arrays, plain objects, functions or primitives. For primitives, their values will be copied. Otherwise, the references to objects will be copied. This is what
It's the same case with Object.assign() and Object spread operators
part refers to.
And regarding
The above code sample works just the same as if you'd copied the array in a to b using the .slice() method
...it truly does. This is a neater way to write a.slice() or [].concat(a). With a considerable exception. ES6 rest operator (as well as Array.from(a)) works equally for all iterables, not just for arrays.
For a deep copy of an object ES6 offers nothing new, an object (which an array is) should be recursively copied by hand. To address all the concerns it still makes sense to use proven third-party helper functions, such as Lodash cloneDeep.
New arrays are not created for internal array elements (for multi-dimensional array):
// One-dimensional array
var a = [1,2,3];
var b = [...a];
a[0]='a';
console.log('a',a);
console.log('b',b);
// expected: b[0] == 1
// got: b[0] == 1
// Multi-dimensional array
var a = [[1], [2], [3]];
var b = [...a];
a[0][0]='a';
console.log('a',a);
console.log('b',b);
// expected: b[0][0] == 1
// got: b[0][0] == 'a'
It works like slice(), so you would have to traverse the array and create new arrays for each dimension. Here's one quick example:
// Multi-dimensional array
var a = [[1], [2], [3]];
var b = (function fn(ar){
return ar.map(el=>Array.isArray(el)&&fn(el)||el)
})(a);
a[0][0]='a';
console.log('a',a);
console.log('b',b);
// expected: b[0][0] == 1
// got: b[0][0] == 1
So what the example is trying to convey is that var b = [...a]; will not unroll the inner arrays of a (e.g b = [1,2,3]), but instead, b will be [[1],[2],[3]]. So b.shift() removes and returns the first element of b which is [1], then the second shift() just removes 1 from that returned array. In one word ... only reaches one level down into your spreaded array, e.g. var b =[...a] is equivelent to var b = [a[0], a[1], a[2]], not var b = [ a[0][0], a[1][0], a[2][0] ] in the example

Objects assigned to array don't want to start at index 1 when loaded with splice

I try to start an array index at 1 with
var entry = {};
entry['idSong'] = idSongVoted;
entry['voted'] = votedsong;
$scope.votedList.splice(idSongVoted, idSongVoted, entry);
but i have always starter at zero :
i need the index is same as the idSong
idSongVoted is a integer with values 1 to 12, without zero
The javascript arrays usually start with the index 0.You can start with the index as 1 with sparse array(object).
So here you need is Javascript objects
This problem is caused by the incorrect usage of splice. While JavaScript arrays are 0-based, neither this nor being a "sparse array" nor "not a normal object" are immediately relevant.
The splice function is simply not the correct tool for this task1.
var a = []
a.splice(1, 1, "a")
// a => ["a"]
a.splice(3, 3, "c")
// a => ["a", "c"]
a.splice(2, 2, "b")
// a => ["a", "c", "b"]
A trivial fix is to assign to each array index directly - if a "normal object" was used it would be considered assigning to each property directly.
var a = []
a[1] = "a"
a[3] = "c"
a[2] = "b"
// a => [undefined, "a", "b", "c"]
In context of the code the fix would simply be:
$scope.votedList[idSongVoted] = entry;
One nice thing about using a "sparse array" as shown - as no value was assigned to index 0 - is that Arrays.forEach and angular.forEach will correctly enumerate items in the numerical order of "id" and skip the indexes without values assigned. (If a value like null or undefined were assigned to the 0th index then the forEach loops would need guards added!)
However, if the "id" values are not relatively densely packed non-negative integers around 0 then a "normal object", which is supported by angular.forEach, would be a much better choice for an "id"-to-entry Map. This is for both logical/semantic and performance reasons.
1 Per the MDN Array.splice documentation:
Index at which to start changing the array. If greater than the length of the array, actual starting index will be set to the length of the array. If negative, will begin that many elements from the end.
This means a.splice(1, 1, "a"), when a is an empty array, is equivalent to a.splice(0, 1, "a"). This same issue also affects a.splice(3, 3, "c") when the array only has two elements - which leads to the overall incorrect ordering of ["a", "c", "b"] when the "id" used is not strictly ordered.
Supplying the "id" to howMany is also problematic, although such is not shown in the example. Just don't use splice here.

Why can't I concat an array reference in JavaScript?

I have two arrays, one comes as a reference (parameter) from a function, and the other is created as part of the function - exactly the same scenario as described here:
Add two arrays without using the concat method
I was using the push.apply() method as per the suggestion above, but can someone please explain to me, why I can't use concat() to merge two arrays if the array is sent into the function as a reference?
Refer to Array.concat on MDN:
Any operation on the new array will have no effect on the original arrays, and vice versa.
This makes it behave differently from Array.push.apply which will mutate the original Array object - the return value of Array.concat must be used. Otherwise, it works as explained in the MDN link above.
If you use concat the original array will be unmodified. If you have a reference to it you wont see the new elements.
var arr1 = [ "a", "b" ];
var arr2 = [ "c", "d" ];
arr1.push.apply(arr1, arr2);
Basically does this:
[ "a", "b" ].push("c", "d");
apply turns an array into a list of arguments. The first argument to apply is the context by the way, arr1 in this case since you want the push to apply to arr1.
You can use concat:
var arr1 = [ "a", "b" ];
var arr2 = [ "c", "d" ];
var arr3 = arr1.concat(arr2);
This leaves the original arr1 as it was. You've created a new array that has both arr1 and arr2 elements in it. If you have a reference to the original arr1 it will be unmodified. That might be a reason to not want to use concat.
Let's say we have 2 arrays "a" and "b". Array.concat method will return new instance of Array "c" which represents concatenation between a and b without any mutation of a or b. Array.push return last index of pushed element and mutate this instance.
Since ES6 (or 15, not sure) it's possible to unpack parameters and you can use push to concatenate (without harmful code) as well
a = [1,2,3,4]; // a=[1,2,3,4];
b = [5,6,7,8]; // b=[5,6,7,8];
a.push(...b) // a=[1,2,3,4,5,6,7,8]; b=[5,6,7,8]

How to extend an existing JavaScript array with another array, without creating a new array

There doesn't seem to be a way to extend an existing JavaScript array with another array, i.e. to emulate Python's extend method.
I want to achieve the following:
>>> a = [1, 2]
[1, 2]
>>> b = [3, 4, 5]
[3, 4, 5]
>>> SOMETHING HERE
>>> a
[1, 2, 3, 4, 5]
I know there's a a.concat(b) method, but it creates a new array instead of simply extending the first one. I'd like an algorithm that works efficiently when a is significantly larger than b (i.e. one that does not copy a).
Note: This is not a duplicate of How to append something to an array? -- the goal here is to add the whole contents of one array to the other, and to do it "in place", i.e. without copying all elements of the extended array.
The .push method can take multiple arguments. You can use the spread operator to pass all the elements of the second array as arguments to .push:
>>> a.push(...b)
If your browser does not support ECMAScript 6, you can use .apply instead:
>>> a.push.apply(a, b)
Or perhaps, if you think it's clearer:
>>> Array.prototype.push.apply(a,b)
Please note that all these solutions will fail with a stack overflow error if array b is too long (trouble starts at about 100,000 elements, depending on the browser).
If you cannot guarantee that b is short enough, you should use a standard loop-based technique described in the other answer.
Update 2018: A better answer is a newer one of mine: a.push(...b). Don't upvote this one anymore, as it never really answered the question, but it was a 2015 hack around first-hit-on-Google :)
For those that simply searched for "JavaScript array extend" and got here, you can very well use Array.concat.
var a = [1, 2, 3];
a = a.concat([5, 4, 3]);
Concat will return a copy the new array, as thread starter didn't want. But you might not care (certainly for most kind of uses this will be fine).
There's also some nice ECMAScript 6 sugar for this in the form of the spread operator:
const a = [1, 2, 3];
const b = [...a, 5, 4, 3];
(It also copies.)
You should use a loop-based technique. Other answers on this page that are based on using .apply can fail for large arrays.
A fairly terse loop-based implementation is:
Array.prototype.extend = function (other_array) {
/* You should include a test to check whether other_array really is an array */
other_array.forEach(function(v) {this.push(v)}, this);
}
You can then do the following:
var a = [1,2,3];
var b = [5,4,3];
a.extend(b);
DzinX's answer (using push.apply) and other .apply based methods fail when the array that we are appending is large (tests show that for me large is > 150,000 entries approx in Chrome, and > 500,000 entries in Firefox). You can see this error occurring in this jsperf.
An error occurs because the call stack size is exceeded when 'Function.prototype.apply' is called with a large array as the second argument. (MDN has a note on the dangers of exceeding call stack size using Function.prototype.apply - see the section titled "apply and built-in functions".)
For a speed comparison with other answers on this page, check out this jsperf (thanks to EaterOfCode). The loop-based implementation is similar in speed to using Array.push.apply, but tends to be a little slower than Array.slice.apply.
Interestingly, if the array you are appending is sparse, the forEach based method above can take advantage of the sparsity and outperform the .apply based methods; check out this jsperf if you want to test this for yourself.
By the way, do not be tempted (as I was!) to further shorten the forEach implementation to:
Array.prototype.extend = function (array) {
array.forEach(this.push, this);
}
because this produces garbage results! Why? Because Array.prototype.forEach provides three arguments to the function it calls - these are: (element_value, element_index, source_array). All of these will be pushed onto your first array for every iteration of forEach if you use "forEach(this.push, this)"!
I feel the most elegant these days is:
arr1.push(...arr2);
The MDN article on the spread operator mentions this nice sugary way in ES2015 (ES6):
A better push
Example: push is often used to push an array to the end of an existing
array. In ES5 this is often done as:
var arr1 = [0, 1, 2];
var arr2 = [3, 4, 5];
// Append all items from arr2 onto arr1
Array.prototype.push.apply(arr1, arr2);
In ES6 with spread this becomes:
var arr1 = [0, 1, 2];
var arr2 = [3, 4, 5];
arr1.push(...arr2);
Do note that arr2 can't be huge (keep it under about 100 000 items), because the call stack overflows, as per jcdude's answer.
Overview
a.push(...b) - limited, fast, modern syntax
a.push.apply(a, b) - limited, fast
a = a.concat(b) unlimited, slow if a is large
for (let i in b) { a.push(b[i]); } - unlimited, slow if b is large
Each snippet modifies a to be extended with b.
The "limited" snippets pass each array element as an argument, and the maximum number of arguments you can pass to a function is limited. From that link, it seems that a.push(...b) is reliable until there are about 32k elements in b (the size of a does not matter).
Relevant MDN documentation: spread syntax, .apply(), .concat(), .push()
Speed considerations
Every method is fast if both a and b are small, so in most web applications you'll want to use push(...b) and be done with it.
If you're handling more than a few thousand elements, what you want to do depends on the situation:
you're adding a few elements to a large array
→ push(...b) is very fast
you're adding many elements to a large array
→ concat is slightly faster than a loop
you're adding many elements to a small array
→ concat is much faster than a loop
you're usually adding only a few elements to any size array
→ loops are about as fast as the limited methods for small additions, but will never throw an exception even if it is not the most performant when you add many elements
you're writing a wrapper function to always get the maximum performance
→ you'll need to check the lengths of the inputs dynamically and choose the right method, perhaps calling push(...b_part) (with slices of the big b) in a loop.
This surprised me: I thought a=a.concat(b) would be able to do a nice memcpy of b onto a without bothering to do individual extend operations as a.push(...b) would have to do, thus always being the fastest. Instead, a.push(...b) is much, much faster especially when a is large.
The speed of different methods was measured in Firefox 88 on Linux using:
a = [];
for (let i = 0; i < Asize; i++){
a.push(i);
}
b = [];
for (let i = 0; i < Bsize; i++){
b.push({something: i});
}
t=performance.now();
// Code to test
console.log(performance.now() - t);
Parameters and results:
ms | Asize | Bsize | code
----+-------+-------+------------------------------
~0 | any | any | a.push(...b)
~0 | any | any | a.push.apply(a, b)
480 | 10M | 50 | a = a.concat(b)
0 | 10M | 50 | for (let i in b) a.push(b[i])
506 | 10M | 500k | a = a.concat(b)
882 | 10M | 500k | for (let i in b) a.push(b[i])
11 | 10 | 500k | a = a.concat(b)
851 | 10 | 500k | for (let i in b) a.push(b[i])
Note that a Bsize of 500 000 is the largest value accepted by all methods on my system, that's why it is smaller than Asize.
All tests were run multiple times to see if the results are outliers or representative. The fast methods are almost immeasurable in just one run using performance.now(), of course, but since the slow methods are so obvious and the two fast methods both work on the same principle, we needn't bother repeating it a bunch of times to split hairs.
The concat method is always slow if either array is large, but the loop is only slow if it has to do a lot of function calls and doesn't care how large a is. A loop is thus similar to push(...b) or push.apply for small bs but without breaking if it does get large; however, when you approach the limit, concat is a bit faster again.
First a few words about apply() in JavaScript to help understand why we use it:
The apply() method calls a function with a given this value, and
arguments provided as an array.
Push expects a list of items to add to the array. The apply() method, however, takes the expected arguments for the function call as an array. This allows us to easily push the elements of one array into another array with the builtin push() method.
Imagine you have these arrays:
var a = [1, 2, 3, 4];
var b = [5, 6, 7];
and simply do this:
Array.prototype.push.apply(a, b);
The result will be:
a = [1, 2, 3, 4, 5, 6, 7];
The same thing can be done in ES6 using the spread operator ("...") like this:
a.push(...b); //a = [1, 2, 3, 4, 5, 6, 7];
Shorter and better but not fully supported in all browsers at the moment.
Also if you want to move everything from array b to a, emptying b in the process, you can do this:
while(b.length) {
a.push(b.shift());
}
and the result will be as follows:
a = [1, 2, 3, 4, 5, 6, 7];
b = [];
If you want to use jQuery, there is $.merge()
Example:
a = [1, 2];
b = [3, 4, 5];
$.merge(a,b);
Result: a = [1, 2, 3, 4, 5]
I like the a.push.apply(a, b) method described above, and if you want you can always create a library function like this:
Array.prototype.append = function(array)
{
this.push.apply(this, array)
}
and use it like this
a = [1,2]
b = [3,4]
a.append(b)
It is possible to do it using splice():
b.unshift(b.length)
b.unshift(a.length)
Array.prototype.splice.apply(a,b)
b.shift() // Restore b
b.shift() //
But despite being uglier it is not faster than push.apply, at least not in Firefox 3.0.
as the top voted answer says, a.push(...b) is probably the correct answer taking into account the size limit issue.
On the other hand, some of the answers on performance seem out of date.
These numbers below are for 2022-05-20
from here
At appears that push is fastest across the board in 2022. That may change in the future.
Answers ignoring the question (generating a new array) are missing the point. Lots of code might need/want to modify an array in place given there can be other references to the same array
let a = [1, 2, 3];
let b = [4, 5, 6];
let c = a;
a = a.concat(b); // a and c are no longer referencing the same array
Those other references could be deep in some object, something that was captured in a closure, etc...
As a probably bad design but as an illustration, imagine you had
const carts = [
{ userId: 123, cart: [item1, item2], },
{ userId: 456, cart: [item1, item2, item3], },
];
and a function
function getCartForUser(userId) {
return customers.find(c => c.userId === userId);
}
Then you want to add items to the cart
const cart = getCartForUser(userId);
if (cart) {
cart.concat(newItems); // FAIL 😢
cart.push(...newItems); // Success! 🤩
}
As an aside, the answers suggesting modifying Array.prototype are arguably bad adivce. Changing the native prototypes is bascially a landmine in your code. Another implementation maybe be different than yours and so it will break your code or you'll break their code expecting the other behavior. This includes if/when a native implmentation gets added that clashes with yours. You might say "I know what I'm using so no issue" and that might be true at the moment and you're a single dev but add a second dev and they can't read your mind. And, you are that second dev in a few years when you've forgotten and then graft some other library (analytics?, logging?, ...) on to your page and forget the landmind you left in the code.
This is not just theory. There are countless stories on the net of people running into these landmines.
Arguably there are just a few safe uses for modifying a native object's prototype. One is to polyfill an existing and specified implementation in an old browser. In that case, the spec is defined, the spec is implemented is shipping in new browsers, you just want to get the same behavior in old browsers. That's pretty safe. Pre-patching (spec in progress but not shipping) is arguably not safe. Specs change before shipping.
This solution works for me (using the spread operator of ECMAScript 6):
let array = ['my', 'solution', 'works'];
let newArray = [];
let newArray2 = [];
newArray.push(...array); // Adding to same array
newArray2.push([...array]); // Adding as child/leaf/sub-array
console.log(newArray);
console.log(newArray2);
I'm adding this answer, because despite the question stating clearly without creating a new array, pretty much every answer just ignores it.
Modern JavaScript works well with arrays and alike as iterable objects. This makes it possible to implement a version of concat that builds upon that, and spans the array data across its parameters logically.
The example below makes use of iter-ops library that features such logic:
import {pipe, concat} from 'iter-ops';
const i = pipe(
originalArray,
concat(array2, array3, array4, ...)
); //=> Iterable
for(const a of i) {
console.log(a); // iterate over values from all arrays
}
Above, no new array is created. Operator concat will iterate through the original array, then will automatically continue into array2, then array3, and so on, in the specified order.
This is the most efficient way of joining arrays in terms of memory usage.
And if, at the end, you decide to convert it into an actual physical array, you can do so via the spread operator or Array.from:
const fullArray1 = [...i]; // pulls all values from iterable, into a new array
const fullArray2 = Array.from(i); // does the same
Combining the answers...
Array.prototype.extend = function(array) {
if (array.length < 150000) {
this.push.apply(this, array)
} else {
for (var i = 0, len = array.length; i < len; ++i) {
this.push(array[i]);
};
}
}
You can create a polyfill for extend as I have below. It will add to the array; in-place and return itself, so that you can chain other methods.
if (Array.prototype.extend === undefined) {
Array.prototype.extend = function(other) {
this.push.apply(this, arguments.length > 1 ? arguments : other);
return this;
};
}
function print() {
document.body.innerHTML += [].map.call(arguments, function(item) {
return typeof item === 'object' ? JSON.stringify(item) : item;
}).join(' ') + '\n';
}
document.body.innerHTML = '';
var a = [1, 2, 3];
var b = [4, 5, 6];
print('Concat');
print('(1)', a.concat(b));
print('(2)', a.concat(b));
print('(3)', a.concat(4, 5, 6));
print('\nExtend');
print('(1)', a.extend(b));
print('(2)', a.extend(b));
print('(3)', a.extend(4, 5, 6));
body {
font-family: monospace;
white-space: pre;
}
Another solution to merge more than two arrays
var a = [1, 2],
b = [3, 4, 5],
c = [6, 7];
// Merge the contents of multiple arrays together into the first array
var mergeArrays = function() {
var i, len = arguments.length;
if (len > 1) {
for (i = 1; i < len; i++) {
arguments[0].push.apply(arguments[0], arguments[i]);
}
}
};
Then call and print as:
mergeArrays(a, b, c);
console.log(a)
Output will be: Array [1, 2, 3, 4, 5, 6, 7]
The answer is super simple.
>>> a = [1, 2]
[1, 2]
>>> b = [3, 4, 5]
[3, 4, 5]
>>> SOMETHING HERE
(The following code will combine the two arrays.)
a = a.concat(b);
>>> a
[1, 2, 3, 4, 5]
Concat acts very similarly to JavaScript string concatenation. It will return a combination of the parameter you put into the concat function on the end of the array you call the function on. The crux is that you have to assign the returned value to a variable or it gets lost. So for example
a.concat(b); <--- This does absolutely nothing since it is just returning the combined arrays, but it doesn't do anything with it.
Another option, if you have lodash installed:
import { merge } from 'lodash';
var arr1 = merge(arr1, arr2);
Use Array.extend instead of Array.push for > 150,000 records.
if (!Array.prototype.extend) {
Array.prototype.extend = function(arr) {
if (!Array.isArray(arr)) {
return this;
}
for (let record of arr) {
this.push(record);
}
return this;
};
}
You can do that by simply adding new elements to the array with the help of the push() method.
let colors = ["Red", "Blue", "Orange"];
console.log('Array before push: ' + colors);
// append new value to the array
colors.push("Green");
console.log('Array after push : ' + colors);
Another method is used for appending an element to the beginning of an array is the unshift() function, which adds and returns the new length. It accepts multiple arguments, attaches the indexes of existing elements, and finally returns the new length of an array:
let colors = ["Red", "Blue", "Orange"];
console.log('Array before unshift: ' + colors);
// append new value to the array
colors.unshift("Black", "Green");
console.log('Array after unshift : ' + colors);
There are other methods too. You can check them out here.
Super simple, does not count on spread operators or apply, if that's an issue.
b.map(x => a.push(x));
After running some performance tests on this, it's terribly slow, but answers the question in regards to not creating a new array. Concat is significantly faster, even jQuery's $.merge() whoops it.
https://jsperf.com/merge-arrays19b/1

Categories