spread operator vs array.concat() - javascript

What is the difference between spread operator and array.concat()
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log([...numbers, ...parts]);
Array.concat() function
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log(numbers.concat(parts));
Both results are same. So, what kind of scenarios we want to use them? And which one is best for performance?

concat and spreads are very different when the argument is not an array.
When the argument is not an array, concat adds it as a whole, while ... tries to iterate it and fails if it can't. Consider:
a = [1, 2, 3]
x = 'hello';
console.log(a.concat(x)); // [ 1, 2, 3, 'hello' ]
console.log([...a, ...x]); // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Here, concat treats the string atomically, while ... uses its default iterator, char-by-char.
Another example:
x = 99;
console.log(a.concat(x)); // [1, 2, 3, 99]
console.log([...a, ...x]); // TypeError: x is not iterable
Again, for concat the number is an atom, ... tries to iterate it and fails.
Finally:
function* gen() { yield *'abc' }
console.log(a.concat(gen())); // [ 1, 2, 3, Object [Generator] {} ]
console.log([...a, ...gen()]); // [ 1, 2, 3, 'a', 'b', 'c' ]
concat makes no attempt to iterate the generator and appends it as a whole, while ... nicely fetches all values from it.
To sum it up, when your arguments are possibly non-arrays, the choice between concat and ... depends on whether you want them to be iterated.
The above describes the default behaviour of concat, however, ES6 provides a way to override it with Symbol.isConcatSpreadable. By default, this symbol is true for arrays, and false for everything else. Setting it to true tells concat to iterate the argument, just like ... does:
str = 'hello'
console.log([1,2,3].concat(str)) // [1,2,3, 'hello']
str = new String('hello');
str[Symbol.isConcatSpreadable] = true;
console.log([1,2,3].concat(str)) // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Performance-wise concat is faster, probably because it can benefit from array-specific optimizations, while ... has to conform to the common iteration protocol. Timings:
let big = (new Array(1e5)).fill(99);
let i, x;
console.time('concat-big');
for(i = 0; i < 1e2; i++) x = [].concat(big)
console.timeEnd('concat-big');
console.time('spread-big');
for(i = 0; i < 1e2; i++) x = [...big]
console.timeEnd('spread-big');
let a = (new Array(1e3)).fill(99);
let b = (new Array(1e3)).fill(99);
let c = (new Array(1e3)).fill(99);
let d = (new Array(1e3)).fill(99);
console.time('concat-many');
for(i = 0; i < 1e2; i++) x = [1,2,3].concat(a, b, c, d)
console.timeEnd('concat-many');
console.time('spread-many');
for(i = 0; i < 1e2; i++) x = [1,2,3, ...a, ...b, ...c, ...d]
console.timeEnd('spread-many');

Well console.log(['one', 'two', 'three', 'four', 'five']) has the same result as well, so why use either here? :P
In general you would use concat when you have two (or more) arrays from arbitrary sources, and you would use the spread syntax in the array literal if the additional elements that are always part of the array are known before. So if you would have an array literal with concat in your code, just go for spread syntax, and just use concat otherwise:
[...a, ...b] // bad :-(
a.concat(b) // good :-)
[x, y].concat(a) // bad :-(
[x, y, ...a] // good :-)
Also the two alternatives behave quite differently when dealing with non-array values.

I am replying just to the performance question since there are already good answers regarding the scenarios. I wrote a test and executed it on the most recent browsers. Below the results and the code.
/*
* Performance results.
* Browser Spread syntax concat method
* --------------------------------------------------
* Chrome 75 626.43ms 235.13ms
* Firefox 68 928.40ms 821.30ms
* Safari 12 165.44ms 152.04ms
* Edge 18 1784.72ms 703.41ms
* Opera 62 590.10ms 213.45ms
* --------------------------------------------------
*/
Below the code I wrote and used.
const array1 = [];
const array2 = [];
const mergeCount = 50;
let spreadTime = 0;
let concatTime = 0;
// Used to popolate the arrays to merge with 10.000.000 elements.
for (let i = 0; i < 10000000; ++i) {
array1.push(i);
array2.push(i);
}
// The spread syntax performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = [ ...array1, ...array2 ];
spreadTime += performance.now() - startTime;
}
// The concat performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = array1.concat(array2);
concatTime += performance.now() - startTime;
}
console.log(spreadTime / mergeCount);
console.log(concatTime / mergeCount);

The one difference I think is valid is that using spread operator for large array size will give you error of Maximum call stack size exceeded which you can avoid using the concat operator.
var someArray = new Array(600000);
var newArray = [];
var tempArray = [];
someArray.fill("foo");
try {
newArray.push(...someArray);
} catch (e) {
console.log("Using spread operator:", e.message)
}
tempArray = newArray.concat(someArray);
console.log("Using concat function:", tempArray.length)

There is one very important difference between concat and push in that the former does not mutate the underlying array, requiring you to assign the result to the same or different array:
let things = ['a', 'b', 'c'];
let moreThings = ['d', 'e'];
things.concat(moreThings);
console.log(things); // [ 'a', 'b', 'c' ]
things.push(...moreThings);
console.log(things); // [ 'a', 'b', 'c', 'd', 'e' ]
I've seen bugs caused by the assumption that concat changes the array (talking for a friend ;).

Update:
Concat is now always faster than spread. The following benchmark shows both small and large-size arrays being joined: https://jsbench.me/nyla6xchf4/1
// preparation
const a = Array.from({length: 1000}).map((_, i)=>`${i}`);
const b = Array.from({length: 2000}).map((_, i)=>`${i}`);
const aSmall = ['a', 'b', 'c', 'd'];
const bSmall = ['e', 'f', 'g', 'h', 'i'];
const c = [...a, ...b];
// vs
const c = a.concat(b);
const c = [...aSmall, ...bSmall];
// vs
const c = aSmall.concat(bSmall)
Previous:
Although some of the replies are correct when it comes to performance on big arrays, the performance is quite different when you are dealing with small arrays.
You can check the results for yourself at https://jsperf.com/spread-vs-concat-size-agnostic.
As you can see, spread is 50% faster for smaller arrays, while concat is multiple times faster on large arrays.

The answer by #georg was helpful to see the comparison. I was also curious about how .flat() would compare in the running and it was by far the worst. Don't use .flat() if speed is a priority. (Something I wasn't aware of until now)
let big = new Array(1e5).fill(99);
let i, x;
console.time("concat-big");
for (i = 0; i < 1e2; i++) x = [].concat(big);
console.timeEnd("concat-big");
console.time("spread-big");
for (i = 0; i < 1e2; i++) x = [...big];
console.timeEnd("spread-big");
console.time("flat-big");
for (i = 0; i < 1e2; i++) x = [[], big].flat();
console.timeEnd("flat-big");
let a = new Array(1e3).fill(99);
let b = new Array(1e3).fill(99);
let c = new Array(1e3).fill(99);
let d = new Array(1e3).fill(99);
console.time("concat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3].concat(a, b, c, d);
console.timeEnd("concat-many");
console.time("spread-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, ...a, ...b, ...c, ...d];
console.timeEnd("spread-many");
console.time("flat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, a, b, c, d].flat();
console.timeEnd("flat-many");

Related

React js Array and Array Reverse issue without mutating original array [duplicate]

Array.prototype.reverse reverses the contents of an array in place (with mutation)...
Is there a similarly simple strategy for reversing an array without altering the contents of the original array (without mutation)?
You can use slice() to make a copy then reverse() it
var newarray = array.slice().reverse();
var array = ['a', 'b', 'c', 'd', 'e'];
var newarray = array.slice().reverse();
console.log('a', array);
console.log('na', newarray);
In ES6:
const newArray = [...array].reverse()
Another ES6 variant:
We can also use .reduceRight() to create a reversed array without actually reversing it.
let A = ['a', 'b', 'c', 'd', 'e', 'f'];
let B = A.reduceRight((a, c) => (a.push(c), a), []);
console.log(B);
Useful Resources:
Array.prototype.reduceRight()
Arrow Functions
Comma Operator
const originalArray = ['a', 'b', 'c', 'd', 'e', 'f'];
const newArray = Array.from(originalArray).reverse();
console.log(newArray);
There are multiple ways of reversing an array without modifying. Two of them are
var array = [1,2,3,4,5,6,7,8,9,10];
// Using Splice
var reverseArray1 = array.splice().reverse(); // Fastest
// Using spread operator
var reverseArray2 = [...array].reverse();
// Using for loop
var reverseArray3 = [];
for(var i = array.length-1; i>=0; i--) {
reverseArray.push(array[i]);
}
Performance test http://jsben.ch/guftu
Try this recursive solution:
const reverse = ([head, ...tail]) =>
tail.length === 0
? [head] // Base case -- cannot reverse a single element.
: [...reverse(tail), head] // Recursive case
reverse([1]); // [1]
reverse([1,2,3]); // [3,2,1]
reverse('hello').join(''); // 'olleh' -- Strings too!
An ES6 alternative using .reduce() and spreading.
const foo = [1, 2, 3, 4];
const bar = foo.reduce((acc, b) => ([b, ...acc]), []);
Basically what it does is create a new array with the next element in foo, and spreading the accumulated array for each iteration after b.
[]
[1] => [1]
[2, ...[1]] => [2, 1]
[3, ...[2, 1]] => [3, 2, 1]
[4, ...[3, 2, 1]] => [4, 3, 2, 1]
Alternatively .reduceRight() as mentioned above here, but without the .push() mutation.
const baz = foo.reduceRight((acc, b) => ([...acc, b]), []);
const arrayCopy = Object.assign([], array).reverse()
This solution:
-Successfully copies the array
-Doesn't mutate the original array
-Looks like it's doing what it is doing
There's a new tc39 proposal, which adds a toReversed method to Array that returns a copy of the array and doesn't modify the original.
Example from the proposal:
const sequence = [1, 2, 3];
sequence.toReversed(); // => [3, 2, 1]
sequence; // => [1, 2, 3]
As it's currently in stage 3, it will likely be implemented in browser engines soon, but in the meantime a polyfill is available here or in core-js.
Reversing in place with variable swap just for demonstrative purposes (but you need a copy if you don't want to mutate)
const myArr = ["a", "b", "c", "d"];
const copy = [...myArr];
for (let i = 0; i < (copy.length - 1) / 2; i++) {
const lastIndex = copy.length - 1 - i;
[copy[i], copy[lastIndex]] = [copy[lastIndex], copy[i]]
}
Jumping into 2022, and here's the most efficient solution today (highest-performing, and no extra memory usage).
For any ArrayLike type, the fastest way to reverse is logically, by wrapping it into a reversed iterable:
function reverse<T>(input: ArrayLike<T>): Iterable<T> {
return {
[Symbol.iterator](): Iterator<T> {
let i = input.length;
return {
next(): IteratorResult<T> {
return i
? {value: input[--i], done: false}
: {value: undefined, done: true};
},
};
},
};
}
This way you can reverse-iterate through any Array, string or Buffer, without any extra copy or processing for the reversed data:
for(const a of reverse([1, 2, 3])) {
console.log(a); //=> 3 2 1
}
It is the fastest approach, because you do not copy the data, and do no processing at all, you just reverse it logically.
Is there a similarly simple strategy for reversing an array without altering the contents of the original array (without mutation) ?
Yes, there is a way to achieve this by using to[Operation] that return a new collection with the operation applied (This is currently at stage 3, will be available soon).
Implementation will be like :
const arr = [5, 4, 3, 2, 1];
const reversedArr = arr.toReverse();
console.log(arr); // [5, 4, 3, 2, 1]
console.log(reversedArr); // [1, 2, 3, 4, 5]
Not the best solution but it works
Array.prototype.myNonMutableReverse = function () {
const reversedArr = [];
for (let i = this.length - 1; i >= 0; i--) reversedArr.push(this[i]);
return reversedArr;
};
const a = [1, 2, 3, 4, 5, 6, 7, 8];
const b = a.myNonMutableReverse();
console.log("a",a);
console.log("////////")
console.log("b",b);
INTO plain Javascript:
function reverseArray(arr, num) {
var newArray = [];
for (let i = num; i <= arr.length - 1; i++) {
newArray.push(arr[i]);
}
return newArray;
}
es6:
const reverseArr = [1,2,3,4].sort(()=>1)

Eliminate duplicates of several arrays

I have 3 arrays:
array1 = [ 'A', 'B', 'A', 'B']
array2 = [ 5, 5, 7, 5]
array3 = [true,true,true,true]
I was wondering if there is any easy way (maybe with lodash) to eliminate the duplicates and end with this:
array1 = [ 'A', 'B', 'A']
array2 = [ 5, 5, 7]
array3 = [true,true,true]
I know I can do a function and compare the previous value, but is there a more clever way to do it?
Update
Please note that I don't need to eliminate the duplicates of each array.
What I looking is a way to eliminate the duplicates "vertically"
Update 2
Please note that each "column" is a record.
record1 = ['A',5,true]
record2 = ['B',5,true]
record3 = ['A',7,true]
record1 = ['B',5,true]
TL;DR
const records = array1.map((a, i) => [a, array2[i], array3[i]]);
const index = {};
records.filter(column => {
const key = JSON.stringify(column);
return key in index ? false : index[key] = true;
});
Huh?
There are a lot of ways to solve this, with varying degrees of efficiency, and the best solution will depend on the size of your data. A simple but naΓ―ve solution iterates over each "column" and checks all of the preceding columns for equality. It looks like this:
const array1 = [ 'A', 'B', 'A', 'B'];
const array2 = [ 5, 5, 7, 5];
const array3 = [true,true,true,true];
const newArray1 = array1.slice(0,1); // column 0 is never duplicate
const newArray2 = array2.slice(0,1);
const newArray3 = array3.slice(0,1);
// loop over columns starting with index 1
outer: for (let i = 1; i < array1.length; i++) {
const a = array1[i];
const b = array2[i];
const c = array3[i];
// check all preceding columns for equality
for (let j = 0; j < i; j++) {
if (a === array1[j] && b === array2[j] && c === array3[j]) {
// duplicate; continue at top of outer loop
continue outer;
}
}
// not a duplicate; add to new arrays
newArray1.push(a);
newArray2.push(b);
newArray3.push(c);
}
console.log(newArray1);
console.log(newArray2);
console.log(newArray3);
.as-console-wrapper{min-height:100%}
As you can see, we have to check each row within each column for equality, every time. If you're curious, the complexity of this is 𝑂(𝑛(𝑛+1)/2) (technically 𝑂(π‘šπ‘›(𝑛+1)/2), where π‘š is 3 for three columns).
For a larger data sets it's advantageous to keep track of values you've already seen in a data structure that's quick to access: A hash, a.k.a. a JavaScript object. Since all of your values are primitive, a quick way to construct a key is JSON.stringify. Some might consider this a "hack"β€”and it's important to note that it will fail with values that can't be represented in JSON, e.g. Infinity or NaNβ€”but it's a fast and easy one with data this simple.
const array1 = ['A', 'B', 'A', 'B'];
const array2 = [5, 5, 7, 5];
const array3 = [true, true, true, true];
const newArray1 = [];
const newArray2 = [];
const newArray3 = [];
const index = {};
for (let i = 0; i < array1.length; i++) {
const a = array1[i];
const b = array2[i];
const c = array3[i];
const key = JSON.stringify([a,b,c]);
if (key in index) {
// duplicate; skip to top of loop
continue;
}
// not a duplicate; record in index and add to new arrays
index[key] = true;
newArray1.push(a);
newArray2.push(b);
newArray3.push(c);
}
console.log(newArray1);
console.log(newArray2);
console.log(newArray3);
.as-console-wrapper{min-height:100%}
The complexity of this is 𝑂(𝑛), or maybe 𝑂(2π‘šπ‘›) where π‘š, again,
is 3 for three columns, and the 2 is another π‘š to very roughly account for the cost of JSON.stringify. (Figuring out the cost of hash access is left as an exercise for the pedants among us; I'm content to call it 𝑂(1).)
That's still pretty verbose. Part of the reason is that using three different variables for the dataβ€”which is really a single "table"β€”leads to a lot of repetition. We can preprocess the data to make it easier to deal with. Once it's "transposed" into a single two-dimensional array, we can use Array.prototype.filter with the key technique from above, for some very terse code:
const array1 = ['A', 'B', 'A', 'B'];
const array2 = [5, 5, 7, 5];
const array3 = [true, true, true, true];
// turn "columns" into "rows" of a 2D array
const records = array1.map((a, i) => [a, array2[i], array3[i]]);
const index = {};
const newData = records.filter(column => {
const key = JSON.stringify(column);
return key in index ? false : index[key] = true;
});
console.log(newData);
.as-console-wrapper{min-height:100%}
Of course, pre-processing isn't free, so this code isn't any more performant than the more verbose version; you'll have to decide how important that is to you. If you want you can now extract the columns from newData into three variables (newData.forEach(([a,b,c]) => { newArray1.push(a); newArray2.push(b); /* ... */ })), but for many purposes the "transposed" 2D array will be easier to work with.
You can use ES6 Set https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set
Set -> lets you store unique values of any type, whether primitive values or object references.
and then convert back to an array
check this snippet
const array1 = ['A','B','A','B']
const array2 = [5,5,7,5]
const array3 = [true,true,true,true]
const uniqA1= new Set(array1)
const uniqA2= new Set(array2)
const uniqA3= new Set(array3)
console.log(Array.from(uniqA1))
console.log(Array.from(uniqA2))
console.log(Array.from(uniqA3))
Hope it helps
You need to find duplicate elements with same indexes in all arrays and then filter out those elements.
var array1 = ['A', 'B', 'A', 'B'],
array2 = [5, 5, 7, 5],
array3 = [true, true, true, true];
var dupes = []
var arrays = [array1, array2, array3];
arrays.forEach(function(arr, i) {
arr.forEach((e, j) => !this[e] ? this[e] = true : dupes[i] = (dupes[i] || []).concat(j))
}, {})
var index = dupes[0].filter(e => dupes.every(a => a.includes(e)))
var result = arrays.map(e => e.filter((a, i) => !index.includes(i)))
console.log(result)
You're going to need a couple of helper functions (lodash provides them also):
let zip = (...arys) => arys[0].map((_, i) => arys.map(a => a[i]));
let uniq = (ary, key) => uniq2(ary, ary.map(key), new Set);
let uniq2 = (ary, keys, set) => ary.filter((_, i) => !set.has(keys[i]) && set.add(keys[i]))
// test
var array1 = ['A', 'B', 'A', 'B'];
var array2 = [5, 5, 7, 5];
var array3 = [true, true, true, true];
var [x, y, z] = zip(
...uniq(
zip(array1, array2, array3),
JSON.stringify
)
);
console.log(x, y, z)
Another way, with filter():
array1 = ['A', 'B', 'A', 'B'];
array2 = [5, 5, 7, 5];
array3 = [true, true, true, true];
uniqueArray1 = array1.filter(function(item, pos) {
return array1.indexOf(item) == pos;
})
uniqueArray2 = array2.filter(function(item, pos) {
return array2.indexOf(item) == pos;
})
uniqueArray3 = array3.filter(function(item, pos) {
return array3.indexOf(item) == pos;
})
console.log(uniqueArray1);
console.log(uniqueArray2);
console.log(uniqueArray3);
One method I can think of is using an object to keep track, which will also coincidentally remove any duplicates as keys have to be unique. The only thing is I can think of how to extract it back into an array for now. I will think about it tomorrow.
This utilizes jquery for deep cloning. If you want it only in vanilla javascript, you could probably just implement a deep clone function.
var array1 = [ 'A', 'B', 'A', 'B'];
var array2 = [ 5, 5, 7, 5];
var array3 = [true,true,true,true];
all_arrays = [array1, array2, array3];
let obj = {};
for (let i = 0; i < all_arrays[0].length; i++)
{
let new_obj = recursive_objects(all_arrays, 0, i)
$.extend(true, obj, new_obj);
}
console.log(obj);
function return_array(array, temp_obj)
{
let keys = Object.keys(temp_obj);
for (let key of keys)
{
}
}
function recursive_objects(arrays, arrays_index, index)
{
let obj = {}
if (arrays_index < arrays.length)
{
obj[arrays[arrays_index][index]] = recursive_objects(arrays, ++arrays_index, index);
}
return obj;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Fast group items in an array in javascript

I have an array which is returned from an API in the format of [a, b, c, d, e, f ... ], where a,c,e and b,d,f are of the same type, respectively. Now I want to group the array into [ [a,b], [c,d], [e,f] ...]. It's fairly easy by creating a new array, but the array is large so that could be slow.
So I'm wondering if there're any methods that can do it in-place?
Do you want it in 2 section chunks?
var o = ['a', 'b', 'c', 'd', 'e', 'f'],
size = 2, i, ar = []; // The new array
for (i = 0; i < o.length; i += size) ar.push(o.slice(i,i + size));
Now, ar is:
[
['a', 'b'],
['c', 'd'],
['e', 'f']
]
No matter how you do it, there is alway going to be some looping. The compiler has to go through all the array elements to make the new array.
Speed Tests
So I'll create an array with this:
var l = 10000000, // The length
o = [], j;
for (j = 0; j < l; j += 1) o.push(j);
So that will make an array with l items now to test the speed:
var start = performance.now(),
size = 2, ar = [];
for (i = 0; i < o.length; i += size) ar.push(o.slice(i,i + size));
console.log(performance.now() - start);
Tests:
100 Thousand: 0.092909533996135 seconds
1 Million: 0.359059600101318 seconds
10 Million: 10.138852232019417 seconds
The 10 million time might surprise but if you have that big of an array you have bigger problems such as memory issues. And if this array is coming from a server you are probably going to be putting excessive strain on the server.
This is wanton use of a library even though op is concerned about performance, but I like using lodash/underscore for easily-comprehensible code:
_.partition('a,b,c,d,e,f'.split(','), function(_, idx) {return !(idx % 2);})
An in place solution is to just iterate as normal, building arrays and 'skipping' elements by splicing them before you reach them.
DEMO
var arr = ['a', 'b', 'c', 'd', 'e', 'f'];
function compact (arr) {
for(var i = 0; i < arr.length; i++) {
arr[i] = [arr[i], arr[i + 1]];
arr.splice(i + 1, 1);
}
return arr; // debug only
}
console.log(compact(arr.slice()));
// >> [["a", "b"], ["c", "d"], ["e", "f"]]
Untested as far as performance goes. I would agree with the comments that it's most likely slower to manipulate the array in place, as apposed to building a new array.

Reverse array in Javascript without mutating original array

Array.prototype.reverse reverses the contents of an array in place (with mutation)...
Is there a similarly simple strategy for reversing an array without altering the contents of the original array (without mutation)?
You can use slice() to make a copy then reverse() it
var newarray = array.slice().reverse();
var array = ['a', 'b', 'c', 'd', 'e'];
var newarray = array.slice().reverse();
console.log('a', array);
console.log('na', newarray);
In ES6:
const newArray = [...array].reverse()
Another ES6 variant:
We can also use .reduceRight() to create a reversed array without actually reversing it.
let A = ['a', 'b', 'c', 'd', 'e', 'f'];
let B = A.reduceRight((a, c) => (a.push(c), a), []);
console.log(B);
Useful Resources:
Array.prototype.reduceRight()
Arrow Functions
Comma Operator
const originalArray = ['a', 'b', 'c', 'd', 'e', 'f'];
const newArray = Array.from(originalArray).reverse();
console.log(newArray);
There are multiple ways of reversing an array without modifying. Two of them are
var array = [1,2,3,4,5,6,7,8,9,10];
// Using Splice
var reverseArray1 = array.splice().reverse(); // Fastest
// Using spread operator
var reverseArray2 = [...array].reverse();
// Using for loop
var reverseArray3 = [];
for(var i = array.length-1; i>=0; i--) {
reverseArray.push(array[i]);
}
Performance test http://jsben.ch/guftu
Try this recursive solution:
const reverse = ([head, ...tail]) =>
tail.length === 0
? [head] // Base case -- cannot reverse a single element.
: [...reverse(tail), head] // Recursive case
reverse([1]); // [1]
reverse([1,2,3]); // [3,2,1]
reverse('hello').join(''); // 'olleh' -- Strings too!
An ES6 alternative using .reduce() and spreading.
const foo = [1, 2, 3, 4];
const bar = foo.reduce((acc, b) => ([b, ...acc]), []);
Basically what it does is create a new array with the next element in foo, and spreading the accumulated array for each iteration after b.
[]
[1] => [1]
[2, ...[1]] => [2, 1]
[3, ...[2, 1]] => [3, 2, 1]
[4, ...[3, 2, 1]] => [4, 3, 2, 1]
Alternatively .reduceRight() as mentioned above here, but without the .push() mutation.
const baz = foo.reduceRight((acc, b) => ([...acc, b]), []);
const arrayCopy = Object.assign([], array).reverse()
This solution:
-Successfully copies the array
-Doesn't mutate the original array
-Looks like it's doing what it is doing
There's a new tc39 proposal, which adds a toReversed method to Array that returns a copy of the array and doesn't modify the original.
Example from the proposal:
const sequence = [1, 2, 3];
sequence.toReversed(); // => [3, 2, 1]
sequence; // => [1, 2, 3]
As it's currently in stage 3, it will likely be implemented in browser engines soon, but in the meantime a polyfill is available here or in core-js.
Reversing in place with variable swap just for demonstrative purposes (but you need a copy if you don't want to mutate)
const myArr = ["a", "b", "c", "d"];
const copy = [...myArr];
for (let i = 0; i < (copy.length - 1) / 2; i++) {
const lastIndex = copy.length - 1 - i;
[copy[i], copy[lastIndex]] = [copy[lastIndex], copy[i]]
}
Jumping into 2022, and here's the most efficient solution today (highest-performing, and no extra memory usage).
For any ArrayLike type, the fastest way to reverse is logically, by wrapping it into a reversed iterable:
function reverse<T>(input: ArrayLike<T>): Iterable<T> {
return {
[Symbol.iterator](): Iterator<T> {
let i = input.length;
return {
next(): IteratorResult<T> {
return i
? {value: input[--i], done: false}
: {value: undefined, done: true};
},
};
},
};
}
This way you can reverse-iterate through any Array, string or Buffer, without any extra copy or processing for the reversed data:
for(const a of reverse([1, 2, 3])) {
console.log(a); //=> 3 2 1
}
It is the fastest approach, because you do not copy the data, and do no processing at all, you just reverse it logically.
Is there a similarly simple strategy for reversing an array without altering the contents of the original array (without mutation) ?
Yes, there is a way to achieve this by using to[Operation] that return a new collection with the operation applied (This is currently at stage 3, will be available soon).
Implementation will be like :
const arr = [5, 4, 3, 2, 1];
const reversedArr = arr.toReverse();
console.log(arr); // [5, 4, 3, 2, 1]
console.log(reversedArr); // [1, 2, 3, 4, 5]
Not the best solution but it works
Array.prototype.myNonMutableReverse = function () {
const reversedArr = [];
for (let i = this.length - 1; i >= 0; i--) reversedArr.push(this[i]);
return reversedArr;
};
const a = [1, 2, 3, 4, 5, 6, 7, 8];
const b = a.myNonMutableReverse();
console.log("a",a);
console.log("////////")
console.log("b",b);
INTO plain Javascript:
function reverseArray(arr, num) {
var newArray = [];
for (let i = num; i <= arr.length - 1; i++) {
newArray.push(arr[i]);
}
return newArray;
}
es6:
const reverseArr = [1,2,3,4].sort(()=>1)

Javascript equivalent of Python's zip function

Is there a javascript equivalent of Python's zip function? That is, given multiple arrays of equal lengths create an array of pairs.
For instance, if I have three arrays that look like this:
var array1 = [1, 2, 3];
var array2 = ['a','b','c'];
var array3 = [4, 5, 6];
The output array should be:
var outputArray = [[1,'a',4], [2,'b',5], [3,'c',6]]
2016 update:
Here's a snazzier Ecmascript 6 version:
zip= rows=>rows[0].map((_,c)=>rows.map(row=>row[c]))
Illustration equiv. to Python{zip(*args)}:
> zip([['row0col0', 'row0col1', 'row0col2'],
['row1col0', 'row1col1', 'row1col2']]);
[["row0col0","row1col0"],
["row0col1","row1col1"],
["row0col2","row1col2"]]
(and FizzyTea points out that ES6 has variadic argument syntax, so the following function definition will act like python, but see below for disclaimer... this will not be its own inverse so zip(zip(x)) will not equal x; though as Matt Kramer points out zip(...zip(...x))==x (like in regular python zip(*zip(*x))==x))
Alternative definition equiv. to Python{zip}:
> zip = (...rows) => [...rows[0]].map((_,c) => rows.map(row => row[c]))
> zip( ['row0col0', 'row0col1', 'row0col2'] ,
['row1col0', 'row1col1', 'row1col2'] );
// note zip(row0,row1), not zip(matrix)
same answer as above
(Do note that the ... syntax may have performance issues at this time, and possibly in the future, so if you use the second answer with variadic arguments, you may want to perf test it. That said it's been quite a while since it's been in the standard.)
Make sure to note the addendum if you wish to use this on strings (perhaps there's a better way to do it now with es6 iterables).
Here's a oneliner:
function zip(arrays) {
return arrays[0].map(function(_,i){
return arrays.map(function(array){return array[i]})
});
}
// > zip([[1,2],[11,22],[111,222]])
// [[1,11,111],[2,22,222]]]
// If you believe the following is a valid return value:
// > zip([])
// []
// then you can special-case it, or just do
// return arrays.length==0 ? [] : arrays[0].map(...)
The above assumes that the arrays are of equal size, as they should be. It also assumes you pass in a single list of lists argument, unlike Python's version where the argument list is variadic. If you want all of these "features", see below. It takes just about 2 extra lines of code.
The following will mimic Python's zip behavior on edge cases where the arrays are not of equal size, silently pretending the longer parts of arrays don't exist:
function zip() {
var args = [].slice.call(arguments);
var shortest = args.length==0 ? [] : args.reduce(function(a,b){
return a.length<b.length ? a : b
});
return shortest.map(function(_,i){
return args.map(function(array){return array[i]})
});
}
// > zip([1,2],[11,22],[111,222,333])
// [[1,11,111],[2,22,222]]]
// > zip()
// []
This will mimic Python's itertools.zip_longest behavior, inserting undefined where arrays are not defined:
function zip() {
var args = [].slice.call(arguments);
var longest = args.reduce(function(a,b){
return a.length>b.length ? a : b
}, []);
return longest.map(function(_,i){
return args.map(function(array){return array[i]})
});
}
// > zip([1,2],[11,22],[111,222,333])
// [[1,11,111],[2,22,222],[null,null,333]]
// > zip()
// []
If you use these last two version (variadic aka. multiple-argument versions), then zip is no longer its own inverse. To mimic the zip(*[...]) idiom from Python, you will need to do zip.apply(this, [...]) when you want to invert the zip function or if you want to similarly have a variable number of lists as input.
addendum:
To make this handle any iterable (e.g. in Python you can use zip on strings, ranges, map objects, etc.), you could define the following:
function iterView(iterable) {
// returns an array equivalent to the iterable
}
However if you write zip in the following way, even that won't be necessary:
function zip(arrays) {
return Array.apply(null,Array(arrays[0].length)).map(function(_,i){
return arrays.map(function(array){return array[i]})
});
}
Demo:
> JSON.stringify( zip(['abcde',[1,2,3,4,5]]) )
[["a",1],["b",2],["c",3],["d",4],["e",5]]
(Or you could use a range(...) Python-style function if you've written one already. Eventually you will be able to use ECMAScript array comprehensions or generators.)
Check out the library Underscore.
Underscore provides over 100 functions that support both your favorite workaday functional helpers: map, filter, invoke β€” as well as more specialized goodies: function binding, javascript templating, creating quick indexes, deep equality testing, and so on.
– Say the people who made it
I recently started using it specifically for the zip() function and it has left a great first impression. I am using jQuery and CoffeeScript, and it just goes perfectly with them. Underscore picks up right where they leave off and so far it hasn't let me down. Oh by the way, it's only 3kb minified.
Check it out:
_.zip(['moe', 'larry', 'curly'], [30, 40, 50], [true, false, false]);
// returns [["moe", 30, true], ["larry", 40, false], ["curly", 50, false]]
Modern ES6 example with a generator:
function *zip (...iterables){
let iterators = iterables.map(i => i[Symbol.iterator]() )
while (true) {
let results = iterators.map(iter => iter.next() )
if (results.some(res => res.done) ) return
else yield results.map(res => res.value )
}
}
First, we get a list of iterables as iterators. This usually happens transparently, but here we do it explicitly, as we yield step-by-step until one of them is exhausted. We check if any of results (using the .some() method) in the given array is exhausted, and if so, we break the while loop.
In addition to ninjagecko's excellent and comprehensive answer, all it takes to zip two JS-arrays into a "tuple-mimic" is:
//Arrays: aIn, aOut
Array.prototype.map.call( aIn, function(e,i){return [e, aOut[i]];})
Explanation:
Since Javascript doesn't have a tuples type, functions for tuples, lists and sets wasn't a high priority in the language specification.
Otherwise, similar behavior is accessible in a straightforward manner via Array map in JS >1.6. (map is actually often implemented by JS engine makers in many >JS 1.4 engines, despite not specified).
The major difference to Python's zip, izip,... results from map's functional style, since map requires a function-argument. Additionally it is a function of the Array-instance. One may use Array.prototype.map instead, if an extra declaration for the input is an issue.
Example:
_tarrin = [0..constructor, function(){}, false, undefined, '', 100, 123.324,
2343243243242343242354365476453654625345345, 'sdf23423dsfsdf',
'sdf2324.234dfs','234,234fsf','100,100','100.100']
_parseInt = function(i){return parseInt(i);}
_tarrout = _tarrin.map(_parseInt)
_tarrin.map(function(e,i,a){return [e, _tarrout[i]]})
Result:
//'('+_tarrin.map(function(e,i,a){return [e, _tarrout[i]]}).join('),\n(')+')'
>>
(function Number() { [native code] },NaN),
(function (){},NaN),
(false,NaN),
(,NaN),
(,NaN),
(100,100),
(123.324,123),
(2.3432432432423434e+42,2),
(sdf23423dsfsdf,NaN),
(sdf2324.234dfs,NaN),
(234,234fsf,234),
(100,100,100),
(100.100,100)
Related Performance:
Using map over for-loops:
See: What is the most efficient way of merging [1,2] and [7,8] into [[1,7], [2,8]]
Note: the base types such as false and undefined do not posess a prototypal object-hierarchy and thus do not expose a toString function. Hence these are shown as empty in the output.
As parseInt's second argument is the base/number radix, to which to convert the number to, and since map passes the index as the second argument to its argument-function, a wrapper function is used.
Along other Python-like functions, pythonic offers a zip function, with the extra benefit of returning a lazy evaluated Iterator, similar to the behaviour of its Python counterpart:
import {zip, zipLongest} from 'pythonic';
const arr1 = ['a', 'b'];
const arr2 = ['c', 'd', 'e'];
for (const [first, second] of zip(arr1, arr2))
console.log(`first: ${first}, second: ${second}`);
// first: a, second: c
// first: b, second: d
for (const [first, second] of zipLongest(arr1, arr2))
console.log(`first: ${first}, second: ${second}`);
// first: a, second: c
// first: b, second: d
// first: undefined, second: e
// unzip
const [arrayFirst, arraySecond] = [...zip(...zip(arr1, arr2))];
Disclosure I'm author and maintainer of Pythonic
Python has two function to zip sequences: zip and itertools.zip_longest. An implementation in Javascript for the same functionality is this:
Implementation of Python`s zip on JS/ES6
const zip = (...arrays) => {
const length = Math.min(...arrays.map(arr => arr.length));
return Array.from({ length }, (value, index) => arrays.map((array => array[index])));
};
Results in:
console.log(zip(
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111],
[11, 221]
));
[ [ 1, 667, 111, 11 ] ]
console.log(zip(
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111, 212, 323, 433, '1111']
));
[ [ 1, 667, 111 ], [ 2, false, 212 ], [ 3, -378, 323 ], [ 'a',
'337', 433 ] ]
console.log(zip(
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111],
[]
));
[]
Implementation of Python`s zip_longest on JS/ES6
(https://docs.python.org/3.5/library/itertools.html?highlight=zip_longest#itertools.zip_longest)
const zipLongest = (placeholder = undefined, ...arrays) => {
const length = Math.max(...arrays.map(arr => arr.length));
return Array.from(
{ length }, (value, index) => arrays.map(
array => array.length - 1 >= index ? array[index] : placeholder
)
);
};
Results:
console.log(zipLongest(
undefined,
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111],
[]
));
[ [ 1, 667, 111, undefined ], [ 2, false, undefined, undefined ],
[ 3, -378, undefined, undefined ], [ 'a', '337', undefined,
undefined ] ]
console.log(zipLongest(
null,
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111],
[]
));
[ [ 1, 667, 111, null ], [ 2, false, null, null ], [ 3, -378,
null, null ], [ 'a', '337', null, null ] ]
console.log(zipLongest(
'Is None',
[1, 2, 3, 'a'],
[667, false, -378, '337'],
[111],
[]
));
[ [ 1, 667, 111, 'Is None' ], [ 2, false, 'Is None', 'Is None' ],
[ 3, -378, 'Is None', 'Is None' ], [ 'a', '337', 'Is None', 'Is
None' ] ]
You can make utility function by using ES6.
console.json = obj => console.log(JSON.stringify(obj));
const zip = (arr, ...arrs) =>
arr.map((val, i) => arrs.reduce((a, arr) => [...a, arr[i]], [val]));
// Example
const array1 = [1, 2, 3];
const array2 = ['a','b','c'];
const array3 = [4, 5, 6];
console.json(zip(array1, array2)); // [[1,"a"],[2,"b"],[3,"c"]]
console.json(zip(array1, array2, array3)); // [[1,"a",4],[2,"b",5],[3,"c",6]]
However, in above solution length of the first array defines the length of the output array.
Here is the solution in which you have more control over it. It's bit complex but worth it.
function _zip(func, args) {
const iterators = args.map(arr => arr[Symbol.iterator]());
let iterateInstances = iterators.map((i) => i.next());
ret = []
while(iterateInstances[func](it => !it.done)) {
ret.push(iterateInstances.map(it => it.value));
iterateInstances = iterators.map((i) => i.next());
}
return ret;
}
const array1 = [1, 2, 3];
const array2 = ['a','b','c'];
const array3 = [4, 5, 6];
const zipShort = (...args) => _zip('every', args);
const zipLong = (...args) => _zip('some', args);
console.log(zipShort(array1, array2, array3)) // [[1, 'a', 4], [2, 'b', 5], [3, 'c', 6]]
console.log(zipLong([1,2,3], [4,5,6, 7]))
// [
// [ 1, 4 ],
// [ 2, 5 ],
// [ 3, 6 ],
// [ undefined, 7 ]]
1. Npm Module: zip-array
I found an npm module that can be used as a javascript version of python zip:
zip-array - A javascript equivalent of Python's zip function. Merges together the values of each of the arrays.
https://www.npmjs.com/package/zip-array
2. tf.data.zip() in Tensorflow.js
Another alternate choice is for Tensorflow.js users: if you need a zip function in python to work with tensorflow datasets in Javascript, you can use tf.data.zip() in Tensorflow.js.
tf.data.zip() in Tensorflow.js documented at here
Original answer (see update below)
I modified flm's nifty answer to take an arbitrary number of arrays:
function* zip(arrays, i = 0) {
while (i<Math.min(...arrays.map(({length})=>length))) {
yield arrays.map((arr, j) => arr[j < arrays.length - 1 ? i : i++])
}
}
Updated answer
As pointed out by Tom Pohl this function can't deal with arrays with falsy values in. Here is an updated/improved version that can deal with any types and also unequal length arrays:
function* zip(arrays, i = 0) {
while (i<Math.min(...arrays.map(arr=>arr.length))) {
yield arrays.map((arr, j) => arr[j < arrays.length - 1 ? i : i++])
}
}
const arr1 = [false,0,1,2]
const arr2 = [100,null,99,98,97]
const arr3 = [7,8,undefined,"monkey","banana"]
console.log(...zip([arr1,arr2,arr3]))
Not built-in to Javascript itself. Some of the common Javascript frameworks (such as Prototype) provide an implementation, or you can write your own.
Like #Brandon, I recommend Underscore's zip function. However, it acts like zip_longest, appending undefined values as needed to return something the length of the longest input.
I used the mixin method to extend underscore with a zipShortest, which acts like Python's zip, based off of the library's own source for zip.
You can add the following to your common JS code and then call it as if it were part of underscore: _.zipShortest([1,2,3], ['a']) returns [[1, 'a']], for example.
// Underscore library addition - zip like python does, dominated by the shortest list
// The default injects undefineds to match the length of the longest list.
_.mixin({
zipShortest : function() {
var args = Array.Prototype.slice.call(arguments);
var length = _.min(_.pluck(args, 'length')); // changed max to min
var results = new Array(length);
for (var i = 0; i < length; i++) {
results[i] = _.pluck(args, "" + i);
}
return results;
}});
A variation of the lazy generator solution:
function* iter(it) {
yield* it;
}
function* zip(...its) {
its = its.map(iter);
while (true) {
let rs = its.map(it => it.next());
if (rs.some(r => r.done))
return;
yield rs.map(r => r.value);
}
}
for (let r of zip([1,2,3], [4,5,6,7], [8,9,0,11,22]))
console.log(r.join())
// the only change for "longest" is some -> every
function* zipLongest(...its) {
its = its.map(iter);
while (true) {
let rs = its.map(it => it.next());
if (rs.every(r => r.done))
return;
yield rs.map(r => r.value);
}
}
for (let r of zipLongest([1,2,3], [4,5,6,7], [8,9,0,11,22]))
console.log(r.join())
And this is the python's classic "n-group" idiom zip(*[iter(a)]*n):
triples = [...zip(...Array(3).fill(iter(a)))]
ES2020 shortest variant:
function * zip(arr1, arr2, i = 0) {
while(arr1[i] || arr2[i]) yield [arr1[i], arr2[i++]].filter(x => !!x);
}
[ ...zip(arr1, arr2) ] // result
You could reduce the array of arrays and map new array by taking the result of the index of the inner array.
var array1 = [1, 2, 3],
array2 = ['a','b','c'],
array3 = [4, 5, 6],
array = [array1, array2, array3],
transposed = array.reduce((r, a) => a.map((v, i) => (r[i] || []).concat(v)), []);
console.log(transposed);
Fun with spread.
const
transpose = (r, a) => a.map((v, i) => [...(r[i] || []), v]),
array1 = [1, 2, 3],
array2 = ['a','b','c'],
array3 = [4, 5, 6],
transposed = [array1, array2, array3].reduce(transpose, []);
console.log(transposed);
I took a run at this in pure JS wondering how the plugins posted above got the job done. Here's my result. I'll preface this by saying that I have no idea how stable this will be in IE and the like. It's just a quick mockup.
init();
function init() {
var one = [0, 1, 2, 3];
var two = [4, 5, 6, 7];
var three = [8, 9, 10, 11, 12];
var four = zip(one, two, one);
//returns array
//four = zip(one, two, three);
//returns false since three.length !== two.length
console.log(four);
}
function zip() {
for (var i = 0; i < arguments.length; i++) {
if (!arguments[i].length || !arguments.toString()) {
return false;
}
if (i >= 1) {
if (arguments[i].length !== arguments[i - 1].length) {
return false;
}
}
}
var zipped = [];
for (var j = 0; j < arguments[0].length; j++) {
var toBeZipped = [];
for (var k = 0; k < arguments.length; k++) {
toBeZipped.push(arguments[k][j]);
}
zipped.push(toBeZipped);
}
return zipped;
}
It's not bulletproof, but it's still interesting.
A generator approach to pythons zip function.
function* zip(...arrs){
for(let i = 0; i < arrs[0].length; i++){
a = arrs.map(e=>e[i])
if(a.indexOf(undefined) == -1 ){yield a }else{return undefined;}
}
}
// use as multiple iterators
for( let [a,b,c] of zip([1, 2, 3, 4], ['a', 'b', 'c', 'd'], ['hi', 'hello', 'howdy', 'how are you']) )
console.log(a,b,c)
// creating new array with the combined arrays
let outputArr = []
for( let arr of zip([1, 2, 3, 4], ['a', 'b', 'c', 'd'], ['hi', 'hello', 'howdy', 'how are you']) )
outputArr.push(arr)
I have created a simple function to do so with a option to provide an zipper function
function zip(zipper, ...arrays) {
if (zipper instanceof Array) {
arrays.unshift(zipper)
zipper = (...elements) => elements
}
const length = Math.min(...arrays.map(array => array.length))
const zipped = []
for (let i = 0; i < length; i++) {
zipped.push(zipper(...arrays.map(array => array[i])))
}
return zipped
}
https://gist.github.com/AmrIKhudair/4b740149c29c492859e00f451832975b
I'm not a javascript guy but I feel like many of these answers are trying to find the cutest and most clever solution using Array.map which is fine, but for someone like me that doesn't use javascript every day here are some alternatives that might possibly be a bit more readable.
Maybe a way to avoid some cute and clever code would be:
function zip(a,b){
// pre-allocate an array to hold the results
rval=Array(Math.max(a.length, b.length));
for(i=0; i<rval.length; i++){
rval[i]=[a[i],b[i]]
}
return rval
}
If you like generators:
function* _zip(a,b){
len = Math.max(a.length, b.length) // handle different sized arrays
for(i=0; i<len; i++) { yield [a[i],b[i]] }
}
Or if you really want to use Array.map:
function map(a,b){
x = a.length > b.length ? a : b // call map on the biggest array
return x.map((_,i)=>[a[i],b[i]])
}
As I said, I'm not an everyday javascript guy so, these aren't going to be the most elegant solutions but they are readable to me.
Below is a fast and efficient way of doing this, using iter-ops library, operator zip:
const {pipe, zip} = require('iter-ops');
const i = pipe(array1, zip(array2, array3));
console.log(...i); //=> [ 1, 'a', 4 ] [ 2, 'b', 5 ] [ 3, 'c', 6 ]
The library processes all inputs as iterables, so they are iterated over just once. And it can handle, in the same way, all types of iterable objects - Iterable, AsyncIterable, Iterator, AsyncIterator.
P.S. I'm the author of iter-ops.
The Mochikit library provides this and many other Python-like functions. developer of Mochikit is also a Python fan, so it has the general style of Python, and also the wraps the async calls in a twisted-like framework.
There is no equivalent function. If you have only a few arrays you should use a for loop to get an index and then use the index to access the arrays:
var array1 = [1, 2, 3];
var array2 = ['a','b','c'];
for (let i = 0; i < Math.min(array1.length, array2.length); i++) {
doStuff(array1[i], array2[i]);
}
You can have an inner loop over the arrays if you have more.
Here is my solution
let zip = (a, b) => (a.length < b.length
? a.map((e, i) => [e, b[i]])
: b.map((e, i) => [a[i], e]))
This shaves a line off Ddi's iterator-based answer:
function* zip(...toZip) {
const iterators = toZip.map((arg) => arg[Symbol.iterator]());
const next = () => toZip = iterators.map((iter) => iter.next());
while (next().every((item) => !item.done)) {
yield toZip.map((item) => item.value);
}
}
If you are fine with ES6:
const zip = (arr,...arrs) =>(
arr.map(
(v,i) => arrs.reduce((a,arr)=>[...a, arr[i]], [v])))

Categories