Is it possible to use multiple arguments when determining indexOf on an array?
I want to determine if my array contains any of three integers. Important to note at this stage that the array will only have one value (if it has more, it won't reach this code block).
array.indexOf(123 || 124 || 125) === 0
So if array = [123] then my indexOf should be 0 and therefore true.
If array = [124] then my indexOf should be 0 and therefore true.
What I am finding is happening is [123] works OK but it's not even bothering to check the indexOf for the 2nd or 3rd arguments, and is just returning false.
http://codepen.io/anon/pen/WxmyGp?editors=0011
The || operator returns the left hand side if it is true, otherwise it returns the right hand side. 123 || 124 || 125 just means 123.
If you want to test if any of multiple values are in an array, then you have to test each one in turn.
array.indexOf(123) == 0 || array.indexOf(124) == 0 || array.indexOf(125) == 0
Since you only care about one specific index the array, you can turn the whole thing on its head:
[123, 124, 125].indexOf(array[0]) > -1
You can do this using Array.some().
The nice thing about this approach is that you will only have to iterate through the array once. If you || multiple indexOf() calls together, you're going to keep iterating the array with every missed search.
function checkArray(array) {
return array.some(function(item) {
return item == 123 || item == 124 || item == 125;
});
};
console.log(checkArray([123, 456, 789]));
console.log(checkArray([456, 789]));
console.log(checkArray([789, 456, 125]));
If you want to get multiple results one other way of doing this is Array.prototype.filter(). Bu this will return you the objects itself. If you are interested in the indices then you have to combine it with Array.prototype.reduce() such as;
var a = [1,2,3],
b = a.filter(e => e === 1 || e === 2)
.reduce((p,c) => p.concat(a.indexOf(c)),[]);
console.log(b);
Related
a=[1,2,3,4]
console.log(a.includes(4 || 5); //this yields true
console.log(a.includes(5 || 4); //this yields false
Why does the function includes() behave in this particular way?
includes only takes one value. You may think you're telling it to check for two things, but you're not. The || operator is going to compare the two operands and resolve to a single value. If the left hand side of the || is "truthy" then that will be used.
In short: 4 || 5 is no different from just 4, and 5 || 4 is no different from just 5. So your first line checks if a includes 4, and the second checks if it includes 5.
If you need to check multiple values, either call includes multiple times, or use a different method like .find
const a = [1,2,3,4];
const element = a.find(val => val === 4 || val === 5);
const found = !!element; // turning it into true/false instead of 4/5/undefined
console.log(found);
a=[1,2,3,4]
console.log(a.includes(4 || 5)); //this yields true
4 || 5 will be executed first. The Result is 4
4 is included in a and the Result is true
console.log(a.includes(5 || 4)); //this yields false
5 || 4 results in 5
5 is not included in a and the Result is false
The function includes() behaves as it should.
I guess the confusion comes from this part:
4 || 5 = 4
5 || 4 = 5
For Example: X || Y
If X is undefined, the result will be Y
If X is defined the result will be X
In both of your examples the "X" is defined with a Number. The result is as expected the fist Number.
I hope that was helpful :)
I'm currently studying Javascript Algorithms. Below is the algorithm im currently trying to learn/understand.
function same(arr1, arr2){
if(arr1.length !== arr2.length){
return false;
}
let frequencyCounter1 = {}
let frequencyCounter2 = {}
for(let val of arr1){
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
console.log(frequencyCounter1);
}
for(let val of arr2){
frequencyCounter2[val] = (frequencyCounter2[val] || 0) +1
}
for(let key in frequencyCounter1){
if(!(key ** 2 in frequencyCounter2)){
return false
}
if(frequencyCounter2[key ** 2] !== frequencyCounter1[key]){
return false
}
}
return true
}
same([1,2,3,2,5], [9,1,4,4,11])
I understand the code, except for 1 line.
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
So what this algo does, is it tries compares 2 arrays. If array b is equal to the square of each number is array a, then it should return true, else it will return false.
So in this example, it will return false
If i do [1,2,3,4,5] [1,4,9,16,25], it will return true
I know what this line does:
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
It makes a key value pair, so say for the first iteration, it makes take in 1 as the key and then (frequencyCounter1[val] || 0) +1 as the value, now this value represents the number of a times a number appears in array so if 1 appears 10 times it'll have a key value pair 1:10
I understand this very clearly, just wanted to know how this statement is evaluated and whats happening behind the scenes?
(frequencyCounter1[val] || 0) +1
The idea is that if frequencyCounter1[val] is undefined it defaults to 0. undefined + 1 returns NaN and it wouldn't work as the programmer intended, so he uses || to workaround that problem without having to write additional lines of code.
In JavaScript the operator || doesn't return true or false as you would expect, it returns either the first element that would evaluates as true if it was converted to boolean, or default to the last element if none is found.
For example, (null || "" || undefined || false || NaN || "test" || 2) will return "test"
This code is very unique in the sense that it starts with the first value of the array and checks if it already exists. If it does not exist, it is undefined. undefined is compared with 0 using the || operator. If undefined the val becomes registered and becomes 1.
Here we are basically setting the key:value pairs as seen in the below example
obj = {
1:1,
2:2,
3:5
}
obj[3] = 7
console.log(obj);
If val already exist the the OR operator is ignored and 1 is added to val. This one-liner is very useful as a counter operation in javaScript and the big O for the code is O(N) which is better than writing a nested loop O(N^2) which is a common solution for such problem.
Given two arrays myArray1 and myArray2, which may be null, how can I output a Boolean which tells me if at least one array is non-empty?
Assuming that I have the following variables:
myArray1 = ["one", "two", "three"]; //-> non-empty array
myArray2 = null; //-> not an array (in my case this happens from .match() returning no results)
I want an expression, such that myArray1 && myArray2 will be FALSE, but myArray1 || myArray2 will be TRUE.
I did look through other relevant Stack Overflow questions (see an abridged list below), but since I still struggled to figure out the solution, I thought I would post it as a separate question since answers might also benefit others.
The common way of testing for empty arrays is:
myBooleanOr = (myArray1.length || myArray2.length); //3
myBooleanAnd = (myArray1.length && myArray2.length); //Error
This works if both variables are arrays, but in this case, the second one will throw up Error: cannot read property length of 'null'. Using the Boolean() function does not solve the problem since the following also throws up the same error:
myBooleanAnd = (Boolean(myArray1.length) && Boolean(myArray2.length)); //error
A solution for testing empty arrays which was accepted in several Stack Overflow questions is to use typeof myArray !== "undefined", but that still does not solve the problem, because neither of the arrays match "undefined", so myBooleanAnd will still throw up an error:
var bool = (typeof myArray1 !== "undefined"); //true
var bool = (typeof myArray2 !== "undefined"); //true
var myBooleanAnd = ((typeof myArray1 !== "undefined" && myArray1.length) || (typeof myArray2 !== "undefined" && myArray2.length)); //Error: cannot read property length of null
Comparing the arrays against [], which also seems intuitive, also doesn't work, because neither of the arrays match []:
var bool = (myArray1 !== []); //true
var bool = (myArray2 !== []); //true
Other relevant posts
A number of other questions on Stack Overflow deal with testing for empty Javascript arrays, including:
Testing for empty arrays: Check if array is empty or exists
Testing for empty arrays (jQuery): Check if array is empty or null
Relative advantages of methods for testing empty arrays: Testing for an empty array
Testing for empty objects: How do I test for an empty JavaScript object?
And there are also questions about the truth value of empty arrays in Javascript:
JavaScript: empty array, [ ] evaluates to true in conditional structures. Why is this?
UPDATE
I have corrected the following errors posted in the original question (thanks to those who pointed them out). I am listing them here since they might also be helpful to others:
==! changed to !==
typeof x === undefined, changed to typeof x === "undefined"
I would suggest using a helper function to determine if a single array is non-empty, and then use that twice. This is simple and straightforward:
function isNonEmptyArray(arr) {
return !!(Array.isArray(arr) && arr.length);
}
var myBooleanAnd = isNonEmptyArray(myArray1) && isNonEmptyArray(myArray2);
var myBooleanOr = isNonEmptyArray(myArray1) || isNonEmptyArray(myArray2);
Ok, so, there're a bunch of errors in the code examples, i'll try to to explain them all:
myBooleanOr = (myArray1.length || myArray2.length); //3
myBooleanAnd = (myArray1.length && myArray2.length); //Error
Here, the first line returns the first truthy value it encounters. Since myArray1 has a length > 0, it returns that value and never evaluates the second part of the condition, that's why you're not getting the error. Swap the checks and it will break.
The second line combines the two values to give a result, so it will always give an error when one of the two variables are null.
var bool = (typeof myArray1 === undefined); //false
typeof returns a string, if you compare it to the undefined constant it will always be false, the correct statement is typeof myArray1 === "undefined" as written in most of the posts you linked
var bool = (myArray2 ==! null);
the "strictly not equal" operator is !== and NOT ==!. You're doing a different operation and that's why you get surprising results.
Putting the right spaces in the syntax, this is your code var bool = (myArray2 == !null);
So you boolean-flip the value of null, which is falsy by nature, getting true, and then compare if myArray2 is loosely-equal to true ... since myArray2 is null, and that is falsy as we said, the comparison gives back a false.
That said, for the solution to the question, I'd propose a slightly longer syntax that is more explicit, clear to understand, and you can scale to check how many arrays you like without adding more complexity:
var myArray1 = [1,2,3]
var myArray2 = null
var arrays = [myArray1, myArray2]
var oneNotEmpty = arrays.some( a => typeof a != "undefined" && a != null && a.length > 0)
console.log("At least one array non-empty?", oneNotEmpty)
You could use isArray to check something is an array or not. Most modern browsers supports it. https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray
And then you can combine isArray and array's length to check if something is a valid non empty array or not.
function isOneNotEmpty(arrayOne, arrayTwo){
return (Array.isArray(arrayOne)? arrayOne.length > 0 : false) || (Array.isArray(arrayTwo)? arrayOne.length > 0 : false)
}
console.log(isOneNotEmpty([],null));
console.log(isOneNotEmpty([1,2,3,4],null));
console.log(isOneNotEmpty([3,4,5],[1]));
Testing for an array is simple enough:
var blnIsPopulatedArray = (myArray != null
&& typeof myArray == "object"
&& typeof myArray.length == "number"
&& myArray.length > 0);
Will return false if 'myArray' isn't an array with at least one item, or true if it is an array with at least one item.
One solution is the following (with or without Boolean()):
Using Array.isArray() and array.length:
var myBooleanAnd = Boolean(Array.isArray(myArray2) && myArray2.length) && Boolean(Array.isArray(myArray1) && myArray1.length) ; //false -> OK
var myBooleanOr = Boolean(Array.isArray(myArray2) && myArray2.length) || Boolean(Array.isArray(myArray1) && myArray1.length) ; //true -> OK
It is also possible to use myArray1 !== null instead of Array.isArray(myArray1), but since the latter is a more specific test, the broader Array.isArray() method seems preferable.
UPDATE
I had previously suggested using the following:
var myBooleanAnd = Boolean(myArray1 && myArray2); //returns false -> OK
var myBooleanOr = Boolean(myArray1 || myArray2); //returns true -> OK
but since, as pointed out by #JLRishe, the following expression also returns TRUE, this is not a safe solution, since it will only work in situations where the arrays can never be empty.
var bool = Boolean([] && []); //returns true -> false positive
function arrayIsEmpty(array) {
if (!Array.isArray(array)) {
return false;
}
if (array.length == 0) {
return true;
}
}
in JavaScript:
(1 == 1) === true;
(1 === 1) === true;
and
var a = 1;
var b = [1];
(a == b) === true
but
([1]==[1]) === false;
Why is it so? I have no idea
[1] and the other [1] are different objects, and object equality is defined as identity. In other words, an object is only equal to itself.
> a = [1]
[1]
> b = [1]
[1]
> a == b
false
> b = a
[1]
> a == b
true
Reference: http://es5.github.io/#x11.9.3, step 1.f is what applies here.
Because [1] and [1] are two instances of the Array object. As such, they are not equal (two objects are equal only if they are the exact same instance).
However, when you compare 1 and [1], there needs to be type juggling involved.
Arrays can be cast to a string, by joining all items with a comma. This results in the string "1".
In turn, "1" can be cast to a number, giving 1. 1 == 1 is clearly true.
please see here: How to compare arrays in JavaScript?
the [] syntax is a literal constructor for Array objects, and objects cannot be compared to others (well they can, with negative result).
Strings and numbers are immutable in JavaScript, so you're literally comparing the content of a certain block in memory with itself when you do something like "string" === "string" or 1 === 1, whereas you're constructing a new Array object each time you use the constructor.
I hope this makes sense.
Because arrays are objects and they are compared by reference rather than value so only the same instances are equal
For example:
var array1 = [1];
var array2 = array1;
(array1 == array2) === true;
(array1 == [1]) === false
I was wondering why / when -1 is used in JavaScript or jQuery, respectively.
I'm using this snippet:
if ($.inArray('false', answers) > -1) {
window.console.log('you Lose!');
}
Cheers,
Chris
In this case, -1 is returned if the expression you are looking for is not in the array you are looking in.
While, according to its documentation, $.inArray returns the index of the element (if it was found), an index of -1 is impossible: This is due to the fact that indices start with 0 as lowest possible value.
Hence -1 does just mean a non-valid index position, i.e. not found.
So, in your snippet, the test for > -1 means: Check whether any valid index was found, or in other words, check whether the given value was in the array.
This is even mentioned in the documentation:
Because JavaScript treats 0 as loosely equal to false (i.e. 0 == false, but 0 !== false), if we're checking for the presence of value within array, we need to check if it's not equal to (or greater than) -1.
Hope this helps :-)
Because indexes is starting with 0. (0-index).
var arr = ['a', 'b', 'c'];
alert( arr[0] ) // a;
So when checking the index of an element in an array ($.inArray and [].indexOf) it would be wrong to return 0 if the element isn't present:
var arr = ['a', 'b', 'c'];
alert( arr.indexOf('a') ); // 0
alert( arr.indexOf('b') ); // 1
alert( arr.indexOf('c') ); // 1
alert( arr.indexOf('d') ); // -1
I would say that jQuery have a design flaw here, I would rather think that jQuery.inArray would return a boolean (true/false).