var i = ['5000','35000'];
alert((i[0] < i[1])?'well duh!':'fuzzy math?');
alert((Number(i[0]) < Number(i[1]))?'well duh!':'fuzzy math?');
What's happening here? In the first alert, the text string "5000" evaluates as not less than "35000". I assumed Javascript used Number() when numerically comparing strings, but apparently that's not the case. Just curious how exactly Javascript handles numerically comparing the strings of numbers by default.
Javascript compares strings by character value, whether the strings look like numbers or not.
You can see this in the spec, section 11.8.5, point 4.
'a' < 'b' and 'ab' < 'ac are both true.
Related
I got these 2 examples below -
console.log("a" > "3") // outputs true
console.log("hello" > "3") // outputs true
According to MDN, If both values are strings, they are compared as strings, based on the values of the Unicode code points they contain.
But then they also wrote the following in the next paragraph, Strings are converted based on the values they contain, and are converted as NaN if they do not contain numeric values.
Following this logic, shouldn't both statements be false since no matter what the operator it is, "a" and "hello" are words in strings and they don't have a numerical value, therefore, it should return NaN, and NaN is false; hence, as soon as one of the operands is false, it outputs false?
If I need to adhere to the former statement above, could you please walk me through this logic?
Thanks.
The key phrase from the MDN article is
If both values are strings, […].
Otherwise JavaScript attempts to convert non-numeric types to numeric values
So no, "a", "3" and "hello" are all strings, and when comparing them, they get compared as strings. No conversion to anything occurs.
You are comparing two strings (in quotation marks), not a string to a number. This should answer your question:
"a" > "3" //true
"a" > 3 //false
As you've written,
If both values are strings, they are compared as strings, based on the
values of the Unicode code points they contain
So, on both examples, both values are strings. If you change "3" to 3, the result is:
console.log("a" > 3)
// expected output: false
console.log("hello" > 3)
// expected output: false
That's because, first it's converted as NaN, and then:
If either value is NaN, the operator returns false.
The docs:
If both values are strings, they are compared as strings, based on the values of the Unicode code points they contain.
Otherwise JavaScript attempts to convert non-numeric types to numeric values: ....
Reading the docs, I understand that, if BOTH strings have numerical values (both not NaN), then both are treated as Unicode code points. Otherwise, if BOTH have numerical values, then they're converted.
But anyways, my sugestion for a js beginer: do not get too attached to how the language works for now, and even tho it can compare strings with "greater" and "less than", I dont think there would (or should) be a real life scenario where you would need it.
Also, just for fun, check this pen I made when I first started with js (I was also very confused about comparison and typing xD):
"" == null // this one is fun
Later, after learning more about the language, you'll get why somethings work like they do in js, I also recomending understanding the history of js and the web, it explains a lot of the weird behaviours of the language.
EDIT: I will rephrase my question, I type Number < String and it returns true, also works when I do typeof(2) < typeof("2").
Number < String => true
typeof(2) < typeof("2") => true
I'm guessing it is the value of ASCII characters of each letter in Number and String but I am not sure if that is the reason this is returning true, and I want to know why does this happens, what processes or how does the interpreter gets to this result?
First answer:
The charCodeAt() method returns the numeric Unicode value of the character at the given index. Read here
Now if you do not specify any index position then character at 0th index is considered. Now, S ASCII value is 83 and N ASCII value is 78. so, you are getting those number. Check here.
And 78 < 83 => true is obvious.
Try "String".charCodeAt(1) and you will get 116 which is ASCII value of t
Second answer based on OP's edited question:
Frankly speaking your comparison Number < String is "technically" incorrect because Less-than Operator < or any similar operator is for expressions, and Number and String are functions and not expressions. However #Pointy explained on how Number < String worked and gave you results.
More insight on comparison operators
Comparison operators like < works on expressions, read here. Typically, you should have a valid expression or resolved value for RHS and LHS.
Now this is the definition of expression, read more here - "An expression is any valid unit of code that resolves to a value. Conceptually, there are two types of expressions: those that assign a value to a variable and those that simply have a value."
So, (x = 7) < (x = 2) or new Number() < new String() is a "technically" valid/good comparison, even this Object.toString < Number.toString() but really not Object < Function.
Below are rules/features for comparisons, read more here
Two strings are strictly equal when they have the same sequence of characters, same length, and same characters in corresponding positions.
Two numbers are strictly equal when they are numerically equal (have the same number value). NaN is not equal to anything, including NaN. Positive and negative zeros are equal to one another.
Two Boolean operands are strictly equal if both are true or both are false.
Two distinct objects are never equal for either strict or abstract comparisons.
An expression comparing Objects is only true if the operands reference the same Object.
Null and Undefined Types are strictly equal to themselves and abstractly equal to each other.
The result of
Number < String
is not the result of comparing the strings "Number" and "String", or not exactly that. It's the result of comparing the strings returned from Number.toString() and String.toString(). Those strings will (in all the runtimes I know of) have more stuff in them than just the strings "Number" and "String", but those two substrings will be the first place that they're different.
You can see what those actual strings are by typing
Number.toString()
in your browser console.
JavaScript does the following thing:
"String".charCodeAt(); => 83
"S".charCodeAt(); => 83
"String".charCodeAt(0); => 83
The method charCodeAt(a) gets the char code from position a. The default value is 0
If you compare N > S you will get 78 > 83 => true
For the complete String Javascript calculates the sum of all ASCII char codes.
So I can answer your question with yes.
I have the following script
document.write("12" < "2");
which returns true. Any reason why? The documentation says that javascript compares strings numerically but, I don't see how "12" is less than "2".
JavaScript compares strings character by character until one of the characters is different.
1 is less than 2 so it stops comparing after the first character.
I believe it is doing a lexicographic comparison - the first char in string one is '1', which is less than the first char of string two, which is '2'. More about Lexicographic order here: http://en.wikipedia.org/wiki/Lexicographical_order
This is because the first character of "12" is 1, which comes before "2"; and JavaScript string comparison is lexically/alphabetically, rather than numerically. Though it appears partially numeric, since 1 is sorted ahead of 2.
You can, however, simply compare the numbers as numbers:
document.write(parseFloat("12") < parseFloat("2"));
Try:
document.write(parseInt("12") < parseInt("2"));
I am trying to understand this expression:
((ch = stream.getChar()) > ' ')
Here, getChar() gets a character. How does this greater-than comparision operator check if any char is greater than an empty space?
Is this possible?
An empty space has a character code. Even though it doesn't look like much, it still has a value. So does the character taken from the stream. Comparing the character codes of these values is what produces the output.
Let's take a gander at the language specification (the algorithm itself is described in here) (do note that it defines <, but the > operator simply flips the resulting value).
What the operator does is try to convert both operands to primitive types, with a preference for numbers:
2. a. Let py be the result of calling ToPrimitive(y, hint Number).
2. b. Let px be the result of calling ToPrimitive(x, hint Number).
In our case, x === stream.getChar() and y === ' '. Since both of the operands are primitive strings already, that results in the original values (px = x, py = y), and we move on to:
4. Else, both px and py are Strings
Now it does checks to see if any of the operands are prefixes of the other, for example:
'abc' > 'abcd' // false
'foo' > 'foobar' // false
Which is relevant if getChar() results in a space, since the space is a prefix of itself:
' ' > ' ' // false
We move on, to finding the first character in x and y who're on the same position in the strings, but are different characters:
Let k be the smallest nonnegative integer such that the character at position k within px is different from the character at position k within py. (There must be such a k, for neither String is a prefix of the other.)
(e.g., 'efg' and 'efh', we want g and h)
The characters we've found are then converted to their integer values:
Let m be the integer that is the code unit value for the character at position k within px.
Let n be the integer that is the code unit value for the character at position k within py.
And finally, a comparison is made:
If m < n, return true. Otherwise, return false.
And that's how it's compared to the space.
tl;dr It converts both arguments to their code-unit integer representations, and compares that.
In Javascript strings are compared in alphabetical order. These expressions are true:
'abacus' <= 'calculator'
'abacus' < 'abate'
In most (if not all) programming languages, characters are represented internally by a number. When you do equality/greater-than/less-than checks what you're actually checking is the underlying number.
hence in JS:
alert('c' > 'b'); // alerts true
alert('a' > 'b'); // alerts false
A space character also has a numeric representation, therefore the check is a valid one.
[string] > [string] will compare the character(s) by their representative values (see ASCII Table)
Characters are stored in the computer's memory as a number (usually a byte or two).
Each character has a unique identifying number.
By checking if a character is greater than space, you actually comapare their place in a table.
See http://en.wikipedia.org/wiki/ASCII for more.
Check out this link, it'll explain how the comparison works on JS: http://javascript.about.com/od/decisionmaking/a/des02.htm
Basically, you're comparing the ASCII value of each character to the ASCII value of the blank space, which is also, a character and therefore, has a corresponding ASCII value.
I understand that using the "===" compares type, so running the following code results in "not equal" because it's comparing a number type to a string type.
var a = 20;
var b = "20";
if (a === b) {
alert("They are equal");
} else {
alert("They are not equal");
}
But I dont understand how using the "==" to compare only the value results in the "They are equal" message.
var a = 20;
var b = "20";
if (a == b) {
alert("They are equal");
} else {
alert("They are not equal");
}
How are the values equal? Isn't the string "20" stored as the ASCII characters 50 and 48 (0110010 and 0110000 in binary) and 20 stored as the actual binary number 0010100?
EDIT: Thanks everyone! I think all the responses are great and have helped me understand this much better.
The == operator compares only the values of the variables. If the types are different, a conversion is operated. So the number 20 is converted to the string "20" and the result is compared.
The === operator compares not only the values, but also the types, so no cast is operated. In this case "20" !== 20
The JavaScript engine sees the a as a number and casts the b to number before the valuation.
When type conversion is needed, JavaScript converts String, Number, Boolean, or Object operands as follows.
When comparing a number and a string, the string is converted to a number value. JavaScript attempts to convert the string numeric literal to a Number type value. First, a mathematical value is derived from the string numeric literal. Next, this value is rounded to nearest Number type value.
If one of the operands is Boolean, the Boolean operand is converted to 1 if it is true and +0 if it is false.
If an object is compared with a number or string, JavaScript attempts to return the default value for the object. Operators attempt to convert the object to a primitive value, a String or Number value, using the valueOf and toString methods of the objects. If this attempt to convert the object fails, a runtime error is generated.
The problem with the == comparison is that JavaScript version 1.2 doesn't perform type conversion, whereas versions 1.1 and 1.3 onwards do.
The === comparison has been available since version 1.3, and is the best way to check of two variables match.
If you need your code to be compatible with version 1.1, 1.2 and 1.3 versions of JavaScript code, you should ensure that the variables all match as if it was an === comparison that was being performed.
Part of the definition of "==" is that the values will be converted to the same types before comparison, when possible. This is true of many loosely typed languages.
Javascript is designed such that a string containing numbers is considered "equal" to that number. The reason for that is simplicity of use for the case of users entering a number into an input field and the site validates it in JS -- you don't have to cast the entered string to a number before comparing.
It simplifies a common use case, and the === operator still allows you to compare with the type considered as well.
As far as I know JavaScript does automatic data type conversion on the fly - so maybe the variables are casted to equivalent types automatically.