Comparing numbers of type string - javascript

I'm implementing a sort function and came across the following:
'49' > '5' // false
'49' > '4' // true
new String(49).localeCompare('4') // 1
new String(49).localeCompare('5') // -1
Expected behaviour is obviously that 49 > 4 or 5 should be true. Is there any way to solve this without converting the strings to numbers?

That is actually expected behavior when comparing strings, as described here. The easiest thing for this situation would be to convert the values to numbers for comparison if you want to compare them as numbers.
Thinking outside the box a little, you could first compare the length of the strings before using the > operator. If they are numeric strings, the longer string would have a higher value (assuming you don't have numbers like '0024'). If they are equal in length, the > operator would then work as you expect.

Related

Comparison String to String

I got these 2 examples below -
console.log("a" > "3") // outputs true
console.log("hello" > "3") // outputs true
According to MDN, If both values are strings, they are compared as strings, based on the values of the Unicode code points they contain.
But then they also wrote the following in the next paragraph, Strings are converted based on the values they contain, and are converted as NaN if they do not contain numeric values.
Following this logic, shouldn't both statements be false since no matter what the operator it is, "a" and "hello" are words in strings and they don't have a numerical value, therefore, it should return NaN, and NaN is false; hence, as soon as one of the operands is false, it outputs false?
If I need to adhere to the former statement above, could you please walk me through this logic?
Thanks.
The key phrase from the MDN article is
If both values are strings, […].
Otherwise JavaScript attempts to convert non-numeric types to numeric values
So no, "a", "3" and "hello" are all strings, and when comparing them, they get compared as strings. No conversion to anything occurs.
You are comparing two strings (in quotation marks), not a string to a number. This should answer your question:
"a" > "3" //true
"a" > 3 //false
As you've written,
If both values are strings, they are compared as strings, based on the
values of the Unicode code points they contain
So, on both examples, both values are strings. If you change "3" to 3, the result is:
console.log("a" > 3)
// expected output: false
console.log("hello" > 3)
// expected output: false
That's because, first it's converted as NaN, and then:
If either value is NaN, the operator returns false.
The docs:
If both values are strings, they are compared as strings, based on the values of the Unicode code points they contain.
Otherwise JavaScript attempts to convert non-numeric types to numeric values: ....
Reading the docs, I understand that, if BOTH strings have numerical values (both not NaN), then both are treated as Unicode code points. Otherwise, if BOTH have numerical values, then they're converted.
But anyways, my sugestion for a js beginer: do not get too attached to how the language works for now, and even tho it can compare strings with "greater" and "less than", I dont think there would (or should) be a real life scenario where you would need it.
Also, just for fun, check this pen I made when I first started with js (I was also very confused about comparison and typing xD):
"" == null // this one is fun
Later, after learning more about the language, you'll get why somethings work like they do in js, I also recomending understanding the history of js and the web, it explains a lot of the weird behaviours of the language.

When comparing strings, does JavaScript start by comparing their length?

I would like to know if, when comparing two (potentially massive) strings, JavaScript internally starts by comparing their length before looping over the characters.
If not, would that mean doing something along the lines of
if (string1.length !== string2.length || string1 !== string2)
// do something if data changed
should lead to a gain of time, right ? (I am assuming the read-only length property of a string is computed when the string is set, and not when asked; Not sure if that's actually the case)
For context, I am dealing with massive strings as I am lazily JSON.stringifying arrays of objects received from some API (comparing the latest element only would probably be better but this question is more out of curiosity than anything, please only mind that the strings I would compare would be humongous ^^)
Relevant references ?
https://javascript.info/comparison
Comparing large strings in JavaScript with a hash
Edit
Thank you #RobG in the comment for linking me to ECMA-262's SameValueNonNumber.
In the step definition, step 5 is
SameValueNonNumber ( x, y )
5.If Type(x) is String, then
If x and y are exactly the same sequence of code units (same length and same code units at corresponding indices), return true; otherwise, return false.
And I think that answers the question

Why does Number < String returns true in JavaScript?

EDIT: I will rephrase my question, I type Number < String and it returns true, also works when I do typeof(2) < typeof("2").
Number < String => true
typeof(2) < typeof("2") => true
I'm guessing it is the value of ASCII characters of each letter in Number and String but I am not sure if that is the reason this is returning true, and I want to know why does this happens, what processes or how does the interpreter gets to this result?
First answer:
The charCodeAt() method returns the numeric Unicode value of the character at the given index. Read here
Now if you do not specify any index position then character at 0th index is considered. Now, S ASCII value is 83 and N ASCII value is 78. so, you are getting those number. Check here.
And 78 < 83 => true is obvious.
Try "String".charCodeAt(1) and you will get 116 which is ASCII value of t
Second answer based on OP's edited question:
Frankly speaking your comparison Number < String is "technically" incorrect because Less-than Operator < or any similar operator is for expressions, and Number and String are functions and not expressions. However #Pointy explained on how Number < String worked and gave you results.
More insight on comparison operators
Comparison operators like < works on expressions, read here. Typically, you should have a valid expression or resolved value for RHS and LHS.
Now this is the definition of expression, read more here - "An expression is any valid unit of code that resolves to a value. Conceptually, there are two types of expressions: those that assign a value to a variable and those that simply have a value."
So, (x = 7) < (x = 2) or new Number() < new String() is a "technically" valid/good comparison, even this Object.toString < Number.toString() but really not Object < Function.
Below are rules/features for comparisons, read more here
Two strings are strictly equal when they have the same sequence of characters, same length, and same characters in corresponding positions.
Two numbers are strictly equal when they are numerically equal (have the same number value). NaN is not equal to anything, including NaN. Positive and negative zeros are equal to one another.
Two Boolean operands are strictly equal if both are true or both are false.
Two distinct objects are never equal for either strict or abstract comparisons.
An expression comparing Objects is only true if the operands reference the same Object.
Null and Undefined Types are strictly equal to themselves and abstractly equal to each other.
The result of
Number < String
is not the result of comparing the strings "Number" and "String", or not exactly that. It's the result of comparing the strings returned from Number.toString() and String.toString(). Those strings will (in all the runtimes I know of) have more stuff in them than just the strings "Number" and "String", but those two substrings will be the first place that they're different.
You can see what those actual strings are by typing
Number.toString()
in your browser console.
JavaScript does the following thing:
"String".charCodeAt(); => 83
"S".charCodeAt(); => 83
"String".charCodeAt(0); => 83
The method charCodeAt(a) gets the char code from position a. The default value is 0
If you compare N > S you will get 78 > 83 => true
For the complete String Javascript calculates the sum of all ASCII char codes.
So I can answer your question with yes.

javascript comparison of strings

I have the following script
document.write("12" < "2");
which returns true. Any reason why? The documentation says that javascript compares strings numerically but, I don't see how "12" is less than "2".
JavaScript compares strings character by character until one of the characters is different.
1 is less than 2 so it stops comparing after the first character.
I believe it is doing a lexicographic comparison - the first char in string one is '1', which is less than the first char of string two, which is '2'. More about Lexicographic order here: http://en.wikipedia.org/wiki/Lexicographical_order
This is because the first character of "12" is 1, which comes before "2"; and JavaScript string comparison is lexically/alphabetically, rather than numerically. Though it appears partially numeric, since 1 is sorted ahead of 2.
You can, however, simply compare the numbers as numbers:
document.write(parseFloat("12") < parseFloat("2"));
Try:
document.write(parseInt("12") < parseInt("2"));

how exactly do Javascript numeric comparison operators handle strings?

var i = ['5000','35000'];
alert((i[0] < i[1])?'well duh!':'fuzzy math?');
alert((Number(i[0]) < Number(i[1]))?'well duh!':'fuzzy math?');
What's happening here? In the first alert, the text string "5000" evaluates as not less than "35000". I assumed Javascript used Number() when numerically comparing strings, but apparently that's not the case. Just curious how exactly Javascript handles numerically comparing the strings of numbers by default.
Javascript compares strings by character value, whether the strings look like numbers or not.
You can see this in the spec, section 11.8.5, point 4.
'a' < 'b' and 'ab' < 'ac are both true.

Categories