I've seen examples of code that uses only operators and "" to perform complex string operations. Basically, the idea was that something like ((+"+")+"")[+""] gives you a letter N, etc. I forgot where I found it, and I'm having no luck finding proper google keywords. Does anyone have a link at hand?
Basically there are two main concepts used here:
making a Number out of string, i.e. Number(str), which shortcut is +str;
stringifying numeric values, i.e. String(n), which shortcut is n+"".
Hence, if we look at the expression thoroughly, we'll see:
+"+" === NaN
NaN + "" === "NaN"
+"" === 0
"NaN"[0] === "N"
There are a lot of things you can do in JavaScript in the same way. One funny example is provided in the following question: What are JavaScript's builtin strings?
Related
I have some old code that doesn't have any comments that is using javascript differently then how I've ever used it. The following is doing math on strings.
if ((toFind > nextOptionText) && (toFind < lookAheadOptionText))
I've found questions like this one - that basically states that "a" < "b":
how exactly do Javascript numeric comparison operators handle strings?
However in my example I have some special characters that it is comparing against. In my code the parameters from above are:
if (("A" > "+") && ("A" < ">EINFUHRUNG ZUM"))
This is equaling TRUE - For me in my case I need it to equal FALSE, but I'm not asking how to make it false, I really just want to understand what the developer that wrote this code was thinking and how does the above if statement work.
Obviously I'm also dealing with a foreign language (German) - I'm pretty sure that this code was written prior to the application becoming multi-lingual.
If there is other suggestions that I should look into, please let me know (i.e. like using the locale or doing a different type of comparison).
My quick test shows that this code evaluates to FALSE, as expected.
if (("A" > "+") && ("A" < ">EINFUHRUNG ZUM")) {
alert('true')
} else {
alert( 'false')
}
In general, the comparison is done as usual according to the character codes, therefore "A" > ">" and "A" > "+".
To compare strings with non-ASCII letters, you might find this reference useful.
I faced with a problem that string comparison in C# works a bit strange:
"0".CompareTo("#") // is 1
And I'm very surprised with that because ASCII codes is next:
ASCII '#' // 64
ASCII '0' // 48
If I'm comparing chats or use String.CompareOrdinal everything fine:
'0'>'#' // false
String.CompareOrdinal("0","#") // -16
And in JS it works also as expected:
"0" > "#" // false - in Javascript
Next thing that C# code I can't change - it uses CompareTo.
But I need same sorting rules in Javascript.
I can't find any solution smarter than replace '#' sign with the '#' because it ASCII code less than zero:
ASCII '#' // 35
Maybe somebody can explain why:
"0".CompareTo("#") // is 1
Or suggest better workaround how making comparison the same in Javascript
It's not strange, it's culture specific. I'm not an expert in js but I guess that localeCompare may help you.
CompareTo() will return 1 if the first value is meant to be sorted after the second value (in ascending order), according to its call to String.Compare(firstValue,(String)secondValue, StringComparison.CurrentCulture), which would consider 0 to come after # in a sorted list for display, so you would have #foo, 0foo, 1foo, afoo, Afoo, foo.
In JavaScript, you can pass a function to Array.sort() that would mimic the behavior for the relevant C# culture.
Say I have a string, which value is already a number, e.g. var str = "1234" Now I want to convert it to number.
I have seen two tricks on the internet so far,
Use the unary +: var num = +str
Use the multiply operator *: var num = str*1
I want to know which one is better in general.
As I saw from the comment of the accepted answer here: Converting Json Results to a Date, it seems like *1 is best to be avoided. Is this true and what is the reason behind it?
Fewer operations, basically.
The unary plus calls the internal toNumber method, whereas the multiplication operator calls toNumber as well, then does a math operation on it.
Why do the extra steps?
http://www.ecma-international.org/ecma-262/6.0/#sec-unary-plus-operator
http://www.ecma-international.org/ecma-262/6.0/#sec-applying-the-mul-operator
i'm following Beginning Javascript, and learning about data types conversions, more specific float and integers.
in chrome console, when i try subtraction:
parseFloat("2.1" - "0.1");
=>2
but when try the same with addition i get this:
parseFloat("2.1" + "0.1");
=>2.1
Can someone elaborate why during addition it behaves not how I would think it should?
thanks in advance
Have a look at the results of the subtraction and addition before calling parseFloat:
"2.1" - "0.1"
=> 2
In this case, JavaScript has inferred that you want to subtract numbers, so the strings have been parsed for you before the subtraction operation has been applied. Let's have a look at the other case:
"2.1" + "0.1"
=> "2.10.1"
Here the plus operator is ambiguous. Did you want to convert the strings to numbers and add them, or did you want to concatenate the strings? Remember that "AB" + "CDE" === "ABCDE". You then have:
parseFloat("2.10.1")
Now, 2.10.1 is not a valid number, but the parsing does its best and returns 2.10 (or put more simply, 2.1).
I think what you really wanted to do is to parse the two numbers separately before adding or subtracting them. That way you don't rely on JavaScript's automatic conversion from strings to numbers, and any mathematical operation should work as you expect:
parseFloat("2.1") - parseFloat("0.1")
=> 2
parseFloat("2.1") + parseFloat("0.1")
=> 2.2
What's the difference between
"2" * 1 + 5
and
parseInt("2") + 5
It's just a to write a more readable code, or there are compatibility issues with the first form.
parseInt is used to grab integers from a string. Consider the following code:
var myString = "3 blind mice";
var myInteger = parseInt(myString); //3
JavaScript will do automatic type conversion, so something like this:
"2" * 1 + 5 //7
The string "2" gets converted to a number.
As noted above in the comments, parseInt takes an additional argument for the base.
JavaScript has a lot of very weird rules about type conversion, and sometimes it's not exactly clear what JavaScript will do in every situation. Keep in mind that the + operator is also used for concatenation as well as addition.
If you're trying to explicitly convert something to a number, you can use the Number constructor provided by JavaScript. Considering the following:
var myString = "2";
var myNum = Number(myString); //2
console.log(typeof myNum); //number
Without the new keyword, it can be used to convert strings to numbers. While it does work, I am not sure parseInt should be used for conversion. Just use the Number constructor.
I don't think there are compatibility issues with using coercion in the first form, it is a language feature that should be widely supported.
However, since you have to add code to do the conversion either way (* 1 vs. parseInt), I would vote for parseInt from a style perspective because it makes your intention clearer. It is hard enough sometimes to keep types straight in javascript without using implicit conversions.
Someone not familiar with your code or with javascript might wonder what is going on with that first form as well.
Plus, as indicated in the comments, parseInt is faster so it's a win all around.