I am trying to write code in JavaScript to convert binary-coded decimal into decimal and vice-versa.
How can I achieve this?
To encode and decode values using Binary Coded Decimal (BCD):
dec2bcd = dec => parseInt(dec.toString(10),16);
bcd2dec = bcd => parseInt(bcd.toString(16),10);
console.log(dec2bcd(42)); // 66 (0x42)
console.log(bcd2dec(66)); // 42
Explanation
Suppose you have the number 42. To encode this as binary coded decimal, you place '4' in the high nibble and a '2' in the low nibble. The most straightforward way of doing this is to reinterpret the string representation of the decimal number as hex. So convert the value 42 to the string '42' and then parse this as hex to arrive at the number 66, which, if you examine it in binary is 01000010b, which indeed has a 4 (0100b) in the high nibble and a 2 (0010b) in the low nibble.
To decode, just format the number as a string using hexadecimal encoding, and then reinterpret this string as decimal.
Related
I am manually sending numbers to an arduino board as hexadecimal like this: sendToBoard(0xE)
I am now trying to get decimal numbers converted into hexadecimal, but I can only get strings
const number = 14
number.toString(16) //e --> string
Could I possibly get this 'e' as hexadecimal to send it to the board like sendToBoard(number) //number === e (in hex)
In Javascript numbers are implemented using double-precision 64-bit binary format. So even if you present them in hexadecimal format, under the hood they will be saved using floating-point representation.
if you have a number and you want the sendToBoard function to receive a number as its input, then just pass in the number:
function sendToBoard(number){
// you can later convert it to a string:
const str = '0x' + number.toString(16).toUpperCase();
}
Alternatively, if you have a string in a hexadecimal representation, and you want sendToBoard to receive a number type, you could do the following:
const number = parseInt('0xf', 16);
sendToBoard(number);
Hope this helps :)
Let's say I have a hexadecimal, for example "0xdc", how do I convert this hexadecimal string to a hexadecimal Number type in JS?
Literally just losing the quotes. The Number() constructor and parseInt() just converted it to an integer between 0 and 255, I just want 0xdc.
EDIT:
To make my point more clear:
I want to go from "0xdc" (of type String), to 0xdc (of type Number)
You can use the Number function, which parses a string into a number according to a certain format.
console.log(Number("0xdc"));
JavaScript uses some notation to recognize numbers format like -
0x = Hexadecimal
0b = Binary
0o = Octal
if you want convert string to hex representation, you can covert to number with 16 as radix.
parseInt("0xdc", 16) // gives you 0xdc
TL;DR
#dhaker 's answer of
parseInt(Number("0xdc"), 10) is correct.
In-Memory Number Representation
Both numbers 0xdc and 220 are represented in the same way in javascript
so
0xdc == 220 will return true.
the prefix 0x is used to tell javascript that the number is hex
So wherever you are passing 220 you can safely pass 0xdc or vice-versa
String Format
Numbers are always shown in base 10 unless specified not to.
'0x' + Number(220).toString(16) gives '0xdc' if you want to print it as string.
In a nutshell
parseInt('0x' + Number(220).toString(16),16) => 220
I am using bitwise operations in javascript but I have noticed something that appears inconsistent.
Right now, I'm using the XOR (^) operation. When I do '0101'^'0001', I get 4, which makes sense since 4 = 0100 in binary.
But when I do '10001'^'01111', I get 9030, when I think I should get 11110.
The format is, to best that I can tell, the same; only the strings are different.
console.log(5^1); //4
console.log('0101'^'0001'); // 100 = 4
console.log(17^15); //30
console.log('10001'^'01111'); //9030...why? shouldn't it be '11110'?
Why is this code producing this result?
Now, If I do this:
console.log(('0b'+'10001')^('0b' + '01111')); //30
Why did I have to add '0b' to specify that the strings were binary sequences when doing bitwise ops on 17 and 15, but not with 5 and 1?
As has been pointed out already, you are not using binary numbers. Use the 0b prefix for binary numbers and toString(2) to convert back:
console.log((0b10001 ^ 0b01111).toString(2));
11110
The first example just works because 1 in decimal is the same as 1 in binary. The result is 100 (decimal) though, not 4. Javascript doesn't store the number format internally.
You are using the ^ operator on strings. When using any numerical operator on a string JavaScript will implicitly convert the string to a number. Adding 0b tells JavaScript to treat your number as a binary or base 2 number.
Otherwise, by default they will be converted to decimal or base 10 values.
Do operations on binary values represented in a string you have to convert your values to numerical base 2 values first.
You can verify in a calculator 10,001^1111 = 9030.
In binary 1111=15 and 10001=17. 15^17= 30 which is 11110 in binary.
101 Xor 1 is a special case, In decimal 101 Xor 1 = 100. In Binary 101 = 5 and 5 Xor 1 = 4 which is written out in binary as 100.
Consider the following btoa output
btoa(99999999999999999999)
"MTAwMDAwMDAwMDAwMDAwMDAwMDAw"
btoa(99999999999999999998)
"MTAwMDAwMDAwMDAwMDAwMDAwMDAw"
btoa("99999999999999999998")
"OTk5OTk5OTk5OTk5OTk5OTk5OTg="
btoa("99999999999999999999")
"OTk5OTk5OTk5OTk5OTk5OTk5OTk="
We see that btoa is unable to encode unique hash for 20 digit int but was able to encode 20 digit string. Why is this?
My original guess is that since btoa is base 64 it can only encode something that is less than base 64, but why is it capable of encoding a 20 digit string instead?
Moreover btoa seems to not able to encode int less than 9223372036854775807 which is a 2^63
btoa(9223372036854775302)
"OTIyMzM3MjAzNjg1NDc3NjAwMA=="
btoa(9223372036854775303)
"OTIyMzM3MjAzNjg1NDc3NjAwMA=="
Because of floating point imprecision. Remember, JavaScript doesn't have integers except temporarily during certain calculations; all JavaScript numbers are IEEE-754 double-precision floating point. 99999999999999999999 is not a safe integer value in IEEE-754 numbers, and in fact if you do this:
console.log(99999999999999999999);
...you'll see
100000000000000000000
The max safe integer (e.g., integer that won't be affected by floating point imprecision) is 9007199254740991.
Since btoa accepts a string (when you pass it a number, the number just gets converted to string), just put quotes around your value:
btoa("99999999999999999999")
=> OTk5OTk5OTk5OTk5OTk5OTk5OTk=
Of course, if the value is the result of a math calculation, you can't do that. You'll have to change whatever it is that's calculating such large numbers, as they exceed the precise range of the number type.
hex = Number(-59).toString(16)
hex is -3b
hex should be ffffffffffffffC5
Thanks for any help!
If the number is negative, the sign is preserved. Especially if the radix is 2, it's returning the binary (zeros and ones) of the number preceeded by a - sign but the two's complement.
This is how the toString() method of the Number type works, it doesn't output the two's complement.
In other words, the toString() method converts the number as if it was positive displaying its hexadecimal representation, if that number is negative it just puts a minus - before it.