Conversion safety for numbers received from JSON API - javascript

I have a backend that sends numbers with up to 19 digits before the decimal point and 6 digits after the decimal point. The backend is exact and correctly converts these digits into JSON like
{
myNumber: 9999999999999999999.123456
}
On the frontend I use JavaScript and the standard fetch API response.json().
By default the myNumber is converted into JavaScripts number type by response.json(). Can there be any loss of precision during this conversion?
Later on I might convert the number type back to string. Can there be any loss of precision during this conversion?
I need to understand which numbers can be safely converted from (a) the JSON representation to (b) JavaScripts number type and (c) back to a string. In which ranges will (a) and (c) be identical, when will I run into issues?
I hope I abstracted that enough away, but in case anyone is interested: The backend is C# Web API with SQL Server which uses DECIMAL(19,6).

Basically, use strings instead of JSON numbers. RFC 8259 states:
This specification allows implementations to set limits on the range
and precision of numbers accepted. Since software that implements
IEEE 754 binary64 (double precision) numbers [IEEE754] is generally
available and widely used, good interoperability can be achieved by
implementations that expect no more precision or range than these
provide, in the sense that implementations will approximate JSON
numbers within the expected precision. A JSON number such as 1E400
or 3.141592653589793238462643383279 may indicate potential
interoperability problems, since it suggests that the software that
created it expects receiving software to have greater capabilities
for numeric magnitude and precision than is widely available.
The numbers you're dealing with have significantly more digits than IEEE-754 double precision will preserve.
If you use a string in the JSON like this:
{
"myNumber": "9999999999999999999.123456"
}
... then you can handle the conversion to/from any high/arbitrary-precision type yourself; it's completely in your control.

Related

Why is Node.js automatically rounding my floating point?

I'm trying to write a function that would fetch me the number of decimal places after the decimal point in a floating-point literal using the answer as a reference from here.
Although this seems to work fine when tried in the browser consoles, in the Node.js environment while running test cases, the precision is truncated only up to 14 digits.
let data = 123.834756380650877834678
console.log(data) // 123.83475638065088
And the function returns 14 as the answer.
Why is the rounding off happening at rest? Is it a default behavior?
The floating-point format used in JavaScript (and presumably Node.js) is IEEE-754 binary64. When 123.834756380650877834678 is used in source code, it is converted to the nearest representable value, which is 123.834756380650873097692965529859066009521484375.
When this is converted to a string with default formatting, JavaScript uses just enough digits to uniquely distinguish the value. For 123.834756380650873097692965529859066009521484375, this should produce “123.83475638065087”. If you are getting “123.83475638065088”, which differs in the last digit, then the software you are using does not conform to the JavaScript specification (ECMAScript).
In any case, the binary64 format does not have sufficient precision to preserve the information that the original numeral, “123.834756380650877834678”, has 21 digits after the decimal point.
The code you link to also does not and cannot compute the number of digits in an original numeral. It computes the number of digits needed to uniquely distinguish the value represented after conversion to binary64. For sufficiently short numerals without trailing zeros after the decimal point, this is the same as the number of digits after the decimal point in the original numeral. For others, it may not be.
it is default behavior of JavaScript.I think it will be same in node.js.
in JS The maximum number of decimals is 17.
for more details take a look at here

Hexadecimal Number in Javascript

I'm writing a small code using serialport module of node js
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
However, I have the values stored in decimal notation.
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format. Converting the resultant into number is making it decimal again, but not in hexa!
I'm confused as to how to send the data to the chip. Please suggest!
Numbers are just numbers, they don't have a number base.
However, I have the values stored in decimal notation.
No, you don't. The only way it could be in decimal notation would be if it were a string, but if myDecimalnumber.toString(16) gives you a hex string, then myDecimalnumber is a Number, not a string (or you have a custom String.prototype.toString method, but I'm sure you don't).
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format.
A number has no concept of a number base. That's a concept related to the representation of a number. That is, 10 decimal is 12 octal is A hex. They're all the same number. It's just their representation (e.g., how we write it down, its string form) that involves a number base.
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
That seems really unlikely. If it's the case, it was written by a hyper-junior engineer or mistranslated from another language.
The chip probably requires an array of integers (numbers), but you'll need to refer to the documentation to see what size of integeres (8-bit, 16-bit, 32-bit, 64-bit, etc.). But it could be that it requires an array of characters with data encoded as hex. In that case, you need to know how many digits per number it requires (likely values are 2, 4, etc.).
But again, fundamentally, number bases are only related to the representation of numbers (the way we write them down, or keep them in strings), not actual numbers. Numbers are just numbers, they don't have a number base.

Implications of the unique binary representation for NaN in JavaScript

Would I be correct in assuming that the reason that JavaScript only supports one binary representation for NaN is that it allows interpreters to speed up operations involving NaNs by checking for that specific bit pattern as a 64 bit integer rather than rely upon the FPU handling them?
Stephen Canon's comment spurred me to run some timing tests, which I guess I should have done in the first place. I'm posting the results as an answer in case they're of interest to anyone else...
Using my Intel Core2 Quad CPU (2.33GHz) PC I compared
for(i=0;i<100000000;++i) x += 1.0;
in C++ and JavaScript with x first equal to 0.0 and then to NaN.
C++ x==0.0: 124ms
C++ x==NaN: 11888ms
JS x==0.0: 268ms
JS x==NaN: 432ms
So it seems that JavaScript is exploiting the fact that it already has to dynamically dispatch arithmetic operators to treat NaN as a special case.
I'm guessing that it's a hidden type that has no data, hence only one binary representation for NaN.

Do Javascript Numbers get represented as 64bit numbers in a 32bit browser?

I'm a bit confused about the size of a javascript number in a 32bit browser. Is it still represented as a 64bit number with max value at 2^53?
Answers couldn't be more wrong, it does depend on engine.
In V8 (Google Chrome, Opera, Node.js) 32-bit:
Integers that fit 31-bit signed representation (from -1073741824 to 1073741823) are represented directly by embedding it them in pointers.
Any other number is generally represented as a heap object that has a 64-bit double as a field for the numeric value (think of Java Double wrapper). In optimized functions such numbers can be temporarily stored directly on the stack and registers. Also certain kind of arrays can store doubles directly "permanently".
In V8 64-bit:
Same as 32-bit except integers can now fit in 32-bit signed representation (from -2147483648 to 2147483647) instead of 31-bit.
Yes. A number in Javascript is a double precision floating point number. It's the same regardless of the platform that it runs on.
I suppose my answer lies on MDN # 64-bit integers

Best way to deal with very large Long numbers in Ajax?

Javascript represents all numbers as double-precision floating-point. This means it loses precision when dealing with numbers at the very highest end of the 64 bit Java Long datatype -- anything after 17 digits. For example, the number:
714341252076979033
... becomes:
714341252076979100
My database uses long IDs and some happen to be in the danger zone. I could change the offending values in the database, but that'd be difficult in my application. Instead, right now I rather laboriously ensure the server encodes Long IDs as Strings in all ajax responses.
However, I'd prefer to deal with this in the Javascript. My question: is there a best practice for coercing JSON parsing to treat a number as a string?
You do have to send your values as strings (i.e. enclosed in quotes) to ensure that Javascript will treat them as strings instead of numbers.
There's no way I know of to get around that.
JavaScript represents all numbers as 64b IEEE 754 floats.
If your integer can't fit in 52 bits then it will be truncated which is what happened here.
If you do need to change your database, change it to send 52 bit integers. Or 53 bit signed integers.
Otherwise, send them as strings.

Categories