I'm writing a small code using serialport module of node js
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
However, I have the values stored in decimal notation.
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format. Converting the resultant into number is making it decimal again, but not in hexa!
I'm confused as to how to send the data to the chip. Please suggest!
Numbers are just numbers, they don't have a number base.
However, I have the values stored in decimal notation.
No, you don't. The only way it could be in decimal notation would be if it were a string, but if myDecimalnumber.toString(16) gives you a hex string, then myDecimalnumber is a Number, not a string (or you have a custom String.prototype.toString method, but I'm sure you don't).
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format.
A number has no concept of a number base. That's a concept related to the representation of a number. That is, 10 decimal is 12 octal is A hex. They're all the same number. It's just their representation (e.g., how we write it down, its string form) that involves a number base.
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
That seems really unlikely. If it's the case, it was written by a hyper-junior engineer or mistranslated from another language.
The chip probably requires an array of integers (numbers), but you'll need to refer to the documentation to see what size of integeres (8-bit, 16-bit, 32-bit, 64-bit, etc.). But it could be that it requires an array of characters with data encoded as hex. In that case, you need to know how many digits per number it requires (likely values are 2, 4, etc.).
But again, fundamentally, number bases are only related to the representation of numbers (the way we write them down, or keep them in strings), not actual numbers. Numbers are just numbers, they don't have a number base.
Related
I have a backend that sends numbers with up to 19 digits before the decimal point and 6 digits after the decimal point. The backend is exact and correctly converts these digits into JSON like
{
myNumber: 9999999999999999999.123456
}
On the frontend I use JavaScript and the standard fetch API response.json().
By default the myNumber is converted into JavaScripts number type by response.json(). Can there be any loss of precision during this conversion?
Later on I might convert the number type back to string. Can there be any loss of precision during this conversion?
I need to understand which numbers can be safely converted from (a) the JSON representation to (b) JavaScripts number type and (c) back to a string. In which ranges will (a) and (c) be identical, when will I run into issues?
I hope I abstracted that enough away, but in case anyone is interested: The backend is C# Web API with SQL Server which uses DECIMAL(19,6).
Basically, use strings instead of JSON numbers. RFC 8259 states:
This specification allows implementations to set limits on the range
and precision of numbers accepted. Since software that implements
IEEE 754 binary64 (double precision) numbers [IEEE754] is generally
available and widely used, good interoperability can be achieved by
implementations that expect no more precision or range than these
provide, in the sense that implementations will approximate JSON
numbers within the expected precision. A JSON number such as 1E400
or 3.141592653589793238462643383279 may indicate potential
interoperability problems, since it suggests that the software that
created it expects receiving software to have greater capabilities
for numeric magnitude and precision than is widely available.
The numbers you're dealing with have significantly more digits than IEEE-754 double precision will preserve.
If you use a string in the JSON like this:
{
"myNumber": "9999999999999999999.123456"
}
... then you can handle the conversion to/from any high/arbitrary-precision type yourself; it's completely in your control.
I'm trying to write a function that would fetch me the number of decimal places after the decimal point in a floating-point literal using the answer as a reference from here.
Although this seems to work fine when tried in the browser consoles, in the Node.js environment while running test cases, the precision is truncated only up to 14 digits.
let data = 123.834756380650877834678
console.log(data) // 123.83475638065088
And the function returns 14 as the answer.
Why is the rounding off happening at rest? Is it a default behavior?
The floating-point format used in JavaScript (and presumably Node.js) is IEEE-754 binary64. When 123.834756380650877834678 is used in source code, it is converted to the nearest representable value, which is 123.834756380650873097692965529859066009521484375.
When this is converted to a string with default formatting, JavaScript uses just enough digits to uniquely distinguish the value. For 123.834756380650873097692965529859066009521484375, this should produce “123.83475638065087”. If you are getting “123.83475638065088”, which differs in the last digit, then the software you are using does not conform to the JavaScript specification (ECMAScript).
In any case, the binary64 format does not have sufficient precision to preserve the information that the original numeral, “123.834756380650877834678”, has 21 digits after the decimal point.
The code you link to also does not and cannot compute the number of digits in an original numeral. It computes the number of digits needed to uniquely distinguish the value represented after conversion to binary64. For sufficiently short numerals without trailing zeros after the decimal point, this is the same as the number of digits after the decimal point in the original numeral. For others, it may not be.
it is default behavior of JavaScript.I think it will be same in node.js.
in JS The maximum number of decimals is 17.
for more details take a look at here
Suppose I have a string of a non-decimal representation of a number beyond Number.MAX_SAFE_INTEGER. How would I get a BigInt of that number?
Were the string a representation of a decimal number, I'd just have BigInt(string), but the number is represented non-decimally.
Note: for my application, efficiency matters.
Edit: I'm looking for a general technique that works for arbitrary bases.
You could try one of the npm packages for large numbers, such as this:
npm big number
I am using node 6.x (npm 3.x) with restify (latest). If a javascript object contains a property set to an integer, by default it looks like restify.send() will serialize that integer into "low" and "high" parts -- presumably representing the low/high 32-bit components of a 64-bit integer.
How can I turn off this default behavior, so that integers are not encoded into low and high parts?
Thanks.
I can reproduce this behaviour when using integer, is that what you're using to represent integer values that may exceed JavaScript's Number.MAX_SAFE_INTEGER?
If so, then you need to convert those integer instances to a proper JS number, otherwise they can't be represented as numerical value in JSON:
Number(obj.intProperty) // or: obj.intProperty.toNumber()
HOWEVER: I assume there's a reason for you using integer. If the number represented by obj.intProperty is too big to be represented as a plain JS Number, converting it may yield invalid results (that's why the JSON-representation of an integer is an object consisting of two 32-bit values).
EDIT: turns out that the issue was caused by the Neo4J driver's representation of 64-bit integers, as documented here: https://www.npmjs.com/package/neo4j-driver#a-note-on-numbers-and-the-integer-type
Javascript represents all numbers as double-precision floating-point. This means it loses precision when dealing with numbers at the very highest end of the 64 bit Java Long datatype -- anything after 17 digits. For example, the number:
714341252076979033
... becomes:
714341252076979100
My database uses long IDs and some happen to be in the danger zone. I could change the offending values in the database, but that'd be difficult in my application. Instead, right now I rather laboriously ensure the server encodes Long IDs as Strings in all ajax responses.
However, I'd prefer to deal with this in the Javascript. My question: is there a best practice for coercing JSON parsing to treat a number as a string?
You do have to send your values as strings (i.e. enclosed in quotes) to ensure that Javascript will treat them as strings instead of numbers.
There's no way I know of to get around that.
JavaScript represents all numbers as 64b IEEE 754 floats.
If your integer can't fit in 52 bits then it will be truncated which is what happened here.
If you do need to change your database, change it to send 52 bit integers. Or 53 bit signed integers.
Otherwise, send them as strings.