Best way to deal with very large Long numbers in Ajax? - javascript

Javascript represents all numbers as double-precision floating-point. This means it loses precision when dealing with numbers at the very highest end of the 64 bit Java Long datatype -- anything after 17 digits. For example, the number:
714341252076979033
... becomes:
714341252076979100
My database uses long IDs and some happen to be in the danger zone. I could change the offending values in the database, but that'd be difficult in my application. Instead, right now I rather laboriously ensure the server encodes Long IDs as Strings in all ajax responses.
However, I'd prefer to deal with this in the Javascript. My question: is there a best practice for coercing JSON parsing to treat a number as a string?

You do have to send your values as strings (i.e. enclosed in quotes) to ensure that Javascript will treat them as strings instead of numbers.
There's no way I know of to get around that.

JavaScript represents all numbers as 64b IEEE 754 floats.
If your integer can't fit in 52 bits then it will be truncated which is what happened here.
If you do need to change your database, change it to send 52 bit integers. Or 53 bit signed integers.
Otherwise, send them as strings.

Related

What is optimal way of storing Javascript Number in PostgreSQL

I have numbers generated from Javascript code and I want to store them in PostgreSQL table. I have legacy table where the whole JSON object is stored as JSONB type and in the new table I'd like to flatten the JSON to separate columns.
Ideally I want to avoid loss of precision as much as possible. Especially I'd like to avoid turning JS integer numbers into float numbers and vice versa. In other words inserting integer and getting back float is something I'd like to mitigate (if possible).
So far I've experimented with DOUBLE PRECISION and NUMERIC types. I think NUMERIC is better fit because documentation states that within the implementation limits there is no loss in precision. On the other hand DOUBLE PRECISION will be probably faster for numeric operations. I plan to do a lot of statistical operations.
I am not sure which one to choose. What is the optimal or recommended PostgreSQL data type with regards to maximum compatibility JavaScript Number type?
I am not JavaScript expert, but what I found on net, then JavaScript uses 64bit floats. It is same like DOUBLE PRECISION type - 8bytes like 8bytes.

How to resolve the JSON. Stringify() method overflow? [duplicate]

I have a WCF service operation that returns an object with long and List<string> properties. When I test the operation in a WCF application, everything works fine and the values are correct. However, I need to be able to call the service using jQuery and JSON format. The value of the long property apparently changes when I read it back in the OnSucceed function.
After searching I've found that JSON.stringify changes big values. So in code like this:
alert(JSON.stringify(25001509088465005));
...it will show the value as 25001509088465004.
What is happening?
Demo here: http://jsfiddle.net/naveen/tPKw7/
JavaScript represents numbers using IEEE-754 double-precision (64 bit) format. As I understand it this gives you 53 bits precision, or fifteen to sixteen decimal digits. Your number has more digits than JavaScript can cope with, so you end up with an approximation.
Do you need to do maths operations on this big number? Because if its just some kind of ID you can return it as a string and avoid the problem.

Hexadecimal Number in Javascript

I'm writing a small code using serialport module of node js
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
However, I have the values stored in decimal notation.
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format. Converting the resultant into number is making it decimal again, but not in hexa!
I'm confused as to how to send the data to the chip. Please suggest!
Numbers are just numbers, they don't have a number base.
However, I have the values stored in decimal notation.
No, you don't. The only way it could be in decimal notation would be if it were a string, but if myDecimalnumber.toString(16) gives you a hex string, then myDecimalnumber is a Number, not a string (or you have a custom String.prototype.toString method, but I'm sure you don't).
Using myDecimalnumber.toString(16) returns in hexa notation but in string format. I need in Number format.
A number has no concept of a number base. That's a concept related to the representation of a number. That is, 10 decimal is 12 octal is A hex. They're all the same number. It's just their representation (e.g., how we write it down, its string form) that involves a number base.
Docs of my hardware chip specified my tranmsitted data to be a byte array of hexadecimal numbers.
That seems really unlikely. If it's the case, it was written by a hyper-junior engineer or mistranslated from another language.
The chip probably requires an array of integers (numbers), but you'll need to refer to the documentation to see what size of integeres (8-bit, 16-bit, 32-bit, 64-bit, etc.). But it could be that it requires an array of characters with data encoded as hex. In that case, you need to know how many digits per number it requires (likely values are 2, 4, etc.).
But again, fundamentally, number bases are only related to the representation of numbers (the way we write them down, or keep them in strings), not actual numbers. Numbers are just numbers, they don't have a number base.

Does rounding values speed up JSON communication (ajax)?

since I want to receive a lot of data multiple times per second with an $ajax (or jQuery.getJSON) method, I wonder if it could make sense to round long float values (1.23242342344 ... ) to a short version.
As far as I know it doesn't make a difference in most programming languages if its 2.2 or 2.202312323, since both reserve the space of a float, but I'm not sure how JSON handles this, maybe it's more like a string, and the string would get shorter with rounded values?
So, can I speed up JSON calls with rounded values?
Rounding values will make a difference, proportional to the amount of data you transfer.
All HTTP communication is done with Strings and JSON is a string format to transfer data.
Therefore 12.3456789 will take 10 bytes where as 12 will only need 2 (one byte per character).

How does number precision affect performance in JavaScript, or does it?

In JavaScript, there's only one type for all different kinds of numbers. Does the amount of decimals in the numbers used (precision) affect performance especially in JavaScript? If it does, how?
How about saving numbers in MongoDB: Do precise numbers take more space than less precise ones?
Generally no. There are some possible performance implications when a number doesn't fit in a 31b signed int.
A tour of V8: object representation explains
According to the spec, all numbers in JavaScript are 64-bit floating point doubles. We frequently work with integers though, so V8 represents numbers with 31-bit signed integers whenever possible (the low bit is always 0; this helps the garbage collector distinguish numbers from pointers). So objects with the fast small integers elements kind only contain this type of number. If we want to store a fractional number or a larger integer or a special value like -0, then we need to upgrade the array to fast doubles. This involves a potentially expensive copy-and-convert operation, but it doesn't happen often in practice. fast doubles objects are still pretty fast because all of the numbers are stored in an unboxed representation. If we want to store any other kind of value, e.g., a string or an object, we must upgrade to a general array of fast elements.
Does the amount of decimals in the numbers used (precision) affect performance especially in JavaScript? If it does, how?
No. The number type in JavaScript is a 64-bit floating-point value with base 2 and always has the same precision. The computer works on that data bit by bit and it doesn't matter whether that data represents something that looks simple to a human like 1.0 or something seemingly complicated like 123423.5645632. In fact, for base 2 floats, the 'human' values are just as 'hard', because 1.1 is truly represented by a much longer number (sth. like 1.10000000000000054). All this doesn't matter, because the computer truly operates on 64 ones and zeros. There are always some arcane exceptions in floating point, but those usually don't matter in practice.
How about saving numbers in MongoDB: Do precise numbers take more space than less precise ones?
Decimal numbers are stored as doubles (64bit), and it doesn't matter whether that is 1.0 or 1.1221234423. Again, the number of bits is constant for these data types.
The same is true for ints, but MongoDB has support for both 32- and 64-bit ints. So NumberLong is indeed larger than regular 32-bit ints and just as large as doubles.

Categories