How to resolve the JSON. Stringify() method overflow? [duplicate] - javascript

I have a WCF service operation that returns an object with long and List<string> properties. When I test the operation in a WCF application, everything works fine and the values are correct. However, I need to be able to call the service using jQuery and JSON format. The value of the long property apparently changes when I read it back in the OnSucceed function.
After searching I've found that JSON.stringify changes big values. So in code like this:
alert(JSON.stringify(25001509088465005));
...it will show the value as 25001509088465004.
What is happening?
Demo here: http://jsfiddle.net/naveen/tPKw7/

JavaScript represents numbers using IEEE-754 double-precision (64 bit) format. As I understand it this gives you 53 bits precision, or fifteen to sixteen decimal digits. Your number has more digits than JavaScript can cope with, so you end up with an approximation.
Do you need to do maths operations on this big number? Because if its just some kind of ID you can return it as a string and avoid the problem.

Related

Javascript Decimal issues [duplicate]

This question already has answers here:
Is floating point math broken?
(31 answers)
Closed 3 years ago.
Using the parseFloat method converted the string value from the DB to float and multiplied by 100. But the output looks odd. Following the piece of code which I've used.
parseFloat(upliftPer) * 100 //upliftPer value read from DB and its value is 0.0099
So when it multiplied with 100 getting 0.9900000000000001 I suppose to get 0.99 but some junk values getting appended. Also I went ahead and did the same in the console log of chrome browser still the same result. I have attached screenshots for reference. Solution I needed is 0.0099 * 100 should result 0.99. I cant apply round / toFixed since I need more precision.
This is because of JavaScript internal casting to double type. There's always a certain degree of noise due to floating point inaccuracy. Here's some information on that.
You can just round it up using toFixed(x) with as much decimal spaces of precision as you want (or as much as JavaScript would allow you).
It is not related to JavaScript nor to any programming language.
It’s because of the conversion from decimal floating-point to binary representation: the process ends up in an infinite process that has to be truncated because of the limited number of bits available for storing the numbers (and thus, of the limited amount of memory in computers)
Check out the process of conversion and take a look at this converter that tells you how much error there is during conversion
As #sebasaenz pointed out in his answer, you can use toFixed(x) to round up your number and get rid of junk

What is optimal way of storing Javascript Number in PostgreSQL

I have numbers generated from Javascript code and I want to store them in PostgreSQL table. I have legacy table where the whole JSON object is stored as JSONB type and in the new table I'd like to flatten the JSON to separate columns.
Ideally I want to avoid loss of precision as much as possible. Especially I'd like to avoid turning JS integer numbers into float numbers and vice versa. In other words inserting integer and getting back float is something I'd like to mitigate (if possible).
So far I've experimented with DOUBLE PRECISION and NUMERIC types. I think NUMERIC is better fit because documentation states that within the implementation limits there is no loss in precision. On the other hand DOUBLE PRECISION will be probably faster for numeric operations. I plan to do a lot of statistical operations.
I am not sure which one to choose. What is the optimal or recommended PostgreSQL data type with regards to maximum compatibility JavaScript Number type?
I am not JavaScript expert, but what I found on net, then JavaScript uses 64bit floats. It is same like DOUBLE PRECISION type - 8bytes like 8bytes.

In JavaScript, should stringified numbers be converted to integers before subtracting them? (best practices)

I recently wrote some code that is grabbing text from two separate elements and then subtracting them, to my surprise I didnt have to convert them first to integers. I did a little looking around and it seems JavaScript converts strings to numbers when using the subtraction operator. Does anyone know if this is ok to leave them as strings or for best practice should they first me converted to integers? and if so why? Thank you.
example:
"10" - "6" = 4
[…] to my surprise I didnt have to convert them first to integers […]
A couple more surprises, then:
JavaScript resolves many operations (such as arithmetic) by implicitly coercing incompatible values to a different type, instead of raising an exception. This makes JavaScript “weakly typed” to that extent.
There is no integer type built into JavaScript; the only number type is IEEE-754 floating-point.
So, your string values were coerced to floating-point values, in the context of the arithmetic operation. JavaScript didn't tell you this was happening.
This is a source of bugs that can remain hidden for a long time, because if your string values would successfully convert to a number, the operation would succeed, even if you would expect those values to raise an error.
js> "1e15" - "0x45" // The reader might have expected this to raise an error.
999999999999931
The brief “Wat” presentation by Gary Bernhardt is packed with other surprising (and hilarous) results of JavaScript's implicit type coercion.
Does anyone know if this is ok to leave them as strings or for best practice should they first me converted to [numbers]?
Yes, in my opinion you should do arithmetic only on explicitly-converted numbers, because (as you discovered) for newcomers reading the code, the implicit coercion rules are not always obvious.

Does rounding values speed up JSON communication (ajax)?

since I want to receive a lot of data multiple times per second with an $ajax (or jQuery.getJSON) method, I wonder if it could make sense to round long float values (1.23242342344 ... ) to a short version.
As far as I know it doesn't make a difference in most programming languages if its 2.2 or 2.202312323, since both reserve the space of a float, but I'm not sure how JSON handles this, maybe it's more like a string, and the string would get shorter with rounded values?
So, can I speed up JSON calls with rounded values?
Rounding values will make a difference, proportional to the amount of data you transfer.
All HTTP communication is done with Strings and JSON is a string format to transfer data.
Therefore 12.3456789 will take 10 bytes where as 12 will only need 2 (one byte per character).

Best way to deal with very large Long numbers in Ajax?

Javascript represents all numbers as double-precision floating-point. This means it loses precision when dealing with numbers at the very highest end of the 64 bit Java Long datatype -- anything after 17 digits. For example, the number:
714341252076979033
... becomes:
714341252076979100
My database uses long IDs and some happen to be in the danger zone. I could change the offending values in the database, but that'd be difficult in my application. Instead, right now I rather laboriously ensure the server encodes Long IDs as Strings in all ajax responses.
However, I'd prefer to deal with this in the Javascript. My question: is there a best practice for coercing JSON parsing to treat a number as a string?
You do have to send your values as strings (i.e. enclosed in quotes) to ensure that Javascript will treat them as strings instead of numbers.
There's no way I know of to get around that.
JavaScript represents all numbers as 64b IEEE 754 floats.
If your integer can't fit in 52 bits then it will be truncated which is what happened here.
If you do need to change your database, change it to send 52 bit integers. Or 53 bit signed integers.
Otherwise, send them as strings.

Categories