On Wed, Apr 25, 2012 at 10:42 PM, Richard O'Keefe
<ok@cs.otago.ac.nz> wrote:
Note that the conversion *IS* lossy in practice.
If you send a JSON message to a Javascript program,
or a Python program, or a Go program (if I am reading src/pkg/encoding/json/decode.go
correctly) what you get will be a 64-bit float.
The Jackson parser for Java uses Double for numbers with a '.' or 'e' by default,
although it can be configured to use BigDecimal.
If you want numbers outside the domain of finite 64-bit floats to travel
unscathed through JSON, then you must control not only which languages are
used at each end, but which versions of which libraries and how configured.
Right, for better or for worse, the absence of numeric semantics in the JSON standard means that what a number means is up to the implementation(s) involved, and the onus is on the user(s) to coordinate between them.
I argued the other end of this in the mailing list for another language,
saying that I wanted things that look like integers to be decoded as integers,
and was stepped on hard. Some people found their programs much simpler if
they always got the same kind of Number whatever the input looked like (in
Jackson, a Number might be returned as an instance of any of five classes).
My view is that the only reasonable approach is to decode JSON numbers into arbitrarily-sized rationals, such that interpretation is arguably lossless (modulo loss of precision, if e.g. 1.0 cannot be distinguished from 1).
Alvaro