On Jan 15, 2009, at 12:32 PM, Kris Zyp wrote:

we can't switch JSON from double to decimal by
default, when decoding *or* encoding.
How do you switch to double or decimal by default on encoding? The
input defines it, not any default setting.

A JSON encoder in a current self-hosted or native ES3.1 implementation sees a number (binary double precision) and encodes it using JSON number syntax. If we add decimal, you want the encoder to stringify a decimal value as a JSON number too. That is a choice -- a design decision (and a mistake :-/).

The alternatives are to stringify or to throw, requiring a custom (non- default) hook to be used as Bob's simplejson package allows.


3.3 is exactly the sum of 1.1 and 2.2 without errors as decimal math
produces

None of these numbers is exactly representable using binary finite precision. Depending on the JSON codec implementation, they may not even round trip to string and back -- see Bob's reply.


You are saying there latent hard-to-find bugs because people believe
that JSON somehow implies that the sum of {"p":1.1, "q":2.2} must be
3.3000000000000003 ?

I never wrote any such thing.

Please look at the previous messages again. If C1 uses double to decode JSON from S1 but C2 uses decimal, then results can differ unexpectedly (from S1's point of view). Likewise, encoding decimal using JSON's number syntax also breaks interoperation with implementations using double to decode and (re-)encode.


If people are returning 3.3, then the argument
that JSON numbers are universally treated computed as binary is not
valid. Is there a less hand-wavy way of stating that?

I don't know what "treated computed as binary" means, even if I delete one of "treated" and "computed". JSON number syntax may be encoded from decimals by some implementations, and decoded into decimals too. This is not interoperable with implementations that transcode using doubles. Period, full stop.


I thought JSON serialization and typeof results could be considered
separate issues.

You brought up Dojo code examples including Dojo's JSON codec as evidence that defining typeof 1.1m == "number" would relieve you of having to change that codec while at the same time preserving correctness. I replied showing that the preserving correctness claim in that case is false, and the relief from having to evolve the codec was an obligation.

We then talked more about JSON than typeof, but the two are related: in both JSON *implementations* and your proposed typeof 1.1m == "number" && typeof 1.1 == "number" world, incompatible number formats are conflated. This is a mistake.


You're arguing by assertion that rounding errors due to double's
finite binary precision, which are the most reported JS bug at
https://bugzilla.mozilla.org, are somehow insignificant when JSON
transcoding is in the mix. That's a bold assertion.
The issue here is relying on another machine to do a computation. I
have trouble believing that all these people that are experiencing
rounding errors are then using these client-side computations for
their server.

Please. No one wrote "all these people". We're talking about subtle latent and future bugs, likelihoods of such bugs (vs. ruling them out by not conflating incompatible number types). Correctness is not a matter of wishful thinking or alleged "good enough" current-code behavior.


The compensation for rounding errors that we are
concerned are usually going to be kept as close to the error as
possible. Why would you build a client-server infrastructure around it?

People do financial stuff in JS. No medical equipment or rocket control yet, AFAIK (I could be wrong). I'm told Google Finance uses integral double values to count pennies. It would not be surprising if JSON transcoding were already interposed between parts of such a system. And it should be possible to do so, of course -- one can always encode bignums or bigdecimals in strings.

What's at issue between us is whether the default encoding of decimal should use JSON's number syntax. If someone builds client-server infrastructure, uses JSON in the middle, and switches from double today to decimal tomorrow, what can go wrong if we follow your proposal and encode decimals using JSON number syntax? Assume the JSON is not in the middle of a closed network where one entity controls the version and quality of all peer software. We can't assume otherwise in the standard.


What should JSON.parse use then, if not double (binary)? JSON.parse
is in ES3.1, and decimal is not.
It should use double. I presume that if a "use decimal" pragma or a
switch was available, it might parse to decimal, but the default would
be double, I would think.

Good, we agreed on decoding to double already but it's great to confirm this.


which breaks round-tripping, which breaks interoperation.
JSON doesn't round-trip JS, and it never will.

That's a complete straw man. Yes, Nan and the infinities won't round trip. But number syntax in JSON per the RFC, in combination with correct, Steele and Gay (ftp://ftp.ccs.neu.edu/pub/people/will/retrospective.pdf)conformant dtoa and strtod code, can indeed round-trip finite values. This should be reliable.


I presume that if a receiver had a "use decimal" pragma
they could count as opt-in to parsing numbers into decimal and then
you could round-trip decimals, but only if the sender was properly
encoding decimals as JSON numbers (decimals).

Yeah, only if. Receiver makes it wrong. Nothing in the over-the-wire data requires the receiver to use decimal, or fail if it lacks decimal support.

You wrote in your last message:

I am not asserting that JSON decoding should automatically convert JSON numbers to binary, only that JSON encoding should serialize decimals to numbers.

This is likely to create real bugs in the face of decoders that lack decimal. There is zero likelihood of such bugs if we don't incompatibly encode decimal using JSON number syntax when adding decimal to a future standard.


Encoding the decimals as strings is far worse.

Throwing by default -- in the absence of explicit opt-in by the encoder-user -- is far better.

It's still up to the producer of the data to worry about tagging the decimal type of the JSON number, or hoping that the data stays in its silo where it's implicitly typed as decimal.But we won't have accidents where implicitly -- by default -- decimal users encode JSON data that is incorrectly decoded.


We are not condemned to repeat history if we pay attention to what
went before. JSON implementations in future ES specs cannot by
default switch either encoding or decoding to use decimal instead
of number.
The decimal number has been around much longer than the computer. Are
saying that a particular language type has more permanence?

I think you know exactly what I'm saying. One (lingua franca, French as the common diplomatic language) particular format is better than many. And we are stuck with double today. So we cannot start encoding decimals as JSON numbers tomorrow. Please stop ducking and weaving and address this head on. If you really endorse "receiver makes it right", give a spirited and explicit defense.

/be
_______________________________________________
Es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to