It has been discussed that hashing -0.0 and 0.0 the same, but this has an unfortunate interaction with sorting. See #9381 <https://github.com/JuliaLang/julia/issues/9381> and #18485 <https://github.com/JuliaLang/julia/issues/18485>, which I just opened.
On Tue, Sep 13, 2016 at 10:41 AM, Steven G. Johnson <stevenj....@gmail.com> wrote: > > > On Tuesday, September 13, 2016 at 10:07:49 AM UTC-4, Fengyang Wang wrote: >> >> This is an intuitive explanation, but the mathematics of IEEE floating >> points seem to be designed so that 0.0 represents a "really small positive >> number" and -0.0 represents "exact zero" or at least "an even smaller >> really small negative number"; hence -0.0 + 0.0 = 0.0. I never understood >> this either. >> > > For one thing, the signed zero preserves 1/(1/x) == x, even when x is +Inf > or -Inf, since 1/-Inf is -0.0 and 1/-0.0 is -Inf. More generally, when > there is underflow (numbers get so small they can no longer be > represented), you lose the value but you don't lose the sign. Also, the > sign of zero is useful in evaluating complex-valued functions that have a > branch cut along the real axis, so that you know which side of the branch > you are on (see the classic paper Much Ado About Nothing's SIgn Bit > <https://people.freebsd.org/~das/kahan86branch.pdf> by Kahan). >