On Thu, Oct 15, 2020 at 11:18 AM Steven D'Aprano <st...@pearwood.info> wrote:
>
> On Wed, Oct 14, 2020 at 03:33:22PM -0400, Random832 wrote:
>
> > That is nonsense. "exactly representable" is a plain english phrase
> > and has a clear meaning that only involves the actual data format, not
> > the context.
>
> Perhaps your understanding of plain English is radically different from
> mine, but I don't understand how that can be.
>
> The actual data format has some humongeous limits (which may or may not
> be reachable in practice, due to memory constraints). It is obvious to
> me that "exactly representable" must take into account the current
> context:
>
> - If it didn't, then the context precision would be meaningless;
>   changing it wouldn't actually change the precision of calculations.
>
> - Decimal(1)/3 is Decimal('0.3333333333333333333333333333') by default,
>   a mere 28 digits, not MAX_PREC (999999999999999999) digits.
>

Neither 1/3 nor sqrt(2) can be *exactly represented* as a decimal
fraction. It doesn't matter what the precision is set to, there is
absolutely no way that they can be perfectly represented (other than
symbolically or as a fraction or something, which the decimal module
doesn't do).

OTOH, 2**X * 5**Y * Z can be exactly represented, for any integers X,
Y, and Z; but the precision required might exceed the module's limits.

I don't really understand your complaint here. The plain English
interpretation of "exactly representable" is, within margin of error,
a perfectly representable concept. (The "margin of error" here is
that, barring infinite RAM, there will always be *some* limit to the
precision stored.)

> (To be honest, I was surprised to learn that the context precision is
> ignored when creating a Decimal from a string. I still find it a bit odd
> that if I set the precision to 3, I can still create a Decimal with
> twelve digits.)
>

The context is used for arithmetic, not construction.

All that being said, though, I still don't think the Decimal module
needs an option to go for infinite precision even if it can be truly
exact. The potential for an unexpected performance hit is too high,
and the temptation to set this flag would also be very high. (Imagine
the Stack Overflow answers: "yeah, the decimal module is inaccurate by
default, just set this and it becomes accurate".)

ChrisA
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/L25JPU5ZDHCBNPPSGYEG4KPHTVQ2HV7F/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to