On Thu, 7 Apr 2022 at 05:37, Oscar Benjamin <oscar.j.benja...@gmail.com> wrote:
>
> On Wed, 6 Apr 2022 at 17:48, Chris Angelico <ros...@gmail.com> wrote:
> >
> > On Thu, 7 Apr 2022 at 02:17, Christopher Barker <python...@gmail.com> wrote:
> > >
> > > On Wed, Apr 6, 2022 at 5:11 AM Oscar Benjamin 
> > > <oscar.j.benja...@gmail.com> wrote:
> > >> > I'd be curious to know what alternatives you see. When a user writes 
> > >> > `x + y` with both `x` and `y` instances of `decimal.Decimal`, the 
> > >> > decimal module needs to know what precision to compute the result to
> > >> (as well as what rounding mode to use, etc.). Absent a thread-local 
> > >> context or task-local context, where would that precision information 
> > >> come from?
> > >
> > >> One possibility is to attach the context information to the instances
> > >> so it's like:
> > >
> > > That seems the obvious thing to me -- a lot more like we already have 
> > > with mixing integers and floats, and/or mixing different precision floats 
> > > in other languages (and numpy).
> >
> > Not so obvious to me, as it would require inordinate amounts of
> > fiddling around when you want to dynamically adjust your precision.
> > You'd have to reassign every Decimal instance to have the new
> > settings.
>
> Why do you need to "dynamically adjust your precision"?

Some algorithms work just fine when you start with X digits of
precision and then increase that to X+Y digits later on (simple
example: Newton's method for calculating square roots).

> Note that round/quantize give you the control that is most likely
> needed in decimal calculations without changing the context: rounding
> to fixed point including with precise control of rounding mode.
>
> If you're otherwise using the decimal module in place of a scientific
> multiprecision library then I think it's not really the right tool for
> the job. It's unfortunate that the docs suggest making functions like
> sin and cos: there is no good reason to use decimal over binary for
> transcendental or irrational numbers or functions.

Suppose you want to teach people how sin and cos are calculated. What
would YOU recommend? Python already comes with an arbitrary-precision
numeric data type. Do we need to use something else?

> > Also: what happens when there's a conflict? Which one wins? Let's say
> > you do "a + b" where the two were created with different contexts - do
> > you use the lower precision? the higher precision? What about rounding
> > settings?
>
> I suggested simply disallowing this. If I really care about having
> each operation use the right context then I'll be happy to see an
> error message if I mess up like this by forgetting to convert in the
> right place.

That would make the above exercise extremely annoying. I'm sure you'd
like it for your use case, but I would hate it for mine, and if it's
disallowed at the language level, that's about as global a choice as
it can ever be.

> It is possible to be explicit about which context you want to use by
> using context methods like ctx.divide. From a quick skim of the
> decimal docs I don't see a single example showing how to use these
> rather than modify/replace the global contexts. though. Conceptually
> using the context's divide method is appropriate since it is the
> operation (divide) that the context affects.
>
> >>> from decimal import Context
> >>> ctx = Context(prec=2)
> >>> ctx.divide(1, 3)
> Decimal('0.33')

So, not only does your proposal make things harder for some use cases,
it also sacrifices all use of operators? What's the advantage, here?

> > >> Realistically do many users want to use many different contexts and
> > >> regularly switch between them? I expect the common use case is wanting
> > >> to do everything in a particular context that just isn't the default
> > >> one.
> > >
> > > I don't know that that's true in the least -- sure, for a basic script, 
> > > absolutely, but PYthon has become a large ecosystem of third party 
> > > packages -- people make a LOT of large systems involving many complex 
> > > third party packages -- the builder of the system may not even know a 
> > > package is using Decimals -- let alone two different third party packages 
> > > using them in very different ways -- it's literally impossible for the 
> > > developer of package A to know how package B works or that someone might 
> > > be using both.
> >
> > Indeed. But I don't hear people complaining that they need to have
> > per-module Decimal contexts, possibly since it's never actually a
> > module-by-module consideration.
>
> If packages A and B are using decimal module contexts without their
> users knowing then I should hope that each is very careful about
> messing with the global contexts to avoid interfering with each other
> as well as anything that the user does. If I wrote a library that does
> this I probably would use the context methods like ctx.divide(a, b)
> just to be sure about things.
>

Local contexts exist for a reason. They just aren't *module* contexts,
because that doesn't actually help anyone.

ChrisA
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/JG6ABBAU5LXOU7YFY6DCAJK7DSM465GS/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to