On 1 Mai, 18:36, Ondrej Certik <ond...@certik.cz> wrote:
> There is some misunderstanding (possibly on my side): I thought that
> the branch has removed the assumptions completely, thus it uses no
> assumptions. Then the idea was to use *local* assumptions, by using
> the Assume class.
>
> Can you clarify what you mean by "global assumptions" in this branch?

You have tests like:

def test_PDF():
    a = Symbol('a')
    x = Symbol('x')

    global_assumptions.add(Assume(a, Q.positive, True))
    global_assumptions.add(Assume(x, Q.real, True))

    exponential = PDF(exp(-x/a), (x,0,oo))
    exponential = exponential.normalize()
    assert exponential.pdf(x) == 1/a*exp(-x/a)
    assert exponential.cdf(x) == 1 - exp(-x/a)
    assert exponential.mean == a
    assert exponential.variance == a**2
    assert exponential.stddev == a

    global_assumptions.discard(Assume(a, Q.positive, True))
    global_assumptions.discard(Assume(x, Q.real, True))


To me it does not make sense to have one big set that stores
absolutely all assumptions.

> I am aware that the code in sympy/assumptions also (besides other
> things) allows to use global assumptions, and I stress that this is
> not the approach I am voting for ---- I am voting for using a local
> Assume (or AssumeContext) whatever you want to call it.

This adds a lot of verbosity compared to the old way.

> >> I would be interested in the community vote on this idea. I vote +1. I
> >> am aware that Ronan voted -1 last year. What do others think?
>
> > I don't think that any branch is ready to be merged now, so maybe
> > let's rather discuss about the approach for future development. For
>
> Yes, I am voting for that approach in the branch. Of course it has to
> be rebased on top of the latest master (or redo from scratch, whatever
> is easier).
>
> > instance, what do you think about the local assumptions approach (see
> > the wiki page and issue 1884)? It's a hack, but I think it is much
> > better than using global assumptions.
>
> I looked at the code, most importantly here:
>
> https://github.com/vks/sympy/commit/bfc837b8859992c7ce43bd4967902bfce...
>
> and even though the hack is simple, it is still a hack, so my gut
> feeling is telling me, that this will bite is quite substantially in
> the future. (E.g. if you want to rewrite the core in C/C++, will this
> design still work?)

This is a good point, I felt free to add it to the wiki. We don't have
to use local assumptions. The idea is that it allows a more painless
transition. We can still replace every use of it with a more explicit
context passing. We could also implement a special data structure that
stores the local contexts, emulating the same behavior, so that it can
be implemented in C. This is however much more work than this simple
hack, where Python does the hard stuff for us.

> However, apart from this hack, your approach seems identical to what I
> have in mind --- except that I would pass the AssumptionContext (that
> you embed automatically in the local scope) around by hand explicitly
> ("explicit is better than implicit").

I think even if it is a hack, it is a well-defined one, with clear
behavior. How do you imagine the example above with explicit
assumption passing? Something like

def test_PDF():
    a = Symbol('a')
    x = Symbol('x')

    ctx = AssumptionContext()
    ctx.add(Assume(a, Q.positive, True))
    ctx.add(Assume(x, Q.real, True))

    exponential = PDF(exp(-x/a), (x,0,oo), ctx)
    exponential = exponential.normalize()
    assert exponential.pdf(x) == 1/a*exp(-x/a)
    assert exponential.cdf(x) == 1 - exp(-x/a)
    assert exponential.mean == a
    assert exponential.variance == a**2
    assert exponential.stddev == a

?

This would need a lot or reimplementing.

Vinzent

-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To post to this group, send email to sympy@googlegroups.com.
To unsubscribe from this group, send email to 
sympy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sympy?hl=en.

Reply via email to