On Mar 31, 2010, at 3:15 PM, Vinzent Steinberg wrote:

> 2010/3/31 Aaron S. Meurer <asmeu...@gmail.com>:
>> Commutativity is treated differently than other assumptions in the core 
>> because it needs to know about it for auto-simplification.  See for example 
>> the source for Mul.flatten().
>> 
>> Maybe this is related too:
>> 
>> In [6]: x = Symbol('x', commutative=False)
>> 
>> In [7]: x.assumptions0
>> Out[7]: {}
>> 
>> In [8]: x = Symbol('x', commutative=False, positive=True)
>> 
>> In [9]: x.assumptions0
>> Out[9]: {commutative: True, complex: True, imaginary: False, negative: 
>> False, nonnegative: True, nonpositive: False, nonzero: True, positive: True, 
>> real: True, zero: False}
>> 
>> Which looks like two bugs.  The first one doesn't put commutativity in the 
>> assumptions dictionary, and the second one overrides the commutativity 
>> option without error (though it still seems to act non-commutative anyway).
>> 
>> Non-commutativity is a mess.  Is it worth it fixing it the way it is now, or 
>> should we just go ahead and make it work with the new assumptions?
> 
> Do you think that even the most trivial assumptions should be handled
> by the new system? I think yes, there might be however performance
> problems, which can be fixed.

On the one hand, x being non-commutative should imply that x is not 
real/complex/etc. because those are commutative (we don't seem to even do that 
now).  On the other hand, how would you refine a non-commutative symbol?  I 
guess I could see the reverse: temporarily assume a symbol is commutative in 
order to simplify it, so something like

refine(x*y - y*x, Assume(x, Q.commutative))

Other than that, I don't see the new assumptions being really beneficial in 
this case.  But maybe there is something I am forgetting.

Also, like I said, commutativity is somewhat different from other assumptions 
because it has to be handled in the core somehow.  Part of the idea behind the 
new assumptions is to remove automatic simplification based on assumptions, 
like sqrt(x**2) == x when x is positive, and put them into refine, but that 
won't work in this case because we need to prevent Mul/Add from changing x*y - 
y*x into 0 at instantiation time (i.e., it's not an automatic simplification 
that we want to remove).

> 
> And we could make x.is_something a wrapper to ask(x, Q.something,
> assumptions). The problem is where to store the assumptions, which are
> currently generated by Symbol(). Tying assumptions to the symbol has
> the advantage that it is only valid for the local scope and afterwards
> forgotten, but one of the motivations of the new assumption system was
> to untie them.

I like this idea.  Maybe it doesn't necessarily have to be tied to Symbol.  If 
there is some kind of global assumptions table, or you are in some "with 
Assume(x, Q.something):" context, then x.is_something would just be a shortcut 
to look it up in there.  It may make it seem to the user that the assumptions 
are stored in the symbol, but it doesn't really matter because that is just an 
implementation detail that he won't even care about either way, unless he wants 
to implement some assumptions or a special symbol type of his own.

> 
> Maybe we could implement a set_local_assumption_context() call which
> sets some place up to be used by local calls to Symbol() to store
> assumptions and which is used by default by ask()?

Exactly.

> 
> Vinzent

Aaron Meurer

-- 
You received this message because you are subscribed to the Google Groups 
"sympy-patches" group.
To post to this group, send email to sympy-patc...@googlegroups.com.
To unsubscribe from this group, send email to 
sympy-patches+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sympy-patches?hl=en.

Reply via email to