[sage-devel] Re: problems with testing sandpiles.py on 6.1b2

2013-12-31 Thread Nils Bruin
On Tuesday, December 31, 2013 8:23:35 PM UTC-8, Volker Braun wrote:
>
> I also agree that it would be much better if the pexpect interfaces would 
> have been written against non-echoing tty. Of course that'll only work if 
> the subprocess doesn't turn echoing back on. The downside is of course that 
> you have to rewrite the way the pexpect interface works.
>

I tried: With maxima it's easy to inset a self._pexpect.setecho(0) in the 
_start method. The echo line is explicitly read by a .readline() and 
deactivating that is enough to get doctests to pass.

For other interfaces (especially singular) it's not so easy: The standard 
eval in interfaces/pexpect returns the string between the first "\n" and 
the last "\r", so it actually uses the echoed "\n" to delineate the 
returned result. Making that work without echo requires a little more work 
than I tried.

It does seem that at least singular, maxima, gp, don't turn echoing back 
on. Gap does seem to turn it back on.

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Re: Conversions without mathematical sense between polynomial rings

2013-12-31 Thread john_perry_usm
I sometimes illustrate techniques of polynomial factorization in a class by 
starting with a polynomial in ZZ[x] and converting it to a polynomial in 
GF(insert_large_prime)[x]. I realize this is not an instance of two 
_finite_ fields, but (a) you mention "the characteristics are different", 
and (b) _are_ there similar applications for finite fields, when the 
characteristics are different?

Or am I misunderstanding the proposal?

john perry

On Tuesday, December 31, 2013 6:59:03 AM UTC-5, Jean-Pierre Flori wrote:
>
> Dear all,
>
> Currently one can obtain surprising results in Sage when converting 
> polynomial over finite fields (or elements of quotient rings of univariate 
> polynomial ring even though tht's not the primary concern of the ticket).
> See http://trac.sagemath.org/ticket/11239
> Basically the generators of the finite fields are just exchanged, even 
> when the characteristics are different.
> The changes suggested in this ticket would more or less only let a 
> coercion be used if there is one, and rant if the characteristics are 
> different or the base finite fields are not part of a common lattice when 
> there is no canonical embedding.
>
> Note though that the original disturbing conversion fits with the current 
> behavior described in the doc:
> * 
> http://www.sagemath.org/doc/reference/coercion/index.html#maps-between-parents
> where it's stated that
>
> "Conversions need not be canonical (they may for example involve a choice 
> of lift) or even make sense mathematically (e.g. constructions of some 
> kind)."
>
> Anyone has a strong opinion about what we should let Sage do in such a 
> situation?
>
> Should we leave the old conversion when there is no coercion even though 
> that might easily to wrong mathematical results for a careless user? Or 
> never look for a coercion unless the user explicitly asks for it (i.e. what 
> the situation is currently in Sage without the patches of the ticket)?
>
> Best,
>
> JP
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Re: problems with testing sandpiles.py on 6.1b2

2013-12-31 Thread Volker Braun
Yes, agree that this is slightly different.

I also agree that it would be much better if the pexpect interfaces would 
have been written against non-echoing tty. Of course that'll only work if 
the subprocess doesn't turn echoing back on. The downside is of course that 
you have to rewrite the way the pexpect interface works.

The alternative is to get rid of pexpect altogether, and patch the 
subprocess input to instead setup & read from a socket (and flush on 
command). I did some work in that direction but haven't had the time to 
finish it...

 



On Monday, December 30, 2013 7:09:19 PM UTC, Nils Bruin wrote:
>
> On Monday, December 30, 2013 9:09:38 AM UTC-8, Volker Braun wrote:
>>
>> That is exactly what I am talking about: The pexpect pattern must match 
>> at a point where the subprocess has stopped processing and is just waiting 
>> for further input. But if you match for a prompt, say, and then the 
>> subprocess spits out another space or return before it waits for input then 
>> the additional characters will end up in random places in the output stream.
>>>
>>>
> I still have trouble fitting that diagnosis to what I observe:
>
> Parent:
>
> P1: write(26, "print(sage698);", 15)= 15
> P2: select(0, NULL, NULL, NULL, {0, 0}) = 0 (Timeout)
> P3: write(26, "\n", 1)  = 1
>
> Child:
>
> C1: read(0, "print(sage698);\n", 1024)  = 16
> C2: write(1, "x2^2-x0^2", 9)= 9
> C3: write(1, "\n", 1)   = 1
> C4: write(1, "> ", 2)   = 2
>
> Parent:
>
> P4: read(26, "print(sage698);x2^2-x0^2\r\n> \r\n", 1000) = 30
>
> The fact that C1 ends in a newline sugggests that both P1 and P3 have 
> completed before C1 executes. However, P4 suggests that it is seeing the 
> result of
> , C2, C3, C4, . That looks more like a race 
> condition between the pty echo and the output produced by the child. I 
> don't see how pexpect could avoid this. from happening (and it's rare 
> indeed).
>
> Actually, it seems to me that the pty is being a bit careless: shouldn't 
>>> it make sure that it's done echoing before it actually delivers the 
>>> character to the process?
>>>
>>
>> There is no such guarantee, pexpect is akin to typing into a terminal. If 
>> echo is on then the typed key shows up immediately and in-between stdout.
>>
>
>  Hm, I've never seen a key I typed end up being echoed *after* the next 
> prompt had printed, and I thought this was because the prompt was printed 
> only *after* the input was received, hence *after* the echo had completed. 
> That is not quite the same scenario as two processes writing to the same 
> stream in an uncoordinated way.
>
> Possibly, if we merge P1 and P3 we can significantly reduce the 
> probability of this happening, since then P1+P3 will arrive together at the 
> pty, so it has a better chance of echoing them both.
>
> Alternatively, if we get rid of the echo altogether, then there is nothing 
> to have a race condition with, so this particular issue might be going away 
> completely.
>
> On #15440 it also seems to be an *echoed* space that gets in the way, so 
> it might help there too. Preliminary tests suggest:
>
> sage: import pexpect
> sage: child=pexpect.spawn("maxima")
> sage: child.expect("(%i.*) ")
> 0
> sage: child.sendline("1+2;")
> 5
> sage: child.readline()
> '1+2;\r\n'
> sage: child.readline()
> '(%o1)  3\r\n'
>
> whereas with echo turned off:
>
> sage: import pexpect
> sage: child=pexpect.spawn("maxima")
> sage: child.setecho(0)
> sage: child.expect("(%i.*) ")
> 0
> sage: child.sendline("1+2;")
> 5
> sage: child.readline()
> '(%o1)  3\r\n'
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Re: Conversions without mathematical sense between polynomial rings

2013-12-31 Thread Peter Bruin
Hi Simon,

Warning: I am partially playing "advocatus diavoli" here. 
>

That can be very useful!
 

> > so let me just say here that I think this 
> > principle (that conversions need not be canonical) shouldn't be pushed 
> > further than reasonable. 
>
> Yes, but for a rather relaxed notion of "reasonable". For example, you 
> couldn't even say that you have a "conversion from K[x]/(f) to 
> K[x]/(g)", since a conversion is not a map. It is a *partial* map, 
> hence, it often is the case that only polynomials of degree zero can be 
> converted and higher degree polynomials result in an error. 
>

Sure, that still counts as reasonable for me.
 

> >  For example, if K is a field and f, g in K[x] are 
> > polynomials, then applying a conversion K[x]/(f) -> K[x]/(g) is a 
> sensible 
> > thing to do if g divides f (the canonical map) or if f divides g (the 
> > lifting map).  However, I don't see why it is important to insist that 
> the 
> > composition of two such maps (e.g. K[x]/(f) -> K[x] -> K[x]/(g) when f, 
> g 
> > have nothing to do with each other) should again be a valid conversion 
> that 
> > can be invoked with the same ease. 
>
> Hang on. Nobody says that the composition of two conversions A->B and 
> B->C yields a conversion A->C (even if we forget for the moment that 
> conversions are not more than partially defined maps)! Composibility 
> is an axiom for *coercions*, but not for *conversions*. Actially 
> conversion 
> have no axioms worth mentioning. 
>

Yes, I shouldn't have said "insist" above, maybe "suggest".  I don't know 
if the fact that we have maps K[x]/(f) -> K[x]/(g) for arbitrary f, g (or 
Z/mZ -> Z/nZ for arbitrary m, n) was intended or just a side effect of not 
doing any checks and just doing conversions if the implementation allows 
it.  However, I'm really not convinced that it is easy for the average user 
to avoid the idea that such conversions apparently have some mathematical 
meaning, just because Sage makes it so easy to do these conversions.

Hence, it would be perfectly fine to have K[x]/(f)->K[x] and 
> K[x]->K[x]/(g) for *all* elements of the involved rings, but 
> no conversion from K[x]/(f)->K[x]/(g), or actually: Only conversion for 
> polynomials of degree zero.
>

Yes, that is more or less what I wanted to say.
 

> That said, do you really think one should do more or less expensive 
> operations (such as: compute gcd or a prime factorisation) to 
> verify that an easy conversion rule makes sense in a particular case? 
>

I don't know of an example where one has to do such expensive things; in 
the above examples, it is just a divisibility check.  In the examples of 
K[x]/(f) and Z/mZ, this divisibility check does look like a good sanity 
check to me.

You say it should not be pushed further than reasonable. Is it reasonable 
> to have an expensive test to decide whether or not to apply an easy 
> conversion rule? Recall: Coercions should obey axioms and thus expensive 
> tests may be needed. But why should one have expensive tests for something 
> that has no defined properties? How *could* you possibly test something 
> that has no defined properties?
>

It depends on how expensive the test is.  If I may play the devil's 
advocate for a moment: why would you want to *use* something that has no 
defined properties?
 

> > Hence I think that R(h), for R = 
> > K[x]/(g) and h in L[x]/(f), should only be allowed when K has a 
> canonical 
> > map to L and either f divides g (in L[x]) or vice versa. 
>
> Do you claim we should drop conversion entirely? After all, we also have a 
> *coercion* system, and coercions (as opposed to conversions) are supposed 
> to 
> be mathematically consistent. Do you think this would be reasonable? 
>

No, I am certainly not suggesting that conversion should go away.  Using 
conversion to find preimages in the maps Z -> Z/mZ or K[x] -> K[x]/(f) is 
something that makes a lot of sense.  But when it comes to being maximally 
permissive in allowing conversions (allowing for example direct conversions 
Z/3Z -> Z/2Z): despite some arguments for it (less checking, allowing 
people who really want to do that to save a few keystrokes) I don't see why 
it is a good thing.

If there is a coercion map then conversion must give the same result. 
> Everything else would be a bug.
>

At least there is no misunderstanding about that!

Best regards,

Peter

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Re: Conversions without mathematical sense between polynomial rings

2013-12-31 Thread Simon King
Hi Peter,

On 2013-12-31, Peter Bruin  wrote:
> I posted a comment at #11239,

Then perhaps I should answer there, but anyway, here it goes.

Warning: I am partially playing "advocatus diavoli" here.

> so let me just say here that I think this 
> principle (that conversions need not be canonical) shouldn't be pushed 
> further than reasonable.

Yes, but for a rather relaxed notion of "reasonable". For example, you
couldn't even say that you have a "conversion from K[x]/(f) to
K[x]/(g)", since a conversion is not a map. It is a *partial* map,
hence, it often is the case that only polynomials of degree zero can be
converted and higher degree polynomials result in an error. Think of
QQ[x]->ZZ:
  sage: a = QQ['x'](1)
  sage: ZZ(a)
  1
but
  sage: a = QQ['x'](1/2)
  sage: ZZ(a)
  

>  For example, if K is a field and f, g in K[x] are 
> polynomials, then applying a conversion K[x]/(f) -> K[x]/(g) is a sensible 
> thing to do if g divides f (the canonical map) or if f divides g (the 
> lifting map).  However, I don't see why it is important to insist that the 
> composition of two such maps (e.g. K[x]/(f) -> K[x] -> K[x]/(g) when f, g 
> have nothing to do with each other) should again be a valid conversion that 
> can be invoked with the same ease.

Hang on. Nobody says that the composition of two conversions A->B and
B->C yields a conversion A->C (even if we forget for the moment that
conversions are not more than partially defined maps)! Composibility
is an axiom for *coercions*, but not for *conversions*. Actially conversion
have no axioms worth mentioning.

Hence, it would be perfectly fine to have K[x]/(f)->K[x] and
K[x]->K[x]/(g) for *all* elements of the involved rings, but
no conversion from K[x]/(f)->K[x]/(g), or actually: Only conversion for
polynomials of degree zero.

That said, do you really think one should do more or less expensive
operations (such as: compute gcd or a prime factorisation) to
verify that an easy conversion rule makes sense in a particular case?

You say it should not be pushed further than reasonable. Is it reasonable
to have an expensive test to decide whether or not to apply an easy
conversion rule? Recall: Coercions should obey axioms and thus expensive
tests may be needed. But why should one have expensive tests for something
that has no defined properties? How *could* you possibly test something
that has no defined properties?

> Hence I think that R(h), for R = 
> K[x]/(g) and h in L[x]/(f), should only be allowed when K has a canonical 
> map to L and either f divides g (in L[x]) or vice versa.

Do you claim we should drop conversion entirely? After all, we also have a
*coercion* system, and coercions (as opposed to conversions) are supposed to
be mathematically consistent. Do you think this would be reasonable?

> Should we leave the old conversion when there is no coercion even though 
>> that might easily to wrong mathematical results for a careless user? Or 
>> never look for a coercion unless the user explicitly asks for it (i.e. what 
>> the situation is currently in Sage without the patches of the ticket)?
>>
> I thought that conversion should always give the same result as a coercion 
> map whenever one exists; is there any (intended) situation where this is 
> not the case?

If there is a coercion map then conversion must give the same result.
Everything else would be a bug.

Best regards,
Simon


-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Re: Conversions without mathematical sense between polynomial rings

2013-12-31 Thread Peter Bruin
Jean-Pierre Flori wrote:
 

> Currently one can obtain surprising results in Sage when converting 
> polynomial over finite fields (or elements of quotient rings of univariate 
> polynomial ring even though tht's not the primary concern of the ticket).
> See http://trac.sagemath.org/ticket/11239
> Basically the generators of the finite fields are just exchanged, even 
> when the characteristics are different.
> The changes suggested in this ticket would more or less only let a 
> coercion be used if there is one, and rant if the characteristics are 
> different or the base finite fields are not part of a common lattice when 
> there is no canonical embedding.
>
> Note though that the original disturbing conversion fits with the current 
> behavior described in the doc:
> * 
> http://www.sagemath.org/doc/reference/coercion/index.html#maps-between-parents
> where it's stated that
>
> "Conversions need not be canonical (they may for example involve a choice 
> of lift) or even make sense mathematically (e.g. constructions of some 
> kind)."
>
> Anyone has a strong opinion about what we should let Sage do in such a 
> situation?
>
I posted a comment at #11239, so let me just say here that I think this 
principle (that conversions need not be canonical) shouldn't be pushed 
further than reasonable.  For example, if K is a field and f, g in K[x] are 
polynomials, then applying a conversion K[x]/(f) -> K[x]/(g) is a sensible 
thing to do if g divides f (the canonical map) or if f divides g (the 
lifting map).  However, I don't see why it is important to insist that the 
composition of two such maps (e.g. K[x]/(f) -> K[x] -> K[x]/(g) when f, g 
have nothing to do with each other) should again be a valid conversion that 
can be invoked with the same ease.  Hence I think that R(h), for R = 
K[x]/(g) and h in L[x]/(f), should only be allowed when K has a canonical 
map to L and either f divides g (in L[x]) or vice versa.  This should cover 
almost all practical uses, and in the rare remaining cases it seems better 
to require being a bit more explicit and doing the conversion in two steps.

Should we leave the old conversion when there is no coercion even though 
> that might easily to wrong mathematical results for a careless user? Or 
> never look for a coercion unless the user explicitly asks for it (i.e. what 
> the situation is currently in Sage without the patches of the ticket)?
>
I thought that conversion should always give the same result as a coercion 
map whenever one exists; is there any (intended) situation where this is 
not the case?

Peter

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.


[sage-devel] Conversions without mathematical sense between polynomial rings

2013-12-31 Thread Jean-Pierre Flori
Dear all,

Currently one can obtain surprising results in Sage when converting 
polynomial over finite fields (or elements of quotient rings of univariate 
polynomial ring even though tht's not the primary concern of the ticket).
See http://trac.sagemath.org/ticket/11239
Basically the generators of the finite fields are just exchanged, even when 
the characteristics are different.
The changes suggested in this ticket would more or less only let a coercion 
be used if there is one, and rant if the characteristics are different or 
the base finite fields are not part of a common lattice when there is no 
canonical embedding.

Note though that the original disturbing conversion fits with the current 
behavior described in the doc:
* 
http://www.sagemath.org/doc/reference/coercion/index.html#maps-between-parents
where it's stated that

"Conversions need not be canonical (they may for example involve a choice 
of lift) or even make sense mathematically (e.g. constructions of some 
kind)."

Anyone has a strong opinion about what we should let Sage do in such a 
situation?

Should we leave the old conversion when there is no coercion even though 
that might easily to wrong mathematical results for a careless user? Or 
never look for a coercion unless the user explicitly asks for it (i.e. what 
the situation is currently in Sage without the patches of the ticket)?

Best,

JP

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/groups/opt_out.