[sage-devel] vector-valued functions

2010-05-03 Thread Jason Grout

I just uploaded a patch (needs review!) to

http://trac.sagemath.org/sage_trac/ticket/8866

that lets a user do some natural things with vector-valued functions.
I'm posting here to call for any feedback on the syntax:

f(x,y) = tuple or list

Here are some examples:

sage: T(r,theta)=[r*cos(theta),r*sin(theta)]
sage: T
((r, theta) |--> r*cos(theta), (r, theta) |--> r*sin(theta))
sage: T.diff() # Jacobian matrix
[   (r, theta) |--> cos(theta) (r, theta) |--> -r*sin(theta)]
[   (r, theta) |--> sin(theta)  (r, theta) |--> r*cos(theta)]
sage: diff(T) # Jacobian matrix
[   (r, theta) |--> cos(theta) (r, theta) |--> -r*sin(theta)]
[   (r, theta) |--> sin(theta)  (r, theta) |--> r*cos(theta)]
sage: T.diff().det() # Jacobian
(r, theta) |--> r*sin(theta)^2 + r*cos(theta)^2

sage: r(t)=[cos(t),sin(t)]
sage: parametric_plot(r(t), (t,0,2*pi))

sage: f(x,y)=x^2+y
sage: f.diff() # gradient
((x, y) |--> 2*x, (x, y) |--> 1)
sage: f.diff().diff() # Hessian matrix
[(x, y) |--> 2 (x, y) |--> 0]
[(x, y) |--> 0 (x, y) |--> 0]

sage: # multivariable 2nd derivative test
sage: f(x,y)=x^2*y+y^2+y
sage: f.diff() # gradient
((x, y) |--> 2*x*y, (x, y) |--> x^2 + 2*y + 1)
sage: solve(list(f.diff()),[x,y])
[[x == -I, y == 0], [x == I, y == 0], [x == 0, y == (-1/2)]]
sage: f.diff(2)  # Hessian matrix
[(x, y) |--> 2*y (x, y) |--> 2*x]
[(x, y) |--> 2*x   (x, y) |--> 2]
sage: f.diff(2)(x=0,y=-1/2)
[-1  0]
[ 0  2]
sage: f.diff(2)(x=0,y=-1/2).eigenvalues()
[-1, 2]
sage: # we have a saddle point

I made it so that the syntax

f(x,y) = tuple or list

makes a vector with the base ring of callable symbolic expressions.  Are 
there any comments on this syntax?  I think it's pretty natural, and 
it's not allowed right now.


I think this will make the computations in a calc 3 class much more 
natural, for example.


Thanks,

Jason


--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread Tim Joseph Dumol
New spkg as 0.8.p2 at the mentioned number needing review with the bug fix.
So you were right regarding that problem -- zope.testbrowser's dependencies
changed, and the download script wasn't updated.

On Tue, May 4, 2010 at 12:05 PM, Tim Joseph Dumol  wrote:

> Actually, it seems that that's not the only problem, as J. Cremona noted:
> http://trac.sagemath.org/sage_trac/ticket/8861
>
>
> On Mon, May 3, 2010 at 11:59 PM, William Stein  wrote:
>
>> On Mon, May 3, 2010 at 8:49 AM, Tim Joseph Dumol 
>> wrote:
>> > This isn't related to my new package includes. Jinja2 wasn't one of
>> those
>> > new packages. The problem is that SageNB is installed before Jinja2 is
>> > installed, so it's more of a problem in the dependency script.
>>
>> Cool -- then that should be very easy to fix.   Thanks for debugging
>> this, even though it wasn't what I thought (and I was wrong).
>>
>> William
>>
>> >
>> > On Mon, May 3, 2010 at 11:01 PM, William Stein 
>> wrote:
>> >>
>> >> Hi,
>> >>
>> >> This is now
>> >>   http://trac.sagemath.org/sage_trac/ticket/8858
>> >>
>> >> William
>> >>
>> >> On Mon, May 3, 2010 at 7:57 AM, John Cremona 
>> >> wrote:
>> >> > Harald,  I made almost the same point earlier today (but in my case
>> it
>> >> > was sagenb building which tried to access the internet.  Which failed
>> >> > as I was building overnight and had turned off my home internet
>> >> > connection.)
>> >> >
>> >> > John
>> >> >
>> >> > On 3 May 2010 15:39, Harald Schilly 
>> wrote:
>> >> >> Hi, while watching the compilation of 4.4.1 I saw that it stopped
>> >> >> compiling and waited for a package to download. I'm curious if this
>> is
>> >> >> intended or just a strange glitch. At least my idea of a self
>> >> >> containing source package is that it doesn't need to download
>> software
>> >> >> from the internet!
>> >> >>
>> >> >> ...
>> >> >> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding
>> 'build/bdist.linux-
>> >> >> i686/egg' to it
>> >> >> removing 'build/bdist.linux-i686/egg' (and everything under it)
>> >> >> Processing Sphinx-0.6.3-py2.6.egg
>> >> >> creating
>> /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> >> site-packages/Sphinx-0.6.3-py2.6.egg
>> >> >> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
>> >> >> sage-4.4.1/local/lib/python2.6/site-packages
>> >> >> Adding Sphinx 0.6.3 to easy-install.pth file
>> >> >> Installing sphinx-build script to /scratch/scratch/schilly/sage/
>> >> >> sage-4.4.1/local/bin
>> >> >> Installing sphinx-quickstart script to
>> /scratch/scratch/schilly/sage/
>> >> >> sage-4.4.1/local/bin
>> >> >> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
>> >> >> sage-4.4.1/local/bin
>> >> >>
>> >> >> Installed
>> /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> >> site-packages/Sphinx-0.6.3-py2.6.egg
>> >> >> Processing dependencies for Sphinx==0.6.3
>> >> >> Searching for docutils==0.5
>> >> >> Best match: docutils 0.5
>> >> >> Adding docutils 0.5 to easy-install.pth file
>> >> >>
>> >> >> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> >> site-packages
>> >> >> Searching for Jinja2==2.3.1
>> >> >> Reading http://pypi.python.org/simple/Jinja2/
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >> Download error: [Errno 110] Connection timed out -- Some packages
>> may
>> >> >> not be found!
>> >> >> Reading http://jinja.pocoo.org/
>> >> >>
>> >> >> And well, it waits the usual 2 minutes socket timeout in each line
>> and
>> >> >> the pocoo website is really down.
>> >> >>
>> >> >> H
>> >> >>
>> >> >> --
>> >> >> To post to this group, send an email to sage-devel@googlegroups.com
>> >> >> To unsubscribe from this group, send an email to
>> >> >> sage-devel+unsubscr...@googlegroups.com
>> >> >> For more options, visit this group at
>> >> >> http://groups.google.com/group/sage-devel
>> >> >> URL: http://www.sagemath.org
>> >> >>
>> >> >
>> >> > --
>> >> > To post to this group, send an email to sage-devel@googlegroups.com
>> >> > To unsubscribe from this group, send an email to
>> >> > sage-devel+unsubscr...@googlegroups.com
>> >> > For more options, visit this group at
>> >> > http://groups.google.com/group/sage-devel
>> >> > URL: http://www.sagemath.org
>> >> >
>> >>
>> >>
>

Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread Tim Joseph Dumol
Actually, it seems that that's not the only problem, as J. Cremona noted:
http://trac.sagemath.org/sage_trac/ticket/8861

On Mon, May 3, 2010 at 11:59 PM, William Stein  wrote:

> On Mon, May 3, 2010 at 8:49 AM, Tim Joseph Dumol  wrote:
> > This isn't related to my new package includes. Jinja2 wasn't one of those
> > new packages. The problem is that SageNB is installed before Jinja2 is
> > installed, so it's more of a problem in the dependency script.
>
> Cool -- then that should be very easy to fix.   Thanks for debugging
> this, even though it wasn't what I thought (and I was wrong).
>
> William
>
> >
> > On Mon, May 3, 2010 at 11:01 PM, William Stein  wrote:
> >>
> >> Hi,
> >>
> >> This is now
> >>   http://trac.sagemath.org/sage_trac/ticket/8858
> >>
> >> William
> >>
> >> On Mon, May 3, 2010 at 7:57 AM, John Cremona 
> >> wrote:
> >> > Harald,  I made almost the same point earlier today (but in my case it
> >> > was sagenb building which tried to access the internet.  Which failed
> >> > as I was building overnight and had turned off my home internet
> >> > connection.)
> >> >
> >> > John
> >> >
> >> > On 3 May 2010 15:39, Harald Schilly  wrote:
> >> >> Hi, while watching the compilation of 4.4.1 I saw that it stopped
> >> >> compiling and waited for a package to download. I'm curious if this
> is
> >> >> intended or just a strange glitch. At least my idea of a self
> >> >> containing source package is that it doesn't need to download
> software
> >> >> from the internet!
> >> >>
> >> >> ...
> >> >> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
> >> >> i686/egg' to it
> >> >> removing 'build/bdist.linux-i686/egg' (and everything under it)
> >> >> Processing Sphinx-0.6.3-py2.6.egg
> >> >> creating
> /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> >> site-packages/Sphinx-0.6.3-py2.6.egg
> >> >> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
> >> >> sage-4.4.1/local/lib/python2.6/site-packages
> >> >> Adding Sphinx 0.6.3 to easy-install.pth file
> >> >> Installing sphinx-build script to /scratch/scratch/schilly/sage/
> >> >> sage-4.4.1/local/bin
> >> >> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
> >> >> sage-4.4.1/local/bin
> >> >> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
> >> >> sage-4.4.1/local/bin
> >> >>
> >> >> Installed
> /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> >> site-packages/Sphinx-0.6.3-py2.6.egg
> >> >> Processing dependencies for Sphinx==0.6.3
> >> >> Searching for docutils==0.5
> >> >> Best match: docutils 0.5
> >> >> Adding docutils 0.5 to easy-install.pth file
> >> >>
> >> >> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> >> site-packages
> >> >> Searching for Jinja2==2.3.1
> >> >> Reading http://pypi.python.org/simple/Jinja2/
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> >> not be found!
> >> >> Reading http://jinja.pocoo.org/
> >> >>
> >> >> And well, it waits the usual 2 minutes socket timeout in each line
> and
> >> >> the pocoo website is really down.
> >> >>
> >> >> H
> >> >>
> >> >> --
> >> >> To post to this group, send an email to sage-devel@googlegroups.com
> >> >> To unsubscribe from this group, send an email to
> >> >> sage-devel+unsubscr...@googlegroups.com
> >> >> For more options, visit this group at
> >> >> http://groups.google.com/group/sage-devel
> >> >> URL: http://www.sagemath.org
> >> >>
> >> >
> >> > --
> >> > To post to this group, send an email to sage-devel@googlegroups.com
> >> > To unsubscribe from this group, send an email to
> >> > sage-devel+unsubscr...@googlegroups.com
> >> > For more options, visit this group at
> >> > http://groups.google.com/group/sage-devel
> >> > URL: http://www.sagemath.org
> >> >
> >>
> >>
> >>
> >> --
> >> William Stein
> >> Professor of Mathematics
> >> University of Washington
> >> http://wstein.org
> >
> >
> >
> > --
> > Tim Joseph Dumol 
> > http://timdumol.com
> >
>
>
>
> --
> William Stein
> Professor of Mathematics
> University of Washington
> http://wstein.org
>



-- 
Tim Joseph Dumol 
http://timdumol.com

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from

Re: [sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Tim Daly

I have also found that it has the side-effect you mention.
It makes debugging easier, if it is needed at all.
Hopefully this will also be true of the person who ends up
maintaining our code after we're gone.

Thanks for the permission. Your quote appears on the
documentation page of the axiom website (which I won't
link here)

Tim

Bill Hart wrote:

Ah you arrived right on cue. LOL!

Ha ha, you can quote me if you want, but I have written a couple of
literate programs in my life, so I'm hardly an expert.

But I was surprised at how much difference it made to the debugging
time.

Bill.

On May 3, 10:04 pm, Tim Daly  wrote:
  

Bill Hart wrote:


That's actually a very interesting paper. I've recently been playing
with Forth, which is a kind of "Lisp type language" (yeah I know you
won't agree with that), based on a data stack. I also worked through a
book on Lisp up to the point where macros were defined, as I wanted to
understand how that was handled in Lisp. I actually "get" Lisp now,
but it was a round about way that I got there. It's clearly not for
everyone.
  
I've also been experimenting with how short programs can be that still

give reasonable performance. The answer is, amazingly short, if one
spends a lot of time thinking about it before coding.
  
Another thing I've been enjoying lately is literate programming.

Amazingly it turns out to be faster to write a literate program than
an ordinary program because debugging takes almost no time.
  

Can I quote you on that in the Axiom system (which is moving toward
being fully literate)?







Anyhow, I'm going to read this paper of yours now.
  
Bill.
  
On May 3, 3:37 pm, rjf  wrote:
  

If you are not doing floating point arithmetic with machine
arithmetic, but using MPFR, then you are sacrificing a huge amount of
time.  You might as well be using rational arithmetic, or the kind of
arithmetic that Collins once proposed, where the denominator is a
power of 2.  Makes reducing to lowest terms relatively fast because
the
GCD is trivial.  Compare that to boosting the overall precision in
MPFR to "big enough".

If you want to read more about multiplying polynomials, you can read

the (unpublished, unfinished, too-long) paper here:

www.cs.berkeley.edu/~fateman/papers/shortprog.tex

RJF

--

To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group athttp://groups.google.com/group/sage-devel
URL:http://www.sagemath.org


--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group athttp://groups.google.com/group/sage-devel
URL:http://www.sagemath.org



  


--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread rjf
Could I be agreeing with Tom?

Well, sort of.
If you are writing a program in the context of some on-going project,
trying
to improve the program that does (say) multiplication, then it is
exactly
relevant to compare your new program to the one you propose to
replace.
And if you have yet another idea, you compare your new' program, and
new'' etc.  That is, I think, what Bill is doing.  So this
benchmarking is
pretty much what you want to do, though if you are too clever in
choosing
examples, you can make really really bad decisions.

For example, the
modular polynomial GCD of Collins and Brown was PROVED to be
of complexity comparable to division, and clearly of far lower
complexity
than (say) the next best, at the time, the subresultant PRS.  Yet it
was
terribly terribly slow.  Why?  wrong model of polynomials. The
complexity
assumed completely dense multivariate polynomials.  And if they were
sparse, it didn't matter, it would make believe they were dense.

So analysis  AND benchmarks can be deceptive, even if not deliberate.

(eventually, GCD methods that accounted for sparseness took over.)

As for having some Sage-ist compare different systems, it can be
pretty
hopeless.  If you don't know about canonical rational function
representation
in Maxima, you will have very slow times.  (hint:  any use of the
function "expand"
is wrong.)  So Maxima might seem slow, while what the benchmark may
show is that you really don't know enough about how to use Maxima.
For Maple, there are substantial subtleties in timing, or at least
used to be.
Like you cannot time a command because that includes an extra
simplification,
so you have to write a little program and time that.
And for Mathematica, you can mess with two (or more?) kinds of arrays,
by accident.
And you may be comparing machine-floats with MPFR floats with quad-
precision with 
UNINTENTIONALLY.

So my view is that if I have a good idea and I can make something work
neatly in
my system  (short program, clever implementation, new algorithm,
etc.)  it is
up to me to describe it in sufficient detail that someone could redo
it in some
other (similarly equipped) system and see if it represents an
improvement or not.

That someone would be sufficiently expert in that other system, and
have enough
interest in making that program neat, clever, etc,  that he/she would
do a good job.

For something at the level of multiplying two integers, I, for one am
not
particularly keen on hacking. Not that it is uninteresting.  Not that
it is solved...


There are maybe a few distinct cases of interest. These come to mind..

small integers (16 bit?)  moderate (32 or sometimes 64 bit),  ones
that fit in a float-mantissa (52 bit),
small numbers X big numbers
big numbers X equally big numbers
numbers X powers of 2  (shift).

also of course numbers just smaller and just larger than critical
points in shifting between algorithms if
there are a bunch.

There may be cases of alignment of data on cache-line boundaries or
not.





On May 3, 3:36 pm, Tom Boothby  wrote:
> > I've always been confused about people who present benchmarks of their
> > own code against more of their own code, rather than to have the
> > courage to compare against other people's code.
>
> I think this can be useful in some contexts.  It can "normalize away"
> the skill of the programmer and the amount of effort put into the
> code, which can make an effective comparison of two strategies /
> algorithms.  Also, if you consider something like MPIR, it can be
> tricky for an outsider to predict which algorithm gets used to
> multiply two integers -- good production code is rarely as simple as a
> straightforward implementation of a single algorithm.
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group athttp://groups.google.com/group/sage-devel
> URL:http://www.sagemath.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Ralf Hemmecke
On 05/04/2010 12:16 AM, Bill Hart wrote:
[snip]

> But I was surprised at how much difference it made to the debugging
> time.

I made the same experience. Literate programming is not only beneficial
for the people who read the literate program, but if the matter is
complex enough, it's also good for the original programmer. (But I shut
up now. Some things one must discover by oneself. Unfortunately there is
always this hurdle to actually start trying something out.)

Ralf

> On May 3, 10:04 pm, Tim Daly  wrote:
>> Bill Hart wrote:
[snip]
>>> Another thing I've been enjoying lately is literate programming.
>>> Amazingly it turns out to be faster to write a literate program than
>>> an ordinary program because debugging takes almost no time.

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Tom Boothby
> I've always been confused about people who present benchmarks of their
> own code against more of their own code, rather than to have the
> courage to compare against other people's code.

I think this can be useful in some contexts.  It can "normalize away"
the skill of the programmer and the amount of effort put into the
code, which can make an effective comparison of two strategies /
algorithms.  Also, if you consider something like MPIR, it can be
tricky for an outsider to predict which algorithm gets used to
multiply two integers -- good production code is rarely as simple as a
straightforward implementation of a single algorithm.

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart


On May 3, 9:32 pm, rjf  wrote:
> Your comments are interesting:
> 1. Theorists tend to reject all papers that merely demonstrate working
> programs,
> in favor of papers that have proofs.  That is what has made the ISSAC
> conferences
> so boring and low in attendance.

Interesting observation. I haven't been doing this sort of thing long
enough to have an opinion on that.

>
> 2. I don't have a separate implementation of the Schoenhage-Strassen
> FFT, (for integer multiplication) but rely on
> GMP to choose that method if it makes sense.
>

That's fair enough. I merely meant that you could mention it. I
realise that GMP will use it.

> 3. Benchmarks are always easy to attack, and some of the most annoying
> comments from referees
> look like this:  (a) why did you use Lisp and not system XYZ?  If you
> used system XYZ, why did you not use
> version XYZ 8.2 instead of version 8.0?

I've always been confused about people who present benchmarks of their
own code against more of their own code, rather than to have the
courage to compare against other people's code.

>
> When I write a paper that (for example) suggests that method X is
> faster than method Y in Lisp,
> I am suggesting that maybe an expert in some other system will make a
> fair judgment for that other system.
>

It's really hard to know. Other systems implement some of the ideas
you implemented. I wouldn't have a clue whether the times you give are
good, bad or ugly.

> My suggestion is that, since I have provided full program listings,
> that if you wish to run the
> program in your favorite computer system, you go ahead and do so.  If
> you come up with a
> result that says you now have a superior method for your system, go
> ahead and install it.

That's entirely up to you. I just feel that your paper would have more
value if it actually supported the thesis that reasonably performance
can be gained with very short programs in a (particular) high level
language.

It's not that I see no value in your paper. To have these
implementations is useful for a Lisp programmer. I'm sure you also
completely enjoyed the challenge you set for yourself! :-)

>
> If you have comments that would improve the presentation, or fix
> typos, you can email me.

Of course.

> Thanks.
> RJF.
>
> On May 3, 11:29 am, Bill Hart  wrote:
>
>
>
>
>
> > I didn't see any mention in your paper of the Schoenhage-Strassen FFT
> > for exact arithmetic. It would be faster than using CRT for a whole
> > lot of multiplications modulo small primes.
>
> > No number theorist would accept the results of "exact arithmetic" done
> > using FFTW or other floating point FFT's unless the FFT implementation
> > has proven accuracy bounds and a proof that these are never exceeded.
> > Certainly this is not the case for FFTW.
>
> > I did find the idea of splitting multivariate problems into smaller
> > problems until the exponents fit into a machine word interesting. For
> > some reason I hadn't thought of doing that. I guess one needs to be
> > careful that this doesn't just split the polynomial up into single
> > monomials to be multiplied.
>
> > I'd also not seen the radix tree idea before, though I probably should
> > have.
>
> > The results of the benchmarking are interesting, but rather
> > meaningless when not compared to other standard implementations on a
> > similar system. Your contention is that with short programs in a high
> > level language, these computations can be done quickly. How do we know
> > that the constant factor involved isn't 6000 compared to an
> > implementation in a low level language with a long program?
>
> > I realise this is probably not a final version as such, and so I'll
> > leave off mentioning the handful of minor typos I noted (e.g. you have
> > ^10 instead of ^{10} somewhere).
>
> > Bill.
>
> > On May 3, 5:57 pm, Bill Hart  wrote:
>
> > > That's actually a very interesting paper. I've recently been playing
> > > with Forth, which is a kind of "Lisp type language" (yeah I know you
> > > won't agree with that), based on a data stack. I also worked through a
> > > book on Lisp up to the point where macros were defined, as I wanted to
> > > understand how that was handled in Lisp. I actually "get" Lisp now,
> > > but it was a round about way that I got there. It's clearly not for
> > > everyone.
>
> > > I've also been experimenting with how short programs can be that still
> > > give reasonable performance. The answer is, amazingly short, if one
> > > spends a lot of time thinking about it before coding.
>
> > > Another thing I've been enjoying lately is literate programming.
> > > Amazingly it turns out to be faster to write a literate program than
> > > an ordinary program because debugging takes almost no time.
>
> > > Anyhow, I'm going to read this paper of yours now.
>
> > > Bill.
>
> > > On May 3, 3:37 pm, rjf  wrote:
>
> > > > If you are not doing floating point arithmetic with machine
> > > > arithmetic, but using MPFR, then you are sacrificing

[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart
Ah you arrived right on cue. LOL!

Ha ha, you can quote me if you want, but I have written a couple of
literate programs in my life, so I'm hardly an expert.

But I was surprised at how much difference it made to the debugging
time.

Bill.

On May 3, 10:04 pm, Tim Daly  wrote:
> Bill Hart wrote:
> > That's actually a very interesting paper. I've recently been playing
> > with Forth, which is a kind of "Lisp type language" (yeah I know you
> > won't agree with that), based on a data stack. I also worked through a
> > book on Lisp up to the point where macros were defined, as I wanted to
> > understand how that was handled in Lisp. I actually "get" Lisp now,
> > but it was a round about way that I got there. It's clearly not for
> > everyone.
>
> > I've also been experimenting with how short programs can be that still
> > give reasonable performance. The answer is, amazingly short, if one
> > spends a lot of time thinking about it before coding.
>
> > Another thing I've been enjoying lately is literate programming.
> > Amazingly it turns out to be faster to write a literate program than
> > an ordinary program because debugging takes almost no time.
>
> Can I quote you on that in the Axiom system (which is moving toward
> being fully literate)?
>
>
>
>
>
> > Anyhow, I'm going to read this paper of yours now.
>
> > Bill.
>
> > On May 3, 3:37 pm, rjf  wrote:
>
> >> If you are not doing floating point arithmetic with machine
> >> arithmetic, but using MPFR, then you are sacrificing a huge amount of
> >> time.  You might as well be using rational arithmetic, or the kind of
> >> arithmetic that Collins once proposed, where the denominator is a
> >> power of 2.  Makes reducing to lowest terms relatively fast because
> >> the
> >> GCD is trivial.  Compare that to boosting the overall precision in
> >> MPFR to "big enough".
>
> >> If you want to read more about multiplying polynomials, you can read
> >> the (unpublished, unfinished, too-long) paper here:
>
> >>www.cs.berkeley.edu/~fateman/papers/shortprog.tex
>
> >> RJF
>
> >> --
> >> To post to this group, send an email to sage-devel@googlegroups.com
> >> To unsubscribe from this group, send an email to 
> >> sage-devel+unsubscr...@googlegroups.com
> >> For more options, visit this group 
> >> athttp://groups.google.com/group/sage-devel
> >> URL:http://www.sagemath.org
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group athttp://groups.google.com/group/sage-devel
> URL:http://www.sagemath.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Tim Daly



Bill Hart wrote:

That's actually a very interesting paper. I've recently been playing
with Forth, which is a kind of "Lisp type language" (yeah I know you
won't agree with that), based on a data stack. I also worked through a
book on Lisp up to the point where macros were defined, as I wanted to
understand how that was handled in Lisp. I actually "get" Lisp now,
but it was a round about way that I got there. It's clearly not for
everyone.

I've also been experimenting with how short programs can be that still
give reasonable performance. The answer is, amazingly short, if one
spends a lot of time thinking about it before coding.

Another thing I've been enjoying lately is literate programming.
Amazingly it turns out to be faster to write a literate program than
an ordinary program because debugging takes almost no time.
  
Can I quote you on that in the Axiom system (which is moving toward 
being fully literate)?

Anyhow, I'm going to read this paper of yours now.

Bill.

On May 3, 3:37 pm, rjf  wrote:
  

If you are not doing floating point arithmetic with machine
arithmetic, but using MPFR, then you are sacrificing a huge amount of
time.  You might as well be using rational arithmetic, or the kind of
arithmetic that Collins once proposed, where the denominator is a
power of 2.  Makes reducing to lowest terms relatively fast because
the
GCD is trivial.  Compare that to boosting the overall precision in
MPFR to "big enough".

If you want to read more about multiplying polynomials, you can read
the (unpublished, unfinished, too-long) paper here:

www.cs.berkeley.edu/~fateman/papers/shortprog.tex

RJF

--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group athttp://groups.google.com/group/sage-devel
URL:http://www.sagemath.org



  


--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread rjf
Your comments are interesting:
1. Theorists tend to reject all papers that merely demonstrate working
programs,
in favor of papers that have proofs.  That is what has made the ISSAC
conferences
so boring and low in attendance.

2. I don't have a separate implementation of the Schoenhage-Strassen
FFT, (for integer multiplication) but rely on
GMP to choose that method if it makes sense.

3. Benchmarks are always easy to attack, and some of the most annoying
comments from referees
look like this:  (a) why did you use Lisp and not system XYZ?  If you
used system XYZ, why did you not use
version XYZ 8.2 instead of version 8.0?

When I write a paper that (for example) suggests that method X is
faster than method Y in Lisp,
I am suggesting that maybe an expert in some other system will make a
fair judgment for that other system.

My suggestion is that, since I have provided full program listings,
that if you wish to run the
program in your favorite computer system, you go ahead and do so.  If
you come up with a
result that says you now have a superior method for your system, go
ahead and install it.

If you have comments that would improve the presentation, or fix
typos, you can email me.
Thanks.
RJF.




On May 3, 11:29 am, Bill Hart  wrote:
> I didn't see any mention in your paper of the Schoenhage-Strassen FFT
> for exact arithmetic. It would be faster than using CRT for a whole
> lot of multiplications modulo small primes.
>
> No number theorist would accept the results of "exact arithmetic" done
> using FFTW or other floating point FFT's unless the FFT implementation
> has proven accuracy bounds and a proof that these are never exceeded.
> Certainly this is not the case for FFTW.
>
> I did find the idea of splitting multivariate problems into smaller
> problems until the exponents fit into a machine word interesting. For
> some reason I hadn't thought of doing that. I guess one needs to be
> careful that this doesn't just split the polynomial up into single
> monomials to be multiplied.
>
> I'd also not seen the radix tree idea before, though I probably should
> have.
>
> The results of the benchmarking are interesting, but rather
> meaningless when not compared to other standard implementations on a
> similar system. Your contention is that with short programs in a high
> level language, these computations can be done quickly. How do we know
> that the constant factor involved isn't 6000 compared to an
> implementation in a low level language with a long program?
>
> I realise this is probably not a final version as such, and so I'll
> leave off mentioning the handful of minor typos I noted (e.g. you have
> ^10 instead of ^{10} somewhere).
>
> Bill.
>
> On May 3, 5:57 pm, Bill Hart  wrote:
>
>
>
> > That's actually a very interesting paper. I've recently been playing
> > with Forth, which is a kind of "Lisp type language" (yeah I know you
> > won't agree with that), based on a data stack. I also worked through a
> > book on Lisp up to the point where macros were defined, as I wanted to
> > understand how that was handled in Lisp. I actually "get" Lisp now,
> > but it was a round about way that I got there. It's clearly not for
> > everyone.
>
> > I've also been experimenting with how short programs can be that still
> > give reasonable performance. The answer is, amazingly short, if one
> > spends a lot of time thinking about it before coding.
>
> > Another thing I've been enjoying lately is literate programming.
> > Amazingly it turns out to be faster to write a literate program than
> > an ordinary program because debugging takes almost no time.
>
> > Anyhow, I'm going to read this paper of yours now.
>
> > Bill.
>
> > On May 3, 3:37 pm, rjf  wrote:
>
> > > If you are not doing floating point arithmetic with machine
> > > arithmetic, but using MPFR, then you are sacrificing a huge amount of
> > > time.  You might as well be using rational arithmetic, or the kind of
> > > arithmetic that Collins once proposed, where the denominator is a
> > > power of 2.  Makes reducing to lowest terms relatively fast because
> > > the
> > > GCD is trivial.  Compare that to boosting the overall precision in
> > > MPFR to "big enough".
>
> > > If you want to read more about multiplying polynomials, you can read
> > > the (unpublished, unfinished, too-long) paper here:
>
> > >www.cs.berkeley.edu/~fateman/papers/shortprog.tex
>
> > > RJF
>
> > > --
> > > To post to this group, send an email to sage-devel@googlegroups.com
> > > To unsubscribe from this group, send an email to 
> > > sage-devel+unsubscr...@googlegroups.com
> > > For more options, visit this group 
> > > athttp://groups.google.com/group/sage-devel
> > > URL:http://www.sagemath.org
>
> > --
> > To post to this group, send an email to sage-devel@googlegroups.com
> > To unsubscribe from this group, send an email to 
> > sage-devel+unsubscr...@googlegroups.com
> > For more options, visit this group 
> > athttp://groups.google.com/group/sage-devel
> > URL:http://www.

[sage-devel] Fwd: Reminder: Scientific Software Day 2010

2010-05-03 Thread William Stein
A heads up if you are in Texas... Next week I will be doing a sage tutorial.

-- Forwarded message --
From: TACC Announcements 
Date: Monday, May 3, 2010
Subject: Reminder: Scientific Software Day 2010
To: wst...@gmail.com


Reminder: Scientific Software Day 2010From: Bob Garza

Scientific Software Day 2010

May 10, 2010  (Monday)
9:00 am - 4:00 pm
J.J. Pickle Research Campus
ROC 1.603 (Building 196)
10100 Burnet Rd.
Austin, TX 78758

The Texas Advanced Computing Center, in association with the Jackson
School of Geosciences at The University of Texas at Austin, is pleased
to present the 4th annual Scientific Software Day event. The purpose
of the event is to increase awareness of new scientific software and
to inform users of relevant and timely issues.

The day will include a number of short presentations related to the
developments in scientific software.

Keynote Address
Can you trust the code? An analysis of the software quality of global
climate models.
by Professor Steve Easterbrook of the University of Toronto

Tutorial
Project leader William Stein will present a tutorial on SAGE, a free
open-source alternative to Magma, Maple, Mathematica, and Matlab.

There is no cost for attending, but space is limited.  To register please visit:
http://www.tacc.utexas.edu/softwareday/




-- 
William Stein
Professor of Mathematics
University of Washington
http://wstein.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart
I didn't see any mention in your paper of the Schoenhage-Strassen FFT
for exact arithmetic. It would be faster than using CRT for a whole
lot of multiplications modulo small primes.

No number theorist would accept the results of "exact arithmetic" done
using FFTW or other floating point FFT's unless the FFT implementation
has proven accuracy bounds and a proof that these are never exceeded.
Certainly this is not the case for FFTW.

I did find the idea of splitting multivariate problems into smaller
problems until the exponents fit into a machine word interesting. For
some reason I hadn't thought of doing that. I guess one needs to be
careful that this doesn't just split the polynomial up into single
monomials to be multiplied.

I'd also not seen the radix tree idea before, though I probably should
have.

The results of the benchmarking are interesting, but rather
meaningless when not compared to other standard implementations on a
similar system. Your contention is that with short programs in a high
level language, these computations can be done quickly. How do we know
that the constant factor involved isn't 6000 compared to an
implementation in a low level language with a long program?

I realise this is probably not a final version as such, and so I'll
leave off mentioning the handful of minor typos I noted (e.g. you have
^10 instead of ^{10} somewhere).

Bill.


On May 3, 5:57 pm, Bill Hart  wrote:
> That's actually a very interesting paper. I've recently been playing
> with Forth, which is a kind of "Lisp type language" (yeah I know you
> won't agree with that), based on a data stack. I also worked through a
> book on Lisp up to the point where macros were defined, as I wanted to
> understand how that was handled in Lisp. I actually "get" Lisp now,
> but it was a round about way that I got there. It's clearly not for
> everyone.
>
> I've also been experimenting with how short programs can be that still
> give reasonable performance. The answer is, amazingly short, if one
> spends a lot of time thinking about it before coding.
>
> Another thing I've been enjoying lately is literate programming.
> Amazingly it turns out to be faster to write a literate program than
> an ordinary program because debugging takes almost no time.
>
> Anyhow, I'm going to read this paper of yours now.
>
> Bill.
>
> On May 3, 3:37 pm, rjf  wrote:
>
>
>
>
>
> > If you are not doing floating point arithmetic with machine
> > arithmetic, but using MPFR, then you are sacrificing a huge amount of
> > time.  You might as well be using rational arithmetic, or the kind of
> > arithmetic that Collins once proposed, where the denominator is a
> > power of 2.  Makes reducing to lowest terms relatively fast because
> > the
> > GCD is trivial.  Compare that to boosting the overall precision in
> > MPFR to "big enough".
>
> > If you want to read more about multiplying polynomials, you can read
> > the (unpublished, unfinished, too-long) paper here:
>
> >www.cs.berkeley.edu/~fateman/papers/shortprog.tex
>
> > RJF
>
> > --
> > To post to this group, send an email to sage-devel@googlegroups.com
> > To unsubscribe from this group, send an email to 
> > sage-devel+unsubscr...@googlegroups.com
> > For more options, visit this group 
> > athttp://groups.google.com/group/sage-devel
> > URL:http://www.sagemath.org
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group athttp://groups.google.com/group/sage-devel
> URL:http://www.sagemath.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] Re: erf + solve

2010-05-03 Thread Burcin Erocal
Hi Ross,

On Sun, 2 May 2010 19:43:17 -0700 (PDT)
Ross Kyprianou  wrote:

> > You should add a new integrator function and register it in the
> > dictionary sage.symbolic.integration.integral.available_integrators.
> >
> > At some point we also need to come up with a protocol to allow these
> > functions to transform the input and pass it on to be processed by
> > the next one in the queue.
> 
> So I guess registering means adding (to
> sage.symbolic.integration.integral) the last line (shown here)
> 
> available_integrators['maxima'] = external.maxima_integrator
> available_integrators['sympy'] = external.sympy_integrator
> available_integrators['mathematica_free'] =
> external.mma_free_integrator
> available_integrators['sage'] = external.sage_integrator
> 
> and including a corresponding function to
> sage.symbolic.integration.external

You are on the right track. Sorry for not being more explicit before.

Note that most of this is new and we are just developing the necessary
framework as we go along. Your example is a good test case, so please
keep on trying, sending emails, and poking people (me) to work on this.

I suggest giving a more specific name to your function. Assuming that
you're working on the erf function, erf_integrator() might be a good
name.

You should definitely add your function to the list of available
integrators. This will let people use only your function if they want
to, by using the algorithm keyword argument of the top level integrate
function.

The new function you implement doesn't need to go in the file
sage/symbolic/integration/external.py. That file contains the
routines which call external systems for integration. I suggest putting
a new file in sage/symbolic/integration, or adding it somewhere close
to the definition of the erf() function. 

Later we can also consider letting functions automatically register
custom integration routines, similar to the way they register custom
numeric evaluation or differentiation methods. I don't really like this
idea, since the order we call these functions might be important.


Can you post some example code (your integrator function) so I have
something to experiment with?


> I think the aim would be that anybody needing a known integral that is
> not caught by the other 3 integrators, patching this integral in this
> "last chance" "sage" integrator which we have more control over than
> the other integrators. (Alternatively, can we patch into sympy as an
> alternative and get an upstream update to their integrator instead?)
> 
> Ive looked at the 1-2 year old threads in this list on integration and
> I guess most of the discussion have been implemented. It seems the
> schema above was implemented so we are using existing integrators
> rather than opting to recreate another large internal integrator (is
> that correct?). If so, it might be that the integrals made available
> very quickly with patches to this new sage integrator would ideally
> find themselves implemented in maxima and/or sympy eventually and
> dropped from this new sage integrator to keep it compact.

We are definitely aiming to implement symbolic integration natively in
Sage. There are some existing implementations even. It's just a matter
of getting things cleaned up and submitted to Sage. Since the work
is done for research, it can be hard to get them ready for public
consumption. :)

> Regardless of the responses to the above, is the following what I
> should implement?
> 1) Add to sage.symbolic.integration.integral.available_integrators...
> available_integrators['sage'] = external.sage_integrator

You should also add it to the lists on line 60 and 146 of
sage/symbolic/integration/integral.py.

> 2) Include a corresponding function (to return the integral result) in
> sage.symbolic.integration.external
> def sage_integrator(expression, v, a=None, b=None):
> ...


Thanks.

Burcin

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart
That's actually a very interesting paper. I've recently been playing
with Forth, which is a kind of "Lisp type language" (yeah I know you
won't agree with that), based on a data stack. I also worked through a
book on Lisp up to the point where macros were defined, as I wanted to
understand how that was handled in Lisp. I actually "get" Lisp now,
but it was a round about way that I got there. It's clearly not for
everyone.

I've also been experimenting with how short programs can be that still
give reasonable performance. The answer is, amazingly short, if one
spends a lot of time thinking about it before coding.

Another thing I've been enjoying lately is literate programming.
Amazingly it turns out to be faster to write a literate program than
an ordinary program because debugging takes almost no time.

Anyhow, I'm going to read this paper of yours now.

Bill.

On May 3, 3:37 pm, rjf  wrote:
> If you are not doing floating point arithmetic with machine
> arithmetic, but using MPFR, then you are sacrificing a huge amount of
> time.  You might as well be using rational arithmetic, or the kind of
> arithmetic that Collins once proposed, where the denominator is a
> power of 2.  Makes reducing to lowest terms relatively fast because
> the
> GCD is trivial.  Compare that to boosting the overall precision in
> MPFR to "big enough".
>
> If you want to read more about multiplying polynomials, you can read
> the (unpublished, unfinished, too-long) paper here:
>
> www.cs.berkeley.edu/~fateman/papers/shortprog.tex
>
> RJF
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group athttp://groups.google.com/group/sage-devel
> URL:http://www.sagemath.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart
There is such a module in Sage. It's just not the one we are talking
about here.

It's actually not necessary to use Colins' arithmetic to get good
speed. The algorithm I'm currently using can make use of a fast
Kronecker Segmentation algorithm and will actually work *faster* than
Colins' arithmetic in some cases. It doesn't just jack up the
precision when it is too low!

Bill.

On May 3, 3:37 pm, rjf  wrote:
> If you are not doing floating point arithmetic with machine
> arithmetic, but using MPFR, then you are sacrificing a huge amount of
> time.  You might as well be using rational arithmetic, or the kind of
> arithmetic that Collins once proposed, where the denominator is a
> power of 2.  Makes reducing to lowest terms relatively fast because
> the
> GCD is trivial.  Compare that to boosting the overall precision in
> MPFR to "big enough".
>
> If you want to read more about multiplying polynomials, you can read
> the (unpublished, unfinished, too-long) paper here:
>
> www.cs.berkeley.edu/~fateman/papers/shortprog.tex
>
> RJF
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group athttp://groups.google.com/group/sage-devel
> URL:http://www.sagemath.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Fwd: [mpir-devel] New MPIR-related project

2010-05-03 Thread Gonzalo Tornaria
On Fri, Apr 30, 2010 at 4:11 AM, Sergey Bochkanov
 wrote:
> Hello, William.
>
>> In Sage we (=mostly Gonzalo Tornaria) spent an enormous amount of time
>> writing two very efficient C functions, one to convert from mpz to
>> Python ints, and one to convert back.   Yes, writing this code is a
>> lot of work.  But no, the resulting code is not slow.  Just because
>> something is hard doesn't mean "we can't do it".
>> If you want this code, I bet Gonzalo would be OK with letting you have
>> it under another license (it's GPL v2+ right now); it's not long, just
>> tricky to write.
>
> That  would  be  nice.  Currently X-MPIR is licensed under LGPL, but I
> think  that  the  same  automatic codegen technology may be applied to
> other  projects  under  different  licenses. So having this code under
> something BSD-like would be important step in Python support.

Although I'm more of a GPL (v2) type, and I wouldn't write BSD code
unless I had a very good reason, I could consider relicensing some
interface code like this.
I've actually relicensed this code to LGPL v2+ (for gmpy), but I'm not
sure I can go farther than that, since my code was based on GMP code
(itself LGPL v2+ at the time).

See the forwarded message below for details.

Best,
Gonzalo

-- Forwarded message --
From: Gonzalo Tornaria 
Date: Tue, Sep 8, 2009 at 10:58 PM
Subject: Re: mpmath and files
To: William Stein 
Cc: casevh , mpm...@googlegroups.com


On Tue, Sep 8, 2009 at 8:54 PM, William Stein wrote:
> On Tue, Sep 8, 2009 at 4:38 PM, casevh wrote:
>> On Sep 7, 5:00 pm, William Stein  wrote:
>>> I think there is very complicated highly optimized code somewhere in Sage
>>> (http://sagemath.org) for conversion between Python longs and GMP
>>> mpz_t's.  If this is what you need, it could be dug up.
>>
>> I maintain gmpy, the Python wrapper for GMP or MPIR, and I would like
>> to look at including Sage's conversion code. I'm currently using
>> mpz_import and mpz_export and am always looking for faster code.
>>
>> Case Van Horsen
>>
>
> Hi,
>
> Look in the src and include directories here:
>
> http://sage.math.washington.edu/home/wstein/build/sage-4.1.1/devel/sage/c_lib/
>
>> Would there be any licensing issues since gmpy is LGPL2+?
>
> Yes, there is a problem since the code is licensed GPL2+.  However,
> I've cc'd the author -- Gonzalo Tornaria -- and I bet he would be
> willing to contribute a LGPL2+ version of the code to gmpy.  I
> certainly endorse that!

I can definitely contribute the mpz<-->pylong conversion code to gmpy
under LGPL2+.

You want to look at the files mpn_pylong.c and mpz_pylong.c. The high
level interface (mpz <-> pylong) is given by functions
mpz_get_pylong() and mpz_set_pylong(). There's also a function
mpz_pythonhash() which computes a python-compatible hash of mpz
integers.

The function mpz_get_pyintlong() is a variation of mpz_get_pylong()
written by David Harvey --- you should ask him about that one.

Note also that the base conversion code used in mpn_get_pylong() is
based on GMP code --- I think GMP was LGPLv2+ at the time, so it
should be fine.

Let me know if you have any issues with using this code in gmpy.

If you use the code, can you add a comment to the files along these lines:

"""
Originally written for sage (http://sagemath.org) -- if you improve
these functions, please contribute them back to sage by posting to
sage-devel@googlegroups.com or sending me an email.
"""

Best, Gonzalo

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread Bill Hart
I finished rewriting and cleaning up my code. There is now a function
mpfr_poly_mul(mpfr_poly_t res, mpfr_poly_t pol1, mpfr_poly_t pol2,
ulong guard_bits).

If pol1 and pol2 are computed to sufficient precision in the first
place (i.e. a bit more than the precision you want in the end) and
guard_bits is set to some small quantity, usually 8 is enough even for
diabolical examples, then this function handles all the "interesting"
cases, including the challenges presented above. Certainly it can do
much better than classical multiplication, both speed wise (it'll use
the FFT if the problem gets big enough) and accuracy-wise (in a
precisely defined way, which I can't be bothered going into here).

The guard_bits actually does some filtering on the output of the
classical and FFT multiplications which are used internally, removing
spurious data below some threshold. This allows it to handle the
examples rjf provided.

Basically the algorithm automatically chooses a newton box and scaling
for each polynomial and uses the first few tricks in Joris' paper. All
the infrastructure is now there for anyone who wants to implement
Joris' algorithm in full (which should now be quite straightforward),
although I don't know any interesting examples where it is needed at
this point. Certainly the code now does everything *I* want it to do.

I've cleaned up all the test code and profiling code too, and
committed to my flint2 repo. Again if anyone is interested in
implementing any other interesting RR[x] algorithms, let me know.

As for speeding it up further, Kronecker Segmentation, and even direct
use of an integer FFT could be used to speed it up further. Also
karatsuba and toom cook algorithms could be implemented. But I have no
plans to work on those improvements just now. These improvements could
be added quite easily to the inner routine _mpfr_poly_mul_inplace
which finally does the actual multiplication once all the other jazz
has been taken care of. It is defined in mpfr_poly/mul.c.

Bill.

On May 3, 2:43 am, Bill Hart  wrote:
> That should say arbitrary exponents, not arbitrary precision.
>
> On May 3, 2:36 am, Bill Hart  wrote:
>
>
>
>
>
> > This thread is about multiple precision floating point arithmetic.
> > What have machine floats got to do with it?
>
> > I'm using mpfr, which is what Sage uses. It has guaranteed rounding
> > for *arbitrary precision* floats with essentially arbitrary precision
> > exponents (there is a limit of course).
>
> > There's no need to even think about what happens when you switch from
> > machine doubles to multiple precision, because the writers of the mpfr
> > library already thought it through already. Their mpfr_t type just
> > works.
>
> > Bill.
>
> > On May 3, 12:06 am, rjf  wrote:
>
> > > On May 2, 9:02 am, Bill Hart  wrote:
>
> > > > On May 2, 4:14 pm, rjf  wrote:
>
> > > > > I repeat,
>
> > > > > The interesting cases are obvious those which are not covered.
>
> > > > Sorry, I don't know what you mean. Are you saying that by definition
> > > > they are interesting because they are not covered by Joris' algorithm,
> > > > whatever they may be?
>
> > > I haven't looked at Joris' algorithm, and if all you are doing is
> > > copying
> > > what he has done, that might be better than making up something to do
> > > something that you haven't defined.  I assumed Joris defined what
> > > he is doing.
>
> > > > > I don't know what your fix is, nor do I especially care, but I gather
> > > > > that, now, at least your "stable" word is meant to indicate something
> > > > > like a small bound in the maximum over all coefficients of the
> > > > > difference in relative error between the true result and the computed
> > > > > result.
>
> > > > That sounds reasonable as a definition to me. However it isn't
> > > > precisely the measure Joris defines.
>
> > > > > I have no reason to believe this is an especially relevant measure,
> > > > > since some coefficients (especially the first and the last) are
> > > > > probably far more important and, incidentally, far easier to compute.
>
> > > > > Here are some more cases.
> > > ... snip..
> > > > > Here is another
>
> > > > > p=1.7976931348623157E308;
> > > > > q= 10*x
>
> > > > > What do you do when the coefficients overflow?
>
> > > > I actually don't understand what you mean. Why would there be an
> > > > overflow?
>
> > > there would be an overflow if you are using machine floating point
> > > numbers,
> > > since p is approximately the largest double-float, and 10*p cannot be
> > > represented
> > > in a machine double-float.
>
> > > I'm missing something important here. I'm using floating
>
> > > > point and the exponents can be 64 bits or something like that. There
> > > > should be no overflow.
>
> > > Really?  So you are not using IEEE double-floats?
>
> > > What, then, do you do if the number exceeds whatever bounds you have
> > > for your floats?
>
> > > ... snip...
>
> > > In fact, what does Sage do?  Probably you can't say, because

[sage-devel] funny page of quotes: Ginac

2010-05-03 Thread William Stein
Hi Sage-Devel,

Some amusing quotes from the GiNaC page:
http://www.ginac.de/People.html   (Note that GiNaC is a C++ library
that is at the core of Sage's symbolic manipulation system.)

 * Richard J. Fateman must be mentioned for his stimulating scepticism
("Maybe what should be done is [...] to do it with experts instead of
physicists-who-can-write-programs.") at the early days of the GiNaC
project, for encouragment ("If you are willing to believe that the
cost of writing the code is justified by the speed of the code, then
you are entitled to do anything, I suppose.") ...

 * Richard M. Stallman deserves credit for discouraging comments ("I
hope you are not dead set on using C++. C++ is a very badly designed
language.")...


-- 
William Stein
Professor of Mathematics
University of Washington
http://wstein.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread William Stein
On Mon, May 3, 2010 at 8:49 AM, Tim Joseph Dumol  wrote:
> This isn't related to my new package includes. Jinja2 wasn't one of those
> new packages. The problem is that SageNB is installed before Jinja2 is
> installed, so it's more of a problem in the dependency script.

Cool -- then that should be very easy to fix.   Thanks for debugging
this, even though it wasn't what I thought (and I was wrong).

William

>
> On Mon, May 3, 2010 at 11:01 PM, William Stein  wrote:
>>
>> Hi,
>>
>> This is now
>>   http://trac.sagemath.org/sage_trac/ticket/8858
>>
>> William
>>
>> On Mon, May 3, 2010 at 7:57 AM, John Cremona 
>> wrote:
>> > Harald,  I made almost the same point earlier today (but in my case it
>> > was sagenb building which tried to access the internet.  Which failed
>> > as I was building overnight and had turned off my home internet
>> > connection.)
>> >
>> > John
>> >
>> > On 3 May 2010 15:39, Harald Schilly  wrote:
>> >> Hi, while watching the compilation of 4.4.1 I saw that it stopped
>> >> compiling and waited for a package to download. I'm curious if this is
>> >> intended or just a strange glitch. At least my idea of a self
>> >> containing source package is that it doesn't need to download software
>> >> from the internet!
>> >>
>> >> ...
>> >> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
>> >> i686/egg' to it
>> >> removing 'build/bdist.linux-i686/egg' (and everything under it)
>> >> Processing Sphinx-0.6.3-py2.6.egg
>> >> creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> site-packages/Sphinx-0.6.3-py2.6.egg
>> >> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
>> >> sage-4.4.1/local/lib/python2.6/site-packages
>> >> Adding Sphinx 0.6.3 to easy-install.pth file
>> >> Installing sphinx-build script to /scratch/scratch/schilly/sage/
>> >> sage-4.4.1/local/bin
>> >> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
>> >> sage-4.4.1/local/bin
>> >> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
>> >> sage-4.4.1/local/bin
>> >>
>> >> Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> site-packages/Sphinx-0.6.3-py2.6.egg
>> >> Processing dependencies for Sphinx==0.6.3
>> >> Searching for docutils==0.5
>> >> Best match: docutils 0.5
>> >> Adding docutils 0.5 to easy-install.pth file
>> >>
>> >> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> >> site-packages
>> >> Searching for Jinja2==2.3.1
>> >> Reading http://pypi.python.org/simple/Jinja2/
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >> Download error: [Errno 110] Connection timed out -- Some packages may
>> >> not be found!
>> >> Reading http://jinja.pocoo.org/
>> >>
>> >> And well, it waits the usual 2 minutes socket timeout in each line and
>> >> the pocoo website is really down.
>> >>
>> >> H
>> >>
>> >> --
>> >> To post to this group, send an email to sage-devel@googlegroups.com
>> >> To unsubscribe from this group, send an email to
>> >> sage-devel+unsubscr...@googlegroups.com
>> >> For more options, visit this group at
>> >> http://groups.google.com/group/sage-devel
>> >> URL: http://www.sagemath.org
>> >>
>> >
>> > --
>> > To post to this group, send an email to sage-devel@googlegroups.com
>> > To unsubscribe from this group, send an email to
>> > sage-devel+unsubscr...@googlegroups.com
>> > For more options, visit this group at
>> > http://groups.google.com/group/sage-devel
>> > URL: http://www.sagemath.org
>> >
>>
>>
>>
>> --
>> William Stein
>> Professor of Mathematics
>> University of Washington
>> http://wstein.org
>
>
>
> --
> Tim Joseph Dumol 
> http://timdumol.com
>



-- 
William Stein
Professor of Mathematics
University of Washington
http://wstein.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread Tim Joseph Dumol
This isn't related to my new package includes. Jinja2 wasn't one of those
new packages. The problem is that SageNB is installed before Jinja2 is
installed, so it's more of a problem in the dependency script.

On Mon, May 3, 2010 at 11:01 PM, William Stein  wrote:

> Hi,
>
> This is now
>   http://trac.sagemath.org/sage_trac/ticket/8858
>
> William
>
> On Mon, May 3, 2010 at 7:57 AM, John Cremona 
> wrote:
> > Harald,  I made almost the same point earlier today (but in my case it
> > was sagenb building which tried to access the internet.  Which failed
> > as I was building overnight and had turned off my home internet
> > connection.)
> >
> > John
> >
> > On 3 May 2010 15:39, Harald Schilly  wrote:
> >> Hi, while watching the compilation of 4.4.1 I saw that it stopped
> >> compiling and waited for a package to download. I'm curious if this is
> >> intended or just a strange glitch. At least my idea of a self
> >> containing source package is that it doesn't need to download software
> >> from the internet!
> >>
> >> ...
> >> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
> >> i686/egg' to it
> >> removing 'build/bdist.linux-i686/egg' (and everything under it)
> >> Processing Sphinx-0.6.3-py2.6.egg
> >> creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> site-packages/Sphinx-0.6.3-py2.6.egg
> >> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
> >> sage-4.4.1/local/lib/python2.6/site-packages
> >> Adding Sphinx 0.6.3 to easy-install.pth file
> >> Installing sphinx-build script to /scratch/scratch/schilly/sage/
> >> sage-4.4.1/local/bin
> >> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
> >> sage-4.4.1/local/bin
> >> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
> >> sage-4.4.1/local/bin
> >>
> >> Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> site-packages/Sphinx-0.6.3-py2.6.egg
> >> Processing dependencies for Sphinx==0.6.3
> >> Searching for docutils==0.5
> >> Best match: docutils 0.5
> >> Adding docutils 0.5 to easy-install.pth file
> >>
> >> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> >> site-packages
> >> Searching for Jinja2==2.3.1
> >> Reading http://pypi.python.org/simple/Jinja2/
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >> Download error: [Errno 110] Connection timed out -- Some packages may
> >> not be found!
> >> Reading http://jinja.pocoo.org/
> >>
> >> And well, it waits the usual 2 minutes socket timeout in each line and
> >> the pocoo website is really down.
> >>
> >> H
> >>
> >> --
> >> To post to this group, send an email to sage-devel@googlegroups.com
> >> To unsubscribe from this group, send an email to
> sage-devel+unsubscr...@googlegroups.com
> >> For more options, visit this group at
> http://groups.google.com/group/sage-devel
> >> URL: http://www.sagemath.org
> >>
> >
> > --
> > To post to this group, send an email to sage-devel@googlegroups.com
> > To unsubscribe from this group, send an email to
> sage-devel+unsubscr...@googlegroups.com
> > For more options, visit this group at
> http://groups.google.com/group/sage-devel
> > URL: http://www.sagemath.org
> >
>
>
>
> --
> William Stein
> Professor of Mathematics
> University of Washington
> http://wstein.org
>



-- 
Tim Joseph Dumol 
http://timdumol.com

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread William Stein
Hi,

This is now
   http://trac.sagemath.org/sage_trac/ticket/8858

William

On Mon, May 3, 2010 at 7:57 AM, John Cremona  wrote:
> Harald,  I made almost the same point earlier today (but in my case it
> was sagenb building which tried to access the internet.  Which failed
> as I was building overnight and had turned off my home internet
> connection.)
>
> John
>
> On 3 May 2010 15:39, Harald Schilly  wrote:
>> Hi, while watching the compilation of 4.4.1 I saw that it stopped
>> compiling and waited for a package to download. I'm curious if this is
>> intended or just a strange glitch. At least my idea of a self
>> containing source package is that it doesn't need to download software
>> from the internet!
>>
>> ...
>> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
>> i686/egg' to it
>> removing 'build/bdist.linux-i686/egg' (and everything under it)
>> Processing Sphinx-0.6.3-py2.6.egg
>> creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> site-packages/Sphinx-0.6.3-py2.6.egg
>> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
>> sage-4.4.1/local/lib/python2.6/site-packages
>> Adding Sphinx 0.6.3 to easy-install.pth file
>> Installing sphinx-build script to /scratch/scratch/schilly/sage/
>> sage-4.4.1/local/bin
>> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
>> sage-4.4.1/local/bin
>> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
>> sage-4.4.1/local/bin
>>
>> Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> site-packages/Sphinx-0.6.3-py2.6.egg
>> Processing dependencies for Sphinx==0.6.3
>> Searching for docutils==0.5
>> Best match: docutils 0.5
>> Adding docutils 0.5 to easy-install.pth file
>>
>> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
>> site-packages
>> Searching for Jinja2==2.3.1
>> Reading http://pypi.python.org/simple/Jinja2/
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>> Download error: [Errno 110] Connection timed out -- Some packages may
>> not be found!
>> Reading http://jinja.pocoo.org/
>>
>> And well, it waits the usual 2 minutes socket timeout in each line and
>> the pocoo website is really down.
>>
>> H
>>
>> --
>> To post to this group, send an email to sage-devel@googlegroups.com
>> To unsubscribe from this group, send an email to 
>> sage-devel+unsubscr...@googlegroups.com
>> For more options, visit this group at 
>> http://groups.google.com/group/sage-devel
>> URL: http://www.sagemath.org
>>
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group at 
> http://groups.google.com/group/sage-devel
> URL: http://www.sagemath.org
>



-- 
William Stein
Professor of Mathematics
University of Washington
http://wstein.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread William Stein
On Mon, May 3, 2010 at 7:39 AM, Harald Schilly  wrote:
> Hi, while watching the compilation of 4.4.1 I saw that it stopped
> compiling and waited for a package to download. I'm curious if this is
> intended or just a strange glitch. At least my idea of a self
> containing source package is that it doesn't need to download software
> from the internet!

This is a major bug.  It's caused by the Sagenb spkg using setuptools
to include some
dependencies.  I complained that this might happen to Tim Dumol when
he changed sagenb
to do this, and he strongly assured me that this wouldn't be the case,
and until now it hasn't been.
So I'm hoping that he'll have a robust fix ASAP.

Harald -- please don't post sage-4.4.1.tar on the official sagemath
website until this issue is resolved.
If it get resolved promptly, then I can post an updated sage-4.4.1.tar
with the fix.

 -- William

>
> ...
> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
> i686/egg' to it
> removing 'build/bdist.linux-i686/egg' (and everything under it)
> Processing Sphinx-0.6.3-py2.6.egg
> creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages/Sphinx-0.6.3-py2.6.egg
> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/lib/python2.6/site-packages
> Adding Sphinx 0.6.3 to easy-install.pth file
> Installing sphinx-build script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
>
> Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages/Sphinx-0.6.3-py2.6.egg
> Processing dependencies for Sphinx==0.6.3
> Searching for docutils==0.5
> Best match: docutils 0.5
> Adding docutils 0.5 to easy-install.pth file
>
> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages
> Searching for Jinja2==2.3.1
> Reading http://pypi.python.org/simple/Jinja2/
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
>
> And well, it waits the usual 2 minutes socket timeout in each line and
> the pocoo website is really down.
>
> H
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group at 
> http://groups.google.com/group/sage-devel
> URL: http://www.sagemath.org
>



-- 
William Stein
Professor of Mathematics
University of Washington
http://wstein.org

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


Re: [sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread John Cremona
Harald,  I made almost the same point earlier today (but in my case it
was sagenb building which tried to access the internet.  Which failed
as I was building overnight and had turned off my home internet
connection.)

John

On 3 May 2010 15:39, Harald Schilly  wrote:
> Hi, while watching the compilation of 4.4.1 I saw that it stopped
> compiling and waited for a package to download. I'm curious if this is
> intended or just a strange glitch. At least my idea of a self
> containing source package is that it doesn't need to download software
> from the internet!
>
> ...
> creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
> i686/egg' to it
> removing 'build/bdist.linux-i686/egg' (and everything under it)
> Processing Sphinx-0.6.3-py2.6.egg
> creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages/Sphinx-0.6.3-py2.6.egg
> Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/lib/python2.6/site-packages
> Adding Sphinx 0.6.3 to easy-install.pth file
> Installing sphinx-build script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
> Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
> Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
> sage-4.4.1/local/bin
>
> Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages/Sphinx-0.6.3-py2.6.egg
> Processing dependencies for Sphinx==0.6.3
> Searching for docutils==0.5
> Best match: docutils 0.5
> Adding docutils 0.5 to easy-install.pth file
>
> Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
> site-packages
> Searching for Jinja2==2.3.1
> Reading http://pypi.python.org/simple/Jinja2/
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
> Download error: [Errno 110] Connection timed out -- Some packages may
> not be found!
> Reading http://jinja.pocoo.org/
>
> And well, it waits the usual 2 minutes socket timeout in each line and
> the pocoo website is really down.
>
> H
>
> --
> To post to this group, send an email to sage-devel@googlegroups.com
> To unsubscribe from this group, send an email to 
> sage-devel+unsubscr...@googlegroups.com
> For more options, visit this group at 
> http://groups.google.com/group/sage-devel
> URL: http://www.sagemath.org
>

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] 4.4.1 accessing internet during compilation

2010-05-03 Thread Harald Schilly
Hi, while watching the compilation of 4.4.1 I saw that it stopped
compiling and waited for a package to download. I'm curious if this is
intended or just a strange glitch. At least my idea of a self
containing source package is that it doesn't need to download software
from the internet!

...
creating 'dist/Sphinx-0.6.3-py2.6.egg' and adding 'build/bdist.linux-
i686/egg' to it
removing 'build/bdist.linux-i686/egg' (and everything under it)
Processing Sphinx-0.6.3-py2.6.egg
creating /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
site-packages/Sphinx-0.6.3-py2.6.egg
Extracting Sphinx-0.6.3-py2.6.egg to /scratch/scratch/schilly/sage/
sage-4.4.1/local/lib/python2.6/site-packages
Adding Sphinx 0.6.3 to easy-install.pth file
Installing sphinx-build script to /scratch/scratch/schilly/sage/
sage-4.4.1/local/bin
Installing sphinx-quickstart script to /scratch/scratch/schilly/sage/
sage-4.4.1/local/bin
Installing sphinx-autogen script to /scratch/scratch/schilly/sage/
sage-4.4.1/local/bin

Installed /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
site-packages/Sphinx-0.6.3-py2.6.egg
Processing dependencies for Sphinx==0.6.3
Searching for docutils==0.5
Best match: docutils 0.5
Adding docutils 0.5 to easy-install.pth file

Using /scratch/scratch/schilly/sage/sage-4.4.1/local/lib/python2.6/
site-packages
Searching for Jinja2==2.3.1
Reading http://pypi.python.org/simple/Jinja2/
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/
Download error: [Errno 110] Connection timed out -- Some packages may
not be found!
Reading http://jinja.pocoo.org/

And well, it waits the usual 2 minutes socket timeout in each line and
the pocoo website is really down.

H

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: numerically stable fast univariate polynomial multiplication over RR[x]

2010-05-03 Thread rjf

If you are not doing floating point arithmetic with machine
arithmetic, but using MPFR, then you are sacrificing a huge amount of
time.  You might as well be using rational arithmetic, or the kind of
arithmetic that Collins once proposed, where the denominator is a
power of 2.  Makes reducing to lowest terms relatively fast because
the
GCD is trivial.  Compare that to boosting the overall precision in
MPFR to "big enough".

If you want to read more about multiplying polynomials, you can read
the (unpublished, unfinished, too-long) paper here:

www.cs.berkeley.edu/~fateman/papers/shortprog.tex

RJF

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Problem compiling Sage 4.4.1 on Mac OS X 10.6

2010-05-03 Thread Andri Egilsson
Hi all,

I ran into the following problem trying to compile Sage 4.4.1 on Mac
OS X 10.6 (Snow Leopard). My configuration:

MacBook Pro 17" 2.16 GHz Intel Core Duo (32bit) (Full specs at
http://www.everymac.com/systems/apple/macbook_pro/stats/macbook_pro_2.16_17.html)
2 GB 667 MHz DDR2 SDRAM
Mac OS X 10.6.3
MacPorts disabled by moving /opt/local to /opt/local-macports

Sage 4.4 binary works fine (sage-4.4-OSX-32bit-10.4-i386-Darwin.dmg)

The relevant part of install.log:
 gcc -std=gnu99 -c -DHAVE_CONFIG_H -m32 -O2 -fomit-frame-pointer -
mtune=pentiumpro -march=pentiumpro -D__GMP_WITHIN_GMP -I.. -
DOPERATION_dive_1 -I. -I. -I.. tmp-dive_1.s -fno-common -DPIC -o .libs/
dive_1.o
tmp-dive_1.s:108:junk `...@got' after expression
make[4]: *** [dive_1.lo] Error 1
make[3]: *** [all-recursive] Error 1
make[2]: *** [all] Error 2
Error building MPIR.

real1m22.957s
user0m42.088s
sys 0m27.357s
sage: An error occurred while installing mpir-1.2.2.p0

This is the same problem as I had trying to compile 4.4 a few days
ago. Any thoughts?

Regards,
Andri

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org


[sage-devel] Re: sage-4.4.1

2010-05-03 Thread Harald Schilly
On May 3, 6:27 am, William Stein  wrote:
> I've released sage-4.4.1:

Cool, I've put it on the mirror network and it will be available there
soon.

H

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org