On 3/15/12 9:05 AM, William Stein wrote:
On Thu, Mar 15, 2012 at 6:53 AM, Daniel Krenn<kr...@aon.at>  wrote:
How do I change the default precision used? E.g. I want to enter {{{a
= 1.2}}} and want that a is an element of RealField(100) without
explicitly telling to use that field each time.

What does not work is:
sage: RR = RealField(100)
sage: a = 1.2
sage: a.parent()
Real Field with 53 bits of precision

You could do the following to force any floating point number entered
to have exactly 100 bits of precision by default:

sage: RealNumber = lambda x: RealField(100)(x)
sage: 2.5
2.5000000000000000000000000000
sage: 4.2992038490283409823094820938492834082093482834
4.2992038490283409823094820939

Or, you could even force all the floating point numbers that you enter
to be intervals with 100 bits of precision:

sage: RealNumber = lambda x: RealIntervalField(100)(x)
sage: 2.5
2.5000000000000000000000000000000?
sage: sin(2.5)
0.598472144103956494051854702186?

I'm curious: why do you use lambda functions?

sage: RealNumber = RealField(100)
sage: (1.2).parent()
Real Field with 100 bits of precision
sage: RealNumber = RealIntervalField(100)
sage: (1.2).parent()
Real Interval Field with 100 bits of precision

Thanks,

Jason



--
To post to this group, send email to sage-support@googlegroups.com
To unsubscribe from this group, send email to 
sage-support+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/sage-support
URL: http://www.sagemath.org

Reply via email to