Hello!
I’m newcomer to Python and I’m on documentation reading stage and trying some 
of examples.
I’m using Win7 x64 OS and Python 2.7.3 (default, Apr 10 2012, 23:24:47) [MSC 
v.1500 64 bit (AMD64)].
I try to understand how string format expression (%)works. Everything is almost 
clear but except one case: using ofg(G) conversion type and # flag.
Let’s take a look at documentation here:
http://docs.python.org/release/2.7.3/library/stdtypes.html#sequence-types-str-unicode-list-tuple-bytearray-buffer-xrange
Document declares for g(G) conversion type in case of using # flag (4th note):
“The precision determines the number of significant digits before and after the 
decimal point and defaults to 6”.

I have noticed behavior that does not meet documentation declaration and looks 
like a bug in case when using g(G) conversion type with # flag
with  omitted  precision  and  zero integer part of the decimal. Could
someone, please comment the case it is a bug or right use case result? If it is 
correct, please explain why.

Steps to reproduce the case:

1.Start python interactive mode
2.Enter string with g(G) conversion type and using #flag like this: "%#g"%0.3 – 
precision parameter is omitted and integer part of the decimal is zero.
3.Watch the output results

Actual result:

Python outputs decimal as declared as but with more significant digits than 
default value of 6 - if integer part of the decimal is equal to zero.
>>> "%#g"%0.3
'0.300000'
>>> "%#G"%0.3
'0.300000'
>>> "%#G"%0.004
'0.00400000'
>>>

Expected results:
As declared in documentation – there will be 6 significant digits before and 
after decimal point by default.

Thanks,
Regards, Lesya.

_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
http://mail.python.org/mailman/listinfo/tutor

Reply via email to