On 6/5/17 12:53 PM, H. S. Teoh via Digitalmars-d wrote:
On Mon, Jun 05, 2017 at 04:29:06PM +0000, Seb via Digitalmars-d wrote:
On Monday, 5 June 2017 at 15:37:42 UTC, Steven Schveighoffer wrote:
It appears that the precision parameter in std.format differs from its
meaning in printf. Is that expected behavior?

Example:

import std.stdio;
import core.stdc.stdio;

void main()
{
    auto f = 20.66666;
    writeln(f);
    writefln("%0.3s", f);
[...]

That should be "%0.3f", not "%0.3s".

If you use the "%s" specifier, precision is interpreted differently,
i.e., as "maximum number of characters", as per "%s" in C's printf.

Interesting. I thought s just stood for "interpret based on the type", and would automatically switch to floating point 'f'. I see in the docs now, it uses 'g', something I've never used.

Curious that 'f' isn't used, I thought it would have been the default.

In any case, I have a fix for my code, move along :)

-Steve

Reply via email to