On Mon, Jun 05, 2017 at 04:29:06PM +0000, Seb via Digitalmars-d wrote: > On Monday, 5 June 2017 at 15:37:42 UTC, Steven Schveighoffer wrote: > > It appears that the precision parameter in std.format differs from its > > meaning in printf. Is that expected behavior? > > > > Example: > > > > import std.stdio; > > import core.stdc.stdio; > > > > void main() > > { > > auto f = 20.66666; > > writeln(f); > > writefln("%0.3s", f); [...]
That should be "%0.3f", not "%0.3s". If you use the "%s" specifier, precision is interpreted differently, i.e., as "maximum number of characters", as per "%s" in C's printf. T -- If it tastes good, it's probably bad for you.