On Sunday, 25 February 2018 at 13:33:07 UTC, psychoticRabbit wrote:
can someone please design a language that does what I tell it!

please!!

is that so hard??

print 1.0 does not mean go and print 1 .. it means go and print 1.0

languages are too much like people.. always thinking for themselves.

I fed up!

fed up I say!

That's in fact a data representation problem not a language problem. In C#, if you are using a *decimal* data type, it prints as expected:

decimal one = 1m;       //internally represented as 10^^0
decimal one2 = 1.0m;    //internally represented as 10^^-1
decimal one3 = 1.00m;   //internally represented as 100^^-2
//one == one2 == one3, but the output is different:
Console.WriteLine(one);  //outputs 1
Console.WriteLine(one2); //outputs 1.0
Console.WriteLine(one3); //outputs 1.00

Nor Java and nor D have any built-in decimal type, therefore the internal representation of floating point values is always double (or float, or real). Double has a unique representation for 1, 1.0 or 1.00 and it's always 2^^0. How the writeln/println functions outputs 2^0, it's a design decision. Since D is inheriting C concepts (including printf), it will use the %g format as in C. I'm not a Java fan, therefore I don't know what was behind the decision of the language creators to output floating point values with at least one decimal digit.

Reply via email to