Don: > I think that even DMD should be using a 128-bit emulator for > internal constants, regardless of the machine precision.
Can you tell me why? (Note: LDC supports 128 bit integers too. Maybe even 128 bit floating points). > the compiler should not depend on NaNs being handled correctly in the > C++ compiler. By "C++ compiler" do you mean the back-end? I think GCC, LLVM, ICC and DMD support NaNs well enough (LLVM supports signaling NaNs too, they say me). Sorry for not understanding your post fully, bye, bearophile