http://d.puremagic.com/issues/show_bug.cgi?id=9957
Summary: [2.061 -> 2.062] Taking pointer of enum float array gives some garbage Product: D Version: D2 Platform: All OS/Version: Mac OS X Status: NEW Severity: regression Priority: P2 Component: DMD AssignedTo: nob...@puremagic.com ReportedBy: st...@kraybit.com --- Comment #0 from st...@kraybit.com 2013-04-18 08:09:28 PDT --- import std.stdio; void main() { enum float[3][1] A = [[1.0, 2.0, 3.0]]; auto a = A[0].ptr; writeln(a[0], " should be ", A[0][0]); writeln(a[1], " should be ", A[0][1]); writeln(a[2], " should be ", A[0][2]); } ———— > rdmd -m32 test.d 1 should be 1 1.76941e-40 should be 2 -1.98565 should be 3 > rdmd -m64 test.d 1 should be 1 1.4013e-45 should be 2 4.33594e+15 should be 3 > _ ———— Example output above. Changes every run. Worked in 2.061. Don't know if it's supposed to work, but if not, should at least throw an error? MacOS 10.8.3 DMD 2.062 -- Configure issuemail: http://d.puremagic.com/issues/userprefs.cgi?tab=email ------- You are receiving this mail because: -------