I have a library where I was using very many voldemort types a la std.range.
While debugging I had an exception triggered, but I found that the
library *pauses* significantly while printing the exception.
What I found is essentially that using voldemort types results in
horrible stack traces.
To demonstrate the problem:
struct S(T)
{
void foo(){ throw new Exception("1");}
}
auto s(T)(T t)
{
struct Result
{
void foo(){ throw new Exception("2");}
}
return Result();
}
void main(string[] args)
{
version(bad)
auto x = 1.s.s.s.s.s;
else
S!(S!(S!(S!(S!(int))))) x;
x.foo;
}
Building without bad version, and running, I get this as the stack frame
for the foo call:
4 testexpansion 0x0000000103c3fc14 pure @safe
void
testexpansion.S!(testexpansion.S!(testexpansion.S!(testexpansion.S!(testexpansion.S!(int).S).S).S).S).S.foo()
+ 144
Now, if I compile with version=bad:
4 testexpansion 0x000000010fb5dbec pure @safe
void
testexpansion.s!(testexpansion.s!(testexpansion.s!(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).s(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).Result).s(testexpansion.s!(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).s(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).Result).Result).s(testexpansion.s!(testexpansion.s!(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).s(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).Result).s(testexpansion.s!(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).s(testexpansion.s!(testexpansion.s!(int).s(int).Result).s(testexpansion.s!(int).s(int).Result).Result).Result).Result).!
Result.foo
()
+ 144
I believe what is happening is both the template parameter and the
argument type are being printed, but both are the same! And each level
of nesting results in another doubling of the printouts. So you have an
exponential effect, and the resulting stack trace is horrendously useless.
what's more, the template bloat factor skyrockets:
dmd -c testexpansion.d
ls -l testexpansion.o
-rw-r--r--+ 1 steves staff 5664 Feb 7 00:06 testexpansion.o
dmd -c -version=bad testexpansion.d
ls -l testexpansion.o
-rw-r--r--+ 1 steves staff 15312 Feb 7 00:07 testexpansion.o
as a final test, I tried this:
auto s(T)(T t)
{
return S!(T)();
}
And the resulting .o file:
-rw-r--r--+ 1 steves staff 7104 Feb 7 00:11 testexpansion.o
With obviously the exception code printing in the less verbose form. So
the cost in template bloat of using a voldemort type over a private type
is 8k here, more than double the existing size. With more nesting, I'm
sure that factor gets worse.
Is there a better way we should be doing this? I'm wondering if
voldemort types are really worth it. They offer a lot of convenience,
and are much DRYer than separate private template types. But the bloat
cost is not really worth the convenience IMO.
Thoughts?
-Steve