Bill Baxter wrote:
It seems macros are implemented as compiler extensions.  You compile
your macros into DLLs first, that then get loaded into the compiler as
plugins.  On the plus side, doing things that way you really do have
access to any API you need at compile-time, using the same syntax as
run-time.  All of .NET can be used at compile-time in your macros.  No
more "can't CTFE that" gotchas.

But it does raise security concerns.  I wonder if they have some way
to prevent macros from running malicious code.  I guess you better run
your web-based compiler service in a tightly partitioned VM.

C# has security levels. If you run the Nemerle compiler in low trust mode, fewer bad things can happen.

Overall it seems pretty nifty to me, really.  Giving macros access to
an actual compiler API seems less hackish than throwing in a
smattering of diverse functionality under the heading of __traits.
And less prone to gotchas than trying to create a separate
compile-time D interpreter that runs inside the D compiler.

What do you see as the down sides?  Just that some rogue macro might
mess up the AST?

It makes macros highly compiler-specific, or requires the compiler's AST to be part of the language.

Nemerle took the nuclear option, and its macros are all-powerful. That's a reasonable way of doing things. I'd be happy with a more restricted system that's easier to standardize, especially if it got rid of all the hacky string manipulation in current D metaprogramming. (Seriously, even __traits returns string arrays for a lot of stuff. It's ridiculous.)

Reply via email to