Don Wrote: > > I > > think VladD2 is right: You need to keep track of both "current" system and > > "target" system. Unfortunately, there is some information about the > > "target" > > system the compile-time code wouldn't be able discern without giving it the > > ability to run code (RPC? Virtualization? Really, really good emulator?) on > > the target system, but then again, that's a limitation with any > > cross-compiling scenario. > > Note that for this to work at all, the compiler needs to be able to > generate exectuable code for platform X as well as for Y -- that is, it > needs to include two back-ends.
If the macros have been compiled and are in binary (executable) form, the compiler must only be able to generate code for platform X, and run macros (execute code from DLL). This is exactly what makes Nemerle compiler. In this case, compiling of the same macros looks like any other compilation process (on the platform X for the platform Y). > I don't think it's quite the same. In a makefile, every executable is > listed, and so you can have some degree of control over it. Trust to rmdir ... lol! And what about NAnt or MSBuild which can have binary extensions? I think, you are completely wrong. > But in this > scenario, the compiler is making calls to arbitrary shared libraries > with arbitrary parameters. > It means the compiler cannot be trusted *at all*. The experience of Lisp (50 years!) and Nemerel (about 6 years) shows that the ability to access any library - is not a problem. This is a huge advantage. And limit the possibility of a macro, you can simply forbidding them to use some libraries.