On Saturday, 4 August 2018 at 12:18:21 UTC, tide wrote:
On Saturday, 4 August 2018 at 01:45:44 UTC, Laeeth Isharc wrote:
On Friday, 3 August 2018 at 22:55:51 UTC, Rubn wrote:

The difference is they would have to rework their existing code. If you are writing D source code bindings for your code, then you are essentially writing new code. You don't have to worry about backwards compatibility.

Why would you write bindings if the computer can do it for you, better, faster and consistently?

With the current tools the ones that generate D files to be used aren't very good. They evaluate Macros based on the current implementation, so if there's a define MACHINE_X86 or MACHINE_x64, those macro and #if's will be evaluated based on the current system running the tool instead of generating equivalent version() statements.

If the D files are to be checked in, then yes, that'd be a problem. If they're not, as is the case with dpp, then... that's actually what you want.

dpp: I fought the preprocessor and the preprocessor won.

It would be, but I don't think it'll ever be 100% and will require manual intervention.

If manual intervention is required, dpp has failed. Some problems will be tricky, especially where the preprocessor is concerned. But a lot of real-life production code works.


Reply via email to