On Wednesday, 27 May 2020 at 05:49:49 UTC, Walter Bright wrote:
On 5/26/2020 9:31 AM, Bruce Carneal wrote:
Currently a machine checked @safe function calling an unannotated extern C routine will error out during compilation. This is great as the C routine was not machine checked, and generally can not be checked.  Post 1028, IIUC, the compilation will go through without complaint.  This seems quite clear.  What am I missing?

Nothing at all.

But I doubt there is much legacy non-compiling code around.

The point isn't that incorrect legacy code that didn't compile now does (although that does matter a bit), it's that newly written code will compile when it shouldn't.

Existing code will be full of extern(C) declarations that are implicitly and correctly @system now and will become @safe with dip1028, which means that when I write new @safe code calling those (could be through a deep dependency chain of inferred-@safety APIs, multiple dub packages...) I could easily find my new code compiling when it shouldn't.


Effectively, by silently @safe-ing things you can't infer, you will be changing APIs from @system to @trusted without any checks.


Just in case there's any confusion, here's a timeline:

1. library A is written containing a dangerous but useful extern(C) declaration assuming @system by default. 2. application B is written for and compiled with dip1028, @safe: at the top of every file. 3. B adds a dependency on A. It continues to compile as @safe, calling an unsafe C function.


This seems like one of those things where it's either wrong or a showstopper.

Reply via email to