On 4 January 2016 at 03:53, Walter Bright via Digitalmars-d <digitalmars-d@puremagic.com> wrote: > On 1/3/2016 9:14 AM, Marc Schütz wrote: >> >> I guess in reality this would not be a frequent thing. Most real C++ code >> will >> have both instances of `identifier` declared in different header files, >> and D's >> modules will usually closely mirror those, so they will end up in >> different >> modules on the D side. > > > My experience with "who would ever write such code" is they exist and you > cannot wish them away.
There's nothing wrong with it if they do, they can easily address the situation, and they would almost certainly do so by default as they naturally structure the API into modules. > In a more general case, we should allow as much C++ > compatibility as we can, because every shortcoming will be complained about > at length. And, as Manu pointed out, one is often not able to adjust the C++ > side of the bridge. How would this ever lead to adjustment of the C++ code in any case? That's not an issue here. We're going to mangle the appropriate C++ symbol either way, what we're discussing is purely about the D code, and not making it awkward for the D users. It's not a 'shortcoming' to let the D coder lay out their modules how they want to, that's normal expected behaviour. Anything that inhibits the ability to organise your D code is a shortcoming, and this behaviour is the biggest shortcoming of the extern(C++) experience by far. It's cause for a bunch of really awkward problems, and it's completely pointless; we have modules! This behaviour inhibits C++ namespaces with the same name as the top level D package (surely this is the 99% _expected_ case), and it also needlessly reduces the pool of compatible namespaces to valid D identifiers; a pointless restriction. You're arguing for "as much C++ compatibility as we can", while arguing for something that does nothing but inhibit C++ compatibility, make it needlessly painful for the D lib coder, and surprising and unconventional to the end-user... this is Schrodinger's argument. This decision impacts _every_ extern(C++) definition everywhere! They all need to be aliased around into the appropriate locations where they should have been defined in the first place. That's messy, hard to follow, hard to explain, and hard to maintain. Also the risk of 'invalid' C++ namespace names, which is an absurd problem to have, only possible under this design since every C++ namespace name must also be a valid D identifier, is much greater than your hypothetical non-problem case. I have already run into this multiple times. I am demonstrating legitimate real-world problems with the design, and you hold fast to a broken design, because it protects against a single hypothetical 'problem' with a trivial and natural solution. I don't understand how it's possible to hold an objective resistance to this change. >> the same identifier actually does appear twice in the same >> module, static structs can be used: >> >> struct ns1 { >> extern(C++, ns1): >> int identifier; >> } >> struct ns2 { >> extern(C++, ns2): >> int identifier; >> } > > > Offhand, I can't think at the moment why that wouldn't work, but using > structs as C++ namespaces did have problems I don't recall at the moment, > and: > > 1. It's ugly. > 2. You're an expert, and it's wrong - the fields need to be 'static'. How > will others fare? > 3. I suspect that this ugly thing will become "best practice", and we won't > be able to fix it in a backwards compatible way. 'Best practice' would be to make another module. It's not so much 'best practice' as, 'normal D behaviour'. > 4. Structs carry other baggage with them, such as 'init' fields, TypeInfo's, > default member functions, etc. No structs, just make another module! >_< .. that's what they're for, that's how you do it in D! > 5. I don't share the opinion that a C++ namespace introducing a scope, just > as it does in C++, is something weird and unexpected. Because we're not writing C++ code, we're writing D code, and any deviation from normal D behaviour should surely be taken as weird and unexpected. The fact that the lib was compiled from C++ code is uninteresting, irrelevant, and possibly even unknown to the library user. As far as they know, they're using a D lib, with an API presented to them in D's terms. That's the point of writing bindings. Almost all the quality D bindings I've ever used have made adaptations to present nicely for D use, and extern(C++) is no different; the binding author will organise their D modules in a sensible and logical manner wrt the api. C libs are small and flat enough that they can sometimes get away with direct 'machine' translation, but a good C++ binding will almost certainly require some care by the binding author to make it work nicely with D semantics, and feel natural to D users. This hypothetical problem you're attached to will tend to iron out naturally, and even if not, the workaround is trivial; put it in another module. The effect of which is exactly the same as the scope created by extern(C++) currently, but it's well understood by conforming to all existing rules, and as a bonus, every single problem I've complained about is instantly resolved, and enhancement requests like supporting "import x.y : X = ns.X;" become redundant. I'll add this, if I can't make the D bindings feel nice to use (which is very hard as it is now), that will significantly reduce motivation to USE said D bindings. If the bindings are riddled with awkwardness, why would anyone prefer them to the C++ bindings? The entire point of writing D bindings is to offer a superior experience using the library. If I fail to attain that goal, then *the whole point* of my effort is moot, and we will remain C++ users. I'm doing this, after-hours, as an avenue to give people in my office an opportunity to write some production D code as plugin to one of our major products, and maybe even ship something! Likewise, even if I wrangle it into something nice at the front-end despite this issue, if the binding itself appears to have been archaic and awkward to implement and maintain, that will equally make a terrible impression on the squad of engineers who will critique this code, and I will not be able to explain or defend this decision to them.