On Sunday, 28 September 2014 at 12:09:50 UTC, Andrei Alexandrescu
wrote:
On 9/27/14, 4:31 PM, H. S. Teoh via Digitalmars-d wrote:
On Sat, Sep 27, 2014 at 11:00:16PM +0000, bearophile via
Digitalmars-d wrote:
H. S. Teoh:
If we can get Andrei on board, I'm all for killing off
autodecoding.
Killing auto-decoding for std.algorithm functions will break
most of
my D2 code... perhaps we can do that in a D3 language.
[...]
Well, obviously it's not going to be done in a careless,
drastic way!
Stuff that's missing:
* Reasonable effort to improve performance of auto-decoding;
* A study of the matter revealing either new artifacts and
idioms, or the insufficiency of such;
* An assessment of the impact on compilability of existing code
* An assessment of the impact on correctness of existing code
(that compiles and runs in both cases)
* An assessment of the improvement in speed of eliminating
auto-decoding
I think there's a very strong need for this stuff, because
claims that current alternatives to selectively avoid
auto-decoding use the throwing of hands (and occasional chairs
out windows) without any real investigation into how library
artifacts may help. This approach to justifying risky moves is
frustratingly unprincipled.
As far as I see, backward compatibility is fairly easy.
Extract autodecoding modules into `autodecoding` dub package and
clean up phobos modules into non-decoding behavior. The phobos
code will be simplified: it will deal with ranges as is without
specialization, the `autodecoding` dub package will be simple:
just wraps strings into dchar range and invokes non-decoding
function from phobos, preserves current module interface to keep
legacy D code working.
Run dfix on your sources, it will replace `import std.algorithm`
with `import autodecoding.algorithm` - then the code should work.
What do you think? Worth a DIP?