"Andrei Alexandrescu" <seewebsiteforem...@erdani.org> wrote in message 
news:4df7d92a.8050...@erdani.org...
> On 6/14/11 4:38 PM, Nick Sabalausky wrote:
>> - Putting it in the compiler forces it all to be written in C++. As an
>> external tool, we could use D.
>
> Having the compiler communicate with a download tool supplied with the 
> distribution seems to be a very promising approach that would address this 
> concern.
>

A two way "compiler <-> build tool" channel is messier than "build tool 
invoked compier", and I don't really see much benefit.

>> - By default, it ends up downloading an entire library one inferred 
>> source
>> file at a time. Why? Libraries are a packaged whole. Standard behavior
>> should be for libraries should be treated as such.
>
> Fair point, though in fact the effect is that one ends up downloading 
> exactly the used modules from that library and potentially others.
>

I really don't see a problem with that. And you'll typically end up needing 
most, if not all, anyway. It's very difficult to see this as an actual 
drawback.

> Although it may seem that libraries are packaged as a whole, that view 
> ignores the interdependencies across them. This proposal solves the 
> interdependencies organically.
>

How does my proposal not handle that? I think it does.

>> - Are we abandoning zdmd now? (Or is it "dmdz"?)
>
> It is a related topic. That project, although it has been implemented, 
> hasn't unfortunately captured the interest of people.
>

Not surprising since there's been very little mention of it. In fact, I've 
been under the impression that it wasn't even finished. Is this not so? If 
it is done, I bet I'm not the only one that didn't know. Plus, I bet most 
people aren't even aware of it at all. RDMD gets trotted out and promoted 
*far* more often and I come across a lot of D users (usually newbies) who 
aren't even aware of *it*.

>> - Does it automatically *compile* the files it downloads or merely use 
>> them
>> to satisfy imports?
>
> We need to arrange things such that the downloaded files are also compiled 
> and linked together with the project.
>

And that's akward under your the model you're proposing. But by handling 
package management in a separate tool, it's a non-issue.

>> - Does every project that uses libX have to download it separately? If 
>> not
>> (or really even if so), how does the compiler handle different versions 
>> of
>> the lib and prevent "dll hell"? Versioning seems to be an afterthought in
>> this DIP - and that's a guaranteed way to eventually find yourself in dll
>> hell.
>
> Versioning is a policy matter that can, I think, be addressed within the 
> URL structure. This proposal tries to support versioning without 
> explicitly imposing it or standing in its way.
>

That's exactly my point. If you leave it open like that, everyone will come 
up with thier own way to do it, many will not even give it any attention at 
all, and most of those approaches will end up being wrong WRT avoiding dll 
hell. Hence, dll hell will get in and library users will end up having to 
deal it. The only way to avoid it is to design it out of the system up from 
*with explicitly imposing it*.

>> - How do you tell it to "update libX"? Not by expecting the user to 
>> manually
>> clear the cache, I hope.
>
> The external tool that would work in conjunction with dmd could have such 
> a flag.
>

That's a messier solution than what I outlined.

>> - With a *real* package management tool, you'd have a built-in (and
>> configurable) list of central data sources.
>
> I don't see why you can't have with this approach too.
>

The problem is you end up having both. One one them, the default one, is a 
mess and shouldn't really be used, and then the other is the one that you'd 
already get anyway with a real package management tool.

>> If you want to use something you
>> don't have installed, and it exists in one of the stores (maybe even one 
>> of
>> the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just
>> grab it, no changes to your source needed at all, and any custom steps
>> needed would be automatically handled. And if it was only in a data store
>> that you didn't already have in your list, all you have to do is add 
>> *one*
>> line. Which is just as easy as the DIP, but that *one* step will also
>> suffice for any other project that needs libX - no need to add the line 
>> for
>> *each* of your libX-using projects. Heck, you wouldn't even need to edit 
>> a
>> file, just do "package-tool addsource http://...";. The DIP doesn't even
>> remotely compare.
>
> I think it does. Clearly a command-line equivalent for the pragma needs to 
> exist, and the appropriate pragmas can be added to dmd.conf. With the 
> appropriate setup, a program would just issue:
>
> using dsource.libX;
>
> and get everything automatically.
>

The approach in the DIP encourages such things to not be used and leaves 
them as afterthoughts. I think this is backwards.

>> - I think you're severely overestimating the amount of extra 
>> dmd-invokations
>> that would be needed by using an external build tool.
>
> I'm not estimating much. It's Adam who shared impressions from actual use.
>
>> I beleive this is
>> because your idea centers around discovering one file at a time instead 
>> of
>> properly handling packages at the *package* level.
>
> The issue with package-level is that http does not have a protocol for 
> listing files in a directory. However, if we arrange to support zip files, 
> the tool could detect that a zip file is at the location of the package 
> and download it entirely.
>

There is no need to deal with individual files. Like I've said, that's the 
wrong level to be dealing with this anyway.

>> Consider this:
>>
>> You tell BuildToolX to build MyApp. It looks at MyApp.config to see what
>> libs it needs. It discovers LibX is needed. It fetches LibX.config, and
>> finds it's dependencies. Etc, building up a dependency graph. It checks 
>> for
>> any problems with the dependency graph before doing any real work 
>> (something
>> the DIP can't do). Then it downloads the libs, and *maybe* runs some 
>> custom
>> setup on each one. If the libs don't have any custom setup, you only have
>> *one* DMD invokation (two if you use RDMD). If the libs do have any 
>> custom
>> setup, and it involves running dmd, then that *only* happens the first 
>> time
>> you build MyApp (until you update one of the libs, causing it's one-time
>> setup to run once more).
>>
>> I think this proposal is a hasty idea that just amounts to chasing after
>> "the easy way out".
>
> I'm just trying to define a simple backend that facilitates sharing code 
> and using of shared code, without arrogating the role and merits of a more 
> sophisticated package management tool and without standing in the way of 
> one. Ideally, the backend should be useful to such a tool - e.g. I imagine 
> a tool could take a plain file format and transform it into a series of 
> pragmas directing library locations.
>

I appreciate the motivation behind it, but I see the whole approach as:

1. Not really helping a package management tool, and likely even getting in 
its way.

2. Encouraging people to use a dangerously ad-hoc "package management" 
instead of a proper fully-thought-out one.

I see this as adding more to the language/compiler in order to make the 
wrong things easier.

> As always, criticism is appreciated, particularly of the kind that prompts 
> pushing things forward - as was the case with the idea of a download tool 
> that's a separate executable, companion to dmd.
>

Maybe I'm tired, or maybe it's just the unfortunate nature of text, but I 
can't tell if you're saying you appreciate the criticism I've given here or 
implying that you want better criticism than what I've given...?


Reply via email to