On 6/15/11 9:56 AM, Robert Clipsham wrote:
On 15/06/2011 15:33, Andrei Alexandrescu wrote:
On 6/15/11 9:13 AM, Steven Schveighoffer wrote:
We have been getting along swimmingly without pragmas for adding local
include paths. Why do we need to add them using pragmas for network
include paths?

That doesn't mean the situation is beyond improvement. If I had my way
I'd add pragma(liburl) AND pragma(libpath).

pragma(lib) doesn't (and can't) work as it is, why do you want to add
more useless pragmas?

Then we should yank it or change it. That pragma was defined in a completely different context from today's, and right now we have a much larger user base to draw experience and insight from.

Command line arguments are the correct way to go
here.

Why? At this point enough time has been collectively spent on this that I'm genuinely curious to find a reason that would have me "huh, haven't thought about it that way. Fine, no need for the dip."

Not to mention that paths won't be standardized across machines
most likely so the latter would be useless.

version() for the win.

Also, I don't see the major difference in someone who's making a piece
of software from adding the include path to their source file vs. adding
it to their build script.

Because in the former case the whole need for a build script may be
obviated. That's where I'm trying to be.

This can't happen in a lot of cases, eg if you're interfacing with a
scripting language, you need certain files automatically generating
during build etc.

Sure. For those cases, use tools. For everything else, there's liburl.

Admittedly, for the most part, you'll just want to be
able to build libraries given a directory or an executable given a file
with _Dmain() in.

That's the spirit. This is what the proposal aims at: you have the root file and the process takes care of everything - no configs, no metadata, no XML info, no command-line switches, no fuss, no muss.

With such a feature, "hello world" equivalents demoing dcollections, qt, mysql (some day), etc. etc. will be as simple as few-liners that anyone can download and compile flag-free. I find it difficult to understand how only a few find that appealing.

There'll still be a lot of cases where you want to
specify some things to be dynamic libs, other static libs, and what if
any of it you want in a resulting binary.

Sure. But won't you think it's okay to have the DIP leave such cases to other tools without impeding them in any way?

Sounds good. I actually had the same notion, just forgot to mention it
in the dip (fixed).

I'd agree with Steven that we need command line arguments for it, I
completely disagree about pragmas though given that they don't work (as
mentioned above). Just because I know you're going to ask:

# a.d has a pragma(lib) in it
$ dmd a.d
$ dmd b.d
$ dmd a.o b.o
<Linker errors>

This is unavoidable unless you put metadata in the object files, and
even then you leave clutter in the resulting binary, unless you specify
that the linker should remove it (I don't know if it can).

I now understand, thanks. So I take it a compile-and-link command would succeed, whereas a compile-separately succession of commands wouldn't? That wouldn't mean the pragma doesn't work, just that it only works under certain build scenarios.

This assumes the URL contains the package prefix. That would work, but
imposes too much on the URL structure. I find the notation -Upackage=url
more general.

I personally think there should be a central repository listing packages
and their URLs etc, which massively simplifies what needs passing on a
command line. Eg -RmyPackage would cause myPackage to be looked up on
the central server, which will have the relevant URL etc.

Of course, there should be some sort of override method for private
remote servers.

That is tantamount to planting a flag in the distributed dmd.conf. Sounds fine.

As I said in another post, you could also specify a zip file or tarball
as a base path, and the whole package is downloaded instead. We may need
some sort of manifest instead in order to verify the import will be
found instead of downloading the entire package to find out.

Sounds cool.

I don't believe this tool should exist without compression being default.

Hm. Well fine.


Andrei

Reply via email to