On 2011-08-11 19:07, Steven Schveighoffer wrote:
On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley
<wiley.andre...@gmail.com> wrote:

On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer
<schvei...@yahoo.com>wrote:
I think the benefit of this approach over a build tool which wraps the
compiler is, the compiler already has the information needed for
dependencies, etc. To a certain extent, the wrapping build tool has to
re-implement some of the compiler pieces.


This last bit doesn't really come into play here because you can
already ask
the compiler to output all that information. and easily use it in a
separate
program. That much is already done.

Yes, but then you have to restart the compiler to figure out what's
next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d
and b.d are on the network. You potentially need to run the compiler 3
times just to make sure you have all the files, then run it a fourth
time to compile.

So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again.

This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once.

Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything.

Well, actually if you're using a build tool it would drive everything. You have the package dependencies in the build script file. The build tool starts by invoking the package manager (see above), then it can query the package manager for include and library paths and libraries to link with. As the final step it invokes the compiler to build everything (see above).

And there is no parsing of the output data, the problem boils down to a
simple get tool. Running a simple get tool over and over doesn't consume
as much time/resources as running the compiler over and over.

There are still problems with the DIP -- there is no way yet to say "oh
yeah, compiler, you have to build this file that I downloaded too". But
if nothing else, I like the approach of having the compiler drive
everything. It reduces the problem space to a smaller more focused task
-- get a file based on a url. We also already have many tools in
existence that can parse a url and download a file/package.

-Steve

The best would be if the compiler could be a library. Then the build tool could drive everything and ask other tools, like the a package manager and compiler about information it needs to build everything.

--
/Jacob Carlborg

Reply via email to