> Instantaneous building of a complex project from source.
> (I'm defining instantaneous as less than 1 second for this.)

Depends on how complex.  I spent two years retrofitting a commercial
parallel make (which only promises a 20x speedup, even with dedicated
hardware) into the build system of a telecommunications product.  In
retrospect, it would have taken less time to write a new build system
with parallelism designed into it, but it seemed less risky to be
incremental.

There are a lot of dependencies in a complex project.  Bundles wrap up
a set of files which include executable tasks composed of libraries
(linked from their own objects derived from source code) and their own
source code: some hand-coded, and some derived from object-oriented
models, lexical analyzers and compiler-compilers, and message-passing
code generators (it can take a surprisingly long time to generate
optimized C code with a functional language).

Compile some of this for an ordinary unixy platform, some for any
platform which supports java, some for systems without a filesystem
where all code runs in the same space as the kernel.  Each unit of
code wants its own options; all code is expected to honor any global
options; build system should not restrict porting code between
platforms with different build processes (or produce any delay in the
schedule at all;).

All of these factors influence the build time of a project, in a
complex web of dependencies, even after you write or modify all the
build tools to be reentrant so you can run them all at once.

The most effective build strategy I've found is avoidance: just don't
build what you don't have to, and make sure you only build something
once.  One thing complicating this is that make and its common
variants aren't smart enough to handle the case where version control
systems regress a file and present an earlier date not newer than the
derived object.

In a nutshell, my experience is that unless developers abandon all the
fancy tools that supposedly make it easier for them to write mountains
of brittle, special-purpose, especially model-generated code, the tool
chain created by these dependencies will defeat efforts to make it run
faster in parallel.  So all your extra processors will only be useful
for running many of these heavy builds in parallel, as you try to have
each developer build and test before integration.

> Sam

Jason Catena

Reply via email to