On 1/13/11, Ralf Wildenhues <ralf.wildenh...@gmx.de> wrote:
> make is a bit flawed for real large projects because it always walks
> the whole dependency graph, unlike beta build systems who use a notify
> daemon and a database to only walk subgraphs known to be outdated.

How big is real large?  GCC uses make, for instance, and it's the
biggest public project that I personally know about.  I worked on a
250ksloc project, but that was in Java and just used eclipse native
stuff (not even ant).  I worked on a 1m sloc project in ada... but for
that project, every build was a full complete 100% fresh build for
reasons outside of this thread.

At what magnitude does make break down, do you think?  And how/where
does it become flawed?

In retrospect, even when dealing with GCC, I never do a partial
rebuild.  Every patch for me no matter how simple involves a complete
fresh build (and because of the nature of what I do with GCC, that
also involves a complete fresh build of binutils and of our libc).  So
I really don't have any good context from which to extrapolate your
meaning.

Reply via email to