On Mon, 17 Jan 2011 17:05:27 +0200 Ciprian Dorin Craciun 
<ciprian.crac...@gmail.com>  wrote:
> On Mon, Jan 17, 2011 at 16:59, erik quanstrom <quans...@quanstro.net> wrote=
> :
> >> Any ideas what could cause this?
> >
> > have you tried profiling mk?
> >
> > - erik
> 
>     In fact I tried to `strace -f -T` it and it seems that in the
> first second or so it `stats` all the files that exist, and then it
> just waits 14 seconds computing something (100% processor), and
> concludes that all is already built. (This is after I've already
> successfully built it once).

strace tells you what system calls were made and when.  To
find out which functions use most time, compile with -pg and
look at the gprof output once done.  That 14 seconds were
probably spent computing dependencies.  You can convert your
test.mk to a Makefile with a trivial sed script. See what
bsdmake or gmake does with it time wise. {bsd,g}make have
been been abused with huge Makefiles for far longer and are
likely to be friendlier to them :-)

But the real issue is that mk has to check all the long
dependency chains your generator creates and it is probably
not tuned for such large mkfiles as typically one factors out
build logic in a set of mkfiles and uses meta rules where
appropriate.

Reply via email to