We have a large non-recursive makefile system that works very well on
the whole. Sources are grouped into functional directories and,
following compilation, are linked into static archive libraries (one per
directory) which get fully linked into one overall executable at the
end.

 

Source dependencies are held on .d files (one per source) and these are
updated as a side-effect of compiling with gcc - we use Tom Tromey's
method. The source dependencies are pulled into the build using a
'-include <list of .d files>' for each directory.

 

One disadvantage of the system is that the start-up time for make is
seen as tool long by some users when only one source actually needs to
be re-compiled.

 

I've started to wonder about reducing the start-up time by placing all
source dependencies in a per-directory .d file generated at the archive
link stage and including this instead of including all the source .d
files. A quick test verifies that this significantly reduces the
start-up time due to make having to open far less files. I've yet to
figure out quite what the implications are for ensuring the
per-directory .d file gets updated correctly.

 

Before I go any further, has anyone else done anything like this? If so,
were there any pitfalls?

Thanks in advance,

Philip.

 

_______________________________________________
Help-make mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-make

Reply via email to