Quoting Basile Starynkevitch <bas...@starynkevitch.net>:

I also am in favor of having a software linked dynamically with shared
libraries, for a very pragramtical reason: If a software has shared
libraries, then modifying one such library in its implementation (not its
interface) is very often easier for the developer, who can, thanks to the
dynamic linking, test and use his improved shared library more easily and
more quickly. In particular, if GCC were made of shared libraries, I believe
that the build time would be much faster for the developer (ie the GCC
contributor), and this is a big comfort in practice.

I don't see that.  Re-linking cc1 / cc1plus is reasonably fast.  And you'll
have to rebuild all the target libraries for a full test, no matter if
you have a monolithic exectuable or one with lots of dsos.  But in the
latter case, you'll have to dynamically link against numerous
dso for each library file compilation.

Where we could save rebuilding time is in cutting unwanted header
file dependencies, like tm.h included by frontends.
So, in that sense, more modularity helps build times.

But any gain from not statically re-linking the full executable is likely
already lost by more resource-hungry -fpic compilation, and even more
so by the dynamic link overhead.

Time savings when compiling the compiler with lto might be more noticable,
but only because you add additionaly boundaries which severely limit of
what lto can do for you.  lto and fast turn-around times don't mix very
well, anyway.

Reply via email to