On Saturday, 6 January 2024 15:28:53 GMT Wols Lists wrote:

> As far as I'm aware, there's no mystery. On a single machine you get the
> exact same thing ... it's all down to parallelism.
> 
> Make asks itself "how many separate tasks can I do at the same time,
> which won't interfere with each other". In gcc's case, the answer
> appears to be two. It doesn't matter how much resource is available,
> make can only make use of two cores.

Yet, if I set -distcc and -j12 -l12, I get 12 threads in parallel. That's the 
mystery.
 
> In other cases, there may be a hundred separate tasks, make fires off a
> hundred tasks shared amongst all the resource it can find, and sits back
> and waits.

And that's how the very first installation goes, with single-host distcc. Then, 
when it gets to gcc, it collapses to 2 threads and everything gained so far is 
lost many-fold. (I set USE=-fortran to avoid pointless recompilation, since 
nothing needs it here.)

> Think of a hundred compile jobs all running at the same time, but then
> the linker is invoked, and you can only have the one linker running,
> after all the compile jobs have finished.

I hadn't thought of that - another thing to consider.

> And this is a HARD problem, I haven't seen it recently, but there used
> to be plenty of threads about hard-to-debug compile failures that went
> away with -j1. The obvious cause was two compile jobs being set off in
> parallel, when in reality one depended on the other, and things messed up.

I haven't either - seen it recently.

-- 
Regards,
Peter.




Reply via email to