Not to say you shouldn't do this, but I worry that increasingly computing
is being done in "containers" where e.g. the number of cpus is doubling
every year but only a small number are available to actually be used by a
given process.  if availableProcessors reports 1 million, what should we
do?  (no need to answer...)

On Tue, Dec 1, 2015 at 1:55 AM, Erik Joelsson <erik.joels...@oracle.com>
wrote:

> Hello,
>
> The current heuristic for figuring out what to default set the -j flag to
> make needs some tweaking.
>
> In JDK 9, it looks at the amount of memory and the number of cpus in the
> system. It divides memory by 1024 to get a safe number of jobs that will
> fit into memory. The lower of that number and the number of cpus is then
> picked. The number is then scaled down to about 90% of the number of cpus
> to leave some resources for other activities. It is also capped at 16.
>
> Since we now have the build using "nice" to make sure the build isn't
> bogging down the system, I see no reason to do the 90% scaling anymore.
> Also, the performance issues that forced us to cap at 16 have long been
> fixed, and even if we don't scale well beyond 16, we do still scale. So I
> propose we remove that arbitrary limitation too.
>
> Bug: https://bugs.openjdk.java.net/browse/JDK-8144312
> Webrev: http://cr.openjdk.java.net/~erikj/8144312/webrev.01/
>
> /Erik
>

Reply via email to