If you're running out of memory during C++ compilation tasks, I don't have 
any useful suggestions beyond what you've already done. If you only run out 
of memory while running multiple linking tasks, you could try setting the 
gn argument `concurrent_links` to a smaller value than the default. This 
would limit how many linking tasks Ninja attempts to run at the same time, 
although it could still run any number of other tasks concurrently with a 
linking task.

But what exactly is the default `concurrent_links` value? That depends on 
the platform, memory size, and build configuration; 
see 
https://source.chromium.org/chromium/chromium/src/+/main:build/toolchain/concurrent_links.gni
 
. To find out what your default value is, I'd recommend adding 
`print(concurrent_links)` to the end of that file and starting a build. If 
you're building standalone V8 without the rest of Chromium, your copy of 
concurrent_links.gni is in the build/ directory within the V8 repo.

On Tuesday, April 30, 2024 at 3:56:33 AM UTC-7 Jakob Kummerow wrote:

> I don't think there's a built-in way to take RAM into account. ninja and 
> autoninja both look at the number of CPU cores/threads.
>
> FWIW, with 2 GiB memory per thread (so e.g. 16 GiB on a 4C/8T CPU), I 
> haven't had OOM issues on a fairly wide range of machines from laptops to 
> workstations.
>
> is_component_build = true may help with linker memory requirements (as 
> well as disk space consumption when building multiple targets).
>
> On Tue, Apr 30, 2024 at 9:01 AM Ben Noordhuis <in...@bnoordhuis.nl> wrote:
>
>> On Tue, Apr 30, 2024 at 3:56 AM Paul Harris <harr...@gmail.com> wrote:
>> >
>> > Hi,
>> >
>> > I'd like to run Ninja build with as many jobs as I have CPUs.
>> > However, some of the build jobs require a lot of RAM (especially for 
>> Debug builds), so I have to reduce the number of jobs artificially, just to 
>> keep the number of jobs under the ram limit.
>> >
>> > Is there a clever set of flags or options in the v8 build system that 
>> would automatically manage the number of jobs launched based on max RAM?
>> >
>> > Or do people simply learn from the past and set a max job limit for 
>> that particular machine?
>> >
>> > Thanks,
>> > Paul
>>
>> It's been my experience that it's really only the link commands that
>> consume lots of memory, not compilation, so I changed the $ld ninja
>> variable from `c++` to `flock c++`, to stop them from running in
>> parallel.
>>
>>

-- 
-- 
v8-dev mailing list
v8-dev@googlegroups.com
http://groups.google.com/group/v8-dev
--- 
You received this message because you are subscribed to the Google Groups 
"v8-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to v8-dev+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/v8-dev/b6c944e7-e11f-4c75-ad20-22aab6288637n%40googlegroups.com.

Reply via email to