If you do tests with newer Java versions you can also try:

- UseNUMA: -XX:+UseNUMA. See https://openjdk.org/jeps/345

You can also assess the new Java GC algorithms:
- -XX:+UseShenandoahGC - works with terabyte of heaps - more memory efficient 
than zgc with heaps <32 GB. See also: 
https://developers.redhat.com/articles/2021/09/16/shenandoah-openjdk-17-sub-millisecond-gc-pauses
-  -XX:+UseZGC - works also with terabytes of heaps - see also 
https://www.baeldung.com/jvm-zgc-garbage-collector

Note: in jdk 21 zgc has an additional option that could make sense to activate:

-XX:+ZGenerational

See also 
https://developers.redhat.com/articles/2021/11/02/how-choose-best-java-garbage-collector

Note: it might be worth to try also JDK 21 - it has for certain GCs 
optimizations (amongst other things - I wonder how much improvement virtual 
threads can bring to Spark)

> Am 08.12.2023 um 01:02 schrieb Faiz Halde <haldef...@gmail.com>:
> 
> 
> Hello,
> 
> We are planning to switch to Java 17 for Spark and were wondering if there's 
> any obvious learnings from anybody related to JVM tuning?
> 
> We've been running on Java 8 for a while now and used to use the parallel GC 
> as that used to be a general recommendation for high throughout systems. How 
> has the default G1GC worked out with Spark?
> 
> Thanks
> Faiz

Reply via email to