Hi, I am using spark.yarn.executor.memoryOverhead=8192 yet getting executors crashed with this error. does that mean I have genuinely not enough RAM or is this matter of config tuning? other config options used:spark.storage.memoryFraction=0.3 SPARK_EXECUTOR_MEMORY=14G running spark 1.2.0 as yarn-client on cluster of 10 nodes (the workload is ALS trainImplicit on ~15GB dataset) thanks for any ideas,Antony.
- java.lang.OutOfMemoryError: GC overhead limit exceeded Antony Mayi
- Re: java.lang.OutOfMemoryError: GC overhead limit excee... Sandy Ryza
- Re: java.lang.OutOfMemoryError: GC overhead limit e... Guru Medasani
- Re: java.lang.OutOfMemoryError: GC overhead lim... Sven Krasser
- Re: java.lang.OutOfMemoryError: GC overhead... Guru Medasani
- Re: java.lang.OutOfMemoryError: GC ove... Antony Mayi
- Re: java.lang.OutOfMemoryError: GC... Guru Medasani