way it
> should need so much memory.
>
> Thanks,
>
> Ian
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Yarn-containers-getting-killed-
> error-52-multiple-joins-tp28594.html
> Sent from the Apache Spark User
:08:40 AM
To: user@spark.apache.org
Subject: Yarn containers getting killed, error 52, multiple joins
Hi,
I have a spark 1.6.2 app (tested previously in 2.0.0 as well). It is
requiring a ton of memory (1.5TB) for a small dataset (~500mb). The memory
usage seems to jump, when I loop through and inne
=0.75
--conf spark.yarn.executor.memoryOverhead=5120
Has anyone seen this or have an idea how to tune it? There is no way it
should need so much memory.
Thanks,
Ian
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Yarn-containers-getting-killed-error-52