SIGTERM on YARN generally means the NM is killing your executor because
it's running over its requested memory limits. Check your NM logs to make
sure. And then take a look at the "memoryOverhead" setting for driver and
executors (http://spark.apache.org/docs/latest/running-on-yarn.html).

On Tue, Jul 7, 2015 at 7:43 AM, Kostas Kougios <
kostas.koug...@googlemail.com> wrote:

> I've recompiled spark deleting the -XX:OnOutOfMemoryError=kill declaration,
> but still I am getting a SIGTERM!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/is-it-possible-to-disable-XX-OnOutOfMemoryError-kill-p-for-the-executors-tp23680p23687.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Marcelo

Reply via email to