Re: Spark dump in slave Node EMR

2016-12-16 Thread Selvam Raman
If i want to take specifically for the task number which got failed. is it possible to take heap dump. "16/12/16 12:25:54 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Container killed by YARN for exceeding memory limits. 20.0 GB of 19.8 GB physical memory used. Consider boosting

Spark dump in slave Node EMR

2016-12-16 Thread Selvam Raman
Hi, how can i take heap dump in EMR slave node to analyze. I have one master and two slave. if i enter jps command in Master, i could see sparksubmit with pid. But i could not see anything in slave node. how can i take heap dump for spark job. -- Selvam Raman "லஞ்சம் தவிர்த்து நெஞ்சம்