Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-12 Thread Vadim Semenov
Yeah, then the easiest would be to fork spark and run using the forked version, and in case of YARN it should be pretty easy to do. git clone https://github.com/apache/spark.git cd spark export MAVEN_OPTS="-Xmx4g -XX:ReservedCodeCacheSize=512m" ./build/mvn -DskipTests clean package

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-12 Thread Serega Sheypak
I tried a similar approach, it works well for user functions. but I need to crash tasks or executor when spark application runs "repartition". I didn't any away to inject "poison pill" into repartition call :( пн, 11 февр. 2019 г. в 21:19, Vadim Semenov : > something like this > > import

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-11 Thread Vadim Semenov
something like this import org.apache.spark.TaskContext ds.map(r => { val taskContext = TaskContext.get() if (taskContext.partitionId == 1000) { throw new RuntimeException } r }) On Mon, Feb 11, 2019 at 8:41 AM Serega Sheypak wrote: > > I need to crash task which does repartition. >

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-11 Thread Serega Sheypak
I need to crash task which does repartition. пн, 11 февр. 2019 г. в 10:37, Gabor Somogyi : > What blocks you to put if conditions inside the mentioned map function? > > On Mon, Feb 11, 2019 at 10:31 AM Serega Sheypak > wrote: > >> Yeah, but I don't need to crash entire app, I want to fail

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-11 Thread Gabor Somogyi
What blocks you to put if conditions inside the mentioned map function? On Mon, Feb 11, 2019 at 10:31 AM Serega Sheypak wrote: > Yeah, but I don't need to crash entire app, I want to fail several tasks > or executors and then wait for completion. > > вс, 10 февр. 2019 г. в 21:49, Gabor Somogyi

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-11 Thread Serega Sheypak
Yeah, but I don't need to crash entire app, I want to fail several tasks or executors and then wait for completion. вс, 10 февр. 2019 г. в 21:49, Gabor Somogyi : > Another approach is adding artificial exception into the application's > source code like this: > > val query = input.toDS.map(_ /

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Gabor Somogyi
Another approach is adding artificial exception into the application's source code like this: val query = input.toDS.map(_ / 0).writeStream.format("console").start() G On Sun, Feb 10, 2019 at 9:36 PM Serega Sheypak wrote: > Hi BR, > thanks for your reply. I want to mimic the issue and kill

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Serega Sheypak
Hi BR, thanks for your reply. I want to mimic the issue and kill tasks at a certain stage. Killing executor is also an option for me. I'm curious how do core spark contributors test spark fault tolerance? вс, 10 февр. 2019 г. в 16:57, Gabor Somogyi : > Hi Serega, > > If I understand your

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Gabor Somogyi
Hi Serega, If I understand your problem correctly you would like to kill one executor only and the rest of the app has to be untouched. If that's true yarn -kill is not what you want because it stops the whole application. I've done similar thing when tested/testing Spark's HA features. - jps

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Jörn Franke
yarn application -kill applicationid ? > Am 10.02.2019 um 13:30 schrieb Serega Sheypak : > > Hi there! > I have weird issue that appears only when tasks fail at specific stage. I > would like to imitate failure on my own. > The plan is to run problematic app and then kill entire executor or

Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Serega Sheypak
Hi there! I have weird issue that appears only when tasks fail at specific stage. I would like to imitate failure on my own. The plan is to run problematic app and then kill entire executor or some tasks when execution reaches certain stage. Is it do-able?