And it might not work completely. Spark only officially supports JDK 8.
I’m not sure if JDK 9 and + support is complete?
From: Jungtaek Lim
Sent: Thursday, February 7, 2019 5:22 AM
To: Gabor Somogyi
Cc: Hande, Ranjit Dilip (Ranjit); user@spark.apache.org
Subject
Another approach is adding artificial exception into the application's
source code like this:
val query = input.toDS.map(_ / 0).writeStream.format("console").start()
G
On Sun, Feb 10, 2019 at 9:36 PM Serega Sheypak
wrote:
> Hi BR,
> thanks for your reply. I want to mimic the issue and kill ta
Hi BR,
thanks for your reply. I want to mimic the issue and kill tasks at a
certain stage. Killing executor is also an option for me.
I'm curious how do core spark contributors test spark fault tolerance?
вс, 10 февр. 2019 г. в 16:57, Gabor Somogyi :
> Hi Serega,
>
> If I understand your problem
Hi Serega,
If I understand your problem correctly you would like to kill one executor
only and the rest of the app has to be untouched.
If that's true yarn -kill is not what you want because it stops the whole
application.
I've done similar thing when tested/testing Spark's HA features.
- jps -vl
yarn application -kill applicationid ?
> Am 10.02.2019 um 13:30 schrieb Serega Sheypak :
>
> Hi there!
> I have weird issue that appears only when tasks fail at specific stage. I
> would like to imitate failure on my own.
> The plan is to run problematic app and then kill entire executor or som
Hi there!
I have weird issue that appears only when tasks fail at specific stage. I
would like to imitate failure on my own.
The plan is to run problematic app and then kill entire executor or some
tasks when execution reaches certain stage.
Is it do-able?