On Tue, Oct 18, 2016 at 3:01 PM, Elkhan Dadashov <elkhan8...@gmail.com> wrote:
> Does my map task need to wait until Spark job finishes ?

No...

> Or is there any way, my map task finishes after launching Spark job, and I
> can still query and get status of Spark job outside of map task (or failure
> reason, if it has failed) ? (maybe by querying Spark job id ?)

...but if the SparkLauncher handle goes away, then you lose the
ability to track the app's state, unless you talk directly to the
cluster manager.

> I guess also if i want my Spark job to be killed, if corresponding delegator
> map task is killed, that means my map task needs to stay alive, so i still
> have SparkAppHandle reference ?

Correct, unless you talk directly to the cluster manager.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to