[ https://issues.apache.org/jira/browse/SPARK-26886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
luzengxiang updated SPARK-26886: -------------------------------- Description: When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. {quote} val nothing = inputData.barrier().mapPartitions {_ => val barrierTask = BarrierTaskContext.get() // save data to disk barrierTask.barrier() barrierTask.barrier() // launch external process, eg MPI Task + TensorFlow } {quote} This Jira is talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt. was: When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. val nothing = inputData.barrier().mapPartitions{ _ => val barrierTask = BarrierTaskContext.get() //save data to disk barrierTask.barrier() //launch external process, eg MPI Task + TensorFlow } This Jira is talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt. > Proper termination of external processes launched by the worker > --------------------------------------------------------------- > > Key: SPARK-26886 > URL: https://issues.apache.org/jira/browse/SPARK-26886 > Project: Spark > Issue Type: New JIRA Project > Components: Spark Core > Affects Versions: 2.4.0 > Reporter: luzengxiang > Priority: Minor > > When Embedding Deeplearning Framework in spark, spark worker has to launch > external process(eg. MPI task) in some cases. > {quote} val nothing = inputData.barrier().mapPartitions > {_ => > val barrierTask = BarrierTaskContext.get() > // save data to disk barrierTask.barrier() > barrierTask.barrier() > // launch external process, eg MPI Task + TensorFlow > } > {quote} > This Jira is talk about properly terminating external processes launched by > spark worker, when spark task is killed or interrupt. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org