[ 
https://issues.apache.org/jira/browse/SPARK-26365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17422001#comment-17422001
 ] 

Vivien Brissat edited comment on SPARK-26365 at 9/29/21, 8:09 AM:
------------------------------------------------------------------

Hello,

Have same issues, in a kubernetes cluster the Spark-Submit does not receive 
Drivers state, in deploy-mode cluster.

(We use Spark in Argo Workflow)

Since this issue is old and have no "official" answer, i just up the subject :) 

Thanks in advance, it's a real issue, dont know why it is "minor" 

Regards, Vivien


was (Author: vivien):
Hello,

Have same issues, in a kubernetes cluster the Spark-Submit does not receive 
Drivers state, in deploy-mode cluster.

Since this issue is old and have no "official" answer, i just up the subject :) 

Thanks in advance, it's a real issue, dont know why it is "minor" 

Regards, Vivien

> spark-submit for k8s cluster doesn't propagate exit code
> --------------------------------------------------------
>
>                 Key: SPARK-26365
>                 URL: https://issues.apache.org/jira/browse/SPARK-26365
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes, Spark Core, Spark Submit
>    Affects Versions: 2.3.2, 2.4.0
>            Reporter: Oscar Bonilla
>            Priority: Minor
>         Attachments: spark-2.4.5-raise-exception-k8s-failure.patch, 
> spark-3.0.0-raise-exception-k8s-failure.patch
>
>
> When launching apps using spark-submit in a kubernetes cluster, if the Spark 
> applications fails (returns exit code = 1 for example), spark-submit will 
> still exit gracefully and return exit code = 0.
> This is problematic, since there's no way to know if there's been a problem 
> with the Spark application.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to