[ https://issues.apache.org/jira/browse/SPARK-27697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16859717#comment-16859717 ]
Henry Yu commented on SPARK-27697: ---------------------------------- @[~dongjoon] I fix it by adding a pod phase judgement . If driver pod exit without succeeded , I will throw a sparkException in org.apache.spark.deploy.k8s.submit.LoggingPodStatusWatcherImpl#awaitCompletion > KubernetesClientApplication alway exit with 0 > --------------------------------------------- > > Key: SPARK-27697 > URL: https://issues.apache.org/jira/browse/SPARK-27697 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 2.4.0 > Reporter: Henry Yu > Priority: Minor > > When submit spark job to k8s, workflows try to get job status by submission > process exit code. > yarnClient will throw sparkExceptions when application failed. > I have fix this in out home maintained spark version. I can make a pr on this > issue. > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org