[ 
https://issues.apache.org/jira/browse/SPARK-25922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16677276#comment-16677276
 ] 

Yinan Li commented on SPARK-25922:
----------------------------------

The application ID used to set the {{spark-app-selector}} label for the driver 
pod is from this line 
[https://github.com/apache/spark/blob/3404a73f4cf7be37e574026d08ad5cf82cfac871/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala#L217.]
 The application ID used to set the {{spark-app-selector}} label for the 
executor pod is from this line 
[https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/KubernetesClusterSchedulerBackend.scala#L87|https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/KubernetesClusterSchedulerBackend.scala#L87,].
 Agreed that it's problematic that two different labels are used.

> [K8] Spark Driver/Executor "spark-app-selector" label mismatch
> --------------------------------------------------------------
>
>                 Key: SPARK-25922
>                 URL: https://issues.apache.org/jira/browse/SPARK-25922
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4.0 RC4
>            Reporter: Anmol Khurana
>            Priority: Major
>
> Hi,
> I have been testing Spark 2.4.0 RC4 on Kubernetes  to run Python Spark 
> Applications and running into an issue where the AppId label on the driver 
> and executors mis-match. I am using the 
> [https://github.com/GoogleCloudPlatform/spark-on-k8s-operator] to run these 
> applications. 
> I see a spark.app.id of the form spark-* as  "spark-app-selector" label on 
> the driver as well as in the K8 config-map which gets created for the driver 
> via spark-submit . My guess is this is coming from 
> [https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala#L211]
>  
> But when the driver actually comes up and brings up executors etc. , I see 
> that the "spark-app-selector" label on the executors as well as the 
> spark.app.Id config within the user-code on the driver is something of the 
> form spark-application-* ( probably from 
> [https://github.com/apache/spark/blob/b19a28dea098c7d6188f8540429c50f42952d678/core/src/main/scala/org/apache/spark/SparkContext.scala#L511]
>  & 
> [https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26|https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26)]
>  )
> We were consuming this "spark-app-selector" label on the Driver Pod to get 
> the App Id and use it to look-up the app in SparkHistory server (among other 
> use-cases). but due to this mis-match, this logic no longer works. This was 
> working fine in Spark 2.2 fork for Kubernetes which i was using earlier. Is 
> this expected behavior and if yes, what's the correct way to fetch the 
> applicationId from outside the application ?  
> Let me know if I can provide any more details or if I am doing something 
> wrong. Here is an example run with different *spark-app-selector* label on 
> the driver/executor : 
>  
> {code:java}
> Name: pyfiles-driver
> Namespace: default
> Priority: 0
> PriorityClassName: <none>
> Start Time: Thu, 01 Nov 2018 18:19:46 -0700
> Labels: spark-app-selector=spark-b78bb10feebf4e2d98c11d7b6320e18f
>  spark-role=driver
>  sparkoperator.k8s.io/app-name=pyfiles
>  sparkoperator.k8s.io/launched-by-spark-operator=true
>  version=2.4.0
> Status: Running
> Name: pyfiles-1541121585642-exec-1
> Namespace: default
> Priority: 0
> PriorityClassName: <none>
> Start Time: Thu, 01 Nov 2018 18:24:02 -0700
> Labels: spark-app-selector=spark-application-1541121829445
>  spark-exec-id=1
>  spark-role=executor
>  sparkoperator.k8s.io/app-name=pyfiles
>  sparkoperator.k8s.io/launched-by-spark-operator=true
>  version=2.4.0
> Status: Pending
> {code}
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to