[ 
https://issues.apache.org/jira/browse/SPARK-25500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16623335#comment-16623335
 ] 

Abhishek Rao commented on SPARK-25500:
--------------------------------------

Solution proposed is as follows

Add the following parameters while doing spark-submit

*For Configmap*

spark.kubernetes.driver.volumes.configMap.<volume-name>.mount.path=<mount 
location inside pod>

spark.kubernetes.driver.volumes.configMap.<volume-name>.options.name=<configmap 
name>

 

*For Secrets*

spark.kubernetes.driver.volumes.secret.<volume-name>.mount.path=<mount location 
inside pod>

spark.kubernetes.driver.volumes.secret.<volume-name>.options.name=<secret name>

 

Using these properties, mount the configmap and secrets to spark-driver and 
spark-executor pods.

Changes are required in the "KubernetesVolumeSpec" and 
"MountVolumesFeatureStep" . It will be on similar lines as -SPARK-23529.-

I'll plan to submit a PR for this shortly.

> Specify configmap and secrets in Spark driver and executor pods in Kubernetes
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-25500
>                 URL: https://issues.apache.org/jira/browse/SPARK-25500
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.3.1
>            Reporter: Abhishek Rao
>            Priority: Minor
>
> This uses SPARK-23529. Support for specifying configmap and secrets as 
> spark-configuration is requested.
> Using PR #22146, the above functionality can be achieved by passing template 
> file. However, for spark properties (like log4j.properties, fairscheduler.xml 
> and metrics.properties), we are proposing this approach as this is native to 
> other configuration options specifications in spark.
> The configmaps and secrets have to be pre-created before using this as spark 
> configuration.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to