[ https://issues.apache.org/jira/browse/SPARK-25500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16625404#comment-16625404 ]
Yinan Li edited comment on SPARK-25500 at 9/24/18 5:51 AM: ----------------------------------------------------------- We don't plan to add more configuration properties for pod customization as we move to a pod template model. See https://issues.apache.org/jira/browse/SPARK-24434. It supports all use cases you mentioned above. BTW: we already have {{spark.kubernetes.[driver|executor].secrets.[SecretName]=[MountPath]}} since Spark 2.3. was (Author: liyinan926): We don't plan to add more configuration properties for pod customization as we move to a pod template model. See https://issues.apache.org/jira/browse/SPARK-24434. It supports all use cases you mentioned above. BTW: we already have {{spark.kubernetes.\{driver|executor}.secrets.[SecretName]=[MountPath] }}since Spark 2.3{{.}} > Specify configmap and secrets in Spark driver and executor pods in Kubernetes > ----------------------------------------------------------------------------- > > Key: SPARK-25500 > URL: https://issues.apache.org/jira/browse/SPARK-25500 > Project: Spark > Issue Type: Improvement > Components: Kubernetes > Affects Versions: 2.3.1 > Reporter: Abhishek Rao > Priority: Minor > > This uses SPARK-23529. Support for specifying configmap and secrets as > spark-configuration is requested. > Using PR #22146, the above functionality can be achieved by passing template > file. However, for spark properties (like log4j.properties, fairscheduler.xml > and metrics.properties), we are proposing this approach as this is native to > other configuration options specifications in spark. > The configmaps and secrets have to be pre-created before using this as spark > configuration. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org