[ https://issues.apache.org/jira/browse/SPARK-24434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17218056#comment-17218056 ]
Liu Runzhong edited comment on SPARK-24434 at 10/21/20, 2:34 AM: ----------------------------------------------------------------- I think yes, you need to provide a pod template file when spark-submit as follows [~prakki79] ```yaml apiversion: v1 kind: Pod spec: containers: - name: sidecar-container image: nginx:1.7.9 ports: - containerPort: 80 ``` was (Author: runzhliu): I think yes, you need to provide a pod template file when spark-submit as follows [~prakki79] ```yaml apiversion: v1 kind: Pod spec: containers: - name: sidecar-container # Simple sidecar: display log files using nginx. # In reality, this sidecar would be a custom image # that uploads logs to a third-party or storage service. image: nginx:1.7.9 ports: - containerPort: 80 ``` > Support user-specified driver and executor pod templates > -------------------------------------------------------- > > Key: SPARK-24434 > URL: https://issues.apache.org/jira/browse/SPARK-24434 > Project: Spark > Issue Type: New Feature > Components: Kubernetes, Spark Core > Affects Versions: 2.4.0 > Reporter: Yinan Li > Assignee: Onur Satici > Priority: Major > Fix For: 3.0.0 > > > With more requests for customizing the driver and executor pods coming, the > current approach of adding new Spark configuration options has some serious > drawbacks: 1) it means more Kubernetes specific configuration options to > maintain, and 2) it widens the gap between the declarative model used by > Kubernetes and the configuration model used by Spark. We should start > designing a solution that allows users to specify pod templates as central > places for all customization needs for the driver and executor pods. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org