Benjamin Miao CAI created SPARK-29487: -----------------------------------------
Summary: Ability to run Spark Kubernetes other than from /opt/spark Key: SPARK-29487 URL: https://issues.apache.org/jira/browse/SPARK-29487 Project: Spark Issue Type: Improvement Components: Kubernetes, Spark Submit Affects Versions: 2.4.4 Reporter: Benjamin Miao CAI On spark kubernetes Dockerfile, the spark binaries are copied to */opt/spark.* If we try to create our own Dockerfile without using */opt/spark* then the image will not run. After looking at the source code, it seem that in various places, the path is hard-coded to */opt/spark* *Example :* Constants.scala : {color:#808080}// Spark app configs for containers {color}{color:#000080}val {color}SPARK_CONF_VOLUME = {color:#008000}"spark-conf-volume"{color} *{color:#000080}val {color}SPARK_CONF_DIR_INTERNAL = {color:#008000}"/opt/spark/conf"{color}* Is it possible to make this configurable so we can put spark elsewhere than /opt/. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org