Github user skonto commented on a diff in the pull request: https://github.com/apache/spark/pull/21462#discussion_r192023771 --- Diff: docs/running-on-kubernetes.md --- @@ -121,8 +121,8 @@ This URI is the location of the example jar that is already in the Docker image. If your application's dependencies are all hosted in remote locations like HDFS or HTTP servers, they may be referred to by their appropriate remote URIs. Also, application dependencies can be pre-mounted into custom-built Docker images. -Those dependencies can be added to the classpath by referencing them with `local://` URIs and/or setting the -`SPARK_EXTRA_CLASSPATH` environment variable in your Dockerfiles. The `local://` scheme is also required when referring to --- End diff -- @mccheah yes putting under /opt/spark/jars is one option because its included by default in the classpath. I will add then SPARK_EXTRA_CLASSPATH back and will let the jars there added to the spark-submit in the container, sounds good?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org