Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20192#discussion_r160874070
  
    --- Diff: 
resource-managers/kubernetes/docker/src/main/dockerfiles/executor/Dockerfile ---
    @@ -1,35 +0,0 @@
    -#
    -# Licensed to the Apache Software Foundation (ASF) under one or more
    -# contributor license agreements.  See the NOTICE file distributed with
    -# this work for additional information regarding copyright ownership.
    -# The ASF licenses this file to You under the Apache License, Version 2.0
    -# (the "License"); you may not use this file except in compliance with
    -# the License.  You may obtain a copy of the License at
    -#
    -#    http://www.apache.org/licenses/LICENSE-2.0
    -#
    -# Unless required by applicable law or agreed to in writing, software
    -# distributed under the License is distributed on an "AS IS" BASIS,
    -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -# See the License for the specific language governing permissions and
    -# limitations under the License.
    -#
    -
    -FROM spark-base
    -
    -# Before building the docker image, first build and make a Spark 
distribution following
    -# the instructions in 
http://spark.apache.org/docs/latest/building-spark.html.
    -# If this docker file is being used in the context of building your images 
from a Spark
    -# distribution, the docker build command should be invoked from the top 
level directory
    -# of the Spark distribution. E.g.:
    -# docker build -t spark-executor:latest -f 
kubernetes/dockerfiles/executor/Dockerfile .
    -
    -COPY examples /opt/spark/examples
    -
    -CMD SPARK_CLASSPATH="${SPARK_HOME}/jars/*" && \
    -    env | grep SPARK_JAVA_OPT_ | sed 's/[^=]*=\(.*\)/\1/g' > 
/tmp/java_opts.txt && \
    -    readarray -t SPARK_EXECUTOR_JAVA_OPTS < /tmp/java_opts.txt && \
    -    if ! [ -z ${SPARK_MOUNTED_CLASSPATH}+x} ]; then 
SPARK_CLASSPATH="$SPARK_MOUNTED_CLASSPATH:$SPARK_CLASSPATH"; fi && \
    -    if ! [ -z ${SPARK_EXECUTOR_EXTRA_CLASSPATH+x} ]; then 
SPARK_CLASSPATH="$SPARK_EXECUTOR_EXTRA_CLASSPATH:$SPARK_CLASSPATH"; fi && \
    --- End diff --
    
    to clarify, by that I mean we no longer have the ability to customize 
different classpath for executor and driver.
    for reference, see spark.driver.extraClassPath vs 
spark.executor.extraClassPath


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to