Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22608#discussion_r230228691
  
    --- Diff: 
resource-managers/kubernetes/docker/src/test/scripts/run-kerberos-test.sh ---
    @@ -0,0 +1,40 @@
    +#!/usr/bin/env bash
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +sed -i -e 's/#//' -e 's/default_ccache_name/# default_ccache_name/' 
/etc/krb5.conf
    +export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
-Dsun.security.krb5.debug=true"
    +export HADOOP_JAAS_DEBUG=true
    +export HADOOP_ROOT_LOGGER=DEBUG,console
    +cp ${TMP_KRB_LOC} /etc/krb5.conf
    +cp ${TMP_CORE_LOC} /opt/spark/hconf/core-site.xml
    +cp ${TMP_HDFS_LOC} /opt/spark/hconf/hdfs-site.xml
    +mkdir -p /etc/krb5.conf.d
    +/opt/spark/bin/spark-submit \
    +      --deploy-mode cluster \
    +      --class ${CLASS_NAME} \
    +      --master k8s://${MASTER_URL} \
    +      --conf spark.kubernetes.namespace=${NAMESPACE} \
    +      --conf spark.executor.instances=1 \
    +      --conf spark.app.name=spark-hdfs \
    +      --conf 
spark.driver.extraClassPath=/opt/spark/hconf/core-site.xml:/opt/spark/hconf/hdfs-site.xml:/opt/spark/hconf/yarn-site.xml:/etc/krb5.conf
 \
    --- End diff --
    
    Adding files to the classpath does not do anything.
    
    ```
    $ scala -cp /etc/krb5.conf
    scala> getClass().getResource("/krb5.conf")
    res0: java.net.URL = null
    
    $ scala -cp /etc
    scala> getClass().getResource("/krb5.conf")
    res0: java.net.URL = file:/etc/krb5.conf
    ```
    
    So this seems not needed. Also because I'd expect spark-submit or the k8s 
backend code to add the hadoop conf to the driver's classpath somehow.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to