Tobias Munk created SPARK-24599:
-----------------------------------

             Summary: SPARK_MOUNTED_CLASSPATH contains incorrect semicolon on 
Windows
                 Key: SPARK-24599
                 URL: https://issues.apache.org/jira/browse/SPARK-24599
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes, Windows
    Affects Versions: 2.3.1, 2.3.0
            Reporter: Tobias Munk


When running spark-submit in cluster mode on kubernetes on a windows machine, 
he environment variable {{SPARK_MOUNTED_CLASSPATH}} does incorrectly contain a 
semicolon:

 
{code:java}
$ echo $SPARK_MOUNTED_CLASSPATH

/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar
{code}
 

When running spark-submit, the driver aborts:

 ./bin/spark-submit.cmd --master k8s://https://localhost:6445 --deploy-mode 
cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.insta
nces=1 --conf spark.kubernetes.container.image=spark:k8s-spark1 
local:///opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar
{code:java}
kubectl logs  spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver
++ id -u
+ myuid=0
++ id -g
+ mygid=0
++ getent passwd 0
+ uidentry=root:x:0:0:root:/root:/bin/ash
+ '[' -z root:x:0:0:root:/root:/bin/ash ']'
+ SPARK_K8S_CMD=driver
+ '[' -z driver ']'
+ shift 1
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sed 's/[^=]*=\(.*\)/\1/g'
+ sort -t_ -k4 -n
+ readarray -t SPARK_JAVA_OPTS
+ '[' -n 
'/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar'
 ']'
+ 
SPARK_CLASSPATH=':/opt/spark/jars/*:/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar'
+ '[' -n '' ']'
+ case "$SPARK_K8S_CMD" in
+ CMD=(${JAVA_HOME}/bin/java "${SPARK_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" 
-Xms$SPARK_DRIVER_MEMORY -Xmx$SPARK_DRIVER_MEMORY 
-Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS 
$SPARK_DRIVER_ARGS)
+ exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java 
-Dspark.kubernetes.driver.pod.name=spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver
 
-Dspark.jars=/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar,/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar
 -Dspark.app.name=spark-pi -Dspark.submit.deployMode=cluster 
-Dspark.driver.blockManager.port=7079 
-Dspark.kubernetes.executor.podNamePrefix=spark-pi-b12d0501f2fc309d89e8634937b7f52c
 -Dspark.executor.instances=1 
-Dspark.app.id=spark-65f2c8cc3ccf462694a67c18e947158c -Dspark.driver.port=7078 
-Dspark.master=k8s://https://localhost:6445 
-Dspark.kubernetes.container.image=spark:k8s-spark1 
-Dspark.driver.host=spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver-svc.default.svc
 -cp 
':/opt/spark/jars/*:/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar'
 -Xms1g -Xmx1g -Dspark.driver.bindAddress=10.1.0.150 
org.apache.spark.examples.SparkPi
Error: Could not find or load main class org.apache.spark.examples.SparkPi{code}
 

You can overwrite {{SPARK_MOUNTED_CLASSPATH}} in 
{{$SPARK_HOME/kubernetes/dockerfiles/spark/entrypoint.sh}} removing the part 
with the semicolon, and then rebuild the docker image with 
{{$SPARK_HOME/bin/docker-image-tool.sh}}. After that, spark-submit does succeed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to