Xiu Juan Xiang created SPARK-33340:
--------------------------------------

             Summary: spark run on kubernetes has Could not load KUBERNETES 
classes issue
                 Key: SPARK-33340
                 URL: https://issues.apache.org/jira/browse/SPARK-33340
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 3.0.1
         Environment: Kubernete 1.16

Spark (master branch code)
            Reporter: Xiu Juan Xiang


Hi, I am trying to run spark on my kubernetes cluster (it's not a minikube 
cluster). And I follow this doc: 
[https://spark.apache.org/docs/latest/running-on-kubernetes.html] to create 
spark docker image and then submit the application step by step. However, it 
failed and I check the log of spark driver, it showed below error:

```+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf 
spark.driver.bindAddress=172.30.140.13 --deploy-mode client --properties-file 
/opt/spark/conf/spark.properties --class org.apache.spark.deploy.PythonRunner 
file:/root/Work/spark/examples/src/main/python/wordcount.py+ exec /usr/bin/tini 
-s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.30.140.13 
--deploy-mode client --properties-file /opt/spark/conf/spark.properties --class 
org.apache.spark.deploy.PythonRunner 
file:/root/Work/spark/examples/src/main/python/wordcount.pyException in thread 
"main" org.apache.spark.SparkException: Could not load KUBERNETES classes. This 
copy of Spark may not have been compiled with KUBERNETES support. at 
org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:942) at 
org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:265)
 at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:877)
 at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1013) at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1022) at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

```

I am not sure if I am missing which step. I have been blocked here several 
days. Cloud you please help me about this? Thanks in advance!

 

By the way, below is the step I did:
 # Prepare a kubernetes cluster and check I have appropriate permissions to 
list, create, edit and delete pods;
About this, I am sure, I have all necessary permissions.
 # Build distribution
```
./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr -Phive 
-Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
```
 # Build spark docker image
```

./bin/docker-image-tool.sh spark -t latest build
```

 # submit application 
```

./bin/spark-submit --master 
k8s://https://c7.us-south.containers.cloud.ibm.com:31937 --deploy-mode cluster 
--name spark-pi --class org.apache.spark.examples.SparkPi --conf 
spark.executor.instances=5 --conf 
spark.kubernetes.container.image=docker.io/bluebosh/spark:python3 
examples/src/main/python/wordcount.py
```
BTW, I am sure the master is correct and also my docker image has contained 
python.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to