Previous example was bad paste( I tried a lot of variants, so sorry for
wrong paste )
PS C:\WINDOWS\system32> spark-submit --master k8s://https://ip:8443
--deploy-mode cluster  --name spark-pi --class
org.apache.spark.examples.SparkPi
--conf spark.executor.instances=1 --executor-memory 1G --conf
spark.kubernete
s.container.image=andrusha/spark-k8s:2.3.0-hadoop2.7
local:///opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
Returns
Image:
andrusha/spark-k8s:2.3.0-hadoop2.7
Environment variables:
SPARK_DRIVER_MEMORY: 1g
SPARK_DRIVER_CLASS: org.apache.spark.examples.SparkPi
SPARK_DRIVER_ARGS:
SPARK_DRIVER_BIND_ADDRESS:
SPARK_MOUNTED_CLASSPATH:
/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
SPARK_JAVA_OPT_0: -Dspark.kubernetes.driver.pod.name
=spark-pi-46f48a0974d43341886076bc3c5f31c4-driver
SPARK_JAVA_OPT_1:
-Dspark.kubernetes.executor.podNamePrefix=spark-pi-46f48a0974d43341886076bc3c5f31c4
SPARK_JAVA_OPT_2: -Dspark.app.name=spark-pi
SPARK_JAVA_OPT_3:
-Dspark.driver.host=spark-pi-46f48a0974d43341886076bc3c5f31c4-driver-svc.default.svc
SPARK_JAVA_OPT_4: -Dspark.submit.deployMode=cluster
SPARK_JAVA_OPT_5: -Dspark.driver.blockManager.port=7079
SPARK_JAVA_OPT_6: -Dspark.master=k8s://https://ip:8443
SPARK_JAVA_OPT_7:
-Dspark.jars=/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar,/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
SPARK_JAVA_OPT_8:
-Dspark.kubernetes.container.image=andrusha/spark-k8s:2.3.0-hadoop2.7
SPARK_JAVA_OPT_9: -Dspark.executor.instances=1
SPARK_JAVA_OPT_10: -Dspark.app.id=spark-16eb67d8953e418aba96c2d12deecd11
SPARK_JAVA_OPT_11: -Dspark.executor.memory=1G
SPARK_JAVA_OPT_12: -Dspark.driver.port=7078


-Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS
$SPARK_DRIVER_ARGS)
+ exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java -
Dspark.app.id=spark-16eb67d8953e418aba96c2d12deecd11
-Dspark.executor.memory=1G -Dspark.driver.port=7078
-Dspark.driver.blockManager.port=7079 -Dspark.submit.deployMode=cluster
-Dspark.jars=/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar,/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
-Dspark.master=k8s://https://172.20.10.12:8443
-Dspark.kubernetes.executor.podNamePrefix=spark-pi-46f48a0974d43341886076bc3c5f31c4
-Dspark.kubernetes.driver.pod.name=spark-pi-46f48a0974d43341886076bc3c5f31c4-driver
-Dspark.driver.host=spark-pi-46f48a0974d43341886076bc3c5f31c4-driver-svc.default.svc
-Dspark.app.name=spark-pi -Dspark.executor.instances=1
-Dspark.kubernetes.container.image=andrusha/spark-k8s:2.3.0-hadoop2.7 -cp
':/opt/spark/jars/*:/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar'
-Xms1g -Xmx1g -Dspark.driver.bindAddress=172.17.0.2
org.apache.spark.examples.SparkPi
Error: Could not find or load main class org.apache.spark.examples.SparkPi

Found this stackoverflow question
https://stackoverflow.com/questions/49331570/spark-2-3-minikube-kubernetes-windows-demo-sparkpi-not-found
but there is no answer.
I also checked container file system, it contains
/opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar



2018-04-11 1:17 GMT+08:00 Yinan Li <liyinan...@gmail.com>:

> The example jar path should be local:///opt/spark/examples/*jars*
> /spark-examples_2.11-2.3.0.jar.
>
> On Tue, Apr 10, 2018 at 1:34 AM, Dmitry <frostb...@gmail.com> wrote:
>
>> Hello spent a lot of time to find what I did wrong , but not found.
>> I have a minikube WIndows based cluster ( Hyper V as hypervisor ) and try
>> to run examples against Spark 2.3. Tried several  docker images builds:
>> * several  builds that I build myself
>> * andrusha/spark-k8s:2.3.0-hadoop2.7 from docker  hub
>> But when I try to submit job driver log returns  class not found exception
>> org.apache.spark.examples.SparkPi
>>
>> spark-submit --master k8s://https://ip:8443  --deploy-mode cluster
>> --name spark-pi --class org.apache.spark.examples.SparkPi --conf
>> spark.executor.instances=1 --executor-memory 1G --conf spark.kubernete
>> s.container.image=andrusha/spark-k8s:2.3.0-hadoop2.7
>> local:///opt/spark/examples/spark-examples_2.11-2.3.0.jar
>>
>> I tried to use https://github.com/apache-spark-on-k8s/spark fork and it
>> is works without problems, more complex examples work also.
>>
>
>

Reply via email to