On Tue, May 22, 2018 at 12:45 AM, Makoto Hashimoto
wrote:
> local:///usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar
Is that the path of the jar inside your docker image? The default
image puts that in /opt/spark IIRC.
--
Marcelo
Hi,
I am trying to run spark job on kubernetes. Using local spark job works
fine as follows:
$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi
--master local[4] examples/jars/spark-examples_2.11-2.3.0.jar 100
..
2018-05-20 21:49:02 INFO DAGScheduler:54 - Job 0 finished: reduce