I am using standalone deployment, with spark 1.4.1
When I submit the job, I get no error at the submission terminal. Then I
check the webui, I can find the driver section which has a my driver
submission, with this error: java.io.FileNotFoundException ... which point
the full path of my jar as
As I remembered you don't need to upload application jar manually, Spark
will do it for you when you use Spark submit. Would you mind posting out
your command of Spark submit?
On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz
wrote:
> Hi there,
>
> I am trying to use the
Hi there,
I am trying to use the "--deploy-mode cluster" option to submit my job
(spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
for my application jar. I can manually copy my application jar on all the
workers, but I was wondering if there is a way to submit the
Hi Saisai
I am using this command:
spark-submit --deploy-mode cluster --properties-file file.conf --class
myclass test-assembly-1.0.jar
The application start only if I manually copy test-assembly-1.0.jar in all
the worer (or the master, I don't remember) and provide the full path of
the file.
Are you running on standalone deploy mode, what Spark version are you
running?
Can you explain a little more specifically what exception occurs, how to
provide the jar to Spark?
I tried in my local machine with command:
./bin/spark-submit --verbose --master spark://hw12100.local:7077