Re: Submitting with --deploy-mode cluster: uploading the jar

2015-10-01 Thread Christophe Schmitz
I am using standalone deployment, with spark 1.4.1

When I submit the job, I get no error at the submission terminal. Then I
check the webui, I can find the driver section which has a my driver
submission, with this error:  java.io.FileNotFoundException ... which point
the full path of my jar as it is on my computer (where I run spark-submit).
Now if I create this full path on the worker(s) with my jar inside, the
application get processed.


Thanks!


On Thu, Oct 1, 2015 at 10:51 AM, Saisai Shao  wrote:

> Are you running on standalone deploy mode, what Spark version are you
> running?
>
> Can you explain a little more specifically what exception occurs, how to
> provide the jar to Spark?
>
> I tried in my local machine with command:
>
> ./bin/spark-submit --verbose --master spark://hw12100.local:7077
> --deploy-mode cluster --class org.apache.spark.examples.SparkPi
> examples/target/scala-2.10/spark-examples-1.5.1-hadoop2.7.1.jar
>
> Seems Spark will upload this examples jar automatically, don't need to
> handle it manually.
>
> Thanks
> Saisai
>
>
>
> On Thu, Oct 1, 2015 at 8:36 AM, Christophe Schmitz 
> wrote:
>
>> Hi Saisai
>>
>> I am using this command:
>> spark-submit --deploy-mode cluster --properties-file file.conf --class
>> myclass test-assembly-1.0.jar
>>
>> The application start only if I manually copy test-assembly-1.0.jar in
>> all the worer (or the master, I don't remember) and provide the full path
>> of the file.
>>
>> On the other hand with --deploy-mode client I don't need to do that, but
>> then I need to accept incoming connection in my client to serve the jar
>> (which is not possible behind a firewall I don't control)
>>
>> Thanks,
>>
>> Christophe
>>
>>
>> On Wed, Sep 30, 2015 at 5:19 PM, Saisai Shao 
>> wrote:
>>
>>> As I remembered you don't need to upload application jar manually, Spark
>>> will do it for you when you use Spark submit. Would you mind posting out
>>> your command of Spark submit?
>>>
>>>
>>> On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz >> > wrote:
>>>
 Hi there,

 I am trying to use the "--deploy-mode cluster" option to submit my job
 (spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
 for my application jar. I can manually copy my application jar on all the
 workers, but I was wondering if there is a way to submit the application
 jar when running spark-submit.

 Thanks!

>>>
>>>
>>
>


Re: Submitting with --deploy-mode cluster: uploading the jar

2015-09-30 Thread Saisai Shao
As I remembered you don't need to upload application jar manually, Spark
will do it for you when you use Spark submit. Would you mind posting out
your command of Spark submit?


On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz 
wrote:

> Hi there,
>
> I am trying to use the "--deploy-mode cluster" option to submit my job
> (spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
> for my application jar. I can manually copy my application jar on all the
> workers, but I was wondering if there is a way to submit the application
> jar when running spark-submit.
>
> Thanks!
>


Submitting with --deploy-mode cluster: uploading the jar

2015-09-30 Thread Christophe Schmitz
Hi there,

I am trying to use the "--deploy-mode cluster" option to submit my job
(spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
for my application jar. I can manually copy my application jar on all the
workers, but I was wondering if there is a way to submit the application
jar when running spark-submit.

Thanks!


Re: Submitting with --deploy-mode cluster: uploading the jar

2015-09-30 Thread Christophe Schmitz
Hi Saisai

I am using this command:
spark-submit --deploy-mode cluster --properties-file file.conf --class
myclass test-assembly-1.0.jar

The application start only if I manually copy test-assembly-1.0.jar in all
the worer (or the master, I don't remember) and provide the full path of
the file.

On the other hand with --deploy-mode client I don't need to do that, but
then I need to accept incoming connection in my client to serve the jar
(which is not possible behind a firewall I don't control)

Thanks,

Christophe


On Wed, Sep 30, 2015 at 5:19 PM, Saisai Shao  wrote:

> As I remembered you don't need to upload application jar manually, Spark
> will do it for you when you use Spark submit. Would you mind posting out
> your command of Spark submit?
>
>
> On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz 
> wrote:
>
>> Hi there,
>>
>> I am trying to use the "--deploy-mode cluster" option to submit my job
>> (spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
>> for my application jar. I can manually copy my application jar on all the
>> workers, but I was wondering if there is a way to submit the application
>> jar when running spark-submit.
>>
>> Thanks!
>>
>
>


Re: Submitting with --deploy-mode cluster: uploading the jar

2015-09-30 Thread Saisai Shao
Are you running on standalone deploy mode, what Spark version are you
running?

Can you explain a little more specifically what exception occurs, how to
provide the jar to Spark?

I tried in my local machine with command:

./bin/spark-submit --verbose --master spark://hw12100.local:7077
--deploy-mode cluster --class org.apache.spark.examples.SparkPi
examples/target/scala-2.10/spark-examples-1.5.1-hadoop2.7.1.jar

Seems Spark will upload this examples jar automatically, don't need to
handle it manually.

Thanks
Saisai



On Thu, Oct 1, 2015 at 8:36 AM, Christophe Schmitz 
wrote:

> Hi Saisai
>
> I am using this command:
> spark-submit --deploy-mode cluster --properties-file file.conf --class
> myclass test-assembly-1.0.jar
>
> The application start only if I manually copy test-assembly-1.0.jar in all
> the worer (or the master, I don't remember) and provide the full path of
> the file.
>
> On the other hand with --deploy-mode client I don't need to do that, but
> then I need to accept incoming connection in my client to serve the jar
> (which is not possible behind a firewall I don't control)
>
> Thanks,
>
> Christophe
>
>
> On Wed, Sep 30, 2015 at 5:19 PM, Saisai Shao 
> wrote:
>
>> As I remembered you don't need to upload application jar manually, Spark
>> will do it for you when you use Spark submit. Would you mind posting out
>> your command of Spark submit?
>>
>>
>> On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz 
>> wrote:
>>
>>> Hi there,
>>>
>>> I am trying to use the "--deploy-mode cluster" option to submit my job
>>> (spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
>>> for my application jar. I can manually copy my application jar on all the
>>> workers, but I was wondering if there is a way to submit the application
>>> jar when running spark-submit.
>>>
>>> Thanks!
>>>
>>
>>
>