BTW this has always worked for me before until we upgraded the cluster to
Spark 1.1.1...
J
ᐧ




*JIMMY MCERLAIN*

DATA SCIENTIST (NERD)

*. . . . . . . . . . . . . . . . . .*


*IF WE CAN’T DOUBLE YOUR SALES,*



*ONE OF US IS IN THE WRONG BUSINESS.*

*E*: ji...@sellpoints.com

*M*: *510.303.7751*

On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA <
aharipriy...@gmail.com> wrote:

> Helo,
>
> Can you check if  the jar file is available in the target->scala-2.10
> folder?
>
> When you use sbt package to make the jar file, that is where the jar file
> would be located.
>
> The following command works well for me:
>
> spark-submit --class “Classname"   --master yarn-cluster
> jarfile(withcomplete path)
>
> Can you try checking  with this initially and later add other options?
>
> On Mon, Oct 13, 2014 at 7:36 PM, Jimmy <ji...@sellpoints.com> wrote:
>
>> Having the exact same error with the exact same jar.... Do you work for
>> Altiscale? :)
>> J
>>
>> Sent from my iPhone
>>
>> On Oct 13, 2014, at 5:33 PM, Andy Srine <andy.sr...@gmail.com> wrote:
>>
>> Hi Guys,
>>
>>
>> Spark rookie here. I am getting a file not found exception on the --jars.
>> This is on the yarn cluster mode and I am running the following command on
>> our recently upgraded Spark 1.1.1 environment.
>>
>>
>> ./bin/spark-submit --verbose --master yarn --deploy-mode cluster --class
>> myEngine --driver-memory 1g --driver-library-path
>> /hadoop/share/hadoop/mapreduce/lib/hadoop-lzo-0.4.18-201406111750.jar
>> --executor-memory 5g --executor-cores 5 --jars
>> /home/andy/spark/lib/joda-convert-1.2.jar --queue default --num-executors 4
>> /home/andy/spark/lib/my-spark-lib_1.0.jar
>>
>>
>> This is the error I am hitting. Any tips would be much appreciated. The
>> file permissions looks fine on my local disk.
>>
>>
>> 14/10/13 22:49:39 INFO yarn.ApplicationMaster: Unregistering
>> ApplicationMaster with FAILED
>>
>> 14/10/13 22:49:39 INFO impl.AMRMClientImpl: Waiting for application to be
>> successfully unregistered.
>>
>> Exception in thread "Driver" java.lang.reflect.InvocationTargetException
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:606)
>>
>> at
>> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
>>
>> Caused by: org.apache.spark.SparkException: Job aborted due to stage
>> failure: Task 3 in stage 1.0 failed 4 times, most recent failure: Lost task
>> 3.3 in stage 1.0 (TID 12, 122-67.vb2.company.com):
>> java.io.FileNotFoundException: ./joda-convert-1.2.jar (Permission denied)
>>
>>         java.io.FileOutputStream.open(Native Method)
>>
>>         java.io.FileOutputStream.<init>(FileOutputStream.java:221)
>>
>>         com.google.common.io.Files$FileByteSink.openStream(Files.java:223)
>>
>>         com.google.common.io.Files$FileByteSink.openStream(Files.java:211)
>>
>>
>> Thanks,
>> Andy
>>
>>
>
>
> --
> Regards,
> Haripriya Ayyalasomayajula
>
>

Reply via email to