Hello,

I have already posted a message with the exact same problem, and proposed a 
patch (the subject is "Application failure in yarn-cluster mode").
Can you test it, and see if it works for you?
I would be glad too if someone can confirm that it is a bug in Spark 1.1.0.

Regards,
Christophe.

On 14/10/2014 03:15, Jimmy McErlain wrote:
BTW this has always worked for me before until we upgraded the cluster to Spark 
1.1.1...
J
[https://mailfoogae.appspot.com/t?sender=aamltbXlAc2VsbHBvaW50cy5jb20%3D&type=zerocontent&guid=92430839-642b-4921-8d42-f266e48bcdfe]ᐧ





JIMMY MCERLAIN

DATA SCIENTIST (NERD)

. . . . . . . . . . . . . . . . . .

[http://assetsw.sellpoint.net/IA/creative_services/logo_2014/sellpoints_logo_black_transparent_170x81.png]

IF WE CAN’T DOUBLE YOUR SALES,

ONE OF US IS IN THE WRONG BUSINESS.


E: ji...@sellpoints.com<mailto:ji...@sellpoints.com>

M: 510.303.7751

On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA 
<aharipriy...@gmail.com<mailto:aharipriy...@gmail.com>> wrote:
Helo,

Can you check if  the jar file is available in the target->scala-2.10 folder?

When you use sbt package to make the jar file, that is where the jar file would 
be located.

The following command works well for me:


spark-submit --class “Classname"   --master yarn-cluster jarfile(withcomplete 
path)

Can you try checking  with this initially and later add other options?

On Mon, Oct 13, 2014 at 7:36 PM, Jimmy 
<ji...@sellpoints.com<mailto:ji...@sellpoints.com>> wrote:
Having the exact same error with the exact same jar.... Do you work for 
Altiscale? :)
J

Sent from my iPhone

On Oct 13, 2014, at 5:33 PM, Andy Srine 
<andy.sr...@gmail.com<mailto:andy.sr...@gmail.com>> wrote:


Hi Guys,


Spark rookie here. I am getting a file not found exception on the --jars. This 
is on the yarn cluster mode and I am running the following command on our 
recently upgraded Spark 1.1.1 environment.


./bin/spark-submit --verbose --master yarn --deploy-mode cluster --class 
myEngine --driver-memory 1g --driver-library-path 
/hadoop/share/hadoop/mapreduce/lib/hadoop-lzo-0.4.18-201406111750.jar 
--executor-memory 5g --executor-cores 5 --jars 
/home/andy/spark/lib/joda-convert-1.2.jar --queue default --num-executors 4 
/home/andy/spark/lib/my-spark-lib_1.0.jar


This is the error I am hitting. Any tips would be much appreciated. The file 
permissions looks fine on my local disk.


14/10/13 22:49:39 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster 
with FAILED

14/10/13 22:49:39 INFO impl.AMRMClientImpl: Waiting for application to be 
successfully unregistered.

Exception in thread "Driver" java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 3 in stage 1.0 failed 4 times, most recent failure: Lost task 3.3 in stage 
1.0 (TID 12, 122-67.vb2.company.com<http://122-67.vb2.company.com>): 
java.io.FileNotFoundException: ./joda-convert-1.2.jar (Permission denied)

        java.io.FileOutputStream.open(Native Method)

        java.io.FileOutputStream.<init>(FileOutputStream.java:221)

        com.google.common.io.Files$FileByteSink.openStream(Files.java:223)

        com.google.common.io.Files$FileByteSink.openStream(Files.java:211)



Thanks,
Andy




--
Regards,
Haripriya Ayyalasomayajula




________________________________
Kelkoo SAS
Société par Actions Simplifiée
Au capital de € 4.168.964,30
Siège social : 8, rue du Sentier 75002 Paris
425 093 069 RCS Paris

Ce message et les pièces jointes sont confidentiels et établis à l'attention 
exclusive de leurs destinataires. Si vous n'êtes pas le destinataire de ce 
message, merci de le détruire et d'en avertir l'expéditeur.

Reply via email to