Hi,

Try running following in the spark folder:

bin/*run-example *SparkPi 10

If this runs fine, just see the set of arguments being passed via this
script, and try in similar way.

Thanks,


On Thu, Oct 16, 2014 at 2:59 PM, Christophe Préaud <
christophe.pre...@kelkoo.com> wrote:

>  Hi,
>
> I have created a JIRA (SPARK-3967
> <https://issues.apache.org/jira/browse/SPARK-3967>), can you please
> confirm that you are hit by the same issue?
>
> Thanks,
> Christophe.
>
>
> On 15/10/2014 09:49, Christophe Préaud wrote:
>
> Hi Jimmy,
> Did you try my patch?
> The problem on my side was that the hadoop.tmp.dir  (in hadoop
> core-site.xml) was not handled properly by Spark when it is set on multiple
> partitions/disks, i.e.:
>
> <property>
>   <name>hadoop.tmp.dir</name>
>   <value>
> file:/d1/yarn/local,file:/d2/yarn/local,file:/d3/yarn/local,file:/d4/yarn/local,file:/d5/yarn/local,file:/d6/yarn/local,file:/d7/yarn/local
> </value>
> </property>
>
> Hence, you won't be hit by this bug if your hadoop.tmp.dir is set on one
> partition only.
> If your hadoop.tmp.dir is also set on several partitions, I agree that it
> looks like a bug in Spark.
>
> Christophe.
>
> On 14/10/2014 18:50, Jimmy McErlain wrote:
>
> So the only way that I could make this work was to build a fat jar file as
> suggested earlier.  To me (and I am no expert) it seems like this is a
> bug.  Everything was working for me prior to our upgrade to Spark 1.1 on
> Hadoop 2.2 but now it seems to not...  ie packaging my jars locally then
> pushing them out to the cluster and pointing them to corresponding
> dependent jars....
>
>  Sorry I cannot be more help!
> J
> ᐧ
>
>
>
>
>  *JIMMY MCERLAIN*
>
> DATA SCIENTIST (NERD)
>
> *. . . . . . . . . . . . . . . . . .*
>
>
>   *IF WE CAN’T DOUBLE YOUR SALES,*
>
>
>
> *ONE OF US IS IN THE WRONG BUSINESS. *
>
> *E*: ji...@sellpoints.com
>
> *M*: *510.303.7751*
>
> On Tue, Oct 14, 2014 at 4:59 AM, Christophe Préaud <
> christophe.pre...@kelkoo.com> wrote:
>
>>  Hello,
>>
>> I have already posted a message with the exact same problem, and proposed
>> a patch (the subject is "Application failure in yarn-cluster mode").
>> Can you test it, and see if it works for you?
>> I would be glad too if someone can confirm that it is a bug in Spark
>> 1.1.0.
>>
>> Regards,
>> Christophe.
>>
>>
>> On 14/10/2014 03:15, Jimmy McErlain wrote:
>>
>> BTW this has always worked for me before until we upgraded the cluster to
>> Spark 1.1.1...
>> J
>> ᐧ
>>
>>
>>
>>
>>  *JIMMY MCERLAIN*
>>
>> DATA SCIENTIST (NERD)
>>
>> *. . . . . . . . . . . . . . . . . .*
>>
>>
>>   *IF WE CAN’T DOUBLE YOUR SALES,*
>>
>>
>>
>> *ONE OF US IS IN THE WRONG BUSINESS. *
>>
>> *E*: ji...@sellpoints.com
>>
>> *M*: *510.303.7751 <510.303.7751>*
>>
>> On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA <
>> aharipriy...@gmail.com> wrote:
>>
>>> Helo,
>>>
>>>  Can you check if  the jar file is available in the target->scala-2.10
>>> folder?
>>>
>>>  When you use sbt package to make the jar file, that is where the jar
>>> file would be located.
>>>
>>>  The following command works well for me:
>>>
>>>  spark-submit --class “Classname"   --master yarn-cluster
>>> jarfile(withcomplete path)
>>>
>>> Can you try checking  with this initially and later add other options?
>>>
>>> On Mon, Oct 13, 2014 at 7:36 PM, Jimmy <ji...@sellpoints.com> wrote:
>>>
>>>>  Having the exact same error with the exact same jar.... Do you work
>>>> for Altiscale? :)
>>>> J
>>>>
>>>> Sent from my iPhone
>>>>
>>>> On Oct 13, 2014, at 5:33 PM, Andy Srine <andy.sr...@gmail.com> wrote:
>>>>
>>>>   Hi Guys,
>>>>
>>>>
>>>>  Spark rookie here. I am getting a file not found exception on the
>>>> --jars. This is on the yarn cluster mode and I am running the following
>>>> command on our recently upgraded Spark 1.1.1 environment.
>>>>
>>>>
>>>>  ./bin/spark-submit --verbose --master yarn --deploy-mode cluster
>>>> --class myEngine --driver-memory 1g --driver-library-path
>>>> /hadoop/share/hadoop/mapreduce/lib/hadoop-lzo-0.4.18-201406111750.jar
>>>> --executor-memory 5g --executor-cores 5 --jars
>>>> /home/andy/spark/lib/joda-convert-1.2.jar --queue default --num-executors 4
>>>> /home/andy/spark/lib/my-spark-lib_1.0.jar
>>>>
>>>>
>>>>  This is the error I am hitting. Any tips would be much appreciated.
>>>> The file permissions looks fine on my local disk.
>>>>
>>>>
>>>>  14/10/13 22:49:39 INFO yarn.ApplicationMaster: Unregistering
>>>> ApplicationMaster with FAILED
>>>>
>>>> 14/10/13 22:49:39 INFO impl.AMRMClientImpl: Waiting for application to
>>>> be successfully unregistered.
>>>>
>>>> Exception in thread "Driver" java.lang.reflect.InvocationTargetException
>>>>
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>
>>>> at
>>>> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
>>>>
>>>> Caused by: org.apache.spark.SparkException: Job aborted due to stage
>>>> failure: Task 3 in stage 1.0 failed 4 times, most recent failure: Lost task
>>>> 3.3 in stage 1.0 (TID 12, 122-67.vb2.company.com):
>>>> java.io.FileNotFoundException: ./joda-convert-1.2.jar (Permission denied)
>>>>
>>>>         java.io.FileOutputStream.open(Native Method)
>>>>
>>>>         java.io.FileOutputStream.<init>(FileOutputStream.java:221)
>>>>
>>>>
>>>> com.google.common.io.Files$FileByteSink.openStream(Files.java:223)
>>>>
>>>>
>>>> com.google.common.io.Files$FileByteSink.openStream(Files.java:211)
>>>>
>>>>
>>>> Thanks,
>>>> Andy
>>>>
>>>>
>>>
>>>
>>>   --
>>> Regards,
>>> Haripriya Ayyalasomayajula
>>>
>>>
>>
>>
>>  ------------------------------
>> Kelkoo SAS
>> Société par Actions Simplifiée
>> Au capital de € 4.168.964,30
>> Siège social : 8, rue du Sentier 75002 Paris
>> 425 093 069 RCS Paris
>>
>> Ce message et les pièces jointes sont confidentiels et établis à
>> l'attention exclusive de leurs destinataires. Si vous n'êtes pas le
>> destinataire de ce message, merci de le détruire et d'en avertir
>> l'expéditeur.
>>
>
>
>
> ------------------------------
> Kelkoo SAS
> Société par Actions Simplifiée
> Au capital de € 4.168.964,30
> Siège social : 8, rue du Sentier 75002 Paris
> 425 093 069 RCS Paris
>
> Ce message et les pièces jointes sont confidentiels et établis à
> l'attention exclusive de leurs destinataires. Si vous n'êtes pas le
> destinataire de ce message, merci de le détruire et d'en avertir
> l'expéditeur.
>
>
>
> ------------------------------
> Kelkoo SAS
> Société par Actions Simplifiée
> Au capital de € 4.168.964,30
> Siège social : 8, rue du Sentier 75002 Paris
> 425 093 069 RCS Paris
>
> Ce message et les pièces jointes sont confidentiels et établis à
> l'attention exclusive de leurs destinataires. Si vous n'êtes pas le
> destinataire de ce message, merci de le détruire et d'en avertir
> l'expéditeur.
>

Reply via email to