Hi,

This is the log trace:
https://gist.github.com/uditmehta27/511eac0b76e6d61f8b47

On the yarn RM UI, I see :

Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher


The command I run is: bin/spark-shell --master yarn-client

The spark defaults I use is:
spark.yarn.jar
hdfs://namenode1-dev.snc1:8020/spark/spark-assembly-1.3.0-hadoop2.4.0.jar
spark.yarn.access.namenodes hdfs://namenode1-dev.snc1:8032
spark.dynamicAllocation.enabled false
spark.scheduler.mode FAIR
spark.driver.extraJavaOptions -Dhdp.version=2.2.0.0-2041
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041

Is there anything wrong in what I am trying to do?

thanks again!


On Fri, Apr 17, 2015 at 2:56 PM, Zhan Zhang <zzh...@hortonworks.com> wrote:

>  Hi Udit,
>
>  By the way, do you mind to share the whole log trace?
>
>  Thanks.
>
>  Zhan Zhang
>
>  On Apr 17, 2015, at 2:26 PM, Udit Mehta <ume...@groupon.com> wrote:
>
>  I am just trying to launch a spark shell and not do anything fancy. I
> got the binary distribution from apache and put the spark assembly on hdfs.
> I then specified the yarn.jars option in spark defaults to point to the
> assembly in hdfs. I still got the same error so though I had to build it
> for hdp. I am using hdp 2.2 with hadoop 2.6/
>
> On Fri, Apr 17, 2015 at 2:21 PM, Udit Mehta <ume...@groupon.com> wrote:
>
>> Thanks. Would that distribution work for hdp 2.2?
>>
>> On Fri, Apr 17, 2015 at 2:19 PM, Zhan Zhang <zzh...@hortonworks.com>
>> wrote:
>>
>>>  You don’t need to put any yarn assembly in hdfs. The spark assembly
>>> jar will include everything. It looks like your package does not include
>>> yarn module, although I didn’t find anything wrong in your mvn command. Can
>>> you check whether the ExecutorLauncher class is in your jar file or not?
>>>
>>>  BTW: For spark-1.3, you can use the binary distribution from apache.
>>>
>>>  Thanks.
>>>
>>>  Zhan Zhang
>>>
>>>
>>>
>>>  On Apr 17, 2015, at 2:01 PM, Udit Mehta <ume...@groupon.com> wrote:
>>>
>>>    I followed the steps described above and I still get this error:
>>>
>>>
>>> Error: Could not find or load main class 
>>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>>
>>>
>>>  I am trying to build spark 1.3 on hdp 2.2.
>>>  I built spark from source using:
>>> build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive
>>> -Phive-thriftserver -DskipTests package
>>>
>>>  Maybe I am not putting the correct yarn assembly on hdfs or some other
>>> issue?
>>>
>>>  Thanks,
>>>  Udit
>>>
>>> On Mon, Mar 30, 2015 at 10:18 AM, Zhan Zhang <zzh...@hortonworks.com>
>>> wrote:
>>>
>>>> Hi Folks,
>>>>
>>>>  Just to summarize it to run SPARK on HDP distribution.
>>>>
>>>>  1. The spark version has to be 1.3.0 and above if you are using
>>>> upstream distribution.  This configuration is mainly for HDP rolling
>>>> upgrade purpose, and the patch only went into spark upstream from 1.3.0.
>>>>
>>>>  2. In $SPARK_HOME/conf/sp[ark-defaults.conf, adding following
>>>> settings.
>>>>     spark.driver.extraJavaOptions -Dhdp.version=xxxxx
>>>>
>>>>    spark.yarn.am.extraJavaOptions -Dhdp.version=xxxxx
>>>>
>>>>  3. In $SPARK_HOME/java-opts, add following options.
>>>>    -Dhdp.version=xxxxx
>>>>
>>>>  Thanks.
>>>>
>>>>  Zhan Zhang
>>>>
>>>>
>>>>
>>>>  On Mar 30, 2015, at 6:56 AM, Doug Balog <doug.sparku...@dugos.com>
>>>> wrote:
>>>>
>>>> The “best” solution to spark-shell’s  problem is creating a file
>>>> $SPARK_HOME/conf/java-opts
>>>> with “-Dhdp.version=2.2.0.0-2014”
>>>>
>>>> Cheers,
>>>>
>>>> Doug
>>>>
>>>> On Mar 28, 2015, at 1:25 PM, Michael Stone <mst...@mathom.us> wrote:
>>>>
>>>> I've also been having trouble running 1.3.0 on HDP. The
>>>> spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
>>>> configuration directive seems to work with pyspark, but not propagate
>>>> when using spark-shell. (That is, everything works find with pyspark, and
>>>> spark-shell fails with the "bad substitution" message.)
>>>>
>>>> Mike Stone
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>>
>>>
>>>
>>
>
>

Reply via email to