You don’t need to put any yarn assembly in hdfs. The spark assembly jar will 
include everything. It looks like your package does not include yarn module, 
although I didn’t find anything wrong in your mvn command. Can you check 
whether the ExecutorLauncher class is in your jar file or not?

BTW: For spark-1.3, you can use the binary distribution from apache.

Thanks.

Zhan Zhang



On Apr 17, 2015, at 2:01 PM, Udit Mehta 
<ume...@groupon.com<mailto:ume...@groupon.com>> wrote:

I followed the steps described above and I still get this error:

Error: Could not find or load main class 
org.apache.spark.deploy.yarn.ExecutorLauncher


I am trying to build spark 1.3 on hdp 2.2.
I built spark from source using:
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver 
-DskipTests package

Maybe I am not putting the correct yarn assembly on hdfs or some other issue?

Thanks,
Udit

On Mon, Mar 30, 2015 at 10:18 AM, Zhan Zhang 
<zzh...@hortonworks.com<mailto:zzh...@hortonworks.com>> wrote:
Hi Folks,

Just to summarize it to run SPARK on HDP distribution.

1. The spark version has to be 1.3.0 and above if you are using upstream 
distribution.  This configuration is mainly for HDP rolling upgrade purpose, 
and the patch only went into spark upstream from 1.3.0.

2. In $SPARK_HOME/conf/sp[ark-defaults.conf, adding following settings.
    spark.driver.extraJavaOptions -Dhdp.version=xxxxx

   spark.yarn.am.extraJavaOptions -Dhdp.version=xxxxx

3. In $SPARK_HOME/java-opts, add following options.
   -Dhdp.version=xxxxx

Thanks.

Zhan Zhang



On Mar 30, 2015, at 6:56 AM, Doug Balog 
<doug.sparku...@dugos.com<mailto:doug.sparku...@dugos.com>> wrote:

The “best” solution to spark-shell’s  problem is creating a file 
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”

Cheers,

Doug

On Mar 28, 2015, at 1:25 PM, Michael Stone 
<mst...@mathom.us<mailto:mst...@mathom.us>> wrote:

I've also been having trouble running 1.3.0 on HDP. The 
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
configuration directive seems to work with pyspark, but not propagate when 
using spark-shell. (That is, everything works find with pyspark, and 
spark-shell fails with the "bad substitution" message.)

Mike Stone

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>




Reply via email to