You probably want to first try the basic configuration to see whether it works, 
instead of setting SPARK_JAR pointing to the hdfs location.  This error is 
caused by not finding ExecutorLauncher in class path, and not HDP specific, I 
think.

Thanks.

Zhan Zhang

On Apr 17, 2015, at 2:26 PM, Udit Mehta 
<ume...@groupon.com<mailto:ume...@groupon.com>> wrote:

I am just trying to launch a spark shell and not do anything fancy. I got the 
binary distribution from apache and put the spark assembly on hdfs. I then 
specified the yarn.jars option in spark defaults to point to the assembly in 
hdfs. I still got the same error so though I had to build it for hdp. I am 
using hdp 2.2 with hadoop 2.6/

On Fri, Apr 17, 2015 at 2:21 PM, Udit Mehta 
<ume...@groupon.com<mailto:ume...@groupon.com>> wrote:
Thanks. Would that distribution work for hdp 2.2?

On Fri, Apr 17, 2015 at 2:19 PM, Zhan Zhang 
<zzh...@hortonworks.com<mailto:zzh...@hortonworks.com>> wrote:
You don’t need to put any yarn assembly in hdfs. The spark assembly jar will 
include everything. It looks like your package does not include yarn module, 
although I didn’t find anything wrong in your mvn command. Can you check 
whether the ExecutorLauncher class is in your jar file or not?

BTW: For spark-1.3, you can use the binary distribution from apache.

Thanks.

Zhan Zhang



On Apr 17, 2015, at 2:01 PM, Udit Mehta 
<ume...@groupon.com<mailto:ume...@groupon.com>> wrote:

I followed the steps described above and I still get this error:

Error: Could not find or load main class 
org.apache.spark.deploy.yarn.ExecutorLauncher


I am trying to build spark 1.3 on hdp 2.2.
I built spark from source using:
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver 
-DskipTests package

Maybe I am not putting the correct yarn assembly on hdfs or some other issue?

Thanks,
Udit

On Mon, Mar 30, 2015 at 10:18 AM, Zhan Zhang 
<zzh...@hortonworks.com<mailto:zzh...@hortonworks.com>> wrote:
Hi Folks,

Just to summarize it to run SPARK on HDP distribution.

1. The spark version has to be 1.3.0 and above if you are using upstream 
distribution.  This configuration is mainly for HDP rolling upgrade purpose, 
and the patch only went into spark upstream from 1.3.0.

2. In $SPARK_HOME/conf/sp[ark-defaults.conf, adding following settings.
    spark.driver.extraJavaOptions -Dhdp.version=xxxxx

   spark.yarn.am.extraJavaOptions -Dhdp.version=xxxxx

3. In $SPARK_HOME/java-opts, add following options.
   -Dhdp.version=xxxxx

Thanks.

Zhan Zhang



On Mar 30, 2015, at 6:56 AM, Doug Balog 
<doug.sparku...@dugos.com<mailto:doug.sparku...@dugos.com>> wrote:

The “best” solution to spark-shell’s  problem is creating a file 
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”

Cheers,

Doug

On Mar 28, 2015, at 1:25 PM, Michael Stone 
<mst...@mathom.us<mailto:mst...@mathom.us>> wrote:

I've also been having trouble running 1.3.0 on HDP. The 
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
configuration directive seems to work with pyspark, but not propagate when 
using spark-shell. (That is, everything works find with pyspark, and 
spark-shell fails with the "bad substitution" message.)

Mike Stone

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>







Reply via email to