Hi Folks,

Just to summarize it to run SPARK on HDP distribution.

1. The spark version has to be 1.3.0 and above if you are using upstream 
distribution.  This configuration is mainly for HDP rolling upgrade purpose, 
and the patch only went into spark upstream from 1.3.0.

2. In $SPARK_HOME/conf/sp[ark-defaults.conf, adding following settings.
    spark.driver.extraJavaOptions -Dhdp.version=xxxxx

   spark.yarn.am.extraJavaOptions -Dhdp.version=xxxxx

3. In $SPARK_HOME/java-opts, add following options.
   -Dhdp.version=xxxxx

Thanks.

Zhan Zhang



On Mar 30, 2015, at 6:56 AM, Doug Balog 
<doug.sparku...@dugos.com<mailto:doug.sparku...@dugos.com>> wrote:

The “best” solution to spark-shell’s  problem is creating a file 
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”

Cheers,

Doug

On Mar 28, 2015, at 1:25 PM, Michael Stone 
<mst...@mathom.us<mailto:mst...@mathom.us>> wrote:

I've also been having trouble running 1.3.0 on HDP. The 
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
configuration directive seems to work with pyspark, but not propagate when 
using spark-shell. (That is, everything works find with pyspark, and 
spark-shell fails with the "bad substitution" message.)

Mike Stone

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>


Reply via email to