The “best” solution to spark-shell’s  problem is creating a file 
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”

Cheers,

Doug

> On Mar 28, 2015, at 1:25 PM, Michael Stone <mst...@mathom.us> wrote:
> 
> I've also been having trouble running 1.3.0 on HDP. The 
> spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
> configuration directive seems to work with pyspark, but not propagate when 
> using spark-shell. (That is, everything works find with pyspark, and 
> spark-shell fails with the "bad substitution" message.)
> 
> Mike Stone
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to