On Wed, Nov 9, 2016 at 1:34 PM, Tseytlin, Keren <
keren.tseyt...@capitalone.com> wrote:

> Hi All,
>
>
>
> I’ve just set up Zeppelin, and I’ve also set up my own Spark with
> connection to Alluxio. I installed Zeppelin using the binary. When I use
> Zeppelin, it seems to be using some internal Spark, not the one that I set
> up. What configurations should I set in order to make the notebooks and
> Spark jobs execute on my own Spark?
>
>
>
> I edited zeppelin-env.sh and added SPARK_HOME, but that caused anything I
> tried to run in my notebook just shoot back “ERROR” with no output.
>
>
>
> Any help would be much appreciated! Thanks!!
>
>
>
> Best,
>
> Keren
>
>
Try updating the following in zeppelin-env.sh
export MASTER="spark://spark-02.softlayer.com:7077"
export SPARK_HOME=/opt/spark-1.6.2-bin-hadoop2.6

And then on the ui, update the spark interpreter configuration to have
master url properly configured.

-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to