Easier solution: creates different instances of Spark interpreter for each
use case:

1) For embedded Spark, just let the master property to local[*]
2) For system provided Spark, edit the Spark interpreter settings and
change the master to some spark://<master_ip>:7077

On Mon, Aug 8, 2016 at 9:52 AM, Patrick Duflot <patrick.duf...@iba-group.com
> wrote:

> Hello Zeppelin users,
>
>
>
> I was looking to configure Zeppelin so that it uses embedded Spark for
> some notebooks but uses system provided Spark for others.
>
> However it seems that the SPARK_HOME is a global parameter in
> zeppelin-env.sh.
>
> Is it possible to overwrite this setting at notebook level?
>
>
>
> Thanks!
>
>
>
> Patrick
>
>
>

Reply via email to