Hi all,

we're looking to support multiple Spark versions in the same Zeppelin
instances. Can this work with multiple Spark groups or in another way?

We already use multiple Interpreters (via "Create"in the Interpreter UI) to
configure different Spark environments (all using group "spark").

How can I copy the spark group and adjust its SPARK_HOME? I could not find
interpreter/spark/interpreter-setting.json which might configure this.

Thanks,
Fabian

Reply via email to