Hi all,

I been struggling for this and feel that there should be a way to fix it.
With the new Zeppelin 0.5.6 we were able to specify SPARK_HOME to external
Spark, which is good as we put a lot spark configuration inside
SPARK_HOME/conf/spark-default.conf. However there is one configuration
spark.executor.memory inside zeppelin/conf/interpreter.json been set to
default 512m, and whenever we restart zeppelin, this conf will always be
pick rather than the spark-default.conf. The only way we can get around
this is put -Dspark.executor.memory in ZEPPELIN_JAVA_OPTS.

In addition, if we go to Interpreter setting in a running Zeppelin and
restart the Spark Interpreter, it will also restart the Spark context with
the default setting in interpreter.json rather than spark-default.conf.

What should be a proper configuration setting to using zeppelin and
external Spark, so that the zeppelin can read the confs in
spark-default.conf as its Spark interpreter setting?

Regards,
Weipu

Reply via email to