I'm trying to run zeppelin using local spark interpreter.
Basically everything works, but if I try to set
`spark.driver.extraJavaOptions` or `spark.executor.extraJavaOptions`
containing several arguments, I get an exception.
For instance, for providing `-DmyParam=1 -DmyOtherParam=2`, I'd get:
Error: Unrecognized option: -DmyOtherParam=2

I noticed that the spark submit looks as follow:

spark-submit --class
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
--driver-class-path
....   *--conf spark.driver.extraJavaOptions=-DmyParam=1 -DmyOtherParam=2*

So I tried to patch SparkInterpreterLauncher to add quotation marks (like
in the example from spark documentation -
https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties
)

I see that the quotation marks were added: *--conf
"spark.driver.extraJavaOptions=-DmyParam=1 -DmyOtherParam=2"*
But I still get the same error.

Any idea how I can make it work?

Reply via email to