zlosim commented on PR #4398:
URL: https://github.com/apache/zeppelin/pull/4398#issuecomment-1180703056

   Hi @Reamer , thanks for the clarification.
   Yes, one can set env vars and create custom interpreter-spec but this is not 
the same as this fix is trying to do and is not on feature parity with spark on 
yarn mode. Let me try to provide an example to better explain myself:
   when using zeppelin in multi-tenant environment with spark on hadoop 
cluster. one user can set their own dependencies and driver configurations via 
inline configuration or interpreter setting page and everything works as 
expected
   on the other hand, when setting dependencies or driver configs in k8s, these 
are ignored and only chance is to set them via `SPARK_SUBMIT_OPTIONS` as you 
mentioned, BUT
   
   - this will be fixed for every user and sometimes users would like to have 
different set of dependencies and driver configurations
   - as stated in zeppelin docs `To be noticed, SPARK_SUBMIT_OPTIONS is 
deprecated and will be removed in future release.`
   - in my opinion with this fix we can provide users with the same experience 
with spark running in k8s as running in hadoop cluster or stand alone mode
   
   
   > Your configuration values like `spark.driver.extraClassPath` and 
`spark.driver.extraLibraryPath` should take no effect, because Zeppelin starts 
Spark on Kubernetes always in client mode.
   
   Yes, it is running in client mode, but driver is not yet started so there is 
a possibility to pass them to the driver on start so they can have effect. We 
are doing the same in 
[SparkInterpreterLauncher](https://github.com/apache/zeppelin/blob/c4c580a37fde649553d336984a94bcb1b2821201/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/launcher/SparkInterpreterLauncher.java#L80)
   
   > You can also set `SPARK_DRIVER_EXTRAJAVAOPTIONS_CONF` directly in your 
Spark Zeppelin interpreter configuration if you want to override the (possibly 
empty) default setting.
   
   I'm sorry I was not aware of that. I know we can set few [params with env 
vars](https://github.com/apache/zeppelin/blob/934e38add26157b87d6b247e4efca4f795411a2c/spark/interpreter/src/main/resources/interpreter-setting.json#L227)
 but I can't find how to set env var from interpreter configuration
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@zeppelin.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to