Russell Spitzer created ZEPPELIN-2416:
-----------------------------------------

             Summary: Make Spark Properties Optional in Spark Interperter
                 Key: ZEPPELIN-2416
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2416
             Project: Zeppelin
          Issue Type: Improvement
          Components: Interpreters
    Affects Versions: 0.7.1
            Reporter: Russell Spitzer
            Priority: Trivial


I was doing some experimentation connecting Zepplin to DataStax Enterprise and 
I noticed that the Spark Interperter requires the "master" interpreter 
parameter to be present in order to work. There are several locations which do  
null-unsafe operations against the return value of getProperty("master")

See
https://github.com/apache/zeppelin/blob/v0.7.1/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L310

Then the value of this property is set in the SparkConf even if the value is 
already set there.
https://github.com/apache/zeppelin/blob/v0.7.1/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L335
and
https://github.com/apache/zeppelin/blob/v0.7.1/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L449

For this and other parameters it might make sense to allow values that are 
already set to be used and not overwritten. This would allow Zepplin to work 
better with "spark-defaults" and those vendors which preset these parameters in 
the environment.

I would propose that all interpreter properties be optional. Operationally this 
means that the "master" is allowed to be unset if it can be automatically 
inherited from the environment in the SparkConf. Other parameters would simply 
need a null check, where they would not be set if they are already null (unset).



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to