heyang wang created ZEPPELIN-1860:
-------------------------------------

             Summary: error when start spark interpreter with 
SPARK_SUBMIT_OPTIONS on zeppelin-env.sh that require int
                 Key: ZEPPELIN-1860
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1860
             Project: Zeppelin
          Issue Type: Bug
          Components: zeppelin-interpreter
    Affects Versions: 0.7.0
         Environment: hadoop 2.6.3 Spark 2.0.2
            Reporter: heyang wang
            Priority: Minor


My goal is to let zeppelin spark interpreter using 100 executor cores. 
With the following setting in zeppelin-env.sh, I always get error saying 
"java.lang.IllegalArgumentException: spark.executor.cores should be int, but 
was 2. "
export SPARK_SUBMIT_OPTIONS="--driver-memory 2048M --executor-memory 4G 
--num-executors 50 --executor-cores 2".

However, if I change the order related to spark job within zeppelin-env.sh as 
the following, everything works fine.
export SPARK_SUBMIT_OPTIONS="--driver-memory 2048M --num-executors 50 
--executor-cores 1 --executor-memory 4G".

After using "ps auxf |grep zeppelin", I found the process created with the 
former one is as the following (only show the last few lines):

--class org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--executor-memory 4G --num-executors 50 --executor-cores 1. 
/usr/local/zeppelin/interpreter/spark/zeppelin-spark_2.11-0.7.0-SNAPSHOT.jar 
14424

While the process created with the latter one is as:
--class org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--num-executors 50 --executor-cores 1 --executor-memory 4G. 
/usr/local/zeppelin/interpreter/spark/zeppelin-spark_2.11-0.7.0-SNAPSHOT.jar 
31902

It's obvious that there is a dot after the --executor-cores 1 which cause the 
error. Somehow this is not a problem when the config is of str fornat like 
--executor-memory since dot after the 4G doesn't cause error.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to