We were using 1.6, but now we are on 2.0.1. Both versions show the same
issue.

I dove deep into the Spark code and have identified that the extra java
options are /not/ added to the process on the executors. At this point, I
believe you have to use spark-defaults.conf to set any values that will be
used. The problem for us, is that these extra Java options are not the same
for each job that is submitted and thus can't put the values in
spark-defaults.conf.

Ivan



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Submit-job-with-driver-options-in-Mesos-Cluster-mode-tp27853p27973.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to