Hi Arun!

I think you can find info at https://spark.apache.org/docs/latest/configuration.html
quote:

Spark provides three locations to configure the system:

  * Spark properties
    
<https://spark.apache.org/docs/latest/configuration.html#spark-properties>control
    most application parameters and can be set by using aSparkConf
    
<https://spark.apache.org/docs/latest/api/core/index.html#org.apache.spark.SparkConf>object,
    or through Java system properties.
  * Environment variables
    
<https://spark.apache.org/docs/latest/configuration.html#environment-variables>can
    be used to set per-machine settings, such as the IP address,
    through the|conf/spark-env.sh|script on each node.
  * Logging
    
<https://spark.apache.org/docs/latest/configuration.html#configuring-logging>can
    be configured through|log4j.properties|.

for your question I guess you can use

|spark.executor.extraJavaOptions| (none) A string of extra JVM options to pass to executors. For instance, GC settings or other logging. Note that it is illegal to set Spark properties or heap size settings with this option. Spark properties should be set using a SparkConf object or the spark-defaults.conf file used with the spark-submit script. Heap size settings can be set with spark.executor.memory.

you  can find it at Runtime Environment

Larry

On 9/24/14 10:52 PM, Arun Ahuja wrote:
What is the proper way to specify java options for the Spark executors using spark-submit? We had done this previously using

export SPARK_JAVA_OPTS='.."

previously, for example to attach a debugger to each executor or add "-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps"

On spark-submit I see --driver-java-options but is there an equivalent for individual executors?

Thanks,
Arun

Reply via email to