[ 
https://issues.apache.org/jira/browse/SPARK-1264?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aaron Davidson updated SPARK-1264:
----------------------------------

    Assignee:     (was: Aaron Davidson)

> Documentation for setting heap sizes across all configurations
> --------------------------------------------------------------
>
>                 Key: SPARK-1264
>                 URL: https://issues.apache.org/jira/browse/SPARK-1264
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.0.0
>            Reporter: Andrew Ash
>
> As a user, there are lots of places to configure heap sizes, and it takes a 
> bit of trial and error to figure out how to configure what you want.
> We need some more clear documentation on how set these for the cross product 
> of Spark components (master, worker, executor, driver, shell) and deployment 
> modes (Standalone, YARN, Mesos, EC2?).
> I'm happy to do the authoring if someone can help pull together the relevant 
> details.
> Here's the best I've got so far:
> {noformat}
> # Standalone cluster
> Master - SPARK_DAEMON_MEMORY - default: 512mb
> Worker - SPARK_DAEMON_MEMORY vs SPARK_WORKER_MEMORY? - default: ?  See 
> WorkerArguments.inferDefaultMemory()
> Executor - spark.executor.memory
> Driver - SPARK_DRIVER_MEMORY - default: 512mb
> Shell - A pre-built driver so SPARK_DRIVER_MEMORY - default: 512mb
> # EC2 cluster
> Master - ?
> Worker - ?
> Executor - ?
> Driver - ?
> Shell - ?
> # Mesos cluster
> Master - SPARK_DAEMON_MEMORY
> Worker - SPARK_DAEMON_MEMORY
> Executor - SPARK_EXECUTOR_MEMORY
> Driver - SPARK_DRIVER_MEMORY
> Shell - A pre-built driver so SPARK_DRIVER_MEMORY
> # YARN cluster
> Master - SPARK_MASTER_MEMORY ?
> Worker - SPARK_WORKER_MEMORY ?
> Executor - SPARK_EXECUTOR_MEMORY
> Driver - SPARK_DRIVER_MEMORY
> Shell - A pre-built driver so SPARK_DRIVER_MEMORY
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to