[ 
https://issues.apache.org/jira/browse/SPARK-1099?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-1099:
-----------------------------------

    Summary: Allow inferring number of cores with local[*]  (was: Spark's local 
mode should respect spark.cores.max by default)

> Allow inferring number of cores with local[*]
> ---------------------------------------------
>
>                 Key: SPARK-1099
>                 URL: https://issues.apache.org/jira/browse/SPARK-1099
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>            Reporter: Aaron Davidson
>            Assignee: Aaron Davidson
>            Priority: Minor
>             Fix For: 1.0.0
>
>
> It seems reasonable that the default number of cores used by spark's local 
> mode (when no value is specified) is drawn from the spark.cores.max 
> configuration parameter (which, conveniently, is now settable as a 
> command-line option in spark-shell).
> For the sake of consistency, it's probable that this change would also entail 
> making the default number of cores when spark.cores.max is NOT specified to 
> be as many logical cores are on the machine (which is what standalone mode 
> does). This too seems reasonable, as Spark is inherently a distributed system 
> and I think it's expected that it should use multiple cores by default. 
> However, it is a behavioral change, and thus requires caution.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to