[ 
https://issues.apache.org/jira/browse/SPARK-12534?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-12534:
------------------------------
    Comment: was deleted

(was: I don't think this info is worth maintaining redundantly in the configs 
doc. It's not about the CLI.)

> Document missing command line options to Spark properties mapping
> -----------------------------------------------------------------
>
>                 Key: SPARK-12534
>                 URL: https://issues.apache.org/jira/browse/SPARK-12534
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Documentation, YARN
>    Affects Versions: 1.5.2
>            Reporter: Felix Cheung
>            Assignee: Apache Spark
>            Priority: Minor
>
> Several Spark properties equivalent to Spark submit command line options are 
> missing.
> {quote}
> The equivalent for spark-submit --num-executors should be 
> spark.executor.instances
> When use in SparkConf?
> http://spark.apache.org/docs/latest/running-on-yarn.html
> Could you try setting that with sparkR.init()?
> _____________________________
> From: Franc Carter <franc.car...@gmail.com>
> Sent: Friday, December 25, 2015 9:23 PM
> Subject: number of executors in sparkR.init()
> To: <u...@spark.apache.org>
> Hi,
> I'm having trouble working out how to get the number of executors set when 
> using sparkR.init().
> If I start sparkR with
>   sparkR  --master yarn --num-executors 6 
> then I get 6 executors
> However, if start sparkR with
>   sparkR 
> followed by
>   sc <- sparkR.init(master="yarn-client",   
> sparkEnvir=list(spark.num.executors='6'))
> then I only get 2 executors.
> Can anyone point me in the direction of what I might doing wrong ? I need to 
> initialise this was so that rStudio can hook in to SparkR
> thanks
> -- 
> Franc
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to