[ 
https://issues.apache.org/jira/browse/SPARK-26642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luca Canali updated SPARK-26642:
--------------------------------
    Description: 
Currently spark-submit supports the option "--num-executors NUM" only for Spark 
on YARN. Users running Spark on K8S can specify the requested number of 
executors in spark-submit with "--conf spark.executor.instances=NUM"

This proposes to extend the spark-submit option -num-executors to be applicable 
to Spark on K8S too. It is motivated by convenience, for example when migrating 
jobs written for YARN to run on K8S.

  was:
Currently spark-submit supports the option "--num-executors NUM" only for Spark 
on YARN. Users running Spark on K8S can specify the requested number of 
executors in spark-submit with "-conf spark.executor.instances=NUM"

This proposes to extend the spark-submit option -num-executors to be applicable 
to Spark on K8S too. It is motivated by convenience, for example when migrating 
jobs written for YARN to run on K8S.


> Add --num-executors option to spark-submit for Spark on K8S
> -----------------------------------------------------------
>
>                 Key: SPARK-26642
>                 URL: https://issues.apache.org/jira/browse/SPARK-26642
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Luca Canali
>            Priority: Trivial
>
> Currently spark-submit supports the option "--num-executors NUM" only for 
> Spark on YARN. Users running Spark on K8S can specify the requested number of 
> executors in spark-submit with "--conf spark.executor.instances=NUM"
> This proposes to extend the spark-submit option -num-executors to be 
> applicable to Spark on K8S too. It is motivated by convenience, for example 
> when migrating jobs written for YARN to run on K8S.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to