[
https://issues.apache.org/jira/browse/SPARK-11555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-11555.
------------------------------------
Resolution: Fixed
Fix Version/s: 1.6.0
1.5.3
> spark on yarn spark-class --num-workers doesn't work
> ----------------------------------------------------
>
> Key: SPARK-11555
> URL: https://issues.apache.org/jira/browse/SPARK-11555
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 1.5.2
> Reporter: Thomas Graves
> Assignee: Thomas Graves
> Priority: Critical
> Fix For: 1.5.3, 1.6.0
>
>
> Using the old spark-class and --num-workers interface, --num-workers
> parameter is ignored and always uses default number of executors (2).
> bin/spark-class org.apache.spark.deploy.yarn.Client --jar
> lib/spark-examples-1.5.2.0-hadoop2.6.0.16.1506060127.jar --class
> org.apache.spark.examples.SparkPi --num-workers 4 --worker-memory 2g
> --master-memory 1g --worker-cores 1 --queue default
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]