Re: --driver-cores for Standalone and YARN only?! What about Mesos?

2016-06-02 Thread Holden Karau
Also seems like this might be better suited for dev@

On Thursday, June 2, 2016, Sun Rui  wrote:

> yes, I think you can fire a JIRA issue for this.
> But why removing the default value. Seems the default core is 1 according
> to
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala#L110
>
> On Jun 2, 2016, at 05:18, Jacek Laskowski  > wrote:
>
> Hi,
>
> I'm reviewing the code of spark-submit and can see that although
> --driver-cores is said to be for Standalone and YARN only, it is
> applicable for Mesos [1].
>
> ➜  spark git:(master) ✗ ./bin/spark-shell --help
> Usage: ./bin/spark-shell [options]
> ...
> Spark standalone with cluster deploy mode only:
>  --driver-cores NUM  Cores for driver (Default: 1).
> ...
> YARN-only:
>  --driver-cores NUM  Number of cores used by the driver, only
> in cluster mode
>  (Default: 1).
>
> I think Mesos has been overlooked (as it's not even included in the
> --help). I also can't find that the default number of cores for the
> driver for the option is 1.
>
> I can see few things to fix:
>
> 1. Have --driver-cores in the "main" help with no separation for
> standalone and YARN.
> 2. Add note that it works only for cluster deploy mode.
> 3. Remove (Default: 1)
>
> Please confirm (or fix) my understanding before I file a JIRA issue.
> Thanks!
>
> [1]
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L475-L476
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> 
> For additional commands, e-mail: user-h...@spark.apache.org
> 
>
>
>
>
>
>


-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


Re: --driver-cores for Standalone and YARN only?! What about Mesos?

2016-06-02 Thread Sun Rui
yes, I think you can fire a JIRA issue for this.
But why removing the default value. Seems the default core is 1 according to 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala#L110


On Jun 2, 2016, at 05:18, Jacek Laskowski  wrote:


Hi,

I'm reviewing the code of spark-submit and can see that although
--driver-cores is said to be for Standalone and YARN only, it is
applicable for Mesos [1].

➜  spark git:(master) ✗ ./bin/spark-shell --help
Usage: ./bin/spark-shell [options]
...
Spark standalone with cluster deploy mode only:
 --driver-cores NUM  Cores for driver (Default: 1).
...
YARN-only:
 --driver-cores NUM  Number of cores used by the driver, only
in cluster mode
 (Default: 1).

I think Mesos has been overlooked (as it's not even included in the
--help). I also can't find that the default number of cores for the
driver for the option is 1.

I can see few things to fix:

1. Have --driver-cores in the "main" help with no separation for
standalone and YARN.
2. Add note that it works only for cluster deploy mode.
3. Remove (Default: 1)

Please confirm (or fix) my understanding before I file a JIRA issue. Thanks!

[1] 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L475-L476

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org





--driver-cores for Standalone and YARN only?! What about Mesos?

2016-06-01 Thread Jacek Laskowski
Hi,

I'm reviewing the code of spark-submit and can see that although
--driver-cores is said to be for Standalone and YARN only, it is
applicable for Mesos [1].

➜  spark git:(master) ✗ ./bin/spark-shell --help
Usage: ./bin/spark-shell [options]
...
 Spark standalone with cluster deploy mode only:
  --driver-cores NUM  Cores for driver (Default: 1).
...
 YARN-only:
  --driver-cores NUM  Number of cores used by the driver, only
in cluster mode
  (Default: 1).

I think Mesos has been overlooked (as it's not even included in the
--help). I also can't find that the default number of cores for the
driver for the option is 1.

I can see few things to fix:

1. Have --driver-cores in the "main" help with no separation for
standalone and YARN.
2. Add note that it works only for cluster deploy mode.
3. Remove (Default: 1)

Please confirm (or fix) my understanding before I file a JIRA issue. Thanks!

[1] 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L475-L476

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org