[GitHub] spark issue #21033: [SPARK-19320][MESOS]allow specifying a hard limit on num...

2018-04-13 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/21033
  
Anything else do we need to do to merge in this change?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21033: [SPARK-19320][MESOS][WIP]allow specifying a hard ...

2018-04-12 Thread yanji84
Github user yanji84 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21033#discussion_r181231118
  
--- Diff: 
resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosCoarseGrainedSchedulerBackendSuite.scala
 ---
@@ -165,18 +165,47 @@ class MesosCoarseGrainedSchedulerBackendSuite extends 
SparkFunSuite
   }
 
 
-  test("mesos does not acquire more than spark.mesos.gpus.max") {
-val maxGpus = 5
-setBackend(Map("spark.mesos.gpus.max" -> maxGpus.toString))
+  test("mesos acquires spark.mesos.executor.gpus number of gpus per 
executor") {
+setBackend(Map("spark.mesos.gpus.max" -> "5",
+   "spark.mesos.executor.gpus" -> "2"))
 
 val executorMemory = backend.executorMemory(sc)
-offerResources(List(Resources(executorMemory, 1, maxGpus + 1)))
+offerResources(List(Resources(executorMemory, 1, 5)))
 
 val taskInfos = verifyTaskLaunched(driver, "o1")
 assert(taskInfos.length == 1)
 
 val gpus = backend.getResource(taskInfos.head.getResourcesList, "gpus")
-assert(gpus == maxGpus)
+assert(gpus == 2)
+  }
+
+
+  test("mesos declines offers that cannot satisfy 
spark.mesos.executor.gpus") {
+setBackend(Map("spark.mesos.gpus.max" -> "5",
--- End diff --

Sounds good. Added the test


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21033: [SPARK-19320][MESOS][WIP]allow specifying a hard ...

2018-04-10 Thread yanji84
GitHub user yanji84 opened a pull request:

https://github.com/apache/spark/pull/21033

[SPARK-19320][MESOS][WIP]allow specifying a hard limit on number of gpus 
required in each spark executor when running on mesos

## What changes were proposed in this pull request?

Currently, Spark only allows specifying overall gpu resources as an upper 
limit, this adds a new conf parameter to allow specifying a hard limit on the 
number of gpu cores for each executor while still respecting the overall gpu 
resource constraint

## How was this patch tested?

Unit Testing

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/yanji84/spark ji/hard_limit_on_gpu

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21033.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21033


commit cec434a1eba6227814ba5a842ff8f41103217539
Author: Ji Yan 
Date:   2017-03-10T05:30:11Z

respect both gpu and maxgpu

commit c427e151dbf63815f25d20fe1b099a7b09e85f51
Author: Ji Yan 
Date:   2017-05-14T20:02:16Z

fix gpu offer

commit 1e61996c31ff3a01396738fd91adf69952fd3558
Author: Ji Yan 
Date:   2017-05-14T20:15:55Z

syntax fix

commit f24dbe17787acecd4c032e25d820cb59d8b6d491
Author: Ji Yan 
Date:   2017-05-15T00:30:50Z

pass all tests

commit f89e5ccae02667d4f55e7aeb1f805a9cfaee1558
Author: Ji Yan 
Date:   2018-04-10T18:37:14Z

remove redundant




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17979: [SPARK-19320][MESOS][WIP]allow specifying a hard limit o...

2018-03-21 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17979
  
We are really in need of this. Can we reopen this?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17979: [SPARK-19320][MESOS][WIP]allow specifying a hard ...

2017-05-14 Thread yanji84
GitHub user yanji84 opened a pull request:

https://github.com/apache/spark/pull/17979

[SPARK-19320][MESOS][WIP]allow specifying a hard limit on number of gpus 
required in each spark executor when running on mesos

## What changes were proposed in this pull request?

Currently, Spark only allows specifying overall gpu resources as an upper 
limit, this adds a new conf parameter to allow specifying a hard limit on the 
number of gpu cores for each executor while still respecting the overall gpu 
resource constraint

## How was this patch tested?

Unit testing

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/yanji84/spark ji/set_allow_set_docker_user

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/17979.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #17979


commit 5f8ccd5789137363e035d1dfb9a05d3b9bf3ce6b
Author: Ji Yan 
Date:   2017-03-10T05:30:11Z

respect both gpu and maxgpu

commit 33ebff693d9b78a15221f931dbbca777cba944e0
Author: Ji Yan 
Date:   2017-03-10T05:43:21Z

Merge branch 'master' into ji/hard_limit_on_gpu

commit c2c1c5b66436a439e1d7342b7a2c58c502e26d6b
Author: Ji Yan 
Date:   2017-03-10T05:30:11Z

respect both gpu and maxgpu

commit c5c5c379fc27f579952700fdf2d15dae9eba104a
Author: Ji Yan 
Date:   2017-05-13T16:25:48Z

Merge branch 'ji/hard_limit_on_gpu' of https://github.com/yanji84/spark 
into ji/hard_limit_on_gpu

commit ba87b35817a7288b9b6aa41f4ac2244e235f2efd
Author: Ji Yan 
Date:   2017-05-13T16:53:59Z

fix syntax

commit 5ef2881a2b1e1180b73d532988bab72c5fdab64c
Author: Ji Yan 
Date:   2017-05-14T20:02:16Z

fix gpu offer

commit c301f3d1e05cc7359142a6cfb8222ad65cbdd9eb
Author: Ji Yan 
Date:   2017-05-14T20:15:55Z

syntax fix

commit 7a07742f4e004e0e88aa2b3bc5143adab3689644
Author: Ji Yan 
Date:   2017-05-15T00:30:50Z

pass all tests




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-04-15 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@srowen @tnachen confirm merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17109: [SPARK-19740][MESOS]Add support in Spark to pass ...

2017-04-15 Thread yanji84
Github user yanji84 commented on a diff in the pull request:

https://github.com/apache/spark/pull/17109#discussion_r111671944
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendUtil.scala
 ---
@@ -99,6 +99,26 @@ private[mesos] object MesosSchedulerBackendUtil extends 
Logging {
 .toList
   }
 
+  /**
+   * Parse a list of docker parameters, each of which
+   * takes the form key=value
+   */
+  private def parseParamsSpec(params: String): List[Parameter] = {
+params.split(",").map(_.split("=")).flatMap { spec: Array[String] =>
--- End diff --

sorry missed out this comment earlier, Ii set limit to 2, pushed the new 
pull request


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-04-11 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@srowen is there anything else holding this up? why does it take so long? 
thanks


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-03-17 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
hello, any update on this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-03-11 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@mgummelt comments addressed, please take another look


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17235: [SPARK-19320][MESOS][WIP]allow specifying a hard ...

2017-03-09 Thread yanji84
GitHub user yanji84 opened a pull request:

https://github.com/apache/spark/pull/17235

[SPARK-19320][MESOS][WIP]allow specifying a hard limit on number of gpus 
required in each spark executor when running on mesos

## What changes were proposed in this pull request?

Currently, spark only allows specifying gpu resources as an upper limit, 
this adds a new conf parameter to allow specifying a hard limit on the number 
of gpu cores. If this hard limit is greater than 0, it will override the effect 
of spark.mesos.gpus.max

## How was this patch tested?

Tests pending

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/yanji84/spark ji/hard_limit_on_gpu

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/17235.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #17235


commit 5f8ccd5789137363e035d1dfb9a05d3b9bf3ce6b
Author: Ji Yan 
Date:   2017-03-10T05:30:11Z

respect both gpu and maxgpu




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-03-09 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@tnachen does the PR look good? Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-03-05 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@tnachen test ready


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #17109: [SPARK-19740][MESOS]Add support in Spark to pass arbitra...

2017-03-01 Thread yanji84
Github user yanji84 commented on the issue:

https://github.com/apache/spark/pull/17109
  
@tnachen I will add test


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17109: [SPARK-19740][MESOS]

2017-02-28 Thread yanji84
GitHub user yanji84 opened a pull request:

https://github.com/apache/spark/pull/17109

[SPARK-19740][MESOS]

## What changes were proposed in this pull request?

Allow passing in arbitrary parameters into docker when launching spark 
executors on mesos with docker containerizer @tnachen

## How was this patch tested?

Manually built and tested with passed in parameter

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/yanji84/spark ji/allow_set_docker_user

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/17109.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #17109


commit 4f8368ea727e5689e96794884b8d1baf3eccb5d5
Author: Ji Yan 
Date:   2017-02-25T22:36:31Z

allow setting docker user when running spark on mesos with docker 
containerizer

commit bba57f9491703b4b06e82144a57660cbafa193ee
Author: Ji Yan 
Date:   2017-02-26T01:34:49Z

allow arbitrary parameters to pass to docker through spark conf

commit ae30e239e574cebc9774087e038aa0853d9939fc
Author: Ji Yan 
Date:   2017-02-26T21:32:45Z

add some debug prints

commit ecb7a8e87589d4b72fe836f91cdf4d8a7e5a53bc
Author: Ji Yan 
Date:   2017-03-01T01:54:34Z

remove debug print




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org