[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-27 Thread asfgit
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/4770 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enab

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread srowen
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76062964 Yes, do you mind closing this PR? I think the same underlying issue of temp file cleanup is discussed in https://issues.apache.org/jira/browse/SPARK-5836 and I think the d

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25380094 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25379441 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76057882 > So yarn will use different local dir for launching each executor But that leaves the case of a single executor being tied to a single local disk after your patch

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25378739 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76057426 Also using multiple disks for each executor provides speed but failure of any of the disk will fail all executors of that node. on other hand using one d

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76056141 I thought about that case too. Since we will be having many executors on one node. So yarn will use different local dir for launching each executor and that w

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25377996 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76045346 There is also a second thing that is broken by this patch: `YARN_LOCAL_DIRS` can actually be multiple directories, as the name implies. The BlockManager uses that to distr

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25373313 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25373106 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25370530 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76036782 @pankajarora12 Data generated by `DiskBlockManager` cannot be deleted since it may be used by other executors when using the external shuffle service. You may be able to o

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25370059 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25370157 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25369938 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25369644 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread vanzin
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25369132 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread srowen
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25368698 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
Github user pankajarora12 commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25367786 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** G

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread srowen
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/4770#discussion_r25367198 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -715,12 +715,8 @@ private[spark] object Utils extends Logging { /** Get the

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/4770#issuecomment-76028290 Can one of the admins verify this patch? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your pro

[GitHub] spark pull request: [CORE][YARN] SPARK-6011: Used Current Working ...

2015-02-25 Thread pankajarora12
GitHub user pankajarora12 opened a pull request: https://github.com/apache/spark/pull/4770 [CORE][YARN] SPARK-6011: Used Current Working directory for sparklocaldirs instead of Application Directory so that spark-local-files gets deleted when executor exits abruptly. Spark uses cur