spark git commit: [SPARK-10003] Improve readability of DAGScheduler

2015-09-03 Thread kayousterhout
Repository: spark Updated Branches: refs/heads/master 208fbca10 -> cf4213864 [SPARK-10003] Improve readability of DAGScheduler Note: this is not intended to be in Spark 1.5! This patch rewrites some code in the `DAGScheduler` to make it more readable. In particular - there were blocks of cod

spark git commit: [SPARK-10421] [BUILD] Exclude curator artifacts from tachyon dependencies.

2015-09-03 Thread vanzin
Repository: spark Updated Branches: refs/heads/master 08b075097 -> 208fbca10 [SPARK-10421] [BUILD] Exclude curator artifacts from tachyon dependencies. This avoids them being mistakenly pulled instead of the newer ones that Spark actually uses. Spark only depends on these artifacts transitivel

spark git commit: [SPARK-10435] Spark submit should fail fast for Mesos cluster mode with R

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master db4c130f9 -> 08b075097 [SPARK-10435] Spark submit should fail fast for Mesos cluster mode with R It's not supported yet so we should error with a clear message. Author: Andrew Or Closes #8590 from andrewor14/mesos-cluster-r-guard. Proj

spark git commit: [SPARK-9591] [CORE] Job may fail for exception during getting remote block

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 11ef32c5a -> db4c130f9 [SPARK-9591] [CORE] Job may fail for exception during getting remote block [SPARK-9591](https://issues.apache.org/jira/browse/SPARK-9591) When we getting the broadcast variable, we can fetch the block form several lo

spark git commit: [SPARK-10430] [CORE] Added hashCode methods in AccumulableInfo and RDDOperationScope

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master e62f4a46f -> 11ef32c5a [SPARK-10430] [CORE] Added hashCode methods in AccumulableInfo and RDDOperationScope Author: Vinod K C Closes #8581 from vinodkc/fix_RDDOperationScope_Hashcode. Project: http://git-wip-us.apache.org/repos/asf/spa

spark git commit: [SPARK-9672] [MESOS] Don’t include SPARK_ENV_LOADED when passing env vars

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 754f853b0 -> e62f4a46f [SPARK-9672] [MESOS] Don’t include SPARK_ENV_LOADED when passing env vars This contribution is my original work and I license the work to the project under the project's open source license. Author: Pat Shields

spark git commit: [SPARK-10431] [CORE] Fix intermittent test failure. Wait for event queue to be clear

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.5 f945b641c -> 4d6333597 [SPARK-10431] [CORE] Fix intermittent test failure. Wait for event queue to be clear Author: robbins Closes #8582 from robbinspg/InputOutputMetricsSuite. Project: http://git-wip-us.apache.org/repos/asf/spark/

spark git commit: [SPARK-9869] [STREAMING] Wait for all event notifications before asserting results

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.5 f01a96713 -> f945b641c [SPARK-9869] [STREAMING] Wait for all event notifications before asserting results Author: robbins Closes #8589 from robbinspg/InputStreamSuite-fix. (cherry picked from commit 754f853b02e9fd221f138c2446445fd56

spark git commit: [SPARK-9869] [STREAMING] Wait for all event notifications before asserting results

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master d911c682f -> 754f853b0 [SPARK-9869] [STREAMING] Wait for all event notifications before asserting results Author: robbins Closes #8589 from robbinspg/InputStreamSuite-fix. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Comm

spark git commit: [SPARK-10431] [CORE] Fix intermittent test failure. Wait for event queue to be clear

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 49aff7b9a -> d911c682f [SPARK-10431] [CORE] Fix intermittent test failure. Wait for event queue to be clear Author: robbins Closes #8582 from robbinspg/InputOutputMetricsSuite. Project: http://git-wip-us.apache.org/repos/asf/spark/repo

spark git commit: [SPARK-10432] spark.port.maxRetries documentation is unclear

2015-09-03 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master af0e3125c -> 49aff7b9a [SPARK-10432] spark.port.maxRetries documentation is unclear Author: Tom Graves Closes #8585 from tgravescs/SPARK-10432. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.

spark git commit: [SPARK-8951] [SPARKR] support Unicode characters in collect()

2015-09-03 Thread shivaram
Repository: spark Updated Branches: refs/heads/master 3abc0d512 -> af0e3125c [SPARK-8951] [SPARKR] support Unicode characters in collect() Spark gives an error message and does not show the output when a field of the result DataFrame contains characters in CJK. I changed SerDe.scala in order

spark git commit: [SPARK-9596] [SQL] treat hadoop classes as shared one in IsolatedClientLoader

2015-09-03 Thread marmbrus
Repository: spark Updated Branches: refs/heads/master 67580f1f5 -> 3abc0d512 [SPARK-9596] [SQL] treat hadoop classes as shared one in IsolatedClientLoader https://issues.apache.org/jira/browse/SPARK-9596 Author: WangTaoTheTonic Closes #7931 from WangTaoTheTonic/SPARK-9596. Project: http:/

spark git commit: [SPARK-10332] [CORE] Fix yarn spark executor validation

2015-09-03 Thread srowen
Repository: spark Updated Branches: refs/heads/master 0349b5b43 -> 67580f1f5 [SPARK-10332] [CORE] Fix yarn spark executor validation >From Jira: Running spark-submit with yarn with number-executors equal to 0 when not using dynamic allocation should error out. In spark 1.5.0 it continues and

spark git commit: [SPARK-10332] [CORE] Fix yarn spark executor validation

2015-09-03 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.5 94404ee53 -> f01a96713 [SPARK-10332] [CORE] Fix yarn spark executor validation >From Jira: Running spark-submit with yarn with number-executors equal to 0 when not using dynamic allocation should error out. In spark 1.5.0 it continues