[2/2] spark git commit: Preparing development version 1.6.0-SNAPSHOT

2015-12-02 Thread pwendell
Preparing development version 1.6.0-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5d915fed Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5d915fed Diff:

[1/2] spark git commit: Preparing Spark release v1.6.0-rc1

2015-12-02 Thread pwendell
Repository: spark Updated Branches: refs/heads/branch-1.6 f449a407f -> 5d915fed3 Preparing Spark release v1.6.0-rc1 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bf525845 Tree:

Git Push Summary

2015-12-02 Thread pwendell
Repository: spark Updated Tags: refs/tags/v1.6.0-rc1 [created] bf525845c - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

spark git commit: [SPARK-12094][SQL] Prettier tree string for TreeNode

2015-12-02 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 128c29035 -> a1542ce2f [SPARK-12094][SQL] Prettier tree string for TreeNode When examining plans of complex queries with multiple joins, a pain point of mine is that, it's hard to immediately see the sibling node of a specific query plan

spark git commit: [SPARK-12094][SQL] Prettier tree string for TreeNode

2015-12-02 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.6 d79dd971d -> f449a407f [SPARK-12094][SQL] Prettier tree string for TreeNode When examining plans of complex queries with multiple joins, a pain point of mine is that, it's hard to immediately see the sibling node of a specific query

spark git commit: [SPARK-12001] Allow partially-stopped StreamingContext to be completely stopped

2015-12-02 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master a1542ce2f -> 452690ba1 [SPARK-12001] Allow partially-stopped StreamingContext to be completely stopped If `StreamingContext.stop()` is interrupted midway through the call, the context will be marked as stopped but certain state will have

spark git commit: [SPARK-10266][DOCUMENTATION, ML] Fixed @Since annotation for ml.tunning

2015-12-02 Thread meng
Repository: spark Updated Branches: refs/heads/master 452690ba1 -> de07d06ab [SPARK-10266][DOCUMENTATION, ML] Fixed @Since annotation for ml.tunning cc mengxr noel-smith I worked on this issues based on https://github.com/apache/spark/pull/8729. ehsanmok thank you for your contricution!

spark git commit: [SPARK-10266][DOCUMENTATION, ML] Fixed @Since annotation for ml.tunning

2015-12-02 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.6 5d915fed3 -> 911259e9a [SPARK-10266][DOCUMENTATION, ML] Fixed @Since annotation for ml.tunning cc mengxr noel-smith I worked on this issues based on https://github.com/apache/spark/pull/8729. ehsanmok thank you for your contricution!

spark git commit: [SPARK-12093][SQL] Fix the error of comment in DDLParser

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.6 911259e9a -> cb142fd1e [SPARK-12093][SQL] Fix the error of comment in DDLParser Author: Yadong Qi Closes #10096 from watermen/patch-1. (cherry picked from commit d0d7ec533062151269b300ed455cf150a69098c0)

spark git commit: [SPARK-12093][SQL] Fix the error of comment in DDLParser

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/master de07d06ab -> d0d7ec533 [SPARK-12093][SQL] Fix the error of comment in DDLParser Author: Yadong Qi Closes #10096 from watermen/patch-1. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit:

spark git commit: [SPARK-12000] do not specify arg types when reference a method in ScalaDoc

2015-12-02 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.6 cb142fd1e -> 656d44e20 [SPARK-12000] do not specify arg types when reference a method in ScalaDoc This fixes SPARK-12000, verified on my local with JDK 7. It seems that `scaladoc` try to match method names and messed up with

spark git commit: [SPARK-12000] do not specify arg types when reference a method in ScalaDoc

2015-12-02 Thread meng
Repository: spark Updated Branches: refs/heads/master d0d7ec533 -> 9bb695b7a [SPARK-12000] do not specify arg types when reference a method in ScalaDoc This fixes SPARK-12000, verified on my local with JDK 7. It seems that `scaladoc` try to match method names and messed up with annotations.

spark git commit: [SPARK-12109][SQL] Expressions's simpleString should delegate to its toString.

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/master ae4025337 -> ec2b6c26c [SPARK-12109][SQL] Expressions's simpleString should delegate to its toString. https://issues.apache.org/jira/browse/SPARK-12109 The change of https://issues.apache.org/jira/browse/SPARK-11596 exposed the problem.

spark git commit: [SPARK-12109][SQL] Expressions's simpleString should delegate to its toString.

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.6 6914ee9f0 -> 6674fd8aa [SPARK-12109][SQL] Expressions's simpleString should delegate to its toString. https://issues.apache.org/jira/browse/SPARK-12109 The change of https://issues.apache.org/jira/browse/SPARK-11596 exposed the

spark git commit: [SPARK-12082][FLAKY-TEST] Increase timeouts in NettyBlockTransferSecuritySuite

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.6 656d44e20 -> 6914ee9f0 [SPARK-12082][FLAKY-TEST] Increase timeouts in NettyBlockTransferSecuritySuite We should try increasing a timeout in NettyBlockTransferSecuritySuite in order to reduce that suite's flakiness in Jenkins. Author:

spark git commit: [SPARK-12082][FLAKY-TEST] Increase timeouts in NettyBlockTransferSecuritySuite

2015-12-02 Thread rxin
Repository: spark Updated Branches: refs/heads/master 9bb695b7a -> ae4025337 [SPARK-12082][FLAKY-TEST] Increase timeouts in NettyBlockTransferSecuritySuite We should try increasing a timeout in NettyBlockTransferSecuritySuite in order to reduce that suite's flakiness in Jenkins. Author:

spark git commit: [SPARK-3580][CORE] Add Consistent Method To Get Number of RDD Partitions Across Different Languages

2015-12-02 Thread srowen
Repository: spark Updated Branches: refs/heads/master 4375eb3f4 -> 128c29035 [SPARK-3580][CORE] Add Consistent Method To Get Number of RDD Partitions Across Different Languages I have tried to address all the comments in pull request https://github.com/apache/spark/pull/2447. Note that the

spark git commit: [SPARK-3580][CORE] Add Consistent Method To Get Number of RDD Partitions Across Different Languages

2015-12-02 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.6 c47a7373a -> d79dd971d [SPARK-3580][CORE] Add Consistent Method To Get Number of RDD Partitions Across Different Languages I have tried to address all the comments in pull request https://github.com/apache/spark/pull/2447. Note that