svn commit: r24128 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_22_01-317b0aa-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Thu Jan 11 06:16:07 2018 New Revision: 24128 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_22_01-317b0aa docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

svn commit: r24127 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_20_01-a6647ff-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Thu Jan 11 04:15:17 2018 New Revision: 24127 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_20_01-a6647ff docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 551ccfba5 -> 317b0aaed [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url ## What changes were proposed in this pull request? Two filesystems comparing does not consider the authority of URI. This is

spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark Updated Branches: refs/heads/master 9b33dfc40 -> a6647ffbf [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url ## What changes were proposed in this pull request? Two filesystems comparing does not consider the authority of URI. This is

svn commit: r24124 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_18_01-551ccfb-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Thu Jan 11 02:15:27 2018 New Revision: 24124 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_18_01-551ccfb docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-23009][PYTHON] Fix for non-str col names to createDataFrame from Pandas

2018-01-10 Thread gurwls223
Repository: spark Updated Branches: refs/heads/branch-2.3 eb4fa551e -> 551ccfba5 [SPARK-23009][PYTHON] Fix for non-str col names to createDataFrame from Pandas ## What changes were proposed in this pull request? This the case when calling `SparkSession.createDataFrame` using a Pandas

svn commit: r24120 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_16_01-9b33dfc-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Thu Jan 11 00:15:19 2018 New Revision: 24120 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_16_01-9b33dfc docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-22951][SQL] fix aggregation after dropDuplicates on empty data frames

2018-01-10 Thread lian
Repository: spark Updated Branches: refs/heads/master 344e3aab8 -> 9b33dfc40 [SPARK-22951][SQL] fix aggregation after dropDuplicates on empty data frames ## What changes were proposed in this pull request? (courtesy of liancheng) Spark SQL supports both global aggregation and grouping

spark git commit: [SPARK-22951][SQL] fix aggregation after dropDuplicates on empty data frames

2018-01-10 Thread lian
Repository: spark Updated Branches: refs/heads/branch-2.3 5b5851cb6 -> eb4fa551e [SPARK-22951][SQL] fix aggregation after dropDuplicates on empty data frames ## What changes were proposed in this pull request? (courtesy of liancheng) Spark SQL supports both global aggregation and grouping

svn commit: r24116 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_12_01-344e3aa-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Wed Jan 10 20:15:34 2018 New Revision: 24116 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_12_01-344e3aa docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

svn commit: r24115 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_10_01-5b5851c-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Wed Jan 10 18:15:40 2018 New Revision: 24115 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_10_01-5b5851c docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-23019][CORE] Wait until SparkContext.stop() finished in SparkLauncherSuite

2018-01-10 Thread vanzin
Repository: spark Updated Branches: refs/heads/master f340b6b30 -> 344e3aab8 [SPARK-23019][CORE] Wait until SparkContext.stop() finished in SparkLauncherSuite ## What changes were proposed in this pull request? In current code ,the function `waitFor` call

spark git commit: [SPARK-23019][CORE] Wait until SparkContext.stop() finished in SparkLauncherSuite

2018-01-10 Thread vanzin
Repository: spark Updated Branches: refs/heads/branch-2.3 60d4d79bb -> 5b5851cb6 [SPARK-23019][CORE] Wait until SparkContext.stop() finished in SparkLauncherSuite ## What changes were proposed in this pull request? In current code ,the function `waitFor` call

spark git commit: [SPARK-22972] Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.hive.orc

2018-01-10 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.2 24f1f2a54 -> 0d943d96b [SPARK-22972] Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.hive.orc ## What changes were proposed in this pull request? Fix the warning: Couldn't find corresponding Hive

svn commit: r24114 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_04_01-f340b6b-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Wed Jan 10 12:15:16 2018 New Revision: 24114 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_04_01-f340b6b docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-22997] Add additional defenses against use of freed MemoryBlocks

2018-01-10 Thread joshrosen
Repository: spark Updated Branches: refs/heads/branch-2.3 2db523959 -> 60d4d79bb [SPARK-22997] Add additional defenses against use of freed MemoryBlocks ## What changes were proposed in this pull request? This patch modifies Spark's `MemoryAllocator` implementations so that

spark git commit: [SPARK-22997] Add additional defenses against use of freed MemoryBlocks

2018-01-10 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master 70bcc9d5a -> f340b6b30 [SPARK-22997] Add additional defenses against use of freed MemoryBlocks ## What changes were proposed in this pull request? This patch modifies Spark's `MemoryAllocator` implementations so that `free(MemoryBlock)`

svn commit: r24111 - in /dev/spark/2.3.0-SNAPSHOT-2018_01_10_00_01-70bcc9d-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-01-10 Thread pwendell
Author: pwendell Date: Wed Jan 10 08:16:20 2018 New Revision: 24111 Log: Apache Spark 2.3.0-SNAPSHOT-2018_01_10_00_01-70bcc9d docs [This commit notification would consist of 1439 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]