spark git commit: [SPARK-25710][SQL] range should report metrics correctly

2018-10-12 Thread wenchen
Repository: spark Updated Branches: refs/heads/master c9ba59d38 -> 34f229bc2 [SPARK-25710][SQL] range should report metrics correctly ## What changes were proposed in this pull request? Currently `Range` reports metrics in batch granularity. This is acceptable, but it's better if we can

svn commit: r30032 - in /dev/spark/2.4.1-SNAPSHOT-2018_10_12_22_02-5554a33-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Sat Oct 13 05:19:00 2018 New Revision: 30032 Log: Apache Spark 2.4.1-SNAPSHOT-2018_10_12_22_02-5554a33 docs [This commit notification would consist of 1472 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

svn commit: r30031 - in /dev/spark/2.3.3-SNAPSHOT-2018_10_12_22_02-b3d1b1b-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Sat Oct 13 05:17:23 2018 New Revision: 30031 Log: Apache Spark 2.3.3-SNAPSHOT-2018_10_12_22_02-b3d1b1b docs [This commit notification would consist of 1443 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: Revert "[SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification"

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.3 182bc85f2 -> b3d1b1bcb Revert "[SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification" This reverts commit 182bc85f2db0b3268b9b93ff91210811b00e1636. Project: http://git-wip-us.apache.org/repos/asf/spark/repo

spark git commit: [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.3 5324a85a2 -> 182bc85f2 [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification ## What changes were proposed in this pull request? ```Scala val df1 = Seq(("abc", 1), (null, 3)).toDF("col1", "col2")

spark git commit: [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.4 0f58b989d -> 5554a33f2 [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification ## What changes were proposed in this pull request? ```Scala val df1 = Seq(("abc", 1), (null, 3)).toDF("col1", "col2")

spark git commit: [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/master 3946de773 -> c9ba59d38 [SPARK-25714] Fix Null Handling in the Optimizer rule BooleanSimplification ## What changes were proposed in this pull request? ```Scala val df1 = Seq(("abc", 1), (null, 3)).toDF("col1", "col2")

svn commit: r30030 - in /dev/spark/3.0.0-SNAPSHOT-2018_10_12_20_02-3946de7-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Sat Oct 13 03:17:06 2018 New Revision: 30030 Log: Apache Spark 3.0.0-SNAPSHOT-2018_10_12_20_02-3946de7 docs [This commit notification would consist of 1481 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-20327][CORE][YARN] Add CLI support for YARN custom resources, like GPUs

2018-10-12 Thread vanzin
Repository: spark Updated Branches: refs/heads/master 1ddfab8c4 -> 3946de773 [SPARK-20327][CORE][YARN] Add CLI support for YARN custom resources, like GPUs ## What changes were proposed in this pull request? This PR adds CLI support for YARN custom resources, e.g. GPUs and any other

svn commit: r30027 - in /dev/spark/3.0.0-SNAPSHOT-2018_10_12_16_02-4e141a4-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 23:16:52 2018 New Revision: 30027 Log: Apache Spark 3.0.0-SNAPSHOT-2018_10_12_16_02-4e141a4 docs [This commit notification would consist of 1481 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-19287][CORE][STREAMING] JavaPairRDD flatMapValues requires function returning Iterable, not Iterator

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master e965fb55a -> 1ddfab8c4 [SPARK-19287][CORE][STREAMING] JavaPairRDD flatMapValues requires function returning Iterable, not Iterator ## What changes were proposed in this pull request? Fix old oversight in API: Java `flatMapValues` needs a

spark git commit: [SPARK-25664][SQL][TEST] Refactor JoinBenchmark to use main method

2018-10-12 Thread dongjoon
Repository: spark Updated Branches: refs/heads/master 4e141a416 -> e965fb55a [SPARK-25664][SQL][TEST] Refactor JoinBenchmark to use main method ## What changes were proposed in this pull request? Refactor `JoinBenchmark` to use main method. 1. use `spark-submit`: ```console bin/spark-submit

svn commit: r30025 - in /dev/spark/2.4.1-SNAPSHOT-2018_10_12_14_02-0f58b98-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 21:16:38 2018 New Revision: 30025 Log: Apache Spark 2.4.1-SNAPSHOT-2018_10_12_14_02-0f58b98 docs [This commit notification would consist of 1472 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

svn commit: r30024 - in /dev/spark/3.0.0-SNAPSHOT-2018_10_12_12_02-8e039a7-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 19:16:31 2018 New Revision: 30024 Log: Apache Spark 3.0.0-SNAPSHOT-2018_10_12_12_02-8e039a7 docs [This commit notification would consist of 1481 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [STREAMING][DOC] Fix typo & formatting for JavaDoc

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-2.4 1a335444e -> 0f58b989d [STREAMING][DOC] Fix typo & formatting for JavaDoc ## What changes were proposed in this pull request? - Fixed typo for function outputMode - OutputMode.Complete(), changed `these is some updates` to `there

spark git commit: [STREAMING][DOC] Fix typo & formatting for JavaDoc

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master c7eadb5e6 -> 4e141a416 [STREAMING][DOC] Fix typo & formatting for JavaDoc ## What changes were proposed in this pull request? - Fixed typo for function outputMode - OutputMode.Complete(), changed `these is some updates` to `there are

spark git commit: [SPARK-25660][SQL] Fix for the backward slash as CSV fields delimiter

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.4 bb211cf27 -> 1a335444e [SPARK-25660][SQL] Fix for the backward slash as CSV fields delimiter ## What changes were proposed in this pull request? The PR addresses the exception raised on accessing chars out of delimiter string. In

spark git commit: [SPARK-25660][SQL] Fix for the backward slash as CSV fields delimiter

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/master 8e039a755 -> c7eadb5e6 [SPARK-25660][SQL] Fix for the backward slash as CSV fields delimiter ## What changes were proposed in this pull request? The PR addresses the exception raised on accessing chars out of delimiter string. In

spark git commit: [SPARK-25697][CORE] When zstd compression enabled, InProgress application is throwing Error in the history webui

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 52f9f66d5 -> 8e039a755 [SPARK-25697][CORE] When zstd compression enabled, InProgress application is throwing Error in the history webui ## What changes were proposed in this pull request? When we enable event log compression and

spark git commit: [SPARK-25697][CORE] When zstd compression enabled, InProgress application is throwing Error in the history webui

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-2.4 3dba5d41f -> bb211cf27 [SPARK-25697][CORE] When zstd compression enabled, InProgress application is throwing Error in the history webui ## What changes were proposed in this pull request? When we enable event log compression and

spark git commit: [SPARK-25712][CORE][MINOR] Improve usage message of start-master.sh and start-slave.sh

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 541d7e1e4 -> 52f9f66d5 [SPARK-25712][CORE][MINOR] Improve usage message of start-master.sh and start-slave.sh ## What changes were proposed in this pull request? Currently if we run ``` ./sbin/start-master.sh -h ``` We get ``` Usage:

spark git commit: [SPARK-25685][BUILD] Allow running tests in Jenkins in enterprise Git repository

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master d47a25f68 -> 541d7e1e4 [SPARK-25685][BUILD] Allow running tests in Jenkins in enterprise Git repository ## What changes were proposed in this pull request? Many companies have their own enterprise GitHub to manage Spark code. To build

spark git commit: [SPARK-25670][TEST] Reduce number of tested timezones in JsonExpressionsSuite

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 3494b1228 -> d47a25f68 [SPARK-25670][TEST] Reduce number of tested timezones in JsonExpressionsSuite ## What changes were proposed in this pull request? After the changes, total execution time of `JsonExpressionsSuite.scala` dropped from

spark git commit: [SPARK-25566][SPARK-25567][WEBUI][SQL] Support pagination for SQL tab to avoid OOM

2018-10-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 78e133141 -> 3494b1228 [SPARK-25566][SPARK-25567][WEBUI][SQL] Support pagination for SQL tab to avoid OOM ## What changes were proposed in this pull request? Currently SQL tab in the WEBUI doesn't support pagination. Because of that

svn commit: r30019 - in /dev/spark/3.0.0-SNAPSHOT-2018_10_12_04_02-78e1331-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 11:16:37 2018 New Revision: 30019 Log: Apache Spark 3.0.0-SNAPSHOT-2018_10_12_04_02-78e1331 docs [This commit notification would consist of 1481 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

svn commit: r30015 - in /dev/spark/2.4.1-SNAPSHOT-2018_10_12_02_02-3dba5d4-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 09:16:40 2018 New Revision: 30015 Log: Apache Spark 2.4.1-SNAPSHOT-2018_10_12_02_02-3dba5d4 docs [This commit notification would consist of 1472 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]

spark git commit: [SPARK-25708][SQL] HAVING without GROUP BY means global aggregate

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.4 1961f8e62 -> 3dba5d41f [SPARK-25708][SQL] HAVING without GROUP BY means global aggregate According to the SQL standard, when a query contains `HAVING`, it indicates an aggregate operator. For more details please refer to

spark git commit: [SPARK-25708][SQL] HAVING without GROUP BY means global aggregate

2018-10-12 Thread lixiao
Repository: spark Updated Branches: refs/heads/master 368513048 -> 78e133141 [SPARK-25708][SQL] HAVING without GROUP BY means global aggregate ## What changes were proposed in this pull request? According to the SQL standard, when a query contains `HAVING`, it indicates an aggregate

svn commit: r30010 - in /dev/spark/3.0.0-SNAPSHOT-2018_10_12_00_02-3685130-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-10-12 Thread pwendell
Author: pwendell Date: Fri Oct 12 07:16:55 2018 New Revision: 30010 Log: Apache Spark 3.0.0-SNAPSHOT-2018_10_12_00_02-3685130 docs [This commit notification would consist of 1481 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]