[spark] branch master updated (1c714be -> bf7215c)

2019-12-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1c714be  [SPARK-25100][TEST][FOLLOWUP] Refactor test cases in 
`FileSuite` and `KryoSerializerSuite`
 add bf7215c  [SPARK-30066][SQL][FOLLOWUP] Remove size field for interval 
column cache

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/execution/columnar/ColumnType.scala | 19 +--
 .../sql/execution/columnar/ColumnTypeSuite.scala  |  2 +-
 2 files changed, 14 insertions(+), 7 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1c714be -> bf7215c)

2019-12-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1c714be  [SPARK-25100][TEST][FOLLOWUP] Refactor test cases in 
`FileSuite` and `KryoSerializerSuite`
 add bf7215c  [SPARK-30066][SQL][FOLLOWUP] Remove size field for interval 
column cache

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/execution/columnar/ColumnType.scala | 19 +--
 .../sql/execution/columnar/ColumnTypeSuite.scala  |  2 +-
 2 files changed, 14 insertions(+), 7 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1c714be -> bf7215c)

2019-12-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1c714be  [SPARK-25100][TEST][FOLLOWUP] Refactor test cases in 
`FileSuite` and `KryoSerializerSuite`
 add bf7215c  [SPARK-30066][SQL][FOLLOWUP] Remove size field for interval 
column cache

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/execution/columnar/ColumnType.scala | 19 +--
 .../sql/execution/columnar/ColumnTypeSuite.scala  |  2 +-
 2 files changed, 14 insertions(+), 7 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37258 - in /dev/spark/v3.0.0-preview2-rc2-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apa

2019-12-16 Thread yumwang
Author: yumwang
Date: Tue Dec 17 05:34:42 2019
New Revision: 37258

Log:
Apache Spark v3.0.0-preview2-rc2 docs


[This commit notification would consist of 1912 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1da7e82 -> 1c714be)

2019-12-16 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1da7e82  [SPARK-30201][SQL] HiveOutputWriter standardOI should use 
ObjectInspectorCopyOption.DEFAULT
 add 1c714be  [SPARK-25100][TEST][FOLLOWUP] Refactor test cases in 
`FileSuite` and `KryoSerializerSuite`

No new revisions were added by this update.

Summary of changes:
 .../test/scala/org/apache/spark/FileSuite.scala| 41 +-
 .../spark/serializer/KryoSerializerSuite.scala | 14 
 2 files changed, 30 insertions(+), 25 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37253 - /dev/spark/v3.0.0-preview2-rc2-bin/

2019-12-16 Thread yumwang
Author: yumwang
Date: Tue Dec 17 05:03:58 2019
New Revision: 37253

Log:
Apache Spark v3.0.0-preview2-rc2

Added:
dev/spark/v3.0.0-preview2-rc2-bin/
dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz   (with 
props)
dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.asc
dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.sha512
dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz   (with props)
dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz.asc
dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz.sha512

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz
   (with props)

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.asc

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.sha512
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz   
(with props)
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz.asc

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz.sha512
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz   
(with props)
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz.asc

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz.sha512

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz   
(with props)

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz.asc

dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz.sha512
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2.tgz   (with props)
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2.tgz.asc
dev/spark/v3.0.0-preview2-rc2-bin/spark-3.0.0-preview2.tgz.sha512

Added: dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.asc
==
--- dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.asc (added)
+++ dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.asc Tue Dec 
17 05:03:58 2019
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCgAdFiEE29RHAQwbT32tP339bhtBIvajozgFAl34UBIACgkQbhtBIvaj
+ozjRsRAAiZzMUygHbimZWo/qu00S11zI6AVXnVz76HkPdiT7sikP1hSr6onC1VXP
+eZF1aL+QoMid3kToXkKi4NZ1F6XDQ+G45Lno9tuqDZ5QG8h2tfqm7ZfLDbaFm/vs
+PTMTbC5vgDO7fjSnCs3tKxenMMejo2VozvPMjbdNtpme2BUOVbQHqvrnyBwjqDyM
+Ba3Qxc7o50SJvWcIsT7/8DfCz0AC2Cc6MOUabEALrOzyvZ6LMeWHLlnxcxPq+Oz0
+AiU0fas6KbRY2uDT3WvVPPSXsFwqdEHkBdKMilqG+ATbUNg3HvfDLQ6xSmaKmVkL
+kf+ZmWJm1ujwMASJAWwRR88K2kkqJd0nELQWCMix+4PtHMuZ7gGSIe1NHwyJbR5e
+RaxbivVnTdN62V5lijnkD27O/NiJjDzmdW3GRktEhvbC22nFB3hsnp3SgwfCxmR9
+fVcMGjzF6Aqf4nZkqQ+86hBpd/YHGn8FmcvDujkhKgpGmoP9gdDReJ3YL0FKfHwI
+LzK7cVtfRaWLRA5DcxfEAG5TDokDi92sF0rOSCIKYoLEd85OZH7/Oo2BiL8DFGOt
+oJFtz5sJVdikxzIa3tir6BGgUI5LTRG0gMLk3whE+PPPF+ReIa+oU5Qde4J9YnKu
+xMDG34ZPr4VCAKhHDPBTpplj8tNI8uVbymt2/MWEBAGIhaTnL5U=
+=xelq
+-END PGP SIGNATURE-

Added: dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.sha512
==
--- dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.sha512 
(added)
+++ dev/spark/v3.0.0-preview2-rc2-bin/SparkR_3.0.0-preview2.tar.gz.sha512 Tue 
Dec 17 05:03:58 2019
@@ -0,0 +1,4 @@
+SparkR_3.0.0-preview2.tar.gz: 22BDD254 43C898CE 7D365F1A 8A811327 460E8F09
+  F254FB0C D1FFEB92 FD78D1A4 34C4A559 BCD9C432
+  9D046F4D 98AE208C C91A52D8 7A1259A7 11D4996F
+  00A00622

Added: dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz.asc
==
--- dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz.asc (added)
+++ dev/spark/v3.0.0-preview2-rc2-bin/pyspark-3.0.0.dev2.tar.gz.asc Tue Dec 17 
05:03:58 2019
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCgAdFiEE29RHAQwbT32tP339bhtBIvajozgFAl34UBMACgkQbhtBIvaj
+ozgeTBAAsiOKnEoUm2XSnqqmO9p7lRliFUsMjM8r9S/wr2IQxWv5iNS43dxcodL1

[spark] branch master updated (e75d9af -> 1da7e82)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from e75d9af  [SPARK-30094][SQL] Apply current namespace for the 
single-part table name
 add 1da7e82  [SPARK-30201][SQL] HiveOutputWriter standardOI should use 
ObjectInspectorCopyOption.DEFAULT

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/sql/hive/HiveInspectors.scala |  9 ++--
 .../spark/sql/hive/execution/HiveFileFormat.scala  |  7 ++-
 .../org/apache/spark/sql/hive/InsertSuite.scala| 24 ++
 3 files changed, 37 insertions(+), 3 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (e75d9af -> 1da7e82)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from e75d9af  [SPARK-30094][SQL] Apply current namespace for the 
single-part table name
 add 1da7e82  [SPARK-30201][SQL] HiveOutputWriter standardOI should use 
ObjectInspectorCopyOption.DEFAULT

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/sql/hive/HiveInspectors.scala |  9 ++--
 .../spark/sql/hive/execution/HiveFileFormat.scala  |  7 ++-
 .../org/apache/spark/sql/hive/InsertSuite.scala| 24 ++
 3 files changed, 37 insertions(+), 3 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (696288f -> e75d9af)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 696288f  [INFRA] Reverts commit 56dcd79 and c216ef1
 add e75d9af  [SPARK-30094][SQL] Apply current namespace for the 
single-part table name

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/analysis/ResolveCatalogs.scala |  2 +-
 .../spark/sql/connector/catalog/LookupCatalog.scala   | 11 ++-
 .../spark/sql/connector/catalog/LookupCatalogSuite.scala  |  3 ++-
 .../sql/catalyst/analysis/ResolveSessionCatalog.scala |  2 +-
 .../apache/spark/sql/connector/DataSourceV2SQLSuite.scala | 15 +++
 .../spark/sql/execution/command/PlanResolutionSuite.scala |  1 +
 6 files changed, 30 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing Spark release v3.0.0-preview2-rc2

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to tag v3.0.0-preview2-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git

commit bcadd5c3096109878fe26fb0d57a9b7d6fdaa257
Author: Yuming Wang 
AuthorDate: Tue Dec 17 03:06:28 2019 +

Preparing Spark release v3.0.0-preview2-rc2
---
 R/pkg/R/sparkR.R   | 4 ++--
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 44 insertions(+), 44 deletions(-)

diff --git a/R/pkg/R/sparkR.R b/R/pkg/R/sparkR.R
index cdb5909..b648c51 100644
--- a/R/pkg/R/sparkR.R
+++ b/R/pkg/R/sparkR.R
@@ -336,8 +336,8 @@ sparkR.session <- function(
 
   # Check if version number of SparkSession matches version number of SparkR 
package
   jvmVersion <- callJMethod(sparkSession, "version")
-  # Remove -SNAPSHOT from jvm versions
-  jvmVersionStrip <- gsub("-SNAPSHOT", "", jvmVersion)
+  # Remove -preview2 from jvm versions
+  jvmVersionStrip <- gsub("-preview2", "", jvmVersion)
   rPackageVersion <- paste0(packageVersion("SparkR"))
 
   if (jvmVersionStrip != rPackageVersion) {
diff --git a/assembly/pom.xml b/assembly/pom.xml
index ef916fb..715a112 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index a1c8a8e..965c1f3 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 163c250..557bd7a 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a6d9981..8c718d8 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 76a402b..65f491c 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 

[spark] tag v3.0.0-preview2-rc2 created (now bcadd5c)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to tag v3.0.0-preview2-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git.


  at bcadd5c  (commit)
This tag includes the following new commits:

 new bcadd5c  Preparing Spark release v3.0.0-preview2-rc2

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (56dcd79 -> 696288f)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 56dcd79  Preparing development version 3.0.1-SNAPSHOT
 add 696288f  [INFRA] Reverts commit 56dcd79 and c216ef1

No new revisions were added by this update.

Summary of changes:
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 43 insertions(+), 43 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing development version 3.0.1-SNAPSHOT

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git

commit 56dcd79992ff1eb67ffab337bf0ea68d9641ed2b
Author: Yuming Wang 
AuthorDate: Tue Dec 17 01:57:27 2019 +

Preparing development version 3.0.1-SNAPSHOT
---
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 43 insertions(+), 43 deletions(-)

diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 2d34a0c..392afec 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 3.0.0-preview2
+Version: 3.0.1
 Title: R Front End for 'Apache Spark'
 Description: Provides an R Front end for 'Apache Spark' 
.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 715a112..e8a296d 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-preview2
+3.0.1-SNAPSHOT
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 965c1f3..fc1441d 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-preview2
+3.0.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 557bd7a..de2a6fb 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-preview2
+3.0.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 8c718d8..6c0c016 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-preview2
+3.0.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 65f491c..b8df191 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-preview2
+3.0.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index 2bc7cd8..8119709 

[spark] branch master updated (5de5e46 -> 56dcd79)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5de5e46  [SPARK-30268][INFRA] Fix incorrect pyspark version when 
releasing preview versions
 add c216ef1  Preparing Spark release v3.0.0-preview2-rc2
 new 56dcd79  Preparing development version 3.0.1-SNAPSHOT

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 43 insertions(+), 43 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing Spark release v3.0.0-preview2-rc2

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to tag v3.0.0-preview2-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git

commit c216ef1d0371170847b007249c33f6f9f18700a0
Author: Yuming Wang 
AuthorDate: Tue Dec 17 01:57:21 2019 +

Preparing Spark release v3.0.0-preview2-rc2
---
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 43 insertions(+), 43 deletions(-)

diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 95d3e52..2d34a0c 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 3.0.0
+Version: 3.0.0-preview2
 Title: R Front End for 'Apache Spark'
 Description: Provides an R Front end for 'Apache Spark' 
.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
diff --git a/assembly/pom.xml b/assembly/pom.xml
index ef916fb..715a112 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index a1c8a8e..965c1f3 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 163c250..557bd7a 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a6d9981..8c718d8 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 76a402b..65f491c 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index 

[spark] tag v3.0.0-preview2-rc2 created (now c216ef1)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to tag v3.0.0-preview2-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git.


  at c216ef1  (commit)
This tag includes the following new commits:

 new c216ef1  Preparing Spark release v3.0.0-preview2-rc2

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (b03ce63 -> 5de5e46)

2019-12-16 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from b03ce63  [SPARK-30258][TESTS] Eliminate warnings of deprecated Spark 
APIs in tests
 add 5de5e46  [SPARK-30268][INFRA] Fix incorrect pyspark version when 
releasing preview versions

No new revisions were added by this update.

Summary of changes:
 dev/create-release/release-build.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (b03ce63 -> 5de5e46)

2019-12-16 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from b03ce63  [SPARK-30258][TESTS] Eliminate warnings of deprecated Spark 
APIs in tests
 add 5de5e46  [SPARK-30268][INFRA] Fix incorrect pyspark version when 
releasing preview versions

No new revisions were added by this update.

Summary of changes:
 dev/create-release/release-build.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5ed72a1 -> b03ce63)

2019-12-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5ed72a1  [SPARK-30247][PYSPARK] GaussianMixtureModel in py side should 
expose gaussian
 add b03ce63  [SPARK-30258][TESTS] Eliminate warnings of deprecated Spark 
APIs in tests

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/sql/avro/AvroFunctionsSuite.scala |  23 +++--
 ...te.scala => DeprecatedAvroFunctionsSuite.scala} |  84 +--
 .../apache/spark/sql/DatasetAggregatorSuite.scala  |  61 ---
 .../org/apache/spark/sql/DateFunctionsSuite.scala  |  87 +---
 .../sql/DeprecatedDatasetAggregatorSuite.scala |  77 ++
 .../spark/sql/DeprecatedDateFunctionsSuite.scala   | 113 +
 .../scala/org/apache/spark/sql/JoinSuite.scala |   6 +-
 .../scala/org/apache/spark/sql/SQLQuerySuite.scala |   6 +-
 .../DeprecatedWholeStageCodegenSuite.scala}|  33 +++---
 .../sql/execution/WholeStageCodegenSuite.scala |  14 ---
 .../DeprecatedStreamingAggregationSuite.scala  |  62 +++
 .../sql/streaming/StreamingAggregationSuite.scala  |  11 --
 12 files changed, 293 insertions(+), 284 deletions(-)
 copy 
external/avro/src/test/scala/org/apache/spark/sql/avro/{AvroFunctionsSuite.scala
 => DeprecatedAvroFunctionsSuite.scala} (62%)
 create mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/DeprecatedDatasetAggregatorSuite.scala
 create mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/DeprecatedDateFunctionsSuite.scala
 copy 
sql/core/src/test/scala/org/apache/spark/sql/{CountMinSketchAggQuerySuite.scala 
=> execution/DeprecatedWholeStageCodegenSuite.scala} (52%)
 create mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/streaming/DeprecatedStreamingAggregationSuite.scala


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (dd217e1 -> 5ed72a1)

2019-12-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from dd217e1  [SPARK-25392][CORE][WEBUI] Prevent error page when accessing 
pools page from history server
 add 5ed72a1  [SPARK-30247][PYSPARK] GaussianMixtureModel in py side should 
expose gaussian

No new revisions were added by this update.

Summary of changes:
 python/pyspark/ml/clustering.py | 14 ++
 1 file changed, 14 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-25392][CORE][WEBUI] Prevent error page when accessing pools page from history server

2019-12-16 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 6d90298  [SPARK-25392][CORE][WEBUI] Prevent error page when accessing 
pools page from history server
6d90298 is described below

commit 6d90298438e627187088a5d8c53d470646d051f4
Author: shahid 
AuthorDate: Mon Dec 16 15:02:34 2019 -0800

[SPARK-25392][CORE][WEBUI] Prevent error page when accessing pools page 
from history server

### What changes were proposed in this pull request?

### Why are the changes needed?

Currently from history server, we will not able to access the pool info, as 
we aren't writing pool information to the event log other than pool name. 
Already spark is hiding pool table when accessing from history server. But from 
the pool column in the stage table will redirect to the pools table, and that 
will throw error when accessing the pools page. To prevent error page, we need 
to hide the pool column also in the stage table

### Does this PR introduce any user-facing change?

No

### How was this patch tested?
Manual test

Before change:
![Screenshot 2019-11-21 at 6 49 40 
AM](https://user-images.githubusercontent.com/23054875/69293868-219b2280-0c30-11ea-9b9a-17140d024d3a.png)
![Screenshot 2019-11-21 at 6 48 51 
AM](https://user-images.githubusercontent.com/23054875/69293834-147e3380-0c30-11ea-9dec-d5f67665486d.png)

After change:
![Screenshot 2019-11-21 at 7 29 01 
AM](https://user-images.githubusercontent.com/23054875/69293991-9cfcd400-0c30-11ea-98a0-7a6268a4e5ab.png)

Closes #26616 from shahidki31/poolHistory.

Authored-by: shahid 
Signed-off-by: Marcelo Vanzin 
(cherry picked from commit dd217e10fc0408831c2c658fc3f52d2917f1a6a2)
Signed-off-by: Marcelo Vanzin 
---
 core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala | 3 +--
 core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala   | 2 ++
 core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala | 2 ++
 3 files changed, 5 insertions(+), 2 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala 
b/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
index f672ce0..d8a93ad 100644
--- a/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
+++ b/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
@@ -30,7 +30,6 @@ import org.apache.spark.ui.{UIUtils, WebUIPage}
 private[ui] class AllStagesPage(parent: StagesTab) extends WebUIPage("") {
   private val sc = parent.sc
   private val subPath = "stages"
-  private def isFairScheduler = parent.isFairScheduler
 
   def render(request: HttpServletRequest): Seq[Node] = {
 // For now, pool information is only accessible in live UIs
@@ -57,7 +56,7 @@ private[ui] class AllStagesPage(parent: StagesTab) extends 
WebUIPage("") {
 
   
 
-val poolsDescription = if (sc.isDefined && isFairScheduler) {
+val poolsDescription = if (parent.isFairScheduler) {
 
   
diff --git a/core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala 
b/core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala
index ff1b75e..102ec4d 100644
--- a/core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala
+++ b/core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala
@@ -33,7 +33,9 @@ private[ui] class JobsTab(parent: SparkUI, store: 
AppStatusStore)
   val sc = parent.sc
   val killEnabled = parent.killEnabled
 
+  // Show pool information for only live UI.
   def isFairScheduler: Boolean = {
+sc.isDefined &&
 store
   .environmentInfo()
   .sparkProperties
diff --git a/core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala 
b/core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala
index 10b0320..b7a8c56 100644
--- a/core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala
+++ b/core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala
@@ -36,7 +36,9 @@ private[ui] class StagesTab(val parent: SparkUI, val store: 
AppStatusStore)
   attachPage(new StagePage(this, store))
   attachPage(new PoolPage(this))
 
+  // Show pool information for only live UI.
   def isFairScheduler: Boolean = {
+sc.isDefined &&
 store
   .environmentInfo()
   .sparkProperties


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5954311 -> dd217e1)

2019-12-16 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5954311  [SPARK-29043][CORE] Improve the concurrent performance of 
History Server
 add dd217e1  [SPARK-25392][CORE][WEBUI] Prevent error page when accessing 
pools page from history server

No new revisions were added by this update.

Summary of changes:
 core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala | 3 +--
 core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala   | 2 ++
 core/src/main/scala/org/apache/spark/ui/jobs/StagesTab.scala | 2 ++
 3 files changed, 5 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (dddfeca -> 5954311)

2019-12-16 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from dddfeca  [SPARK-30209][SQL][WEB-UI] Display stageId, attemptId and 
taskId for max metrics in Spark UI
 add 5954311  [SPARK-29043][CORE] Improve the concurrent performance of 
History Server

No new revisions were added by this update.

Summary of changes:
 .../spark/deploy/history/FsHistoryProvider.scala   | 121 ++---
 .../deploy/history/FsHistoryProviderSuite.scala|  39 ++-
 2 files changed, 118 insertions(+), 42 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-30209][SQL][WEB-UI] Display stageId, attemptId and taskId for max metrics in Spark UI

2019-12-16 Thread tgraves
This is an automated email from the ASF dual-hosted git repository.

tgraves pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new dddfeca  [SPARK-30209][SQL][WEB-UI] Display stageId, attemptId and 
taskId for max metrics in Spark UI
dddfeca is described below

commit dddfeca175bdce5294debe00d4a993daef92ca60
Author: Niranjan Artal 
AuthorDate: Mon Dec 16 15:27:34 2019 -0600

[SPARK-30209][SQL][WEB-UI] Display stageId, attemptId and taskId for max 
metrics in Spark UI

### What changes were proposed in this pull request?

SPARK-30209 discusses about adding additional metrics such as stageId, 
attempId and taskId for max metrics. We have the data required to display in 
LiveStageMetrics. Need to capture and pass these metrics to display on the UI. 
To minimize memory used for variables, we are saving maximum of each metric id 
per stage. So per stage additional memory usage is (#metrics * 4 * 
sizeof(Long)).
Then max is calculated for each metric id among all stages which is passed 
in the stringValue method. Memory used is minimal. Ran the benchmark for 
runtime. Stage.Proc time has increased to around 1.5-2.5x but the Aggregate 
time has decreased.

### Why are the changes needed?

These additional metrics stageId, attemptId and taskId could help in 
debugging the jobs quicker.  For a  given operator, it will be easy to identify 
the task which is taking maximum time to complete from the SQL tab itself.

### Does this PR introduce any user-facing change?

Yes. stageId, attemptId and taskId is shown only for executor side metrics. 
For driver metrics, "(driver)" is displayed on UI.
![image 
(3)](https://user-images.githubusercontent.com/50492963/70763041-929d9980-1d07-11ea-940f-88ac6bdce9b5.png)

"Driver"
![image 
(4)](https://user-images.githubusercontent.com/50492963/70763043-94675d00-1d07-11ea-95ab-3478728cb435.png)

### How was this patch tested?

Manually tested, ran benchmark script for runtime.

Closes #26843 from nartal1/SPARK-30209.

Authored-by: Niranjan Artal 
Signed-off-by: Thomas Graves 
---
 .../spark/sql/execution/metric/SQLMetrics.scala| 52 --
 .../sql/execution/ui/SQLAppStatusListener.scala| 63 +-
 .../sql/execution/metric/SQLMetricsSuite.scala | 39 +-
 .../sql/execution/metric/SQLMetricsTestUtils.scala | 26 ++---
 .../execution/ui/SQLAppStatusListenerSuite.scala   |  5 +-
 5 files changed, 137 insertions(+), 48 deletions(-)

diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala
index b7f0ab2..45b1c86 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala
@@ -111,7 +111,8 @@ object SQLMetrics {
 // data size total (min, med, max):
 // 100GB (100MB, 1GB, 10GB)
 val acc = new SQLMetric(SIZE_METRIC, -1)
-acc.register(sc, name = Some(s"$name total (min, med, max)"), 
countFailedValues = false)
+acc.register(sc, name = Some(s"$name total (min, med, max (stageId 
(attemptId): taskId))"),
+  countFailedValues = false)
 acc
   }
 
@@ -120,14 +121,16 @@ object SQLMetrics {
 // duration(min, med, max):
 // 5s (800ms, 1s, 2s)
 val acc = new SQLMetric(TIMING_METRIC, -1)
-acc.register(sc, name = Some(s"$name total (min, med, max)"), 
countFailedValues = false)
+acc.register(sc, name = Some(s"$name total (min, med, max (stageId 
(attemptId): taskId))"),
+  countFailedValues = false)
 acc
   }
 
   def createNanoTimingMetric(sc: SparkContext, name: String): SQLMetric = {
 // Same with createTimingMetric, just normalize the unit of time to 
millisecond.
 val acc = new SQLMetric(NS_TIMING_METRIC, -1)
-acc.register(sc, name = Some(s"$name total (min, med, max)"), 
countFailedValues = false)
+acc.register(sc, name = Some(s"$name total (min, med, max (stageId 
(attemptId): taskId))"),
+  countFailedValues = false)
 acc
   }
 
@@ -142,31 +145,46 @@ object SQLMetrics {
 // probe avg (min, med, max):
 // (1.2, 2.2, 6.3)
 val acc = new SQLMetric(AVERAGE_METRIC)
-acc.register(sc, name = Some(s"$name (min, med, max)"), countFailedValues 
= false)
+acc.register(sc, name = Some(s"$name (min, med, max (stageId (attemptId): 
taskId))"),
+  countFailedValues = false)
 acc
   }
 
+  private def toNumberFormat(value: Long): String = {
+val numberFormat = NumberFormat.getNumberInstance(Locale.US)
+numberFormat.format(value.toDouble / baseForAvgMetric)
+  }
+
+  def metricNeedsMax(metricsType: String): Boolean = {
+metricsType != SUM_METRIC
+  }
+
   /**
* A function that 

[spark] branch master updated (23b1312 -> b573f23)

2019-12-16 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 23b1312  [SPARK-30200][DOCS][FOLLOW-UP] Add documentation for 
explain(mode: String)
 add b573f23  [SPARK-29574][K8S] Add SPARK_DIST_CLASSPATH to the executor 
class path

No new revisions were added by this update.

Summary of changes:
 docs/hadoop-provided.md| 22 ++
 .../src/main/dockerfiles/spark/entrypoint.sh   |  8 +++-
 2 files changed, 29 insertions(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (23b1312 -> b573f23)

2019-12-16 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 23b1312  [SPARK-30200][DOCS][FOLLOW-UP] Add documentation for 
explain(mode: String)
 add b573f23  [SPARK-29574][K8S] Add SPARK_DIST_CLASSPATH to the executor 
class path

No new revisions were added by this update.

Summary of changes:
 docs/hadoop-provided.md| 22 ++
 .../src/main/dockerfiles/spark/entrypoint.sh   |  8 +++-
 2 files changed, 29 insertions(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37240 - in /dev/spark/v3.0.0-preview2-rc1-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apa

2019-12-16 Thread yumwang
Author: yumwang
Date: Mon Dec 16 15:09:53 2019
New Revision: 37240

Log:
Apache Spark v3.0.0-preview2-rc1 docs


[This commit notification would consist of 1912 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37239 - /dev/spark/v3.0.0-preview2-rc1-bin/

2019-12-16 Thread yumwang
Author: yumwang
Date: Mon Dec 16 14:41:02 2019
New Revision: 37239

Log:
Apache Spark v3.0.0-preview2-rc1

Added:
dev/spark/v3.0.0-preview2-rc1-bin/
dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz   (with 
props)
dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.asc
dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.sha512
dev/spark/v3.0.0-preview2-rc1-bin/pyspark-3.0.0.dev02.tar.gz.sha512

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz
   (with props)

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.asc

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.sha512
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz   
(with props)
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz.asc

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7.tgz.sha512
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz   
(with props)
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz.asc

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop3.2.tgz.sha512

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz   
(with props)

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz.asc

dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-without-hadoop.tgz.sha512
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2.tgz   (with props)
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2.tgz.asc
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2.tgz.sha512

Added: dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.asc
==
--- dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.asc (added)
+++ dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.asc Mon Dec 
16 14:41:02 2019
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJHBAABCgAxFiEE29RHAQwbT32tP339bhtBIvajozgFAl33hZcTHHl1bXdhbmdA
+YXBhY2hlLm9yZwAKCRBuG0Ei9qOjOKfsEADKxS7K/N1Ol0vBqqJ8xtUU/Gxt50hI
+rqLvgMGitVWDsVWWubz8zs1+6PHTRn8VTxVleo+7SDRvKqPPPY+vqFCd0TujzcCt
+Zb5rcLGGxdX5eVbQcIy+ENsVPtO8aFaQ89Lq7h9X1uuE9ymVm0Rz5EM4SrKBk11T
+73a1cvhyxYIG7gNU4Qr4D5QIsRBojz1L4lw0e06EVZFeO5xi4SKqJx5cy3l5yGaj
+1secJO30ewod7GWLlINb65rGWEhxDsYHt0q/gqibZSS1w/CcpQME2cdckr5CPLWU
+DINJwFjlPaC0bu939lVnIi8terEo2Mf7ocLHfeaOlegzFdWIa7KoV7MyVuj7amrc
+Qiv73erA9GuTjjMEbW8pLy9RBvZp+iVaaxbS0LLQDrARY/Dr1JlQOAQueo87wWBW
+FYiav9vvu9ABcKZ/bHHV/eR0VehyxD0hN+Lsqa0fgwhepQxro6CxAIhLYNeNug8G
+0JYXGxpqAlOOoSamhuDtThqv2Asoom5nQYUkN/PXJU8yuRokwcm7prNEMUugSDWj
+F192+6nYIX1LZQAdzW56FwcYMQ9yF5vKI5jg8OdHX7k5Yyqs7cVognp/0C3hT6+T
+6GWFsqxIRVyxsxVD7TkxfjW/UEMN275PVfhWTQF89a2l0M4dXYDmXiBcM/4MIEX6
+p9LZX6rXNLMmmA==
+=uKQI
+-END PGP SIGNATURE-

Added: dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.sha512
==
--- dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.sha512 
(added)
+++ dev/spark/v3.0.0-preview2-rc1-bin/SparkR_3.0.0-preview2.tar.gz.sha512 Mon 
Dec 16 14:41:02 2019
@@ -0,0 +1,4 @@
+SparkR_3.0.0-preview2.tar.gz: 964E49A9 B85AFC5D 9B581802 0F063E73 41A10776
+  E77E5D93 DA3C4C67 EC6933A4 1ABCBBB7 3DA55349
+  E8414356 D9FA96C3 7D665CA6 07549011 FDAA3808
+  A50FDEC0

Added: dev/spark/v3.0.0-preview2-rc1-bin/pyspark-3.0.0.dev02.tar.gz.sha512
==
(empty)

Added: 
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz
==
Binary file - no diff available.

Propchange: 
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz
--
svn:mime-type = application/octet-stream

Added: 
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.asc
==
--- 
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.asc
 (added)
+++ 
dev/spark/v3.0.0-preview2-rc1-bin/spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz.asc
 Mon Dec 16 14:41:02 2019
@@ -0,0 +1,17 @@
+-BEGIN PGP 

[spark] branch master updated (ba0f59b -> 23b1312)

2019-12-16 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from ba0f59b  [SPARK-30265][INFRA] Do not change R version when releasing 
preview versions
 add 23b1312  [SPARK-30200][DOCS][FOLLOW-UP] Add documentation for 
explain(mode: String)

No new revisions were added by this update.

Summary of changes:
 sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala | 12 
 1 file changed, 12 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (ba0f59b -> 23b1312)

2019-12-16 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from ba0f59b  [SPARK-30265][INFRA] Do not change R version when releasing 
preview versions
 add 23b1312  [SPARK-30200][DOCS][FOLLOW-UP] Add documentation for 
explain(mode: String)

No new revisions were added by this update.

Summary of changes:
 sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala | 12 
 1 file changed, 12 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] tag v3.0.0-preview2-rc1 created (now b668811)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to tag v3.0.0-preview2-rc1
in repository https://gitbox.apache.org/repos/asf/spark.git.


  at b668811  (commit)
This tag includes the following new commits:

 new b668811  Preparing Spark release v3.0.0-preview2-rc1

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing Spark release v3.0.0-preview2-rc1

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to tag v3.0.0-preview2-rc1
in repository https://gitbox.apache.org/repos/asf/spark.git

commit b6688115024194d5b0d83758bd6dc0b0f6ec8cc7
Author: Yuming Wang 
AuthorDate: Mon Dec 16 12:14:54 2019 +

Preparing Spark release v3.0.0-preview2-rc1
---
 R/pkg/R/sparkR.R   | 4 ++--
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graph/api/pom.xml  | 2 +-
 graph/cypher/pom.xml   | 2 +-
 graph/graph/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 42 files changed, 44 insertions(+), 44 deletions(-)

diff --git a/R/pkg/R/sparkR.R b/R/pkg/R/sparkR.R
index cdb5909..b648c51 100644
--- a/R/pkg/R/sparkR.R
+++ b/R/pkg/R/sparkR.R
@@ -336,8 +336,8 @@ sparkR.session <- function(
 
   # Check if version number of SparkSession matches version number of SparkR 
package
   jvmVersion <- callJMethod(sparkSession, "version")
-  # Remove -SNAPSHOT from jvm versions
-  jvmVersionStrip <- gsub("-SNAPSHOT", "", jvmVersion)
+  # Remove -preview2 from jvm versions
+  jvmVersionStrip <- gsub("-preview2", "", jvmVersion)
   rPackageVersion <- paste0(packageVersion("SparkR"))
 
   if (jvmVersionStrip != rPackageVersion) {
diff --git a/assembly/pom.xml b/assembly/pom.xml
index ef916fb..715a112 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index a1c8a8e..965c1f3 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 163c250..557bd7a 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a6d9981..8c718d8 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.0.0-SNAPSHOT
+3.0.0-preview2
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 76a402b..65f491c 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 

[spark] branch master updated (fdcd0e7 -> ba0f59b)

2019-12-16 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from fdcd0e7  [SPARK-30192][SQL] support column position in DS v2
 add ba0f59b  [SPARK-30265][INFRA] Do not change R version when releasing 
preview versions

No new revisions were added by this update.

Summary of changes:
 dev/create-release/release-tag.sh | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (72f5597 -> fdcd0e7)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 72f5597  [SPARK-30104][SQL][FOLLOWUP] Remove 
LookupCatalog.AsTemporaryViewIdentifier
 add fdcd0e7  [SPARK-30192][SQL] support column position in DS v2

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/sql/catalyst/parser/SqlBase.g4|   2 +-
 .../sql/connector/catalog/IdentifierImpl.java  |  19 +--
 .../spark/sql/connector/catalog/TableChange.java   | 178 +++--
 .../sql/catalyst/analysis/ResolveCatalogs.scala|  23 ++-
 .../spark/sql/catalyst/parser/AstBuilder.scala |  27 ++--
 .../sql/catalyst/plans/logical/statements.scala|  10 +-
 .../sql/connector/catalog/CatalogV2Implicits.scala |   2 +-
 .../sql/connector/catalog/CatalogV2Util.scala  |  60 +--
 .../spark/sql/catalyst/parser/DDLParserSuite.scala |  70 +---
 .../catalyst/analysis/ResolveSessionCatalog.scala  |  27 +++-
 .../sql-tests/results/change-column.sql.out|  20 +--
 .../spark/sql/connector/AlterTableTests.scala  |  98 
 .../spark/sql/connector/DataSourceV2SQLSuite.scala |   4 +-
 .../spark/sql/execution/command/DDLSuite.scala |  10 ++
 14 files changed, 444 insertions(+), 106 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (72f5597 -> fdcd0e7)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 72f5597  [SPARK-30104][SQL][FOLLOWUP] Remove 
LookupCatalog.AsTemporaryViewIdentifier
 add fdcd0e7  [SPARK-30192][SQL] support column position in DS v2

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/sql/catalyst/parser/SqlBase.g4|   2 +-
 .../sql/connector/catalog/IdentifierImpl.java  |  19 +--
 .../spark/sql/connector/catalog/TableChange.java   | 178 +++--
 .../sql/catalyst/analysis/ResolveCatalogs.scala|  23 ++-
 .../spark/sql/catalyst/parser/AstBuilder.scala |  27 ++--
 .../sql/catalyst/plans/logical/statements.scala|  10 +-
 .../sql/connector/catalog/CatalogV2Implicits.scala |   2 +-
 .../sql/connector/catalog/CatalogV2Util.scala  |  60 +--
 .../spark/sql/catalyst/parser/DDLParserSuite.scala |  70 +---
 .../catalyst/analysis/ResolveSessionCatalog.scala  |  27 +++-
 .../sql-tests/results/change-column.sql.out|  20 +--
 .../spark/sql/connector/AlterTableTests.scala  |  98 
 .../spark/sql/connector/DataSourceV2SQLSuite.scala |   4 +-
 .../spark/sql/execution/command/DDLSuite.scala |  10 ++
 14 files changed, 444 insertions(+), 106 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3bf5498 -> 72f5597)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3bf5498  [MINOR][DOCS] Fix documentation for slide function
 add 72f5597  [SPARK-30104][SQL][FOLLOWUP] Remove 
LookupCatalog.AsTemporaryViewIdentifier

No new revisions were added by this update.

Summary of changes:
 .../sql/connector/catalog/LookupCatalog.scala  | 14 --
 .../sql/connector/catalog/LookupCatalogSuite.scala | 56 --
 2 files changed, 70 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3bf5498 -> 72f5597)

2019-12-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3bf5498  [MINOR][DOCS] Fix documentation for slide function
 add 72f5597  [SPARK-30104][SQL][FOLLOWUP] Remove 
LookupCatalog.AsTemporaryViewIdentifier

No new revisions were added by this update.

Summary of changes:
 .../sql/connector/catalog/LookupCatalog.scala  | 14 --
 .../sql/connector/catalog/LookupCatalogSuite.scala | 56 --
 2 files changed, 70 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org