[spark] branch branch-3.0 updated: [SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` configs to StaticSQLConf

2020-02-02 Thread zsxwing
This is an automated email from the ASF dual-hosted git repository.

zsxwing pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new f9b8637  [SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` 
configs to StaticSQLConf
f9b8637 is described below

commit f9b86370cb04b72a4f00cbd4d60873960aa2792c
Author: Yuanjian Li 
AuthorDate: Sun Feb 2 23:37:13 2020 -0800

[SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` configs to 
StaticSQLConf

### What changes were proposed in this pull request?
Put the configs below needed by Structured Streaming UI into StaticSQLConf:

- spark.sql.streaming.ui.enabled
- spark.sql.streaming.ui.retainedProgressUpdates
- spark.sql.streaming.ui.retainedQueries

### Why are the changes needed?
Make all SS UI configs consistent with other similar configs in usage and 
naming.

### Does this PR introduce any user-facing change?
Yes, add new static config `spark.sql.streaming.ui.retainedProgressUpdates`.

### How was this patch tested?
Existing UT.

Closes #27425 from xuanyuanking/SPARK-29543-follow.

Authored-by: Yuanjian Li 
Signed-off-by: Shixiong Zhu 
(cherry picked from commit a4912cee615314e9578e6ab4eae25f147feacbd5)
Signed-off-by: Shixiong Zhu 
---
 .../org/apache/spark/sql/internal/SQLConf.scala  | 16 
 .../apache/spark/sql/internal/StaticSQLConf.scala| 20 
 .../org/apache/spark/sql/internal/SharedState.scala  | 15 ---
 .../streaming/ui/StreamingQueryStatusListener.scala  | 10 ++
 .../spark/sql/streaming/ui/StreamingQueryTab.scala   |  2 +-
 .../ui/StreamingQueryStatusListenerSuite.scala   |  4 ++--
 6 files changed, 37 insertions(+), 30 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
index 04572c3..3ad3416 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -1150,18 +1150,6 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
-  val STREAMING_UI_ENABLED =
-buildConf("spark.sql.streaming.ui.enabled")
-  .doc("Whether to run the structured streaming UI for the Spark 
application.")
-  .booleanConf
-  .createWithDefault(true)
-
-  val STREAMING_UI_INACTIVE_QUERY_RETENTION =
-buildConf("spark.sql.streaming.ui.numInactiveQueries")
-  .doc("The number of inactive queries to retain for structured streaming 
ui.")
-  .intConf
-  .createWithDefault(100)
-
   val VARIABLE_SUBSTITUTE_ENABLED =
 buildConf("spark.sql.variable.substitute")
   .doc("This enables substitution using syntax like ${var} ${system:var} 
and ${env:var}.")
@@ -2284,10 +2272,6 @@ class SQLConf extends Serializable with Logging {
 
   def isUnsupportedOperationCheckEnabled: Boolean = 
getConf(UNSUPPORTED_OPERATION_CHECK_ENABLED)
 
-  def isStreamingUIEnabled: Boolean = getConf(STREAMING_UI_ENABLED)
-
-  def streamingUIInactiveQueryRetention: Int = 
getConf(STREAMING_UI_INACTIVE_QUERY_RETENTION)
-
   def streamingFileCommitProtocolClass: String = 
getConf(STREAMING_FILE_COMMIT_PROTOCOL_CLASS)
 
   def fileSinkLogDeletion: Boolean = getConf(FILE_SINK_LOG_DELETION)
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
index 66ac9ddb..6bc7522 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
@@ -176,4 +176,24 @@ object StaticSQLConf {
   .internal()
   .booleanConf
   .createWithDefault(true)
+
+  val STREAMING_UI_ENABLED =
+buildStaticConf("spark.sql.streaming.ui.enabled")
+  .doc("Whether to run the Structured Streaming Web UI for the Spark 
application when the " +
+"Spark Web UI is enabled.")
+  .booleanConf
+  .createWithDefault(true)
+
+  val STREAMING_UI_RETAINED_PROGRESS_UPDATES =
+buildStaticConf("spark.sql.streaming.ui.retainedProgressUpdates")
+  .doc("The number of progress updates to retain for a streaming query for 
Structured " +
+"Streaming UI.")
+  .intConf
+  .createWithDefault(100)
+
+  val STREAMING_UI_RETAINED_QUERIES =
+buildStaticConf("spark.sql.streaming.ui.retainedQueries")
+  .doc("The number of inactive queries to retain for Structured Streaming 
UI.")
+  .intConf
+  .createWithDefault(100)
 }
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala
index fefd72d..5347264 100644
--- 

[spark] branch master updated: [SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` configs to StaticSQLConf

2020-02-02 Thread zsxwing
This is an automated email from the ASF dual-hosted git repository.

zsxwing pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new a4912ce  [SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` 
configs to StaticSQLConf
a4912ce is described below

commit a4912cee615314e9578e6ab4eae25f147feacbd5
Author: Yuanjian Li 
AuthorDate: Sun Feb 2 23:37:13 2020 -0800

[SPARK-29543][SS][FOLLOWUP] Move `spark.sql.streaming.ui.*` configs to 
StaticSQLConf

### What changes were proposed in this pull request?
Put the configs below needed by Structured Streaming UI into StaticSQLConf:

- spark.sql.streaming.ui.enabled
- spark.sql.streaming.ui.retainedProgressUpdates
- spark.sql.streaming.ui.retainedQueries

### Why are the changes needed?
Make all SS UI configs consistent with other similar configs in usage and 
naming.

### Does this PR introduce any user-facing change?
Yes, add new static config `spark.sql.streaming.ui.retainedProgressUpdates`.

### How was this patch tested?
Existing UT.

Closes #27425 from xuanyuanking/SPARK-29543-follow.

Authored-by: Yuanjian Li 
Signed-off-by: Shixiong Zhu 
---
 .../org/apache/spark/sql/internal/SQLConf.scala  | 16 
 .../apache/spark/sql/internal/StaticSQLConf.scala| 20 
 .../org/apache/spark/sql/internal/SharedState.scala  | 15 ---
 .../streaming/ui/StreamingQueryStatusListener.scala  | 10 ++
 .../spark/sql/streaming/ui/StreamingQueryTab.scala   |  2 +-
 .../ui/StreamingQueryStatusListenerSuite.scala   |  4 ++--
 6 files changed, 37 insertions(+), 30 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
index 04572c3..3ad3416 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -1150,18 +1150,6 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
-  val STREAMING_UI_ENABLED =
-buildConf("spark.sql.streaming.ui.enabled")
-  .doc("Whether to run the structured streaming UI for the Spark 
application.")
-  .booleanConf
-  .createWithDefault(true)
-
-  val STREAMING_UI_INACTIVE_QUERY_RETENTION =
-buildConf("spark.sql.streaming.ui.numInactiveQueries")
-  .doc("The number of inactive queries to retain for structured streaming 
ui.")
-  .intConf
-  .createWithDefault(100)
-
   val VARIABLE_SUBSTITUTE_ENABLED =
 buildConf("spark.sql.variable.substitute")
   .doc("This enables substitution using syntax like ${var} ${system:var} 
and ${env:var}.")
@@ -2284,10 +2272,6 @@ class SQLConf extends Serializable with Logging {
 
   def isUnsupportedOperationCheckEnabled: Boolean = 
getConf(UNSUPPORTED_OPERATION_CHECK_ENABLED)
 
-  def isStreamingUIEnabled: Boolean = getConf(STREAMING_UI_ENABLED)
-
-  def streamingUIInactiveQueryRetention: Int = 
getConf(STREAMING_UI_INACTIVE_QUERY_RETENTION)
-
   def streamingFileCommitProtocolClass: String = 
getConf(STREAMING_FILE_COMMIT_PROTOCOL_CLASS)
 
   def fileSinkLogDeletion: Boolean = getConf(FILE_SINK_LOG_DELETION)
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
index 66ac9ddb..6bc7522 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala
@@ -176,4 +176,24 @@ object StaticSQLConf {
   .internal()
   .booleanConf
   .createWithDefault(true)
+
+  val STREAMING_UI_ENABLED =
+buildStaticConf("spark.sql.streaming.ui.enabled")
+  .doc("Whether to run the Structured Streaming Web UI for the Spark 
application when the " +
+"Spark Web UI is enabled.")
+  .booleanConf
+  .createWithDefault(true)
+
+  val STREAMING_UI_RETAINED_PROGRESS_UPDATES =
+buildStaticConf("spark.sql.streaming.ui.retainedProgressUpdates")
+  .doc("The number of progress updates to retain for a streaming query for 
Structured " +
+"Streaming UI.")
+  .intConf
+  .createWithDefault(100)
+
+  val STREAMING_UI_RETAINED_QUERIES =
+buildStaticConf("spark.sql.streaming.ui.retainedQueries")
+  .doc("The number of inactive queries to retain for Structured Streaming 
UI.")
+  .intConf
+  .createWithDefault(100)
 }
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala
index fefd72d..5347264 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala
+++ 

[spark] branch branch-3.0 updated: [SPARK-30697][SQL] Handle database and namespace exceptions in catalog.isView

2020-02-02 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 91f78ae  [SPARK-30697][SQL] Handle database and namespace exceptions 
in catalog.isView
91f78ae is described below

commit 91f78aee71fad5677445ba21024263d1037a
Author: Burak Yavuz 
AuthorDate: Mon Feb 3 14:08:59 2020 +0800

[SPARK-30697][SQL] Handle database and namespace exceptions in 
catalog.isView

### What changes were proposed in this pull request?

Adds NoSuchDatabaseException and NoSuchNamespaceException to the `isView` 
method for SessionCatalog.

### Why are the changes needed?

This method prevents specialized resolutions from kicking in within 
Analysis when using V2 Catalogs if the identifier is a specialized identifier.

### Does this PR introduce any user-facing change?

No

### How was this patch tested?

Added test to DataSourceV2SessionCatalogSuite

Closes #27423 from brkyvz/isViewF.

Authored-by: Burak Yavuz 
Signed-off-by: Wenchen Fan 
(cherry picked from commit 2eccfd8a73c4afa30a6aa97c2afd38661f29e24b)
Signed-off-by: Wenchen Fan 
---
 .../sql/catalyst/catalog/SessionCatalog.scala  |  2 ++
 .../DataSourceV2DataFrameSessionCatalogSuite.scala | 22 ++
 .../DataSourceV2SQLSessionCatalogSuite.scala   | 14 ++
 3 files changed, 38 insertions(+)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
index 45f0ef6..12f9a61 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
@@ -826,6 +826,8 @@ class SessionCatalog(
 getTempViewOrPermanentTableMetadata(ident).tableType == 
CatalogTableType.VIEW
   } catch {
 case _: NoSuchTableException => false
+case _: NoSuchDatabaseException => false
+case _: NoSuchNamespaceException => false
   }
 }
   }
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2DataFrameSessionCatalogSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2DataFrameSessionCatalogSuite.scala
index 4c67888..01caf8e 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2DataFrameSessionCatalogSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2DataFrameSessionCatalogSuite.scala
@@ -101,6 +101,13 @@ class InMemoryTableSessionCatalog extends 
TestV2SessionCatalogBase[InMemoryTable
 new InMemoryTable(name, schema, partitions, properties)
   }
 
+  override def loadTable(ident: Identifier): Table = {
+val identToUse = 
Option(InMemoryTableSessionCatalog.customIdentifierResolution)
+  .map(_(ident))
+  .getOrElse(ident)
+super.loadTable(identToUse)
+  }
+
   override def alterTable(ident: Identifier, changes: TableChange*): Table = {
 val fullIdent = fullIdentifier(ident)
 Option(tables.get(fullIdent)) match {
@@ -125,6 +132,21 @@ class InMemoryTableSessionCatalog extends 
TestV2SessionCatalogBase[InMemoryTable
   }
 }
 
+object InMemoryTableSessionCatalog {
+  private var customIdentifierResolution: Identifier => Identifier = _
+
+  def withCustomIdentifierResolver(
+  resolver: Identifier => Identifier)(
+  f: => Unit): Unit = {
+try {
+  customIdentifierResolution = resolver
+  f
+} finally {
+  customIdentifierResolution = null
+}
+  }
+}
+
 private [connector] trait SessionCatalogTest[T <: Table, Catalog <: 
TestV2SessionCatalogBase[T]]
   extends QueryTest
   with SharedSparkSession
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSessionCatalogSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSessionCatalogSuite.scala
index 27725bc..b699744 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSessionCatalogSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSessionCatalogSuite.scala
@@ -49,4 +49,18 @@ class DataSourceV2SQLSessionCatalogSuite
 v2Catalog.asInstanceOf[TableCatalog]
   .loadTable(Identifier.of(Array.empty, nameParts.last))
   }
+
+  test("SPARK-30697: catalog.isView doesn't throw an error for specialized 
identifiers") {
+val t1 = "tbl"
+withTable(t1) {
+  sql(s"CREATE TABLE $t1 (id bigint, data string) USING $v2Format")
+
+  def idResolver(id: Identifier): Identifier = Identifier.of(Array.empty, 
id.name())
+
+  InMemoryTableSessionCatalog.withCustomIdentifierResolver(idResolver) {
+// The following should not 

[spark] branch master updated (fb321b6 -> 2eccfd8)

2020-02-02 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from fb321b6  [MINOR][SPARKR][DOCS] Remove duplicate @name tags from 
read.df and read.stream
 add 2eccfd8  [SPARK-30697][SQL] Handle database and namespace exceptions 
in catalog.isView

No new revisions were added by this update.

Summary of changes:
 .../sql/catalyst/catalog/SessionCatalog.scala  |  2 ++
 .../DataSourceV2DataFrameSessionCatalogSuite.scala | 22 ++
 .../DataSourceV2SQLSessionCatalogSuite.scala   | 14 ++
 3 files changed, 38 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (fb321b6 -> 2eccfd8)

2020-02-02 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from fb321b6  [MINOR][SPARKR][DOCS] Remove duplicate @name tags from 
read.df and read.stream
 add 2eccfd8  [SPARK-30697][SQL] Handle database and namespace exceptions 
in catalog.isView

No new revisions were added by this update.

Summary of changes:
 .../sql/catalyst/catalog/SessionCatalog.scala  |  2 ++
 .../DataSourceV2DataFrameSessionCatalogSuite.scala | 22 ++
 .../DataSourceV2SQLSessionCatalogSuite.scala   | 14 ++
 3 files changed, 38 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1adf352 -> fb321b6)

2020-02-02 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1adf352  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest
 add fb321b6  [MINOR][SPARKR][DOCS] Remove duplicate @name tags from 
read.df and read.stream

No new revisions were added by this update.

Summary of changes:
 R/pkg/R/SQLContext.R | 2 --
 1 file changed, 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1adf352 -> fb321b6)

2020-02-02 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1adf352  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest
 add fb321b6  [MINOR][SPARKR][DOCS] Remove duplicate @name tags from 
read.df and read.stream

No new revisions were added by this update.

Summary of changes:
 R/pkg/R/SQLContext.R | 2 --
 1 file changed, 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37835 - in /dev/spark/v2.4.5-rc2-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2020-02-02 Thread dongjoon
Author: dongjoon
Date: Sun Feb  2 20:53:31 2020
New Revision: 37835

Log:
Apache Spark v2.4.5-rc2 docs


[This commit notification would consist of 1459 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37833 - /dev/spark/v2.4.5-rc2-bin/

2020-02-02 Thread dongjoon
Author: dongjoon
Date: Sun Feb  2 20:27:00 2020
New Revision: 37833

Log:
Apache Spark v2.4.5-rc2

Added:
dev/spark/v2.4.5-rc2-bin/
dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz   (with props)
dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.asc
dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.sha512
dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz   (with props)
dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz.asc
dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz.sha512
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.6.tgz.asc
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.6.tgz.sha512
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.7.tgz.asc
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-hadoop2.7.tgz.sha512
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop-scala-2.12.tgz   
(with props)
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop-scala-2.12.tgz.asc

dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop-scala-2.12.tgz.sha512
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop.tgz   (with props)
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop.tgz.asc
dev/spark/v2.4.5-rc2-bin/spark-2.4.5-bin-without-hadoop.tgz.sha512
dev/spark/v2.4.5-rc2-bin/spark-2.4.5.tgz   (with props)
dev/spark/v2.4.5-rc2-bin/spark-2.4.5.tgz.asc
dev/spark/v2.4.5-rc2-bin/spark-2.4.5.tgz.sha512

Added: dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.asc
==
--- dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.asc (added)
+++ dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.asc Sun Feb  2 20:27:00 2020
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQIcBAABAgAGBQJeNyqPAAoJEO2gDOg08PxcNocP/R7s0Cs5G7n8NMjwQmMel9m1
+IUBfPOz3f/qesjX5+q48jvoB2se99g2OTbeaWXE+rS6BmekGevrlaovRXvJ3tuH2
+kHwMJjvc6H3xS4oEEohlqAV18Sn0Up6NidaeufTXzIi+hP4EYdzpH+zeFOOYSfJj
++lAysESeAXIcTCf5ITFpNjaQHH/pfpxgSfgO7CBaN83CXjdn9dnUZNyXvS4e5eOY
+PXeuhSO/CRWK6yzus17WP1v15okVsOmL9rxssAv+SMFPHBOhY8Cyfpnddj8U7pIx
+3XWqE5HFerMJ3F0bT0QG0GKbS5JPw7N/zoDUzyz4gK3EJGnSehtiuOV6uPtBVRCl
+4EESjGtt6z/Znm3OQEaGVBc3Bh2IisfKgqNYT8oBdm30AtgXieaxMNSv/fxTMcsZ
+8utneIlpZ0znf3LOkJvs6wrJ6SX6HRnRJ7F0zrXj+6+R0t3XEOQ2+MQNzvG0Heur
+0fgJ8ldfM8M0qDBwtjgMAYwhbikDbDh0HXvwBpWwjdbqZz3LjGZB9qhu7WQXzdwv
+POHVGsoGyMOfyNu39F/lEUVU5i6aze2vXQvJbu2NGkjIVF4F7gSJ7owJDWSGKzu+
+umbJAwOcFLmg1RbNpLH8u938sgUUlh/adVDqajyaT6OQ26LFzCIloAn7V94ldzKD
+ikrKLXZW3hqndUv1C47+
+=QUGr
+-END PGP SIGNATURE-

Added: dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.sha512
==
--- dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.sha512 (added)
+++ dev/spark/v2.4.5-rc2-bin/SparkR_2.4.5.tar.gz.sha512 Sun Feb  2 20:27:00 2020
@@ -0,0 +1,3 @@
+SparkR_2.4.5.tar.gz: 6C26F6C4 5914A67A 6CC25A48 6197618A BCFF91FD 0DD820B3
+ FDFCDBFA 46C1121A 262763EC E8B18F5C 85E0B02B FB2BBA76
+ 8F0BE5E6 84E47A89 9A79617E 46BCA1AA

Added: dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz.asc
==
--- dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz.asc (added)
+++ dev/spark/v2.4.5-rc2-bin/pyspark-2.4.5.tar.gz.asc Sun Feb  2 20:27:00 2020
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQIcBAABAgAGBQJeNydkAAoJEO2gDOg08PxcgYEQAJKy44tsQM2Aqm0w+SIlcjc6
+OIJU/IbfETyMwiZJYPuaOG/+Es3eiRD8ZPCbPuEOmoUZlV6wBQtXdcxtVTvj6Dg3
+0GArLDkZkA/vtGsrctCq/YZXMweJG3Gmn8cFJClJqRQWJ/3cczN4MQ2AH2tS2q3/
+tcV20SJ+V7A1WVZxJk08Zu6lXqoQzOR4N1NiBsoWPHRqA/soLvIDoznq9XfcM6uu
+AE2ZwI87FD4KRYlPDgudHuZ+OZI7Bpmz5dtnVIm4Xgds4ghM/fLhqQvI1506D4xQ
+JaTLqHOOV3f+/Saf2eQKzWN841GJRyUIn2h5ukKxPWIRj3Vfl9PXAIAt4gSB1bZk
+oStLuwM2n9kRAQDZzR6vXam0+M3oQXCvx7bWlQPCsWzhi1jcNNR8t9cyMx/R2Hcj
+fFpof1/TbACjY5sT/8EZ8QH6mqkPd5kYD4V/hQae56S+rwEBgtd1ceMbdMRTZKmh
+65HH91x1ZtMdzWbsIoW9V0SS1n3iIGgyKZ2TtxapxIHGLor52BY8XMMhQmIgu7/S
+rTibjsI/jYfKRz1XEuyWu/4F8pY27uPQUtZ+DF1T2G/231lMFCwqaAoiFsB4VWq5
+Ia0zQCMWgsomBdVCYk7350/HK7VB8UK7f/AsBw5yxQ9vqJBtjKVBY1D3w4aJrs+i
+8cDflvQfFnkQc+elKMZX
+=9Uyn
+-END PGP SIGNATURE-

Added: 

[spark] 01/01: Preparing Spark release v2.4.5-rc2

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to tag v2.4.5-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git

commit cee4ecbb16917fa85f02c635925e2687400aa56b
Author: Dongjoon Hyun 
AuthorDate: Sun Feb 2 19:23:09 2020 +

Preparing Spark release v2.4.5-rc2
---
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/flume-assembly/pom.xml| 2 +-
 external/flume-sink/pom.xml| 2 +-
 external/flume/pom.xml | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kafka-0-8-assembly/pom.xml| 2 +-
 external/kafka-0-8/pom.xml | 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 43 files changed, 44 insertions(+), 44 deletions(-)

diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index fffdcd2..57a5d84 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.4.6
+Version: 2.4.5
 Title: R Front End for 'Apache Spark'
 Description: Provides an R Front end for 'Apache Spark' 
.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 9aa868c..bac854c 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.6-SNAPSHOT
+2.4.5
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 3593374..256b9ea 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.6-SNAPSHOT
+2.4.5
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 0aadd49..3be5244 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.6-SNAPSHOT
+2.4.5
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index d862cf8..7132b53 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.6-SNAPSHOT
+2.4.5
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 011bb49..8d478b8 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.6-SNAPSHOT
+2.4.5
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index 02b9ce7..a7e1fae 

[spark] branch branch-2.4 updated (cb4a736 -> 9bf11ed)

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git.


from cb4a736  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest
 add cee4ecb  Preparing Spark release v2.4.5-rc2
 new 9bf11ed  Preparing development version 2.4.6-SNAPSHOT

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] tag v2.4.5-rc2 created (now cee4ecb)

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to tag v2.4.5-rc2
in repository https://gitbox.apache.org/repos/asf/spark.git.


  at cee4ecb  (commit)
This tag includes the following new commits:

 new cee4ecb  Preparing Spark release v2.4.5-rc2

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing development version 2.4.6-SNAPSHOT

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git

commit 9bf11edf7927f41aa0dc8073c460c7dab2af72cd
Author: Dongjoon Hyun 
AuthorDate: Sun Feb 2 19:23:14 2020 +

Preparing development version 2.4.6-SNAPSHOT
---
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/flume-assembly/pom.xml| 2 +-
 external/flume-sink/pom.xml| 2 +-
 external/flume/pom.xml | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kafka-0-8-assembly/pom.xml| 2 +-
 external/kafka-0-8/pom.xml | 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 python/pyspark/version.py  | 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 43 files changed, 44 insertions(+), 44 deletions(-)

diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 57a5d84..fffdcd2 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.4.5
+Version: 2.4.6
 Title: R Front End for 'Apache Spark'
 Description: Provides an R Front end for 'Apache Spark' 
.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
diff --git a/assembly/pom.xml b/assembly/pom.xml
index bac854c..9aa868c 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.5
+2.4.6-SNAPSHOT
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 256b9ea..3593374 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.5
+2.4.6-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 3be5244..0aadd49 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.5
+2.4.6-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 7132b53..d862cf8 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.5
+2.4.6-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 8d478b8..011bb49 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.4.5
+2.4.6-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index 

[spark] branch branch-2.4 updated: [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead of the latest

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new cb4a736  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest
cb4a736 is described below

commit cb4a736e66d72c29f44e98e19ad0b72343108ef1
Author: Dongjoon Hyun 
AuthorDate: Sun Feb 2 00:44:25 2020 -0800

[SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead of the latest

This PR aims to pin the version of `jekyll-redirect-from` to 0.15.0. This 
is a release blocker for both Apache Spark 3.0.0 and 2.4.5.

`jekyll-redirect-from` released 0.16.0 a few days ago and that requires 
Ruby 2.4.0.
- https://github.com/jekyll/jekyll-redirect-from/releases/tag/v0.16.0
```
$ cd dev/create-release/spark-rm/
$ docker build -t spark:test .
...
ERROR:  Error installing jekyll-redirect-from:
jekyll-redirect-from requires Ruby version >= 2.4.0.
...
```

No.

Manually do the above command to build `spark-rm` Docker image.
```
...
Successfully installed jekyll-redirect-from-0.15.0
Parsing documentation for jekyll-redirect-from-0.15.0
Installing ri documentation for jekyll-redirect-from-0.15.0
Done installing documentation for jekyll-redirect-from after 0 seconds
1 gem installed
Successfully installed rouge-3.15.0
Parsing documentation for rouge-3.15.0
Installing ri documentation for rouge-3.15.0
Done installing documentation for rouge after 4 seconds
1 gem installed
Removing intermediate container e0ec7c77b69f
 ---> 32dec37291c6
```

Closes #27434 from dongjoon-hyun/SPARK-30704.

Authored-by: Dongjoon Hyun 
Signed-off-by: Dongjoon Hyun 
(cherry picked from commit 1adf3520e3c753e6df8dccb752e8239de682a09a)
Signed-off-by: Dongjoon Hyun 
---
 dev/create-release/spark-rm/Dockerfile | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
index 6104a03..992961f 100644
--- a/dev/create-release/spark-rm/Dockerfile
+++ b/dev/create-release/spark-rm/Dockerfile
@@ -77,7 +77,7 @@ RUN apt-get clean && apt-get update && $APT_INSTALL gnupg 
ca-certificates apt-tr
   # Install tools needed to build the documentation.
   $APT_INSTALL ruby2.3 ruby2.3-dev mkdocs && \
   gem install jekyll --no-rdoc --no-ri -v 3.8.6 && \
-  gem install jekyll-redirect-from && \
+  gem install jekyll-redirect-from -v 0.15.0 && \
   gem install pygments.rb
 
 WORKDIR /opt/spark-rm/output


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (cd5f03a -> 1adf352)

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from cd5f03a  [SPARK-27686][DOC][SQL] Update migration guide for make Hive 
2.3 dependency by default
 add 1adf352  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest

No new revisions were added by this update.

Summary of changes:
 dev/create-release/spark-rm/Dockerfile | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead of the latest

2020-02-02 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 2f1fb4c  [SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead 
of the latest
2f1fb4c is described below

commit 2f1fb4c01d0d4bfda17b3262e6f586f4f1a25bac
Author: Dongjoon Hyun 
AuthorDate: Sun Feb 2 00:44:25 2020 -0800

[SPARK-30704][INFRA] Use jekyll-redirect-from 0.15.0 instead of the latest

### What changes were proposed in this pull request?

This PR aims to pin the version of `jekyll-redirect-from` to 0.15.0. This 
is a release blocker for both Apache Spark 3.0.0 and 2.4.5.

### Why are the changes needed?

`jekyll-redirect-from` released 0.16.0 a few days ago and that requires 
Ruby 2.4.0.
- https://github.com/jekyll/jekyll-redirect-from/releases/tag/v0.16.0
```
$ cd dev/create-release/spark-rm/
$ docker build -t spark:test .
...
ERROR:  Error installing jekyll-redirect-from:
jekyll-redirect-from requires Ruby version >= 2.4.0.
...
```

### Does this PR introduce any user-facing change?

No.

### How was this patch tested?

Manually do the above command to build `spark-rm` Docker image.
```
...
Successfully installed jekyll-redirect-from-0.15.0
Parsing documentation for jekyll-redirect-from-0.15.0
Installing ri documentation for jekyll-redirect-from-0.15.0
Done installing documentation for jekyll-redirect-from after 0 seconds
1 gem installed
Successfully installed rouge-3.15.0
Parsing documentation for rouge-3.15.0
Installing ri documentation for rouge-3.15.0
Done installing documentation for rouge after 4 seconds
1 gem installed
Removing intermediate container e0ec7c77b69f
 ---> 32dec37291c6
```

Closes #27434 from dongjoon-hyun/SPARK-30704.

Authored-by: Dongjoon Hyun 
Signed-off-by: Dongjoon Hyun 
(cherry picked from commit 1adf3520e3c753e6df8dccb752e8239de682a09a)
Signed-off-by: Dongjoon Hyun 
---
 dev/create-release/spark-rm/Dockerfile | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
index 3ba8e97..6345168 100644
--- a/dev/create-release/spark-rm/Dockerfile
+++ b/dev/create-release/spark-rm/Dockerfile
@@ -78,7 +78,7 @@ RUN apt-get clean && apt-get update && $APT_INSTALL gnupg 
ca-certificates && \
   # Install tools needed to build the documentation.
   $APT_INSTALL ruby2.3 ruby2.3-dev mkdocs && \
   gem install jekyll --no-rdoc --no-ri -v 3.8.6 && \
-  gem install jekyll-redirect-from && \
+  gem install jekyll-redirect-from -v 0.15.0 && \
   gem install rouge
 
 WORKDIR /opt/spark-rm/output


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org