svn commit: r29428 - in /dev/spark/2.4.1-SNAPSHOT-2018_09_16_22_02-fb1539a-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s
Author: pwendell Date: Mon Sep 17 05:17:03 2018 New Revision: 29428 Log: Apache Spark 2.4.1-SNAPSHOT-2018_09_16_22_02-fb1539a docs [This commit notification would consist of 1474 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
spark git commit: [SPARK-22713][CORE][TEST][FOLLOWUP] Fix flaky ExternalAppendOnlyMapSuite due to timeout
Repository: spark Updated Branches: refs/heads/branch-2.4 1cb1e4301 -> fb1539ad8 [SPARK-22713][CORE][TEST][FOLLOWUP] Fix flaky ExternalAppendOnlyMapSuite due to timeout ## What changes were proposed in this pull request? SPARK-22713 uses [`eventually` with the default timeout `150ms`](https://github.com/apache/spark/pull/21369/files#diff-5bbb6a931b7e4d6a31e4938f51935682R462). It causes flakiness because it's executed once when GC is slow. ```scala eventually { System.gc() ... } ``` **Failures** ```scala org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 1 times over 501.22261 milliseconds. Last failure message: tmpIsNull was false. ``` - master-test-sbt-hadoop-2.7 [4916](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4916) [4907](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4907) [4906](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4906) - spark-master-test-sbt-hadoop-2.6 [4979](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4979) [4974](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4974) [4967](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4967) [4966](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4966) ## How was this patch tested? Pass the Jenkins. Closes #22432 from dongjoon-hyun/SPARK-22713. Authored-by: Dongjoon Hyun Signed-off-by: Wenchen Fan (cherry picked from commit 538e0478783160d8fab2dc76fd8fc7b469cb4e19) Signed-off-by: Wenchen Fan Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/fb1539ad Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/fb1539ad Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/fb1539ad Branch: refs/heads/branch-2.4 Commit: fb1539ad876d0878dde56258af53399dfdf706eb Parents: 1cb1e43 Author: Dongjoon Hyun Authored: Mon Sep 17 11:07:51 2018 +0800 Committer: Wenchen Fan Committed: Mon Sep 17 11:08:46 2018 +0800 -- .../apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/spark/blob/fb1539ad/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala -- diff --git a/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala b/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala index 8a2f2ff..cd25265 100644 --- a/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala +++ b/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala @@ -18,6 +18,7 @@ package org.apache.spark.util.collection import scala.collection.mutable.ArrayBuffer +import scala.concurrent.duration._ import scala.ref.WeakReference import org.scalatest.Matchers @@ -457,7 +458,7 @@ class ExternalAppendOnlyMapSuite extends SparkFunSuite // https://github.com/scala/scala/blob/2.13.x/test/junit/scala/tools/testing/AssertUtil.scala // (lines 69-89) // assert(map.currentMap == null) -eventually { +eventually(timeout(5 seconds), interval(200 milliseconds)) { System.gc() // direct asserts introduced some macro generated code that held a reference to the map val tmpIsNull = null == underlyingMapRef.get.orNull - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
spark git commit: [SPARK-22713][CORE][TEST][FOLLOWUP] Fix flaky ExternalAppendOnlyMapSuite due to timeout
Repository: spark Updated Branches: refs/heads/master a1dd78255 -> 538e04787 [SPARK-22713][CORE][TEST][FOLLOWUP] Fix flaky ExternalAppendOnlyMapSuite due to timeout ## What changes were proposed in this pull request? SPARK-22713 uses [`eventually` with the default timeout `150ms`](https://github.com/apache/spark/pull/21369/files#diff-5bbb6a931b7e4d6a31e4938f51935682R462). It causes flakiness because it's executed once when GC is slow. ```scala eventually { System.gc() ... } ``` **Failures** ```scala org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 1 times over 501.22261 milliseconds. Last failure message: tmpIsNull was false. ``` - master-test-sbt-hadoop-2.7 [4916](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4916) [4907](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4907) [4906](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/4906) - spark-master-test-sbt-hadoop-2.6 [4979](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4979) [4974](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4974) [4967](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4967) [4966](https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/4966) ## How was this patch tested? Pass the Jenkins. Closes #22432 from dongjoon-hyun/SPARK-22713. Authored-by: Dongjoon Hyun Signed-off-by: Wenchen Fan Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/538e0478 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/538e0478 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/538e0478 Branch: refs/heads/master Commit: 538e0478783160d8fab2dc76fd8fc7b469cb4e19 Parents: a1dd782 Author: Dongjoon Hyun Authored: Mon Sep 17 11:07:51 2018 +0800 Committer: Wenchen Fan Committed: Mon Sep 17 11:07:51 2018 +0800 -- .../apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/spark/blob/538e0478/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala -- diff --git a/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala b/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala index 8a2f2ff..cd25265 100644 --- a/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala +++ b/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala @@ -18,6 +18,7 @@ package org.apache.spark.util.collection import scala.collection.mutable.ArrayBuffer +import scala.concurrent.duration._ import scala.ref.WeakReference import org.scalatest.Matchers @@ -457,7 +458,7 @@ class ExternalAppendOnlyMapSuite extends SparkFunSuite // https://github.com/scala/scala/blob/2.13.x/test/junit/scala/tools/testing/AssertUtil.scala // (lines 69-89) // assert(map.currentMap == null) -eventually { +eventually(timeout(5 seconds), interval(200 milliseconds)) { System.gc() // direct asserts introduced some macro generated code that held a reference to the map val tmpIsNull = null == underlyingMapRef.get.orNull - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
svn commit: r29425 - in /dev/spark/v2.4.0-rc1-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark
Author: wenchen Date: Mon Sep 17 01:11:13 2018 New Revision: 29425 Log: Apache Spark v2.4.0-rc1 docs [This commit notification would consist of 1477 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
svn commit: r29424 - in /dev/spark/2.5.0-SNAPSHOT-2018_09_16_16_02-a1dd782-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s
Author: pwendell Date: Sun Sep 16 23:16:35 2018 New Revision: 29424 Log: Apache Spark 2.5.0-SNAPSHOT-2018_09_16_16_02-a1dd782 docs [This commit notification would consist of 1483 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
svn commit: r29423 - in /dev/spark/2.4.1-SNAPSHOT-2018_09_16_14_02-1cb1e43-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s
Author: pwendell Date: Sun Sep 16 21:17:06 2018 New Revision: 29423 Log: Apache Spark 2.4.1-SNAPSHOT-2018_09_16_14_02-1cb1e43 docs [This commit notification would consist of 1474 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
spark-website git commit: Added TransmogrifAI to Applications Using Spark
Repository: spark-website Updated Branches: refs/heads/asf-site 2f282fa35 -> 370d99580 Added TransmogrifAI to Applications Using Spark Hello, We [open sourced TransmogrifAI library](https://medium.com/snabar/4e5d0e098da2) just a few weeks ago and I thought it would be great to have it on the Apache Spark page. Thanks in advance. Author: Matthew Tovbin Author: Matthew Tovbin Closes #144 from tovbinm/patch-1. Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/370d9958 Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/370d9958 Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/370d9958 Branch: refs/heads/asf-site Commit: 370d99580fdb96dbc91fd8201419a29aa7f9f425 Parents: 2f282fa Author: Matthew Tovbin Authored: Sun Sep 16 15:24:11 2018 -0500 Committer: Sean Owen Committed: Sun Sep 16 15:24:11 2018 -0500 -- site/third-party-projects.html | 1 + third-party-projects.md| 1 + 2 files changed, 2 insertions(+) -- http://git-wip-us.apache.org/repos/asf/spark-website/blob/370d9958/site/third-party-projects.html -- diff --git a/site/third-party-projects.html b/site/third-party-projects.html index 36259b7..c4caf1c 100644 --- a/site/third-party-projects.html +++ b/site/third-party-projects.html @@ -260,6 +260,7 @@ implementation for Spark Apache Kafka for real-time large scale machine learning https://github.com/bigdatagenomics/adam;>ADAM - A framework and CLI for loading, transforming, and analyzing genomic data using Apache Spark + https://github.com/salesforce/TransmogrifAI;>TransmogrifAI - AutoML library for building modular, reusable, strongly typed machine learning workflows on Spark with minimal hand tuning Additional Language Bindings http://git-wip-us.apache.org/repos/asf/spark-website/blob/370d9958/third-party-projects.md -- diff --git a/third-party-projects.md b/third-party-projects.md index d0210cd..d965bce 100644 --- a/third-party-projects.md +++ b/third-party-projects.md @@ -64,6 +64,7 @@ implementation for Spark Apache Kafka for real-time large scale machine learning - https://github.com/bigdatagenomics/adam;>ADAM - A framework and CLI for loading, transforming, and analyzing genomic data using Apache Spark +- https://github.com/salesforce/TransmogrifAI;>TransmogrifAI - AutoML library for building modular, reusable, strongly typed machine learning workflows on Spark with minimal hand tuning Additional Language Bindings - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
spark git commit: [MINOR][DOCS] Axe deprecated doc refs
Repository: spark Updated Branches: refs/heads/branch-2.4 60af706b4 -> 1cb1e4301 [MINOR][DOCS] Axe deprecated doc refs Continuation of #22370. Summary of discussion there: There is some inconsistency in the R manual w.r.t. supercedent functions linking back to deprecated functions. - `createOrReplaceTempView` and `createTable` both link back to functions which are deprecated (`registerTempTable` and `createExternalTable`, respectively) - `sparkR.session` and `dropTempView` do _not_ link back to deprecated functions This PR takes the view that it is preferable _not_ to link back to deprecated functions, and removes these references from `?createOrReplaceTempView` and `?createTable`. As `registerTempTable` was included in the `SparkDataFrame functions` `family` of functions, other documentation pages which included a link to `?registerTempTable` will similarly be altered. Author: Michael Chirico Author: Michael Chirico Closes #22393 from MichaelChirico/axe_deprecated_doc_refs. (cherry picked from commit a1dd78255a3ae023820b2f245cd39f0c57a32fb1) Signed-off-by: Felix Cheung Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/1cb1e430 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/1cb1e430 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/1cb1e430 Branch: refs/heads/branch-2.4 Commit: 1cb1e43012e57e649d77524f8ff2de231f52c66a Parents: 60af706 Author: Michael Chirico Authored: Sun Sep 16 12:57:44 2018 -0700 Committer: Felix Cheung Committed: Sun Sep 16 12:58:04 2018 -0700 -- R/pkg/R/DataFrame.R | 1 - R/pkg/R/catalog.R | 1 - 2 files changed, 2 deletions(-) -- http://git-wip-us.apache.org/repos/asf/spark/blob/1cb1e430/R/pkg/R/DataFrame.R -- diff --git a/R/pkg/R/DataFrame.R b/R/pkg/R/DataFrame.R index 4f2d4c7..458deca 100644 --- a/R/pkg/R/DataFrame.R +++ b/R/pkg/R/DataFrame.R @@ -503,7 +503,6 @@ setMethod("createOrReplaceTempView", #' @param x A SparkDataFrame #' @param tableName A character vector containing the name of the table #' -#' @family SparkDataFrame functions #' @seealso \link{createOrReplaceTempView} #' @rdname registerTempTable-deprecated #' @name registerTempTable http://git-wip-us.apache.org/repos/asf/spark/blob/1cb1e430/R/pkg/R/catalog.R -- diff --git a/R/pkg/R/catalog.R b/R/pkg/R/catalog.R index baf4d86..c2d0fc3 100644 --- a/R/pkg/R/catalog.R +++ b/R/pkg/R/catalog.R @@ -69,7 +69,6 @@ createExternalTable <- function(x, ...) { #' @param ... additional named parameters as options for the data source. #' @return A SparkDataFrame. #' @rdname createTable -#' @seealso \link{createExternalTable} #' @examples #'\dontrun{ #' sparkR.session() - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
spark git commit: [MINOR][DOCS] Axe deprecated doc refs
Repository: spark Updated Branches: refs/heads/master bfcf74260 -> a1dd78255 [MINOR][DOCS] Axe deprecated doc refs Continuation of #22370. Summary of discussion there: There is some inconsistency in the R manual w.r.t. supercedent functions linking back to deprecated functions. - `createOrReplaceTempView` and `createTable` both link back to functions which are deprecated (`registerTempTable` and `createExternalTable`, respectively) - `sparkR.session` and `dropTempView` do _not_ link back to deprecated functions This PR takes the view that it is preferable _not_ to link back to deprecated functions, and removes these references from `?createOrReplaceTempView` and `?createTable`. As `registerTempTable` was included in the `SparkDataFrame functions` `family` of functions, other documentation pages which included a link to `?registerTempTable` will similarly be altered. Author: Michael Chirico Author: Michael Chirico Closes #22393 from MichaelChirico/axe_deprecated_doc_refs. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a1dd7825 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a1dd7825 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a1dd7825 Branch: refs/heads/master Commit: a1dd78255a3ae023820b2f245cd39f0c57a32fb1 Parents: bfcf742 Author: Michael Chirico Authored: Sun Sep 16 12:57:44 2018 -0700 Committer: Felix Cheung Committed: Sun Sep 16 12:57:44 2018 -0700 -- R/pkg/R/DataFrame.R | 1 - R/pkg/R/catalog.R | 1 - 2 files changed, 2 deletions(-) -- http://git-wip-us.apache.org/repos/asf/spark/blob/a1dd7825/R/pkg/R/DataFrame.R -- diff --git a/R/pkg/R/DataFrame.R b/R/pkg/R/DataFrame.R index 4f2d4c7..458deca 100644 --- a/R/pkg/R/DataFrame.R +++ b/R/pkg/R/DataFrame.R @@ -503,7 +503,6 @@ setMethod("createOrReplaceTempView", #' @param x A SparkDataFrame #' @param tableName A character vector containing the name of the table #' -#' @family SparkDataFrame functions #' @seealso \link{createOrReplaceTempView} #' @rdname registerTempTable-deprecated #' @name registerTempTable http://git-wip-us.apache.org/repos/asf/spark/blob/a1dd7825/R/pkg/R/catalog.R -- diff --git a/R/pkg/R/catalog.R b/R/pkg/R/catalog.R index baf4d86..c2d0fc3 100644 --- a/R/pkg/R/catalog.R +++ b/R/pkg/R/catalog.R @@ -69,7 +69,6 @@ createExternalTable <- function(x, ...) { #' @param ... additional named parameters as options for the data source. #' @return A SparkDataFrame. #' @rdname createTable -#' @seealso \link{createExternalTable} #' @examples #'\dontrun{ #' sparkR.session() - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
svn commit: r29421 - /dev/spark/v2.3.2-rc6-bin/
Author: jshao Date: Sun Sep 16 13:30:43 2018 New Revision: 29421 Log: Apache Spark v2.3.2-rc6 Added: dev/spark/v2.3.2-rc6-bin/ dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz (with props) dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz.asc dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz.sha512 dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz (with props) dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz.asc dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz.sha512 dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz (with props) dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz.asc dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz.sha512 dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz (with props) dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz.asc dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz.sha512 Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz == Binary file - no diff available. Propchange: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz -- svn:mime-type = application/octet-stream Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc == --- dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc (added) +++ dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc Sun Sep 16 13:30:43 2018 @@ -0,0 +1,16 @@ +-BEGIN PGP SIGNATURE- + +iQIcBAABCgAGBQJbnkkHAAoJENsLIaASlz/QGnMP/jKJ2zrZbpjd/ladRk5c6i3h +DalLIk6+ZnSimBUvH+aZFqM2Xam41KlKJkgrXUS4wVOoHfcu0HxkwkpqhC0E/cTY +KUTZ2Y2rFm7IVFUtwfwlqdR77v/4MEE0tMkOAxy8ZAumyKV5AAG+1OQ0k+X4q+E9 +Q6E8WicEhzr6Pi+9bOSJuHZE0LP1Vpou7Q9JhRQQC/cT1VbZu7+AeJ3RoiQLV6gp +uigSK73pMDIPlaHpqyTJAvy9VVyF7DseACTDOGon/FOXMNXg2UZcQ00cViJ5Ykxd +i/jFrFa3X79hedlLfC9RMI191G5DzePtnh+grqQxk80EK3xizx+Y1ptir7RRuO9V +KWslgAI7cLxpJ6v8tvpWzqfheUD0HGoZ8JhSXsG02X0/v4ZNIIrzGF8eEKZvc5AW +NTAHD7ws9myeghp4pcOiZuw64obBG7QIkMHe9a62ZdyfqZjkdpA2BiEhqFi0dI89 +lLf2bjmoz97Y5YuFrjix6XP4057xGUSFGnZuOWsfvjtg6dbTEYaIZxLqcplu6esD +gBLk4Ct0pXH7wcv4aWEtby20Wq6YGR7GKCIpEnOtXIPkKdPi4iuCIyWZy9WXjwZY +wJ4z2locysS5bgDahsdNSLQEN9UbxkPqi7GIpGPVvNrR97HXcumOOsmQeaWS2Xx4 +YsZoVDmqlgBu/oyW5Bw1 +=OlcR +-END PGP SIGNATURE- Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 == --- dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 (added) +++ dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 Sun Sep 16 13:30:43 2018 @@ -0,0 +1,3 @@ +SparkR_2.3.2.tar.gz: BE4B6B28 DC3CB5FC 947E7B21 79AED9DD 55573A05 D0DEBB53 + 86864B05 C02F32B4 FB997E7A 9643BA61 6BC495E1 A2FE03D9 + AE2D2DC2 4D43A48C 39498238 7259F58D Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz == Binary file - no diff available. Propchange: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz -- svn:mime-type = application/octet-stream Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc == --- dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc (added) +++ dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc Sun Sep 16 13:30:43 2018 @@ -0,0 +1,16 @@ +-BEGIN PGP SIGNATURE- + +iQIcBAABCgAGBQJbnkXVAAoJENsLIaASlz/QaAsP/3MdsTgoz9cfqQTleKT2Kw6M +P6rzaiFoTq9tlWlBoeWSmqR42TilQzPGSLerzSGNMuEIpdpzENc/aopqd/1vU2qf +ghmmfGtyCn1Mj2wLHRAIEseaXCViZPOmiH6YpmcUziY7aybNtB0g9aZt/9M9N2ts +BnCU06zk0esBYkZmnw4f/WYG32v7WQN7Lb/IewgoguhpGKRa0ypad56r24y2Qf0N +Us1GUfQzu5XXTr+CJI9zukJudLCNnOdIlnUoSv25pePxWodNRw+49ixG+qQvxkvt +WGsb/lWJh3tTvPeZFJcB5Yg2lU5YWKck0a6WNhIRSlbJgzizhEyQs9YrF3HBtlgC +bAT6GEjcnwCXxdgUZKUnd0P3POK85Dd1XFxVj+yWwIjKBvdFlqlE50eAgPuKZMZ+ +aptQ3+XPakoukKFA07moywE38yQZrYpULGLn5V4W04PS1g/3DOm0pAvshJuA58Sf +z76gMJGthcYgL2RmXGJslMyZetUVVjZkvm5GVAIJtxJlGA1vtsEVYUJQyW1M8Vh3 +lCiUBSpyZL/6XHLSObPWLX4NuagjaC0vSUMbfZJYOYMh8SGltWCWJt2/2SdzueJY +4RdOfmkYmXub9NVn/MgAYCGoq+kx0NGNoG8fF2+x6xnm81pYKJTecQjVrfZUgSkC +/oriBynvPpnJ0lBRRyw8 +=F1pu +-END PGP SIGNATURE- Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 == --- dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 (added) +++ dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 Sun Sep 16 13:30:43 2018 @@ -0,0 +1,3 @@
svn commit: r29420 - in /dev/spark/2.5.0-SNAPSHOT-2018_09_16_00_02-bfcf742-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s
Author: pwendell Date: Sun Sep 16 07:17:08 2018 New Revision: 29420 Log: Apache Spark 2.5.0-SNAPSHOT-2018_09_16_00_02-bfcf742 docs [This commit notification would consist of 1484 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
svn commit: r29419 - /dev/spark/v2.4.0-rc1-bin/
Author: wenchen Date: Sun Sep 16 06:05:59 2018 New Revision: 29419 Log: Apache Spark v2.4.0-rc1 Added: dev/spark/v2.4.0-rc1-bin/ dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz (with props) dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.asc dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.sha512 dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz (with props) dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.asc dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.sha512 dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.6.tgz (with props) dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.6.tgz.asc dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.6.tgz.sha512 dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.7.tgz (with props) dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.7.tgz.asc dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-hadoop2.7.tgz.sha512 dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-without-hadoop.tgz (with props) dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-without-hadoop.tgz.asc dev/spark/v2.4.0-rc1-bin/spark-2.4.0-bin-without-hadoop.tgz.sha512 dev/spark/v2.4.0-rc1-bin/spark-2.4.0.tgz (with props) dev/spark/v2.4.0-rc1-bin/spark-2.4.0.tgz.asc dev/spark/v2.4.0-rc1-bin/spark-2.4.0.tgz.sha512 Added: dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz == Binary file - no diff available. Propchange: dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz -- svn:mime-type = application/octet-stream Added: dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.asc == --- dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.asc (added) +++ dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.asc Sun Sep 16 06:05:59 2018 @@ -0,0 +1,17 @@ +-BEGIN PGP SIGNATURE- +Version: GnuPG v1 + +iQIcBAABAgAGBQJbneo+AAoJEB6A3KhRgyfrfUYP/3VOKMsFda/+8RIPY5xdrk5D +zyynGDtuSktdIImFL7k5EG3pI9RteSWFdjBYCWeOYnLws2fH0Pk7uAuGvZCDnDPk +ehtM9CJ8Tn8niH9F2JaGr0ONrMF4IhvHRZxPd/7OoQzzJnI8oYwXJ4To1OgB5Xr2 +tk8P1+oT7keIkcfmjwNVO70ytK7cUaGGTma/GQArZ5TO2N9YttWP03c84Wcth6Fb +G5nOP6v/uuysqVbeO29I3x5TSLTmS2iLIpqApCj1YMvohnqM137lDEMwTm+Z7oK/ +2TB/MWy2S5vlP9xmmTSApFTJLka90MDr17oCb9bn3eotBk0/CbQIlWazlOcsdbL4 +q9v2y+kUI7BdkjlnWB3oCmafaEZGkANGMsk1pM9gs0h1CR0FSgnLOueFuVgbjOmQ +XOOMXigyMg7EC3s415cZTHU3wToiZgJO2CIdyv8cGv6sYjpQqgULfjwwUhQQYaya +mUgI1vdk8OdBmwpGFrLkCggvWmdKl9iEw++QF7nlGcIWjwF6Fv9mMz+o7WnD9nQn +3rFOw5eY4nYY84xPPRf29Z9+lBgqUyLX9RaVGFYtvNf0eg8ckSPAB5dpi2uoThcG +tCU4xekSdBlGTokYSJxTN0lwN5JbVHbHdcpM4hCp9aIsWqYxf2NvcLOsZpX0zBC5 +BmD5wv8p4NQEDhaxqM16 +=YKP/ +-END PGP SIGNATURE- Added: dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.sha512 == --- dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.sha512 (added) +++ dev/spark/v2.4.0-rc1-bin/SparkR_2.4.0.tar.gz.sha512 Sun Sep 16 06:05:59 2018 @@ -0,0 +1,3 @@ +SparkR_2.4.0.tar.gz: B581D52E 38185332 31CC8D16 8A681101 EBB833DE E29B944A + B5BB6887 3EFF5743 4929DF42 017063DE 8C2050BA 5A3465B2 + FA6710F5 E932FCBB 66873407 60F5F6D4 Added: dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz == Binary file - no diff available. Propchange: dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz -- svn:mime-type = application/octet-stream Added: dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.asc == --- dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.asc (added) +++ dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.asc Sun Sep 16 06:05:59 2018 @@ -0,0 +1,17 @@ +-BEGIN PGP SIGNATURE- +Version: GnuPG v1 + +iQIcBAABAgAGBQJbneczAAoJEB6A3KhRgyfrqdgQAIpzyxxe+1j9N3TFgKAPFeZW +R40j4Gf4jee9SEqlaGLFozQ8wQ0wwD2TFvpWX59rm8InfbHgesAkCyIL9E3QBXbG +cdKZFy+lLVJu8bqJkxIV29mo7T7vrXIu0n2NaeoPdNZcRNXYt4WNk0y5ZpjVSyMC +3w88Pq/B4Zdk3tlVGPAUNL55VG9cPwM2B6wWT9puF2aW0EV/RBF1lV2+nav5PrPg +qg3RR7M9CcTmyd4Z6I2g1jnIBELgcf2dp4k8dZeK7VIGZOx//cg4bYRzi6zxF5Ry +4F+I6NsCPRblWHg/Z2E1HYjPtMqUG4uPw3pMMzXunaBOB1DuFlZcePqyQD/pnhFm +VmIDuAIMLIiA3faylUOe9oKT0MKsUK1o5+OZa9XJoYfQRwWYWAN8uxiEy6koFjNm +IWbj39rLNWPRZPnKM2SW5dCk+iZMeh7LLyhKw2oSkXbtrPtMvba+Hxjph1TeZKTp +OE0aOmP5GzzmvZYmyItFSg7MuwlZArsJz02nPQlru1Z4DmNRtiDcg0/ugy/fnM9K +dc+TQ5aVdI0qfu3mIQ16OxfLjUgsOfOUcPNanaHfxTuYnQaMT478gGsO4fM8CZik +SXx5+KGsd38ZN8iLWlOaj0Ard33JFeIQWXM0gv9u3X0iP1FKtJpLMDqyn2irijY0 +RXZ5SDrSnwiNst13mheR +=0EEq +-END PGP SIGNATURE- Added: dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.sha512 == --- dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.sha512 (added) +++ dev/spark/v2.4.0-rc1-bin/pyspark-2.4.0.tar.gz.sha512