spark git commit: [SPARK-23490][SQL] Check storage.locationUri with existing table in CreateTable

2018-02-22 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/master c5abb3c2d -> 049f243c5


[SPARK-23490][SQL] Check storage.locationUri with existing table in CreateTable

## What changes were proposed in this pull request?

For CreateTable with Append mode, we should check if `storage.locationUri` is 
the same with existing table in `PreprocessTableCreation`

In the current code, there is only a simple exception if the 
`storage.locationUri` is different with existing table:
`org.apache.spark.sql.AnalysisException: Table or view not found:`

which can be improved.

## How was this patch tested?

Unit test

Author: Wang Gengliang 

Closes #20660 from gengliangwang/locationUri.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/049f243c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/049f243c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/049f243c

Branch: refs/heads/master
Commit: 049f243c59737699fee54fdc9d65cbd7c788032a
Parents: c5abb3c
Author: Wang Gengliang 
Authored: Thu Feb 22 21:49:25 2018 -0800
Committer: gatorsmile 
Committed: Thu Feb 22 21:49:25 2018 -0800

--
 .../spark/sql/execution/datasources/rules.scala |  8 ++
 .../spark/sql/execution/command/DDLSuite.scala  | 29 
 2 files changed, 37 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/049f243c/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala
index 5cc21ee..0dea767 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala
@@ -118,6 +118,14 @@ case class PreprocessTableCreation(sparkSession: 
SparkSession) extends Rule[Logi
   s"`${existingProvider.getSimpleName}`. It doesn't match the 
specified format " +
   s"`${specifiedProvider.getSimpleName}`.")
   }
+  tableDesc.storage.locationUri match {
+case Some(location) if location.getPath != 
existingTable.location.getPath =>
+  throw new AnalysisException(
+s"The location of the existing table 
${tableIdentWithDB.quotedString} is " +
+  s"`${existingTable.location}`. It doesn't match the specified 
location " +
+  s"`${tableDesc.location}`.")
+case _ =>
+  }
 
   if (query.schema.length != existingTable.schema.length) {
 throw new AnalysisException(

http://git-wip-us.apache.org/repos/asf/spark/blob/049f243c/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
--
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
index f76bfd2..b800e6f 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
@@ -536,6 +536,35 @@ abstract class DDLSuite extends QueryTest with 
SQLTestUtils {
 }
   }
 
+  test("create table - append to a non-partitioned table created with 
different paths") {
+import testImplicits._
+withTempDir { dir1 =>
+  withTempDir { dir2 =>
+withTable("path_test") {
+  Seq(1L -> "a").toDF("v1", "v2")
+.write
+.mode(SaveMode.Append)
+.format("json")
+.option("path", dir1.getCanonicalPath)
+.saveAsTable("path_test")
+
+  val ex = intercept[AnalysisException] {
+Seq((3L, "c")).toDF("v1", "v2")
+  .write
+  .mode(SaveMode.Append)
+  .format("json")
+  .option("path", dir2.getCanonicalPath)
+  .saveAsTable("path_test")
+  }.getMessage
+  assert(ex.contains("The location of the existing table 
`default`.`path_test`"))
+
+  checkAnswer(
+spark.table("path_test"), Row(1L, "a") :: Nil)
+}
+  }
+}
+  }
+
   test("Refresh table after changing the data source table partitioning") {
 import testImplicits._
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r25230 - in /dev/spark/2.4.0-SNAPSHOT-2018_02_22_16_01-c5abb3c-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-02-22 Thread pwendell
Author: pwendell
Date: Fri Feb 23 00:15:23 2018
New Revision: 25230

Log:
Apache Spark 2.4.0-SNAPSHOT-2018_02_22_16_01-c5abb3c docs


[This commit notification would consist of 1444 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r25227 - /dev/spark/v2.3.0-rc5-bin/spark-parent_2.11.iml

2018-02-22 Thread sameerag
Author: sameerag
Date: Thu Feb 22 21:25:59 2018
New Revision: 25227

Log:
remove iml file

Removed:
dev/spark/v2.3.0-rc5-bin/spark-parent_2.11.iml


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r25226 - in /dev/spark/2.4.0-SNAPSHOT-2018_02_22_12_01-87293c7-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-02-22 Thread pwendell
Author: pwendell
Date: Thu Feb 22 20:15:40 2018
New Revision: 25226

Log:
Apache Spark 2.4.0-SNAPSHOT-2018_02_22_12_01-87293c7 docs


[This commit notification would consist of 1444 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r25225 - in /dev/spark/v2.3.0-rc5-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-02-22 Thread sameerag
Author: sameerag
Date: Thu Feb 22 20:12:29 2018
New Revision: 25225

Log:
Apache Spark v2.3.0-rc5 docs


[This commit notification would consist of 1446 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23476][CORE] Generate secret in local mode when authentication on

2018-02-22 Thread vanzin
Repository: spark
Updated Branches:
  refs/heads/master 87293c746 -> c5abb3c2d


[SPARK-23476][CORE] Generate secret in local mode when authentication on

## What changes were proposed in this pull request?

If spark is run with "spark.authenticate=true", then it will fail to start in 
local mode.

This PR generates secret in local mode when authentication on.

## How was this patch tested?

Modified existing unit test.
Manually started spark-shell.

Author: Gabor Somogyi 

Closes #20652 from gaborgsomogyi/SPARK-23476.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c5abb3c2
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c5abb3c2
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c5abb3c2

Branch: refs/heads/master
Commit: c5abb3c2d16f601d507bee3c53663d4e117eb8b5
Parents: 87293c7
Author: Gabor Somogyi 
Authored: Thu Feb 22 12:07:51 2018 -0800
Committer: Marcelo Vanzin 
Committed: Thu Feb 22 12:07:51 2018 -0800

--
 .../org/apache/spark/SecurityManager.scala  | 16 +--
 .../org/apache/spark/SecurityManagerSuite.scala | 50 +---
 docs/security.md|  2 +-
 3 files changed, 46 insertions(+), 22 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/c5abb3c2/core/src/main/scala/org/apache/spark/SecurityManager.scala
--
diff --git a/core/src/main/scala/org/apache/spark/SecurityManager.scala 
b/core/src/main/scala/org/apache/spark/SecurityManager.scala
index 5b15a1c..2519d26 100644
--- a/core/src/main/scala/org/apache/spark/SecurityManager.scala
+++ b/core/src/main/scala/org/apache/spark/SecurityManager.scala
@@ -520,19 +520,25 @@ private[spark] class SecurityManager(
*
* If authentication is disabled, do nothing.
*
-   * In YARN mode, generate a new secret and store it in the current user's 
credentials.
+   * In YARN and local mode, generate a new secret and store it in the current 
user's credentials.
*
* In other modes, assert that the auth secret is set in the configuration.
*/
   def initializeAuth(): Unit = {
+import SparkMasterRegex._
+
 if (!sparkConf.get(NETWORK_AUTH_ENABLED)) {
   return
 }
 
-if (sparkConf.get(SparkLauncher.SPARK_MASTER, null) != "yarn") {
-  require(sparkConf.contains(SPARK_AUTH_SECRET_CONF),
-s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF 
config.")
-  return
+val master = sparkConf.get(SparkLauncher.SPARK_MASTER, "")
+master match {
+  case "yarn" | "local" | LOCAL_N_REGEX(_) | LOCAL_N_FAILURES_REGEX(_, _) 
=>
+// Secret generation allowed here
+  case _ =>
+require(sparkConf.contains(SPARK_AUTH_SECRET_CONF),
+  s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF 
config.")
+return
 }
 
 val rnd = new SecureRandom()

http://git-wip-us.apache.org/repos/asf/spark/blob/c5abb3c2/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala
--
diff --git a/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala 
b/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala
index cf59265..106ece7 100644
--- a/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala
+++ b/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala
@@ -440,23 +440,41 @@ class SecurityManagerSuite extends SparkFunSuite with 
ResetSystemProperties {
 assert(keyFromEnv === new SecurityManager(conf2).getSecretKey())
   }
 
-  test("secret key generation in yarn mode") {
-val conf = new SparkConf()
-  .set(NETWORK_AUTH_ENABLED, true)
-  .set(SparkLauncher.SPARK_MASTER, "yarn")
-val mgr = new SecurityManager(conf)
-
-UserGroupInformation.createUserForTesting("authTest", Array()).doAs(
-  new PrivilegedExceptionAction[Unit]() {
-override def run(): Unit = {
-  mgr.initializeAuth()
-  val creds = UserGroupInformation.getCurrentUser().getCredentials()
-  val secret = creds.getSecretKey(SecurityManager.SECRET_LOOKUP_KEY)
-  assert(secret != null)
-  assert(new String(secret, UTF_8) === mgr.getSecretKey())
+  test("secret key generation") {
+Seq(
+  ("yarn", true),
+  ("local", true),
+  ("local[*]", true),
+  ("local[1, 2]", true),
+  ("local-cluster[2, 1, 1024]", false),
+  ("invalid", false)
+).foreach { case (master, shouldGenerateSecret) =>
+  val conf = new SparkConf()
+.set(NETWORK_AUTH_ENABLED, true)
+.set(SparkLauncher.SPARK_MASTER, master)
+  val mgr = new SecurityManager(conf)
+
+  

svn commit: r25224 - /dev/spark/v2.3.0-rc5-bin/

2018-02-22 Thread sameerag
Author: sameerag
Date: Thu Feb 22 19:54:10 2018
New Revision: 25224

Log:
Apache Spark v2.3.0-rc5

Added:
dev/spark/v2.3.0-rc5-bin/
dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz   (with props)
dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.asc
dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.md5
dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.sha512
dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz   (with props)
dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.asc
dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.md5
dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.sha512
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.6.tgz.asc
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.6.tgz.md5
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.6.tgz.sha512
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.7.tgz.asc
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.7.tgz.md5
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-hadoop2.7.tgz.sha512
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-without-hadoop.tgz   (with props)
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-without-hadoop.tgz.asc
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-without-hadoop.tgz.md5
dev/spark/v2.3.0-rc5-bin/spark-2.3.0-bin-without-hadoop.tgz.sha512
dev/spark/v2.3.0-rc5-bin/spark-2.3.0.tgz   (with props)
dev/spark/v2.3.0-rc5-bin/spark-2.3.0.tgz.asc
dev/spark/v2.3.0-rc5-bin/spark-2.3.0.tgz.md5
dev/spark/v2.3.0-rc5-bin/spark-2.3.0.tgz.sha512
dev/spark/v2.3.0-rc5-bin/spark-parent_2.11.iml

Added: dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.asc
==
--- dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.asc (added)
+++ dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.asc Thu Feb 22 19:54:10 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEE8sZCQuwb7Gnqj7413OS/2AdGHpYFAlqPHhQACgkQ3OS/2AdG
+HpZEhg//UoO4iDZFLKxlxuLQtKg3Vfa4laoY1/8TVZMj7GAOA9TIT1qDYoVHIEFx
+5X6+MrvjskgmWFNJL0cB+KK86n5/ZgmJmM7gV6DKYl4MsDG+EQQI3GOKuXeJbvlh
+7gNtKhM1Gz2nQFyyg/6E6+m4XKDUdlg5MnkEDgHetjgl4zR6PDDAGxrRbJFVaZeJ
+aKhusnXPLMlRdLKZPcRVLN5DN3BLyHbQRyeHUY8OJYhQjIP431gPA+1ULeb9SzKW
+PJ/zX+WcosB1o9fv+rDcaAvYr/1WZkW+r4uUWWWTlivTZPwb0sPuUd1xxzfLtb/M
+MpcraXpNIliIQgAKXKmAm+fAWbRpu7W71saEB5rofO39sXJDY9w6iJ33AqYxcRuh
++IBFcnxViBB5yPOpHMfSPaLXCeeeMoPmxfnYA8+hLYM54yrFK0EQMLWpROSMe4ZT
+V2k3YfI4HwQgWy6rD2Qv9iKEkDb8UXDPbZnElel0qzcYhvjIJ/bfglIhmVUEtRYx
+2ZJ1corXCf6rQ8gP9LQ61WuY3NkNMKRj9N+IhPrO9QxVPve5V0KigAUUb4CvtvkJ
+dJiApsjbvMqc0DbAv4AvXYmlIFCSSTeBBA5aNiPw3zUBcLXofCS52aSgYDhTIJ3c
+iSwCsKEANi8QIeBx4o5uvXclGlPz14STA6D3q7ycl7ACiz5KkCQ=
+=O08n
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.md5
==
--- dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.md5 (added)
+++ dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.md5 Thu Feb 22 19:54:10 2018
@@ -0,0 +1 @@
+SparkR_2.3.0.tar.gz: 65 0D A7 D2 99 32 90 A7  BF 6D 7E 05 C6 B9 5E 7D

Added: dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.sha512
==
--- dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.sha512 (added)
+++ dev/spark/v2.3.0-rc5-bin/SparkR_2.3.0.tar.gz.sha512 Thu Feb 22 19:54:10 2018
@@ -0,0 +1,3 @@
+SparkR_2.3.0.tar.gz: BC8B59FF A0A18B29 92B02794 4A9E21B6 A914D4F2 E01D5D4A
+ FB2A6C01 5B2152C5 C11E8240 5E0E3A02 C8719E99 AF3FC722
+ E3D7AD3A E303BDB1 505DFB84 B265CF22

Added: dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.asc
==
--- dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.asc (added)
+++ dev/spark/v2.3.0-rc5-bin/pyspark-2.3.0.tar.gz.asc Thu Feb 22 19:54:10 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEE8sZCQuwb7Gnqj7413OS/2AdGHpYFAlqPHRIACgkQ3OS/2AdG
+HpZfNRAAkf6SmmtFJ9C5tKIYrOSE47zIfdLe4DTKMaN+mac3iDo+uUM5HQbiE5eE
+vD7tsRWG6fHcObLbPLqQCXAapLwt1m1pHmJXVns7pUhkSoZ+aGcsiqcL0KE7liFW
+Ed+OBGzgurp3ORd01W5nUf/TbRdserxjjUs6rImJIrkYA4Ba8aUuLKgMZVpWKGVO

spark git commit: [SPARK-23475][UI] Show also skipped stages

2018-02-22 Thread vanzin
Repository: spark
Updated Branches:
  refs/heads/master 45cf714ee -> 87293c746


[SPARK-23475][UI] Show also skipped stages

## What changes were proposed in this pull request?

SPARK-20648 introduced the status `SKIPPED` for the stages. On the UI, 
previously, skipped stages were shown as `PENDING`; after this change, they are 
not shown on the UI.

The PR introduce a new section in order to show also `SKIPPED` stages in a 
proper table.

## How was this patch tested?

manual tests

Author: Marco Gaido 

Closes #20651 from mgaido91/SPARK-23475.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/87293c74
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/87293c74
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/87293c74

Branch: refs/heads/master
Commit: 87293c746e19d66f475d506d0adb43421f496843
Parents: 45cf714
Author: Marco Gaido 
Authored: Thu Feb 22 11:00:12 2018 -0800
Committer: Marcelo Vanzin 
Committed: Thu Feb 22 11:00:12 2018 -0800

--
 .../org/apache/spark/ui/static/webui.js |  1 +
 .../apache/spark/ui/jobs/AllStagesPage.scala| 27 
 .../org/apache/spark/ui/UISeleniumSuite.scala   | 17 
 3 files changed, 45 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/87293c74/core/src/main/resources/org/apache/spark/ui/static/webui.js
--
diff --git a/core/src/main/resources/org/apache/spark/ui/static/webui.js 
b/core/src/main/resources/org/apache/spark/ui/static/webui.js
index 83009df..f01c567 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/webui.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/webui.js
@@ -72,6 +72,7 @@ $(function() {
   
collapseTablePageLoad('collapse-aggregated-allActiveStages','aggregated-allActiveStages');
   
collapseTablePageLoad('collapse-aggregated-allPendingStages','aggregated-allPendingStages');
   
collapseTablePageLoad('collapse-aggregated-allCompletedStages','aggregated-allCompletedStages');
+  
collapseTablePageLoad('collapse-aggregated-allSkippedStages','aggregated-allSkippedStages');
   
collapseTablePageLoad('collapse-aggregated-allFailedStages','aggregated-allFailedStages');
   
collapseTablePageLoad('collapse-aggregated-activeStages','aggregated-activeStages');
   
collapseTablePageLoad('collapse-aggregated-pendingOrSkippedStages','aggregated-pendingOrSkippedStages');

http://git-wip-us.apache.org/repos/asf/spark/blob/87293c74/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
--
diff --git a/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala 
b/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
index 606dc1e..38450b9 100644
--- a/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
+++ b/core/src/main/scala/org/apache/spark/ui/jobs/AllStagesPage.scala
@@ -36,6 +36,7 @@ private[ui] class AllStagesPage(parent: StagesTab) extends 
WebUIPage("") {
 
 val activeStages = allStages.filter(_.status == StageStatus.ACTIVE)
 val pendingStages = allStages.filter(_.status == StageStatus.PENDING)
+val skippedStages = allStages.filter(_.status == StageStatus.SKIPPED)
 val completedStages = allStages.filter(_.status == StageStatus.COMPLETE)
 val failedStages = allStages.filter(_.status == StageStatus.FAILED).reverse
 
@@ -51,6 +52,9 @@ private[ui] class AllStagesPage(parent: StagesTab) extends 
WebUIPage("") {
 val completedStagesTable =
   new StageTableBase(parent.store, request, completedStages, "completed", 
"completedStage",
 parent.basePath, subPath, parent.isFairScheduler, false, false)
+val skippedStagesTable =
+  new StageTableBase(parent.store, request, skippedStages, "skipped", 
"skippedStage",
+parent.basePath, subPath, parent.isFairScheduler, false, false)
 val failedStagesTable =
   new StageTableBase(parent.store, request, failedStages, "failed", 
"failedStage",
 parent.basePath, subPath, parent.isFairScheduler, false, true)
@@ -66,6 +70,7 @@ private[ui] class AllStagesPage(parent: StagesTab) extends 
WebUIPage("") {
 val shouldShowActiveStages = activeStages.nonEmpty
 val shouldShowPendingStages = pendingStages.nonEmpty
 val shouldShowCompletedStages = completedStages.nonEmpty
+val shouldShowSkippedStages = skippedStages.nonEmpty
 val shouldShowFailedStages = failedStages.nonEmpty
 
 val appSummary = parent.store.appSummary()
@@ -103,6 +108,14 @@ private[ui] class AllStagesPage(parent: StagesTab) extends 
WebUIPage("") {
 }
   }
   {
+if (shouldShowSkippedStages) {
+  
+   

spark-website git commit: Update committer pages

2018-02-22 Thread cutlerb
Repository: spark-website
Updated Branches:
  refs/heads/asf-site 3f874c90a -> 6853fd7c6


Update committer pages


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/6853fd7c
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/6853fd7c
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/6853fd7c

Branch: refs/heads/asf-site
Commit: 6853fd7c6964389606e64d0a98db204ba5d631de
Parents: 3f874c9
Author: Bryan Cutler 
Authored: Wed Feb 21 15:55:56 2018 -0800
Committer: Bryan Cutler 
Committed: Wed Feb 21 15:55:56 2018 -0800

--
 committers.md| 1 +
 site/committers.html | 4 
 2 files changed, 5 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark-website/blob/6853fd7c/committers.md
--
diff --git a/committers.md b/committers.md
index 6c6b8ab..4098fb2 100644
--- a/committers.md
+++ b/committers.md
@@ -15,6 +15,7 @@ navigation:
 |Joseph Bradley|Databricks|
 |Felix Cheung|Microsoft|
 |Mosharaf Chowdhury|University of Michigan, Ann Arbor|
+|Bryan Cutler|IBM|
 |Jason Dai|Intel|
 |Tathagata Das|Databricks|
 |Ankur Dave|UC Berkeley|

http://git-wip-us.apache.org/repos/asf/spark-website/blob/6853fd7c/site/committers.html
--
diff --git a/site/committers.html b/site/committers.html
index c545cd2..83e275e 100644
--- a/site/committers.html
+++ b/site/committers.html
@@ -225,6 +225,10 @@
   University of Michigan, Ann Arbor
 
 
+  Bryan Cutler
+  IBM
+
+
   Jason Dai
   Intel
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r25220 - in /dev/spark/2.3.1-SNAPSHOT-2018_02_22_10_01-285b841-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-02-22 Thread pwendell
Author: pwendell
Date: Thu Feb 22 18:15:42 2018
New Revision: 25220

Log:
Apache Spark 2.3.1-SNAPSHOT-2018_02_22_10_01-285b841 docs


[This commit notification would consist of 1443 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: Preparing development version 2.3.1-SNAPSHOT

2018-02-22 Thread sameerag
Preparing development version 2.3.1-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/285b841f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/285b841f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/285b841f

Branch: refs/heads/branch-2.3
Commit: 285b841ffbfb21c0af3f83800f7815fb0bfe3627
Parents: 992447f
Author: Sameer Agarwal 
Authored: Thu Feb 22 09:57:03 2018 -0800
Committer: Sameer Agarwal 
Committed: Thu Feb 22 09:57:03 2018 -0800

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/285b841f/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 6d46c31..29a8a00 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.0
+Version: 2.3.1
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/285b841f/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 2ca9ab6..5c5a8e9 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.0
+2.3.1-SNAPSHOT
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/285b841f/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 404c744..2a625da 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.0
+2.3.1-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/285b841f/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 3c0b528..adb1890 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.0
+2.3.1-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/285b841f/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index fe3bcfd..4cdcfa2 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 

[spark] Git Push Summary

2018-02-22 Thread sameerag
Repository: spark
Updated Tags:  refs/tags/v2.3.0-rc5 [created] 992447fb3

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[1/2] spark git commit: Preparing Spark release v2.3.0-rc5

2018-02-22 Thread sameerag
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 a0d794989 -> 285b841ff


Preparing Spark release v2.3.0-rc5


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/992447fb
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/992447fb
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/992447fb

Branch: refs/heads/branch-2.3
Commit: 992447fb30ee9ebb3cf794f2d06f4d63a2d792db
Parents: a0d7949
Author: Sameer Agarwal 
Authored: Thu Feb 22 09:56:57 2018 -0800
Committer: Sameer Agarwal 
Committed: Thu Feb 22 09:56:57 2018 -0800

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/992447fb/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 29a8a00..6d46c31 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.1
+Version: 2.3.0
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/992447fb/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 5c5a8e9..2ca9ab6 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.1-SNAPSHOT
+2.3.0
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/992447fb/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 2a625da..404c744 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.1-SNAPSHOT
+2.3.0
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/992447fb/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index adb1890..3c0b528 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.1-SNAPSHOT
+2.3.0
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/992447fb/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 4cdcfa2..fe3bcfd 100644
---