spark git commit: [SPARK-15269][SQL] Removes unexpected empty table directories created while creating external Spark SQL data sourcet tables.

2016-06-01 Thread lian
Repository: spark
Updated Branches:
  refs/heads/branch-2.0 44052a707 -> e033fd50f


[SPARK-15269][SQL] Removes unexpected empty table directories created while 
creating external Spark SQL data sourcet tables.

This PR is an alternative to #13120 authored by xwu0226.

## What changes were proposed in this pull request?

When creating an external Spark SQL data source table and persisting its 
metadata to Hive metastore, we don't use the standard Hive `Table.dataLocation` 
field because Hive only allows directory paths as data locations while Spark 
SQL also allows file paths. However, if we don't set `Table.dataLocation`, Hive 
always creates an unexpected empty table directory under database location, but 
doesn't remove it while dropping the table (because the table is external).

This PR works around this issue by explicitly setting `Table.dataLocation` and 
then manullay removing the created directory after creating the external table.

Please refer to [this JIRA comment][1] for more details about why we chose this 
approach as a workaround.

[1]: 
https://issues.apache.org/jira/browse/SPARK-15269?focusedCommentId=15297408=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15297408

## How was this patch tested?

1. A new test case is added in `HiveQuerySuite` for this case
2. Updated `ShowCreateTableSuite` to use the same table name in all test cases. 
(This is how I hit this issue at the first place.)

Author: Cheng Lian 

Closes #13270 from liancheng/spark-15269-unpleasant-fix.

(cherry picked from commit 7bb64aae27f670531699f59d3f410e38866609b7)
Signed-off-by: Cheng Lian 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e033fd50
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e033fd50
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e033fd50

Branch: refs/heads/branch-2.0
Commit: e033fd50f0fcefb2a6cffb763ff7e026b0066c07
Parents: 44052a7
Author: Cheng Lian 
Authored: Wed Jun 1 16:02:27 2016 -0700
Committer: Cheng Lian 
Committed: Wed Jun 1 16:02:37 2016 -0700

--
 .../apache/spark/sql/AnalysisException.scala|  5 ++-
 .../sql/catalyst/catalog/SessionCatalog.scala   |  2 +-
 .../org/apache/spark/sql/SparkSession.scala |  6 +--
 .../command/createDataSourceTables.scala|  4 +-
 .../apache/spark/sql/internal/HiveSerDe.scala   |  1 -
 .../spark/sql/hive/HiveExternalCatalog.scala| 45 ++--
 .../apache/spark/sql/hive/HiveSharedState.scala |  4 +-
 .../spark/sql/hive/client/HiveClientImpl.scala  | 22 --
 .../sql/hive/HiveExternalCatalogSuite.scala |  3 +-
 .../sql/hive/MetastoreDataSourcesSuite.scala| 13 ++
 .../spark/sql/hive/ShowCreateTableSuite.scala   | 36 
 11 files changed, 105 insertions(+), 36 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/e033fd50/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
--
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
index d2003fd..6911843 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
@@ -32,8 +32,9 @@ class AnalysisException protected[sql] (
 val message: String,
 val line: Option[Int] = None,
 val startPosition: Option[Int] = None,
-val plan: Option[LogicalPlan] = None)
-  extends Exception with Serializable {
+val plan: Option[LogicalPlan] = None,
+val cause: Option[Throwable] = None)
+  extends Exception(message, cause.orNull) with Serializable {
 
   def withPosition(line: Option[Int], startPosition: Option[Int]): 
AnalysisException = {
 val newException = new AnalysisException(message, line, startPosition)

http://git-wip-us.apache.org/repos/asf/spark/blob/e033fd50/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
--
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
index cf9286e..371c198 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
@@ -22,7 +22,7 @@ import javax.annotation.concurrent.GuardedBy
 import scala.collection.mutable
 
 import org.apache.hadoop.conf.Configuration
-import org.apache.hadoop.fs.Path
+import 

spark git commit: [SPARK-15269][SQL] Removes unexpected empty table directories created while creating external Spark SQL data sourcet tables.

2016-06-01 Thread lian
Repository: spark
Updated Branches:
  refs/heads/master 9e2643b21 -> 7bb64aae2


[SPARK-15269][SQL] Removes unexpected empty table directories created while 
creating external Spark SQL data sourcet tables.

This PR is an alternative to #13120 authored by xwu0226.

## What changes were proposed in this pull request?

When creating an external Spark SQL data source table and persisting its 
metadata to Hive metastore, we don't use the standard Hive `Table.dataLocation` 
field because Hive only allows directory paths as data locations while Spark 
SQL also allows file paths. However, if we don't set `Table.dataLocation`, Hive 
always creates an unexpected empty table directory under database location, but 
doesn't remove it while dropping the table (because the table is external).

This PR works around this issue by explicitly setting `Table.dataLocation` and 
then manullay removing the created directory after creating the external table.

Please refer to [this JIRA comment][1] for more details about why we chose this 
approach as a workaround.

[1]: 
https://issues.apache.org/jira/browse/SPARK-15269?focusedCommentId=15297408=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15297408

## How was this patch tested?

1. A new test case is added in `HiveQuerySuite` for this case
2. Updated `ShowCreateTableSuite` to use the same table name in all test cases. 
(This is how I hit this issue at the first place.)

Author: Cheng Lian 

Closes #13270 from liancheng/spark-15269-unpleasant-fix.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7bb64aae
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7bb64aae
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7bb64aae

Branch: refs/heads/master
Commit: 7bb64aae27f670531699f59d3f410e38866609b7
Parents: 9e2643b
Author: Cheng Lian 
Authored: Wed Jun 1 16:02:27 2016 -0700
Committer: Cheng Lian 
Committed: Wed Jun 1 16:02:27 2016 -0700

--
 .../apache/spark/sql/AnalysisException.scala|  5 ++-
 .../sql/catalyst/catalog/SessionCatalog.scala   |  2 +-
 .../org/apache/spark/sql/SparkSession.scala |  6 +--
 .../command/createDataSourceTables.scala|  4 +-
 .../apache/spark/sql/internal/HiveSerDe.scala   |  1 -
 .../spark/sql/hive/HiveExternalCatalog.scala| 45 ++--
 .../apache/spark/sql/hive/HiveSharedState.scala |  4 +-
 .../spark/sql/hive/client/HiveClientImpl.scala  | 22 --
 .../sql/hive/HiveExternalCatalogSuite.scala |  3 +-
 .../sql/hive/MetastoreDataSourcesSuite.scala| 13 ++
 .../spark/sql/hive/ShowCreateTableSuite.scala   | 36 
 11 files changed, 105 insertions(+), 36 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/7bb64aae/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
--
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
index d2003fd..6911843 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala
@@ -32,8 +32,9 @@ class AnalysisException protected[sql] (
 val message: String,
 val line: Option[Int] = None,
 val startPosition: Option[Int] = None,
-val plan: Option[LogicalPlan] = None)
-  extends Exception with Serializable {
+val plan: Option[LogicalPlan] = None,
+val cause: Option[Throwable] = None)
+  extends Exception(message, cause.orNull) with Serializable {
 
   def withPosition(line: Option[Int], startPosition: Option[Int]): 
AnalysisException = {
 val newException = new AnalysisException(message, line, startPosition)

http://git-wip-us.apache.org/repos/asf/spark/blob/7bb64aae/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
--
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
index cf9286e..371c198 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
@@ -22,7 +22,7 @@ import javax.annotation.concurrent.GuardedBy
 import scala.collection.mutable
 
 import org.apache.hadoop.conf.Configuration
-import org.apache.hadoop.fs.Path
+import org.apache.hadoop.fs.{FileSystem, Path}
 
 import org.apache.spark.internal.Logging
 import