[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-12 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/20564


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-10 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20564#discussion_r167392786
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -598,8 +598,22 @@ private[hive] class HiveClientImpl(
   db: String,
   table: String,
   newParts: Seq[CatalogTablePartition]): Unit = withHiveState {
-val hiveTable = toHiveTable(getTable(db, table), Some(userName))
-shim.alterPartitions(client, table, newParts.map { p => 
toHivePartition(p, hiveTable) }.asJava)
+// Note: Before altering table partitions in Hive, you *must* set the 
current database
+// to the one that contains the table of interest. Otherwise you will 
end up with the
+// most helpful error message ever: "Unable to alter partition. alter 
is not possible."
+// See HIVE-2742 for more detail.
+val original = state.getCurrentDatabase
+if (databaseExists(db)) {
+  state.setCurrentDatabase(db)
+} else {
+  throw new NoSuchDatabaseException(db)
+}
--- End diff --

Instead of copying the codes from `setCurrentDatabase`, why not directly 
calling it?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-10 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20564#discussion_r167392724
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1107,11 +1107,6 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
   }
 }
 
-// Note: Before altering table partitions in Hive, you *must* set the 
current database
-// to the one that contains the table of interest. Otherwise you will 
end up with the
-// most helpful error message ever: "Unable to alter partition. alter 
is not possible."
-// See HIVE-2742 for more detail.
-client.setCurrentDatabase(db)
--- End diff --

This does not change the behavior. Just to restore the original current 
database in the hive client. I do not think the current implementation has any 
issue, because we do not rely on the current database in the hive client. 
However, I am also fine to restore the original. This change does not have any 
harm but just make the codes more stable for the future refactoring.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-09 Thread liufengdb
Github user liufengdb commented on a diff in the pull request:

https://github.com/apache/spark/pull/20564#discussion_r167381054
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1107,11 +1107,6 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
   }
 }
 
-// Note: Before altering table partitions in Hive, you *must* set the 
current database
-// to the one that contains the table of interest. Otherwise you will 
end up with the
-// most helpful error message ever: "Unable to alter partition. alter 
is not possible."
-// See HIVE-2742 for more detail.
-client.setCurrentDatabase(db)
--- End diff --

Sorry, I meant `a special case`.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-09 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/20564#discussion_r167380287
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1107,11 +1107,6 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
   }
 }
 
-// Note: Before altering table partitions in Hive, you *must* set the 
current database
-// to the one that contains the table of interest. Otherwise you will 
end up with the
-// most helpful error message ever: "Unable to alter partition. alter 
is not possible."
-// See HIVE-2742 for more detail.
-client.setCurrentDatabase(db)
--- End diff --

Can we have a test case for the exception?
> This removes the exception that some calls from HiveExternalCatalog can 
reset the current database in the hive client as a side effect.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20564: [SPARK-23378][SQL] move setCurrentDatabase from H...

2018-02-09 Thread liufengdb
GitHub user liufengdb opened a pull request:

https://github.com/apache/spark/pull/20564

[SPARK-23378][SQL] move setCurrentDatabase from HiveExternalCatalog to 
HiveClientImpl

## What changes were proposed in this pull request?

This enforces the rule that no calls from `HiveExternalCatalog` reset the 
current database in the hive client, except the `setCurrentDatabase` method. 

## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/liufengdb/spark move

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/20564.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #20564


commit f1acb5ad9a12df11faf082144c676082df647ff9
Author: Feng Liu 
Date:   2018-02-10T00:18:44Z

some




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org