Repository: spark
Updated Branches:
  refs/heads/branch-1.1 f17b7957a -> 6cbf83c05


[SPARK-3481] [SQL] Eliminate the error log in local Hive comparison test

Logically, we should remove the Hive Table/Database first and then reset the 
Hive configuration, repoint to the new data warehouse directory etc.
Otherwise it raised exceptions like "Database doesn't not exists: default" in 
the local testing.

Author: Cheng Hao <hao.ch...@intel.com>

Closes #2352 from chenghao-intel/test_hive and squashes the following commits:

74fd76b [Cheng Hao] eliminate the error log

(cherry picked from commit 8194fc662c08eb445444c207264e22361def54ea)
Signed-off-by: Michael Armbrust <mich...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6cbf83c0
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/6cbf83c0
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/6cbf83c0

Branch: refs/heads/branch-1.1
Commit: 6cbf83c05c7a073d4df81b59a1663fea38ce65f6
Parents: f17b795
Author: Cheng Hao <hao.ch...@intel.com>
Authored: Fri Sep 12 11:29:30 2014 -0700
Committer: Michael Armbrust <mich...@databricks.com>
Committed: Fri Sep 12 11:29:44 2014 -0700

----------------------------------------------------------------------
 .../scala/org/apache/spark/sql/hive/TestHive.scala | 17 ++++++++---------
 1 file changed, 8 insertions(+), 9 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/6cbf83c0/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
----------------------------------------------------------------------
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
index a013f3f..8bb2216 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
@@ -309,15 +309,6 @@ class TestHiveContext(sc: SparkContext) extends 
HiveContext(sc) {
         
log.asInstanceOf[org.apache.log4j.Logger].setLevel(org.apache.log4j.Level.WARN)
       }
 
-      // It is important that we RESET first as broken hooks that might have 
been set could break
-      // other sql exec here.
-      runSqlHive("RESET")
-      // For some reason, RESET does not reset the following variables...
-      runSqlHive("set datanucleus.cache.collections=true")
-      runSqlHive("set datanucleus.cache.collections.lazy=true")
-      // Lots of tests fail if we do not change the partition whitelist from 
the default.
-      runSqlHive("set hive.metastore.partition.name.whitelist.pattern=.*")
-
       loadedTables.clear()
       catalog.client.getAllTables("default").foreach { t =>
         logDebug(s"Deleting table $t")
@@ -343,6 +334,14 @@ class TestHiveContext(sc: SparkContext) extends 
HiveContext(sc) {
         FunctionRegistry.unregisterTemporaryUDF(udfName)
       }
 
+      // It is important that we RESET first as broken hooks that might have 
been set could break
+      // other sql exec here.
+      runSqlHive("RESET")
+      // For some reason, RESET does not reset the following variables...
+      runSqlHive("set datanucleus.cache.collections=true")
+      runSqlHive("set datanucleus.cache.collections.lazy=true")
+      // Lots of tests fail if we do not change the partition whitelist from 
the default.
+      runSqlHive("set hive.metastore.partition.name.whitelist.pattern=.*")
       configure()
 
       runSqlHive("USE default")


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to