This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 3e1795b  [SPARK-30798][SQL][TESTS][FOLLOW-UP] Set the configuration 
against the current session explicitly in HiveShowCreateTableSuite
3e1795b is described below

commit 3e1795b4aeba5a627dc3e45888cefbcccd1c8941
Author: HyukjinKwon <gurwls...@apache.org>
AuthorDate: Wed Feb 26 20:48:43 2020 +0900

    [SPARK-30798][SQL][TESTS][FOLLOW-UP] Set the configuration against the 
current session explicitly in HiveShowCreateTableSuite
    
    ### What changes were proposed in this pull request?
    
    After https://github.com/apache/spark/pull/27387 (see 
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-hive-2.3/202/),
 the tests below fail consistently, specifically in one job 
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-hive-2.3/
 in Jenkins
    
    ```
    org.apache.spark.sql.hive.HiveShowCreateTableSuite.simple hive table
    org.apache.spark.sql.hive.HiveShowCreateTableSuite.simple external hive 
table
    org.apache.spark.sql.hive.HiveShowCreateTableSuite.hive bucketing is 
supported
    ```
    
    The profile is same as PR builder but seems it fails specifically in this 
machine. Seems the legacy configuration 
`spark.sql.legacy.createHiveTableByDefault.enabled` is not being set due to the 
inconsistency between `SQLConf.get` and the active Spark session as described 
in the https://github.com/apache/spark/pull/27387.
    
    This PR proposes to explicitly set the configuration against the session 
used instead of `SQLConf.get`.
    
    ### Why are the changes needed?
    
    To make `spark-master-test-sbt-hadoop-2.7-hive-2.3` job pass.
    
    ### Does this PR introduce any user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Cannot reproduce in my local. Presumably it cannot be reproduced in the PR 
builder. We should see if the tests pass at 
`spark-master-test-sbt-hadoop-2.7-hive-2.3` job after this PR is merged
    
    Closes #27703 from HyukjinKwon/SPARK-30798-followup.
    
    Authored-by: HyukjinKwon <gurwls...@apache.org>
    Signed-off-by: HyukjinKwon <gurwls...@apache.org>
    (cherry picked from commit 020b2622e597458b925d7227ed5f9fa269f2d391)
    Signed-off-by: HyukjinKwon <gurwls...@apache.org>
---
 .../org/apache/spark/sql/hive/HiveShowCreateTableSuite.scala     | 9 +++++----
 1 file changed, 5 insertions(+), 4 deletions(-)

diff --git 
a/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveShowCreateTableSuite.scala
 
b/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveShowCreateTableSuite.scala
index 99db1e3..50c9018 100644
--- 
a/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveShowCreateTableSuite.scala
+++ 
b/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveShowCreateTableSuite.scala
@@ -19,7 +19,7 @@ package org.apache.spark.sql.hive
 
 import org.apache.spark.sql.{AnalysisException, ShowCreateTableSuite}
 import org.apache.spark.sql.catalyst.TableIdentifier
-import org.apache.spark.sql.catalyst.catalog.{CatalogStorageFormat, 
CatalogTable}
+import org.apache.spark.sql.catalyst.catalog.CatalogTable
 import org.apache.spark.sql.hive.test.TestHiveSingleton
 import org.apache.spark.sql.internal.{HiveSerDe, SQLConf}
 
@@ -30,12 +30,13 @@ class HiveShowCreateTableSuite extends ShowCreateTableSuite 
with TestHiveSinglet
   protected override def beforeAll(): Unit = {
     super.beforeAll()
     origCreateHiveTableConfig =
-      SQLConf.get.getConf(SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED)
-    SQLConf.get.setConf(SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED, 
true)
+      spark.conf.get(SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED)
+    spark.conf.set(SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED.key, 
true)
   }
 
   protected override def afterAll(): Unit = {
-    SQLConf.get.setConf(SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED,
+    spark.conf.set(
+      SQLConf.LEGACY_CREATE_HIVE_TABLE_BY_DEFAULT_ENABLED.key,
       origCreateHiveTableConfig)
     super.afterAll()
   }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to