This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 3d9049533d8 [SPARK-39181][SQL] `SessionCatalog.reset` should not drop 
temp functions twice
3d9049533d8 is described below

commit 3d9049533d8bc75cc2c4832dc2fd6f8ac53efe21
Author: Wenchen Fan <wenc...@databricks.com>
AuthorDate: Fri May 13 10:21:15 2022 -0700

    [SPARK-39181][SQL] `SessionCatalog.reset` should not drop temp functions 
twice
    
    ### What changes were proposed in this pull request?
    
    `SessionCatalog.reset` is a test only API and it drops the temp functions 
twice currently:
    1. once with `listFunctions(DEFAULT_DATABASE)...` which list temp functions 
as well
    2. once with `functionRegistry.clear()`
    
    This PR changes `listFunctions` to `externalCatalog.listFunctions` which 
only list permanent functions.
    
    ### Why are the changes needed?
    
    code simplification and probably makes tests run faster
    
    ### Does this PR introduce _any_ user-facing change?
    
    no
    
    ### How was this patch tested?
    
    N/A
    
    Closes #36542 from cloud-fan/minor.
    
    Authored-by: Wenchen Fan <wenc...@databricks.com>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 .../apache/spark/sql/catalyst/catalog/SessionCatalog.scala    | 11 ++++-------
 1 file changed, 4 insertions(+), 7 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
index 6b7f8a207d6..d6c80f98bf7 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
@@ -1818,13 +1818,10 @@ class SessionCatalog(
     listTables(DEFAULT_DATABASE).foreach { table =>
       dropTable(table, ignoreIfNotExists = false, purge = false)
     }
-    listFunctions(DEFAULT_DATABASE).map(_._1).foreach { func =>
-      if (func.database.isDefined) {
-        dropFunction(func, ignoreIfNotExists = false)
-      } else {
-        dropTempFunction(func.funcName, ignoreIfNotExists = false)
-      }
-    }
+    // Temp functions are dropped below, we only need to drop permanent 
functions here.
+    externalCatalog.listFunctions(DEFAULT_DATABASE, "*").map { f =>
+      FunctionIdentifier(f, Some(DEFAULT_DATABASE))
+    }.foreach(dropFunction(_, ignoreIfNotExists = false))
     clearTempTables()
     globalTempViewManager.clear()
     functionRegistry.clear()


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to