This is an automated email from the ASF dual-hosted git repository. ruifengz pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 1100d75f53c [SPARK-39579][PYTHON][FOLLOWUP] fix functionExists(functionName, dbName) when dbName is not None 1100d75f53c is described below commit 1100d75f53c16f44dd414b8a0be477760420507d Author: Ruifeng Zheng <ruife...@apache.org> AuthorDate: Tue Jul 5 19:53:13 2022 +0800 [SPARK-39579][PYTHON][FOLLOWUP] fix functionExists(functionName, dbName) when dbName is not None ### What changes were proposed in this pull request? fix functionExists(functionName, dbName) ### Why are the changes needed? https://github.com/apache/spark/pull/36977 introduce a bug in `functionExists(functionName, dbName)`, when dbName is not None, should call `self._jcatalog.functionExists(dbName, functionName)` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? existing testsuite Closes #37088 from zhengruifeng/py_3l_fix_functionExists. Authored-by: Ruifeng Zheng <ruife...@apache.org> Signed-off-by: Ruifeng Zheng <ruife...@apache.org> --- python/pyspark/sql/catalog.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/pyspark/sql/catalog.py b/python/pyspark/sql/catalog.py index 42c040c284b..7efaf14eb82 100644 --- a/python/pyspark/sql/catalog.py +++ b/python/pyspark/sql/catalog.py @@ -359,7 +359,7 @@ class Catalog: "a future version. Use functionExists(`dbName.tableName`) instead.", FutureWarning, ) - return self._jcatalog.functionExists(self.currentDatabase(), functionName) + return self._jcatalog.functionExists(dbName, functionName) def getFunction(self, functionName: str) -> Function: """Get the function with the specified name. This function can be a temporary function or a --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org