This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new ad5a535  [SPARK-36896][PYTHON] Return boolean for `dropTempView` and 
`dropGlobalTempView`
ad5a535 is described below

commit ad5a53511e46d4a432f1143cd3d6dee8af1f224a
Author: Xinrong Meng <xinrong.m...@databricks.com>
AuthorDate: Thu Sep 30 14:27:00 2021 +0900

    [SPARK-36896][PYTHON] Return boolean for `dropTempView` and 
`dropGlobalTempView`
    
    ### What changes were proposed in this pull request?
    Currently `dropTempView` and `dropGlobalTempView` don't have return value, 
which conflicts with their docstring:
    `Returns true if this view is dropped successfully, false otherwise.`. And 
that's not consistent with the same API in other languages.
    
    The PR proposes a fix for that.
    
    ### Why are the changes needed?
    Be consistent with API in other languages.
    
    ### Does this PR introduce _any_ user-facing change?
    Yes.
    #### From
    ```py
    # dropTempView
    >>> spark.createDataFrame([(1, 1)]).createTempView("my_table")
    >>> spark.table("my_table").collect()
    [Row(_1=1, _2=1)]
    >>> spark.catalog.dropTempView("my_table")
    >>> spark.catalog.dropTempView("my_table")
    
    # dropGlobalTempView
    >>> spark.createDataFrame([(1, 1)]).createGlobalTempView("my_table")
    >>> spark.table("global_temp.my_table").collect()
    [Row(_1=1, _2=1)]
    >>> spark.catalog.dropGlobalTempView("my_table")
    >>> spark.catalog.dropGlobalTempView("my_table")
    ```
    
    #### To
    ```py
    # dropTempView
    >>> spark.createDataFrame([(1, 1)]).createTempView("my_table")
    >>> spark.table("my_table").collect()
    [Row(_1=1, _2=1)]
    >>> spark.catalog.dropTempView("my_table")
    True
    >>> spark.catalog.dropTempView("my_table")
    False
    
    # dropGlobalTempView
    >>> spark.createDataFrame([(1, 1)]).createGlobalTempView("my_table")
    >>> spark.table("global_temp.my_table").collect()
    [Row(_1=1, _2=1)]
    >>> spark.catalog.dropGlobalTempView("my_table")
    True
    >>> spark.catalog.dropGlobalTempView("my_table")
    False
    ```
    
    ### How was this patch tested?
    Existing tests.
    
    Closes #34147 from xinrong-databricks/fix_return.
    
    Authored-by: Xinrong Meng <xinrong.m...@databricks.com>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/sql/catalog.py   | 8 +++++---
 python/pyspark/sql/dataframe.py | 6 ++++++
 2 files changed, 11 insertions(+), 3 deletions(-)

diff --git a/python/pyspark/sql/catalog.py b/python/pyspark/sql/catalog.py
index 4990133..3760f96 100644
--- a/python/pyspark/sql/catalog.py
+++ b/python/pyspark/sql/catalog.py
@@ -50,7 +50,7 @@ class Catalog(object):
     @since(2.0)
     def setCurrentDatabase(self, dbName):
         """Sets the current default database in this session."""
-        return self._jcatalog.setCurrentDatabase(dbName)
+        self._jcatalog.setCurrentDatabase(dbName)
 
     @since(2.0)
     def listDatabases(self):
@@ -323,12 +323,13 @@ class Catalog(object):
         >>> spark.table("my_table").collect()
         [Row(_1=1, _2=1)]
         >>> spark.catalog.dropTempView("my_table")
+        True
         >>> spark.table("my_table") # doctest: +IGNORE_EXCEPTION_DETAIL
         Traceback (most recent call last):
             ...
         AnalysisException: ...
         """
-        self._jcatalog.dropTempView(viewName)
+        return self._jcatalog.dropTempView(viewName)
 
     def dropGlobalTempView(self, viewName):
         """Drops the global temporary view with the given view name in the 
catalog.
@@ -343,12 +344,13 @@ class Catalog(object):
         >>> spark.table("global_temp.my_table").collect()
         [Row(_1=1, _2=1)]
         >>> spark.catalog.dropGlobalTempView("my_table")
+        True
         >>> spark.table("global_temp.my_table") # doctest: 
+IGNORE_EXCEPTION_DETAIL
         Traceback (most recent call last):
             ...
         AnalysisException: ...
         """
-        self._jcatalog.dropGlobalTempView(viewName)
+        return self._jcatalog.dropGlobalTempView(viewName)
 
     def registerFunction(self, name, f, returnType=None):
         """An alias for :func:`spark.udf.register`.
diff --git a/python/pyspark/sql/dataframe.py b/python/pyspark/sql/dataframe.py
index de289e1..8d4c94f 100644
--- a/python/pyspark/sql/dataframe.py
+++ b/python/pyspark/sql/dataframe.py
@@ -136,6 +136,8 @@ class DataFrame(PandasMapOpsMixin, PandasConversionMixin):
         >>> sorted(df.collect()) == sorted(df2.collect())
         True
         >>> spark.catalog.dropTempView("people")
+        True
+
         """
         warnings.warn(
             "Deprecated in 2.0, use createOrReplaceTempView instead.",
@@ -164,6 +166,7 @@ class DataFrame(PandasMapOpsMixin, PandasConversionMixin):
         ...
         AnalysisException: u"Temporary table 'people' already exists;"
         >>> spark.catalog.dropTempView("people")
+        True
 
         """
         self._jdf.createTempView(name)
@@ -185,6 +188,7 @@ class DataFrame(PandasMapOpsMixin, PandasConversionMixin):
         >>> sorted(df3.collect()) == sorted(df2.collect())
         True
         >>> spark.catalog.dropTempView("people")
+        True
 
         """
         self._jdf.createOrReplaceTempView(name)
@@ -209,6 +213,7 @@ class DataFrame(PandasMapOpsMixin, PandasConversionMixin):
         ...
         AnalysisException: u"Temporary table 'people' already exists;"
         >>> spark.catalog.dropGlobalTempView("people")
+        True
 
         """
         self._jdf.createGlobalTempView(name)
@@ -229,6 +234,7 @@ class DataFrame(PandasMapOpsMixin, PandasConversionMixin):
         >>> sorted(df3.collect()) == sorted(df2.collect())
         True
         >>> spark.catalog.dropGlobalTempView("people")
+        True
 
         """
         self._jdf.createOrReplaceGlobalTempView(name)

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to