This is an automated email from the ASF dual-hosted git repository.

ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 774440b66f3 [SPARK-43740][PYTHON][CONNECT] Hide unsupported `session` 
methods from auto-completion
774440b66f3 is described below

commit 774440b66f3378868f38018f4e5fc7b2de950016
Author: Ruifeng Zheng <ruife...@apache.org>
AuthorDate: Tue May 23 17:20:54 2023 +0800

    [SPARK-43740][PYTHON][CONNECT] Hide unsupported `session` methods from 
auto-completion
    
    ### What changes were proposed in this pull request?
    move unsupported functions to `__getattr__`, except `getActiveSession` 
(this method doesn't work for `classmethod`)
    
    ### Why are the changes needed?
    Hide unsupported functions from auto-completion
    
    before:
    <img width="1464" alt="image" 
src="https://github.com/apache/spark/assets/7322292/6a3efc83-99ed-4b73-b681-13640b3de7a0";>
    
    after:
    <img width="1311" alt="image" 
src="https://github.com/apache/spark/assets/7322292/79366a32-718b-4208-ae1f-3e749971d6d2";>
    
    ### Does this PR introduce _any_ user-facing change?
    yes
    
    ### How was this patch tested?
    manually check in `ipython`
    
    Closes #41272 from zhengruifeng/session_unsupported.
    
    Authored-by: Ruifeng Zheng <ruife...@apache.org>
    Signed-off-by: Ruifeng Zheng <ruife...@apache.org>
---
 python/pyspark/sql/connect/session.py | 45 ++++++++---------------------------
 1 file changed, 10 insertions(+), 35 deletions(-)

diff --git a/python/pyspark/sql/connect/session.py 
b/python/pyspark/sql/connect/session.py
index 511e041c535..1d0a5c5d7b6 100644
--- a/python/pyspark/sql/connect/session.py
+++ b/python/pyspark/sql/connect/session.py
@@ -553,49 +553,24 @@ class SparkSession:
             error_class="NOT_IMPLEMENTED", message_parameters={"feature": 
"getActiveSession()"}
         )
 
-    def newSession(self) -> Any:
-        raise PySparkNotImplementedError(
-            error_class="NOT_IMPLEMENTED", message_parameters={"feature": 
"newSession()"}
-        )
-
     @property
     def conf(self) -> RuntimeConf:
         return RuntimeConf(self.client)
 
-    @property
-    def sparkContext(self) -> Any:
-        raise PySparkNotImplementedError(
-            error_class="NOT_IMPLEMENTED", message_parameters={"feature": 
"sparkContext()"}
-        )
-
     @property
     def streams(self) -> "StreamingQueryManager":
         return StreamingQueryManager(self)
 
-    @property
-    def _jsc(self) -> None:
-        raise PySparkAttributeError(
-            error_class="JVM_ATTRIBUTE_NOT_SUPPORTED", 
message_parameters={"attr_name": "_jsc"}
-        )
-
-    @property
-    def _jconf(self) -> None:
-        raise PySparkAttributeError(
-            error_class="JVM_ATTRIBUTE_NOT_SUPPORTED", 
message_parameters={"attr_name": "_jconf"}
-        )
-
-    @property
-    def _jvm(self) -> None:
-        raise PySparkAttributeError(
-            error_class="JVM_ATTRIBUTE_NOT_SUPPORTED", 
message_parameters={"attr_name": "_jvm"}
-        )
-
-    @property
-    def _jsparkSession(self) -> None:
-        raise PySparkAttributeError(
-            error_class="JVM_ATTRIBUTE_NOT_SUPPORTED",
-            message_parameters={"attr_name": "_jsparkSession"},
-        )
+    def __getattr__(self, name: str) -> Any:
+        if name in ["_jsc", "_jconf", "_jvm", "_jsparkSession"]:
+            raise PySparkAttributeError(
+                error_class="JVM_ATTRIBUTE_NOT_SUPPORTED", 
message_parameters={"attr_name": name}
+            )
+        elif name in ["newSession", "sparkContext"]:
+            raise PySparkNotImplementedError(
+                error_class="NOT_IMPLEMENTED", message_parameters={"feature": 
f"{name}()"}
+            )
+        return object.__getattribute__(self, name)
 
     @property
     def udf(self) -> "UDFRegistration":


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to