This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 5ca3467b8653 [SPARK-47729][PYTHON][TESTS] Get the proper default port 
for pyspark-connect testcases
5ca3467b8653 is described below

commit 5ca3467b86531b971f92c4d9da3ecc2735ae2214
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Thu Apr 4 07:33:24 2024 -0700

    [SPARK-47729][PYTHON][TESTS] Get the proper default port for 
pyspark-connect testcases
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to get the proper default port for `pyspark-connect` 
testcases.
    
    ### Why are the changes needed?
    
    `pyspark-connect` cannot access to the JVM, so cannot get the randomized 
port assigned from JVM.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, `pyspark-connect` is not published yet, and this is a test-only change.
    
    ### How was this patch tested?
    
    Manually tested.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #45875 from HyukjinKwon/SPARK-47729.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 python/pyspark/sql/connect/client/core.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/python/pyspark/sql/connect/client/core.py 
b/python/pyspark/sql/connect/client/core.py
index dd7fae881aec..b731960bbaf3 100644
--- a/python/pyspark/sql/connect/client/core.py
+++ b/python/pyspark/sql/connect/client/core.py
@@ -56,6 +56,7 @@ import grpc
 from google.protobuf import text_format, any_pb2
 from google.rpc import error_details_pb2
 
+from pyspark.util import is_remote_only
 from pyspark.accumulators import SpecialAccumulatorIds
 from pyspark.loose_version import LooseVersion
 from pyspark.version import __version__
@@ -292,7 +293,7 @@ class DefaultChannelBuilder(ChannelBuilder):
 
     @staticmethod
     def default_port() -> int:
-        if "SPARK_TESTING" in os.environ:
+        if "SPARK_TESTING" in os.environ and not is_remote_only():
             from pyspark.sql.session import SparkSession as PySparkSession
 
             # In the case when Spark Connect uses the local mode, it starts 
the regular Spark


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to