This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 950adeab4d2 [SPARK-41528][CONNECT][FOLLOW-UP] Do not set null as a 
string for remote option
950adeab4d2 is described below

commit 950adeab4d2869ccc034f4dc16b9d12fcc810aa4
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Thu Dec 22 12:23:09 2022 +0900

    [SPARK-41528][CONNECT][FOLLOW-UP] Do not set null as a string for remote 
option
    
    ### What changes were proposed in this pull request?
    
    This PR is a followup of https://github.com/apache/spark/pull/39041 that 
avoids to set `null` as string for remote configuration and option.
    
    ### Why are the changes needed?
    
    To make the default as the regular PySpark as is. Otherwise, it attempts to 
create remote SparkSession by default, which later fails because of `null` 
Spark Connect URL.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, the main code has not be released yet.
    It fixes `./bin/pyspark` case.
    
    ### How was this patch tested?
    
    Manually tested via:
    
    ```bash
    ./bin/pyspark
    ```
    
    Closes #39168 from HyukjinKwon/SPARK-41528-followup.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 .../java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java     | 4 +++-
 python/pyspark/testing/connectutils.py                                | 1 -
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git 
a/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java
 
b/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java
index 520a147751d..055123d8a74 100644
--- 
a/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java
+++ 
b/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java
@@ -349,7 +349,9 @@ class SparkSubmitCommandBuilder extends 
AbstractCommandBuilder {
       // pass conf spark.pyspark.python to python by environment variable.
       env.put("PYSPARK_PYTHON", conf.get(SparkLauncher.PYSPARK_PYTHON));
     }
-    env.put("SPARK_REMOTE", remote);
+    if (remote != null) {
+      env.put("SPARK_REMOTE", remote);
+    }
     if (!isEmpty(pyOpts)) {
       pyargs.addAll(parseOptionString(pyOpts));
     }
diff --git a/python/pyspark/testing/connectutils.py 
b/python/pyspark/testing/connectutils.py
index 21bdf35c6f3..dcbc09f2210 100644
--- a/python/pyspark/testing/connectutils.py
+++ b/python/pyspark/testing/connectutils.py
@@ -46,7 +46,6 @@ if have_pandas and have_pyarrow and have_grpc:
     connect_url = "--remote sc://localhost"
     jars_args = "--jars %s" % connect_jar
     plugin_args = "--conf 
spark.plugins=org.apache.spark.sql.connect.SparkConnectPlugin"
-    os.environ["PYSPARK_SUBMIT_ARGS"] = " ".join([jars_args, plugin_args, 
existing_args])
     os.environ["PYSPARK_SUBMIT_ARGS"] = " ".join(
         [connect_url, jars_args, plugin_args, existing_args]
     )


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to