GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/20841

    [SPARK-23706][PYTHON] spark.conf.get(value, default=None) should produce 
None in PySpark

    ## What changes were proposed in this pull request?
    
    Scala:
    
    ```
    scala> spark.conf.get("hey", null)
    res1: String = null
    ```
    
    ```
    scala> spark.conf.get("spark.sql.sources.partitionOverwriteMode", null)
    res2: String = null
    ```
    
    Python:
    
    **Before**
    
    ```
    >>> spark.conf.get("hey", None)
    ...
    py4j.protocol.Py4JJavaError: An error occurred while calling o30.get.
    : java.util.NoSuchElementException: hey
    ...
    ```
    
    ```
    >>> spark.conf.get("spark.sql.sources.partitionOverwriteMode", None)
    u'STATIC'
    ```
    
    **After**
    
    ```
    >>> spark.conf.get("hey", None) is None
    True
    ```
    
    ```
    >>> spark.conf.get("spark.sql.sources.partitionOverwriteMode", None) is None
    True
    ```
    
    *Note that this PR preserves the case below:
    
    ```
    >>> spark.conf.get("spark.sql.sources.partitionOverwriteMode")
    u'STATIC'
    ```
    
    
    ## How was this patch tested?
    
    Manually tested and unit tests were added.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark spark-conf-get

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20841.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20841
    
----
commit 1a6cfcea2fd23a2a8b7cd0604507a8eb502962a6
Author: hyukjinkwon <gurwls223@...>
Date:   2018-03-16T04:09:58Z

    spark.conf.get(value, default=None) should produce None in PySpark

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to