GitHub user bomeng opened a pull request:

    https://github.com/apache/spark/pull/11568

    [SPARK-13727] [SQL] SparkConf.contains does not consider deprecated keys

    ## What changes were proposed in this pull request?
    The contains() method does not return consistently with get() if the key is 
deprecated. For example,
    import org.apache.spark.SparkConf
    val conf = new SparkConf()
    conf.set("spark.io.compression.lz4.block.size", "12345")  /* display some 
deprecated warning message */
    conf.get("spark.io.compression.lz4.block.size") /* return 12345 */
    conf.get("spark.io.compression.lz4.blockSize") /* return 12345 */
    conf.contains("spark.io.compression.lz4.block.size") /* return true */
    **conf.contains("spark.io.compression.lz4.blockSize") /* return false */**
    
    The fix will make the contains() and get() more consistent.
    
    ## How was this patch tested?
    I've added a test case for this.
    
    (Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
    Unit tests should be sufficient.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/bomeng/spark SPARK-13727

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/11568.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #11568
    
----
commit 91333f9b1c267a41b71b843b40ebaefc5e60e215
Author: bomeng <bm...@us.ibm.com>
Date:   2016-03-08T00:25:38Z

    fix SPARK-13727 with unit test

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to