[ https://issues.apache.org/jira/browse/SPARK-13727?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-13727: ------------------------------------ Assignee: (was: Apache Spark) > SparkConf.contains does not consider deprecated keys > ---------------------------------------------------- > > Key: SPARK-13727 > URL: https://issues.apache.org/jira/browse/SPARK-13727 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Marcelo Vanzin > Priority: Minor > > This makes it kinda inconsistent with other SparkConf APIs. For example: > {code} > scala> import org.apache.spark.SparkConf > import org.apache.spark.SparkConf > scala> val conf = new SparkConf().set("spark.io.compression.lz4.block.size", > "12345") > 16/03/07 10:55:17 WARN spark.SparkConf: The configuration key > 'spark.io.compression.lz4.block.size' has been deprecated as of Spark 1.4 and > and may be removed in the future. Please use the new key > 'spark.io.compression.lz4.blockSize' instead. > conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@221e8982 > scala> conf.get("spark.io.compression.lz4.blockSize") > res0: String = 12345 > scala> conf.contains("spark.io.compression.lz4.blockSize") > res1: Boolean = false > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org