[ 
https://issues.apache.org/jira/browse/SPARK-11474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14990335#comment-14990335
 ] 

Apache Spark commented on SPARK-11474:
--------------------------------------

User 'huaxingao' has created a pull request for this issue:
https://github.com/apache/spark/pull/9473

> Options to jdbc load are lower cased
> ------------------------------------
>
>                 Key: SPARK-11474
>                 URL: https://issues.apache.org/jira/browse/SPARK-11474
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 1.5.1
>         Environment: Linux & Mac
>            Reporter: Stephen Samuel
>            Priority: Minor
>
> We recently upgraded from spark 1.3.0 to 1.5.1 and one of the features we 
> wanted to take advantage of was the fetchSize added to the jdbc data frame 
> reader.
> In 1.5.1 there appears to be a bug or regression, whereby an options map has 
> its keys lowercased. This means the existing properties from prior to 1.4 are 
> ok, such as dbtable, url and driver, but the newer fetchSize gets converted 
> to fetchsize.
> To re-produce:
> val conf = new SparkConf(true).setMaster("local").setAppName("fetchtest")
> val sc = new SparkContext(conf)
> val sql = new SQLContext(sc)
> val options = Map("url" -> ...., "driver" -> ...., "fetchSize" -> ....)
> val df = sql.load("jdbc", options)
> Breakpoint at line 371 in JDBCRDD and you'll see the options are all 
> lowercased, so:
> val fetchSize = properties.getProperty("fetchSize", "0").toInt
> results in 0
> Now I know sql.load is deprecated, but this might be occuring on other 
> methods too. The workaround is to use the java.util.Properties overload, 
> which keeps the case sensitive keys.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to