[ 
https://issues.apache.org/jira/browse/SPARK-20557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-20557:
------------------------------------

    Assignee: Xiao Li  (was: Apache Spark)

> JdbcUtils doesn't support java.sql.Types.TIMESTAMP_WITH_TIMEZONE
> ----------------------------------------------------------------
>
>                 Key: SPARK-20557
>                 URL: https://issues.apache.org/jira/browse/SPARK-20557
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.1.0, 2.3.0
>            Reporter: Jannik Arndt
>            Assignee: Xiao Li
>              Labels: easyfix, jdbc, oracle, sql, timestamp
>             Fix For: 2.3.0
>
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> Reading from an Oracle DB table with a column of type TIMESTAMP WITH TIME 
> ZONE via jdbc ({{spark.sqlContext.read.format("jdbc").option(...).load()}}) 
> results in an error:
> {{Unsupported type -101}}
> {{org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:209)}}
> {{org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$5.apply(JdbcUtils.scala:246)}}
> {{org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$5.apply(JdbcUtils.scala:246)}}
> That is because the type 
> {{[java.sql.Types.TIMESTAMP_WITH_TIMEZONE|https://docs.oracle.com/javase/8/docs/api/java/sql/Types.html#TIMESTAMP_WITH_TIMEZONE]}}
>  (in Java since 1.8) is missing in 
> {{[JdbcUtils.scala|https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L225]}}
>  
> This is similar to SPARK-7039.
> I created a pull request with a fix.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to