[ 
https://issues.apache.org/jira/browse/SPARK-26494?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17013093#comment-17013093
 ] 

Jeff Evans commented on SPARK-26494:
------------------------------------

To be clear, this type represents an instant in time.  From [the 
docs|https://docs.oracle.com/database/121/SUTIL/GUID-CB5D2124-D9AE-4C71-A83D-DFE071FE3542.htm]:

{quote}The TIMESTAMP WITH LOCAL TIME ZONE data type is another variant of 
TIMESTAMP that includes a time zone offset in its value. Data stored in the 
database is normalized to the database time zone, and time zone displacement is 
not stored as part of the column data. When the data is retrieved, it is 
returned in the user's local session time zone. It is specified as 
follows:{quote}

So it's really almost the same as a {{TIMESTAMP}}, just that it does some kind 
of automatic TZ conversion (converting from the offset given by the client to 
the DB server's offset automatically).  But that conversion is orthogonal to 
Spark entirely; it should just be treated like a {{TIMESTAMP}}.

> 【spark sql】Use spark to read oracle TIMESTAMP(6) WITH LOCAL TIME ZONE type 
> can't be found,
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26494
>                 URL: https://issues.apache.org/jira/browse/SPARK-26494
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: kun'qin 
>            Priority: Minor
>
> Use spark to read oracle TIMESTAMP(6) WITH LOCAL TIME ZONE type can't be 
> found,
> When the data type is TIMESTAMP(6) WITH LOCAL TIME ZONE
> At this point, the sqlType value of the function getCatalystType in the 
> JdbcUtils class is -102.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to