[ 
https://issues.apache.org/jira/browse/SPARK-26902?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17085759#comment-17085759
 ] 

Jorge Machado commented on SPARK-26902:
---------------------------------------

what about Supporting the interface Temporal ?

> Support java.time.Instant as an external type of TimestampType
> --------------------------------------------------------------
>
>                 Key: SPARK-26902
>                 URL: https://issues.apache.org/jira/browse/SPARK-26902
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Maxim Gekk
>            Assignee: Maxim Gekk
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Currently, Spark supports the java.sql.Date and java.sql.Timestamp types as 
> external types for Catalyst's DateType and TimestampType. It accepts and 
> produces values of such types. Since Java 8, base classes for dates and 
> timestamps are java.time.Instant, java.time.LocalDate/LocalDateTime, and 
> java.time.ZonedDateTime. Need to add new converters from/to Instant.
> The Instant type holds epoch seconds (and nanoseconds), and directly reflects 
> to Catalyst's TimestampType.
> Main motivations for the changes:
> - Smoothly support Java 8 time API
> - Avoid inconsistency of calendars used inside Spark 3.0 (Proleptic Gregorian 
> calendar) and inside of java.sql.Timestamp (hybrid calendar - Julian + 
> Gregorian). 
> - Make conversion independent from current system timezone.
> In case of collecting values of Date/TimestampType, the following SQL config 
> can control types of returned values:
>  - spark.sql.catalyst.timestampType with supported values 
> "java.sql.Timestamp" (by default) and "java.time.Instant"



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to