[ 
https://issues.apache.org/jira/browse/SPARK-18491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15673719#comment-15673719
 ] 

Damian Momot edited comment on SPARK-18491 at 11/17/16 1:25 PM:
----------------------------------------------------------------

You don't have to break compatibility. java.sql.Timestamp can still be 
internal/default type. At the same time it could be possible to define type 
using joda (it's on the classpath anyway) as:

{code}
case class Test(id: String, timestamp: org.joda.time.Instant)
{code}

And Dataset[Test] would infer schema as TimestampType

As you told if it was possible to write custom encoders this could be easily 
extended, but from my short findings there isn't way to do that yet?


was (Author: daimon):
You don't have to break compatibility. java.sql.Timestamp can still be 
internal/default type. At the same time it could be possible to define type as:

{code}
case class Test(id: String, timestamp: org.joda.time.Instant)
{code}

And Dataset[Test] would infer schema as TimestampType

As you told if it was possible to write custom encoders this could be easily 
extended, but from my short findings there isn't way to do that yet?

> Spark uses mutable classes for date/time types mapping
> ------------------------------------------------------
>
>                 Key: SPARK-18491
>                 URL: https://issues.apache.org/jira/browse/SPARK-18491
>             Project: Spark
>          Issue Type: Improvement
>            Reporter: Damian Momot
>            Priority: Minor
>
> TimestampType is mapped to java.sql.Timestamp
> DateType is mapped to java.sql.Date
> Those both java types are mutable and thus their usage is highly discourage, 
> especially in distributed computing which uses lazy, functional approach
> Mapping to immutable joda times should be enough for now (until scala 2.12 + 
> jdk8 java.time is available for spark)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to