Spark currently requires at least Java 1.7, so adding a Java
1.8-specific encoder will not be straightforward without affecting
requirements. I can think of two solutions:

1. add a Java 1.8 build profile which includes such encoders (this may
be useful for Scala 2.12 support in the future as well)
2. expose a custom Encoder API (the current one is not easily extensible)

I would personally favor solution number 2 as it avoids adding yet
another build configuration to choose from, however I am not sure how
feasible it is to make custom encoders play nice with Catalyst.

To get back to your question, I don't think there are currently any
plans and I would recommend you work around the issue by converting to
the old Date API
http://stackoverflow.com/questions/33066904/localdate-to-java-util-date-and-vice-versa-simpliest-conversion

On Fri, Sep 2, 2016 at 8:29 AM, Daniel Siegmann
<dsiegm...@securityscorecard.io> wrote:
> It seems Spark can handle case classes with java.sql.Date, but not
> java.time.LocalDate. It complains there's no encoder.
>
> Are there any plans to add an encoder for LocalDate (and other classes in
> the new Java 8 Time and Date API), or is there an existing library I can use
> that provides encoders?
>
> --
> Daniel Siegmann
> Senior Software Engineer
> SecurityScorecard Inc.
> 214 W 29th Street, 5th Floor
> New York, NY 10001
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to