I switched to java.sql.Date and converted milliseconds to days:

         while (it.hasNext()) {
                Row irow = it.next();
        long t_long =
irow.<java.sql.Date>getAs("time_col").getTime()/(60*60*1000)))/24;
int t_int = toIntExact(t_long);
         }

Though if there is more efficient way to do it I would be happy to see that.
Anton

On Wed, Jun 14, 2017 at 12:42 AM, Kazuaki Ishizaki <ishiz...@jp.ibm.com>
wrote:

> Does this code help you?
> https://github.com/apache/spark/blob/master/sql/core/
> src/test/java/test/org/apache/spark/sql/JavaDataFrameSuite.java#L156-L194
>
> Kazuaki Ishizaki
>
>
>
> From:        Anton Kravchenko <kravchenko.anto...@gmail.com>
> To:        "user @spark" <user@spark.apache.org>
> Date:        2017/06/14 01:16
> Subject:        Java access to internal representation of
> DataTypes.DateType
> ------------------------------
>
>
>
> How one would access to internal representation of DataTypes.DateType from
> Spark (2.0.1) Java API?
>
> From
> *https://github.com/apache/spark/blob/51b1c1551d3a7147403b9e821fcc7c8f57b4824c/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DateType.scala*
> <https://github.com/apache/spark/blob/51b1c1551d3a7147403b9e821fcc7c8f57b4824c/sql/catalyst/src/main/scala/org/apache/spark/sql/types/DateType.scala>
> :
> "Internally, this is represented as the number of days from 1970-01-01."
>
>
>
>

Reply via email to