Re: [Spark SQL] Nanoseconds in Timestamps are set as Microseconds

2017-06-02 Thread Anton Okolnychyi
Then let me provide a PR so that we can discuss an alternative way 2017-06-02 8:26 GMT+02:00 Reynold Xin : > Seems like a bug we should fix? I agree some form of truncation makes more > sense. > > > On Thu, Jun 1, 2017 at 1:17 AM, Anton Okolnychyi < >

Re: [Spark SQL] Nanoseconds in Timestamps are set as Microseconds

2017-06-02 Thread Reynold Xin
Seems like a bug we should fix? I agree some form of truncation makes more sense. On Thu, Jun 1, 2017 at 1:17 AM, Anton Okolnychyi wrote: > Hi all, > > I would like to ask what the community thinks regarding the way how Spark > handles nanoseconds in the Timestamp

[Spark SQL] Nanoseconds in Timestamps are set as Microseconds

2017-06-01 Thread Anton Okolnychyi
Hi all, I would like to ask what the community thinks regarding the way how Spark handles nanoseconds in the Timestamp type. As far as I see in the code, Spark assumes microseconds precision. Therefore, I expect to have a truncated to microseconds timestamp or an exception if I specify a