[ https://issues.apache.org/jira/browse/SPARK-27546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16823801#comment-16823801 ]
Jiatao Tao edited comment on SPARK-27546 at 4/23/19 8:17 AM: ------------------------------------------------------------- see an example like this: {code:java} // config spark session with "spark.sql.session.timeZone", "UTC" val df = mockDF // TimeZone.setDefault(TimeZone.getTimeZone("UTC")) val date = new Date(1356998400000L) // 2013-01-01 (UTC) val ts = df.select(lit(date).cast(TimestampType).cast(DateType)).head().getDate(0).getTime {code} The ts I got is 1356969600000 (2012-12-31 UTC), not the same with the origin date 2013-01-01. So I think this is a problem. And when I set "TimeZone.setDefault(TimeZone.getTimeZone("UTC"))" first, the result is same with 1356998400000. !image-2019-04-23-08-10-00-475.png! !image-2019-04-23-08-10-50-247.png! was (Author: aron.tao): see an example like this: {code:java} // config spark session with "spark.sql.session.timeZone", "UTC" val df = mockDF // TimeZone.setDefault(TimeZone.getTimeZone("UTC")) val date = new Date(1356998400000L) // 2013-01-01 (UTC) val ts = df.select(lit(date).cast(TimestampType).cast(DateType)).head().getDate(0).getTime {code} The ts I got is: 1356969600000 (12/31/2012 UTC) When I set "TimeZone.setDefault(TimeZone.getTimeZone("UTC"))", the ts is same with 1356998400000. !image-2019-04-23-08-10-00-475.png! !image-2019-04-23-08-10-50-247.png! > Should repalce DateTimeUtils#defaultTimeZoneuse with sessionLocalTimeZone > ------------------------------------------------------------------------- > > Key: SPARK-27546 > URL: https://issues.apache.org/jira/browse/SPARK-27546 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.1 > Reporter: Jiatao Tao > Priority: Minor > Attachments: image-2019-04-23-08-10-00-475.png, > image-2019-04-23-08-10-50-247.png > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org