Github user ckadner commented on the issue: https://github.com/apache/spark/pull/14398 @holdenk @davies I agree that date, time, and timezone handling is best left to the experts and leveraging [JSR-310](http://www.threeten.org/threetenbp/) ([SPARK-16788](https://issues.apache.org/jira/browse/SPARK-16788)) in the long run would probably solve most of the problems that have been painstakingly fixed or attempted to be fixed in `DateTimeUtils.scala`. The change proposed in this PR makes for cleaner code but it also regresses the fix for [SPARK-15613](https://issues.apache.org/jira/browse/SPARK-15613). With this fix there are 4 incorrect `day -> millis -> day` conversions since 1970/01/01 just for `Europe/Moscow`: ```scala test("15613 Incorrect days to millis conversion Europe/Moscow") { DateTimeTestUtils.withDefaultTimeZone(TimeZone.getTimeZone("Europe/Moscow")) { val badDays = (0 to 20000).filterNot{ d => d === millisToDays(daysToMillis(d)) } val df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss z") badDays.foreach { d => println(s"day: ${d}, date: ${df.format(new Date(daysToMillis(d)))}") } assert(badDays.isEmpty) } } ``` ```console TestFailedException: Vector(4108, 4473, 4838, 5204) was not empty day: 4108, date: 1981-03-31 23:00:00 MSK day: 4473, date: 1982-03-31 23:00:00 MSK day: 4838, date: 1983-03-31 23:00:00 MSK day: 5204, date: 1984-03-31 23:00:00 MSK ``` ...and it increases the number of incorrectly converted days from 52 to 4036 when testing all time zones and dates between 1900 and 2020 ([see test code snippet here](https://github.com/apache/spark/pull/13652#issuecomment-226877084)).
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org