[ https://issues.apache.org/jira/browse/SPARK-29328?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-29328: ---------------------------------- Description: Existing implementation assumes 31 days per month or 372 days per year which is far away from the correct number. Spark uses the proleptic Gregorian calendar by default SPARK-26651 in which the average year is 365.2425 days long: https://en.wikipedia.org/wiki/Gregorian_calendar . Need to fix calculation in 3 places at least: - GroupStateImpl.scala:167: val millisPerMonth = TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 - EventTimeWatermark.scala:32: val millisPerMonth = TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 - DateTimeUtils.scala:610: val secondsInMonth = DAYS.toSeconds(31) *BEFORE* {code} spark-sql> select months_between('2019-09-15', '1970-01-01'); 596.4516129 {code} *AFTER* {code} spark-sql> select months_between('2019-09-15', '1970-01-01'); 596.45996838 {code} was: Existing implementation assumes 31 days per month or 372 days per year which is far away from the correct number. Spark uses the proleptic Gregorian calendar by default SPARK-26651 in which the average year is 365.2425 days long: https://en.wikipedia.org/wiki/Gregorian_calendar . Need to fix calculation in 3 places at least: - GroupStateImpl.scala:167: val millisPerMonth = TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 - EventTimeWatermark.scala:32: val millisPerMonth = TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 - DateTimeUtils.scala:610: val secondsInMonth = DAYS.toSeconds(31) > Incorrect calculation mean seconds per month > -------------------------------------------- > > Key: SPARK-29328 > URL: https://issues.apache.org/jira/browse/SPARK-29328 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.2.3, 2.3.4, 2.4.4 > Reporter: Maxim Gekk > Priority: Minor > > Existing implementation assumes 31 days per month or 372 days per year which > is far away from the correct number. Spark uses the proleptic Gregorian > calendar by default SPARK-26651 in which the average year is 365.2425 days > long: https://en.wikipedia.org/wiki/Gregorian_calendar . Need to fix > calculation in 3 places at least: > - GroupStateImpl.scala:167: val millisPerMonth = > TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 > - EventTimeWatermark.scala:32: val millisPerMonth = > TimeUnit.MICROSECONDS.toMillis(CalendarInterval.MICROS_PER_DAY) * 31 > - DateTimeUtils.scala:610: val secondsInMonth = DAYS.toSeconds(31) > *BEFORE* > {code} > spark-sql> select months_between('2019-09-15', '1970-01-01'); > 596.4516129 > {code} > *AFTER* > {code} > spark-sql> select months_between('2019-09-15', '1970-01-01'); > 596.45996838 > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org