[
https://issues.apache.org/jira/browse/SPARK-30740?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17032922#comment-17032922
]
Maxim Gekk commented on SPARK-30740:
This is because of the special *if*
[https://github.com/apache/spark/blob/a3e3cfa03a18d31370acd9a10562ff5312bb/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala#L603-L605]
which was implemented to be compatible with Hive:
[https://github.com/apache/hive/blob/287e5d5e4c43beb2bc84a80e342f897494e32c6c/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFMonthsBetween.java#L133-L138]
> months_between wrong calculation
>
>
> Key: SPARK-30740
> URL: https://issues.apache.org/jira/browse/SPARK-30740
> Project: Spark
> Issue Type: Bug
> Components: SQL
>Affects Versions: 2.4.4
>Reporter: nhufas
>Priority: Critical
>
> months_between not calculating right for February
> example
>
> {{select }}
> {{ months_between('2020-02-29','2019-12-29')}}
> {{,months_between('2020-02-29','2019-12-30') }}
> {{,months_between('2020-02-29','2019-12-31') }}
>
> will generate a result like this
> |2|1.96774194|2|
>
> For 2019-12-30 is calculating wrong.
>
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org