Serge Rielau created SPARK-47571:
------------------------------------

             Summary: date_format() java.lang.ArithmeticException: long 
overflow for large dates
                 Key: SPARK-47571
                 URL: https://issues.apache.org/jira/browse/SPARK-47571
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.4.0
            Reporter: Serge Rielau


The following works for CATS, but not for DATE_FORMAT():

select  cast(cast('5881580' AS DATE) AS STRING);
+5881580-01-01

spark-sql (default)> select date_format(cast('5881580' AS DATE), 
'yyyyyyy-mm-dd');

24/03/26 11:08:23 ERROR SparkSQLDriver: Failed in [select 
date_format(cast('5881580' AS DATE), 'yyyyyyy-mm-dd')]

java.lang.ArithmeticException: long overflow

 at java.base/java.lang.Math.multiplyExact(Math.java:1004)

 at 
org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.instantToMicros(SparkDateTimeUtils.scala:122)

 at 
org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.instantToMicros$(SparkDateTimeUtils.scala:116)

 at 
org.apache.spark.sql.catalyst.util.DateTimeUtils$.instantToMicros(DateTimeUtils.scala:41)

 at 
org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.daysToMicros(SparkDateTimeUtils.scala:174)

 at 
org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.daysToMicros$(SparkDateTimeUtils.scala:172)

 at 
org.apache.spark.sql.catalyst.util.DateTimeUtils$.daysToMicros(DateTimeUtils.scala:41)

 at 
org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castToTimestamp$14(Cast.scala:642)

 at scala.runtime.java8.JFunction1$mcJI$sp.apply(JFunction1$mcJI$sp.scala:17)

 at org.apache.spark.sql.catalyst.expressions.Cast.buildCast(Cast.scala:557)

 at 
org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castToTimestamp$13(Cast.scala:642)

 at org.apache.spark.sql.catalyst.expressions.Cast.nullSafeEval(Cast.scala:1170)

 at 
org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:558)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to