[ https://issues.apache.org/jira/browse/SPARK-34532?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ted Yu updated SPARK-34532: --------------------------- Description: I noticed the following when running test suite: build/sbt "sql/testOnly *SQLQueryTestSuite" {code} 19:10:17.977 ERROR org.apache.spark.scheduler.TaskSetManager: Task 1 in stage 6416.0 failed 1 times; aborting job [info] - postgreSQL/int4.sql (2 seconds, 543 milliseconds) 19:10:20.994 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 in stage 6476.0 (TID 7789) java.lang.ArithmeticException: long overflow at java.lang.Math.multiplyExact(Math.java:892) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.project_doConsume_0$(Unknown Source) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:755) at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345) at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:898) at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:898) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373) at org.apache.spark.rdd.RDD.iterator(RDD.scala:337) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) {code} {code} 19:15:38.255 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14744.0 (TID 16705) java.lang.ArithmeticException: long overflow at java.lang.Math.addExact(Math.java:809) at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:105) at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:104) at org.apache.spark.sql.catalyst.expressions.Add.nullSafeEval(arithmetic.scala:268) at org.apache.spark.sql.catalyst.expressions.BinaryExpression.eval(Expression.scala:573) at org.apache.spark.sql.catalyst.expressions.InterpretedMutableProjection.apply(InterpretedMutableProjection.scala:97) {code} This likely was caused by the following line: {code} val microseconds = left.microseconds + right.microseconds {code} We should check whether the addition would produce overflow before adding. was: I noticed the following when running test suite: {code} 19:15:38.255 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14744.0 (TID 16705) java.lang.ArithmeticException: long overflow at java.lang.Math.addExact(Math.java:809) at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:105) at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:104) at org.apache.spark.sql.catalyst.expressions.Add.nullSafeEval(arithmetic.scala:268) at org.apache.spark.sql.catalyst.expressions.BinaryExpression.eval(Expression.scala:573) at org.apache.spark.sql.catalyst.expressions.InterpretedMutableProjection.apply(InterpretedMutableProjection.scala:97) {code} This likely was caused by the following line: {code} val microseconds = left.microseconds + right.microseconds {code} We should check whether the addition would produce overflow before adding. > IntervalUtils.add() may result in 'long overflow' > ------------------------------------------------- > > Key: SPARK-34532 > URL: https://issues.apache.org/jira/browse/SPARK-34532 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.2 > Reporter: Ted Yu > Priority: Major > > I noticed the following when running test suite: > build/sbt "sql/testOnly *SQLQueryTestSuite" > {code} > 19:10:17.977 ERROR org.apache.spark.scheduler.TaskSetManager: Task 1 in stage > 6416.0 failed 1 times; aborting job > [info] - postgreSQL/int4.sql (2 seconds, 543 milliseconds) > 19:10:20.994 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 > in stage 6476.0 (TID 7789) > java.lang.ArithmeticException: long overflow > at java.lang.Math.multiplyExact(Math.java:892) > at > org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.project_doConsume_0$(Unknown > Source) > at > org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown > Source) > at > org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) > at > org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:755) > at > org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345) > at > org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:898) > at > org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:898) > at > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:337) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) > {code} > {code} > 19:15:38.255 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 > in stage 14744.0 (TID 16705) > java.lang.ArithmeticException: long overflow > at java.lang.Math.addExact(Math.java:809) > at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:105) > at org.apache.spark.sql.types.LongExactNumeric$.plus(numerics.scala:104) > at > org.apache.spark.sql.catalyst.expressions.Add.nullSafeEval(arithmetic.scala:268) > at > org.apache.spark.sql.catalyst.expressions.BinaryExpression.eval(Expression.scala:573) > at > org.apache.spark.sql.catalyst.expressions.InterpretedMutableProjection.apply(InterpretedMutableProjection.scala:97) > {code} > This likely was caused by the following line: > {code} > val microseconds = left.microseconds + right.microseconds > {code} > We should check whether the addition would produce overflow before adding. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org