David Vogelbacher created SPARK-24957: -----------------------------------------
Summary: Decimal arithmetic can lead to wrong values using codegen Key: SPARK-24957 URL: https://issues.apache.org/jira/browse/SPARK-24957 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 2.3.1 Reporter: David Vogelbacher I noticed a bug when doing arithmetic on a dataframe containing decimal values with codegen enabled. I tried to narrow it down on a small repro and got this (executed in spark-shell): {noformat} scala> val df = Seq( | ("a", BigDecimal("12.0")), | ("a", BigDecimal("12.0")), | ("a", BigDecimal("11.9999999988")), | ("a", BigDecimal("12.0")), | ("a", BigDecimal("12.0")), | ("a", BigDecimal("11.9999999988")), | ("a", BigDecimal("11.9999999988")) | ).toDF("text", "number") df: org.apache.spark.sql.DataFrame = [text: string, number: decimal(38,18)] scala> val df_grouped_1 = df.groupBy(df.col("text")).agg(functions.avg(df.col("number")).as("number")) df_grouped_1: org.apache.spark.sql.DataFrame = [text: string, number: decimal(38,22)] scala> df_grouped_1.collect() res0: Array[org.apache.spark.sql.Row] = Array([a,11.9999999994857142857143]) scala> val df_grouped_2 = df_grouped_1.groupBy(df_grouped_1.col("text")).agg(functions.sum(df_grouped_1.col("number")).as("number")) df_grouped_2: org.apache.spark.sql.DataFrame = [text: string, number: decimal(38,22)] scala> df_grouped_2.collect() res1: Array[org.apache.spark.sql.Row] = Array([a,1199999999948571.4285714285714285714286]) scala> val df_total_sum = df_grouped_1.agg(functions.sum(df_grouped_1.col("number")).as("number")) df_total_sum: org.apache.spark.sql.DataFrame = [number: decimal(38,22)] scala> df_total_sum.collect() res2: Array[org.apache.spark.sql.Row] = Array([11.9999999994857142857143]) {noformat} The results of {{df_grouped_1}} and {{df_total_sum}} are correct, whereas the result of {{df_grouped_2}} is clearly incorrect (it is the value of the correct result times {{10^14}}). -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org