cloud-fan commented on issue #27627: [WIP][SPARK-28067][SQL] Fix incorrect 
results for decimal aggregate sum by returning null on decimal overflow
URL: https://github.com/apache/spark/pull/27627#issuecomment-595600181
 
 
   > Addition of two decimal values ( expression coming from sum) that results 
in value that cannot be contained.
   
   You can try `Decimal.+(Decimal)` locally, it does return a value that is not 
null. We can't hold an overflowed decimal value in unsafe row, but a `Decimal` 
object can be temporary overflowed.
   
   In my repro, `spark.range(0, 12, 1, 1)` works fine and `spark.range(0, 1, 1, 
1).union(spark.range(0, 11, 1, 1))` gives wrong result. I looked at the code 
again and whole-stage-codegen also stores the partial aggregate result to 
unsafe row. Can someone investigate it further and see why `spark.range(0, 12, 
1, 1)` works?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to