Robert Joseph Evans created SPARK-40089:
-------------------------------------------

             Summary: Doring of at least Decimal(20, 2) fails for some values 
near the max.
                 Key: SPARK-40089
                 URL: https://issues.apache.org/jira/browse/SPARK-40089
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.3.0, 3.2.0, 3.4.0
            Reporter: Robert Joseph Evans


I have been doing some testing with Decimal values for the RAPIDS Accelerator 
for Apache Spark. I have been trying to add in new corner cases and when I 
tried to enable the maximum supported value for a sort I started to get 
failures.  On closer inspection it looks like the CPU is sorting things 
incorrectly.  Specifically anything that is "999999999999999999.50" or above is 
placed as a chunk in the wrong location in the outputs.

 In local mode with 12 tasks.
{code:java}
spark.read.parquet("input.parquet").orderBy(col("a")).collect.foreach(System.err.println)
 {code}
 

Here you will notice that the last entry printed is 
{{[999999999999999999.49]}}, and {{[999999999999999999.99]}} is near the top 
near {{[-999999999999999999.99]}}





--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to