andygrove opened a new issue, #4124:
URL: https://github.com/apache/datafusion-comet/issues/4124

   ## Describe the bug
   
   The new Spark 4.1 test `SPARK-53968 reading the view after 
allowPrecisionLoss is changed` in `SQLViewSuite` fails with Comet enabled. The 
test stores `DECIMAL(38, 18)` values and computes `unit_price + 
COALESCE(shipping_price, 0)` through a CTE wrapped in a view. Comet returns 
values approximately 10x smaller than expected:
   
   | Row | Expected | Actual (with Comet) |
   |---|---|---|
   | part1 | 100.00000000000000000 | 10.00000000000000000 |
   | part2 | 100.00000000000000000 | 10.00000000000000000 |
   | part3 | 300.23000000000000000 | 30.02300000000000000 |
   
   The plan involves `CometBroadcastHashJoin → CometProject(unit_price + 
COALESCE(shipping_price, 0E-18)) → CometExchange(rangepartitioning) → 
CometSort`.
   
   ## Steps to reproduce
   
   Run Spark 4.1.1's `SQLViewSuite` with Comet enabled. The reproducer is the 
test body itself (see 
`sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala` 
`SPARK-53968 reading the view after allowPrecisionLoss is changed`).
   
   ## Expected behavior
   
   `unit_price + shipping_price` should match Spark's result (100, 100, 300.23).
   
   ## Workaround
   
   The test is currently tagged `IgnoreComet(...)` in `dev/diffs/4.1.1.diff`.
   
   ## Additional context
   
   PR #4093 enables Spark 4.1.1 in the `Spark SQL Tests` workflow. The 10x 
discrepancy and the `DECIMAL(38, 18)` schema strongly suggest a precision/scale 
handling bug in Comet's decimal addition or in the view-resolution path that 
drops 1 digit of scale.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to