coderfender commented on PR #1996: URL: https://github.com/apache/datafusion-comet/pull/1996#issuecomment-3122030778
@andygrove , @parthchandra The casting operation to DecimalType ( in case the operands are arent already decimal types) is failing the test `"test integral divide"` . I think we should update the test to reflect the fact that there will surely be a spark operator (cast / project) in case the operands arent already decimal . Also, I was wondering if we should stick to max precision and scale while casting operands to decimal or should we perhaps cap it at precison of Long.MAX_VALUE (10 digits) ? Finally, below is the rust code which helped me understand the inconsistency / proof of concept of why we couldnt cast the operands to DoubleType (like native Spark operation does) ` fn main() { let a:f64 = i64::MIN as f64; assert!(((a / -1.0) as i64) == i64::MAX); }` https://play.rust-lang.org/?version=stable&mode=debug&edition=2024&gist=68515908b492b787be6973861504924c -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For additional commands, e-mail: github-h...@datafusion.apache.org