ulysses-you commented on code in PR #38739:
URL: https://github.com/apache/spark/pull/38739#discussion_r1031054191


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecisionSuite.scala:
##########
@@ -276,9 +276,9 @@ class DecimalPrecisionSuite extends AnalysisTest with 
BeforeAndAfter {
       val a = AttributeReference("a", DecimalType(3, -10))()
       val b = AttributeReference("b", DecimalType(1, -1))()
       val c = AttributeReference("c", DecimalType(35, 1))()
-      checkType(Multiply(a, b), DecimalType(5, -11))
-      checkType(Multiply(a, c), DecimalType(38, -9))
-      checkType(Multiply(b, c), DecimalType(37, 0))
+      checkType(Multiply(a, b), DecimalType(16, 0))

Review Comment:
   Not sure what you found. I think the reason why Divide and IntegralDivide 
fail is simple. SQL strandard does not allow negative scale, but we use its 
definition formula to calculate the result precision and scale. Then the result 
precision can be nagetive which is unexpected. So I think other binary 
arithmetic also should not follow if scale is negative.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to