ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r884620911


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##########
@@ -323,11 +389,24 @@ case class Add(
 
   override def decimalMethod: String = "$plus"
 
+  override def decimalType(p1: Int, s1: Int, p2: Int, s2: Int): DecimalType = {

Review Comment:
   ```
    *   Operation    Result Precision                        Result Scale
    *   ------------------------------------------------------------------------
    *   e1 + e2      max(s1, s2) + max(p1-s1, p2-s2) + 1     max(s1, s2)
    *   e1 - e2      max(s1, s2) + max(p1-s1, p2-s2) + 1     max(s1, s2)
    *   e1 * e2      p1 + p2 + 1                             s1 + s2
    *   e1 / e2      p1 - s1 + s2 + max(6, s1 + p2 + 1)      max(6, s1 + p2 + 1)
    *   e1 % e2      min(p1-s1, p2-s2) + max(s1, s2)         max(s1, s2)
    *   e1 union e2  max(s1, s2) + max(p1-s1, p2-s2)         max(s1, s2)
   ```
   
   do you mean we should copy this docs into each operator ?



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##########
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boolean = 
SQLConf.get.ansiEnabled
   override protected def withNewChildInternal(newChild: Expression): Abs = 
copy(child = newChild)
 }
 
+/**
+ * The child class should override decimalType method to report the result 
data type which must
+ * follow the description of `DecimalPrecision`.
+ *
+ * When `spark.sql.decimalOperations.allowPrecisionLoss` is set to true, if 
the precision / scale
+ * needed are out of the range of available values, the scale is reduced up to 
6, in order to
+ * prevent the truncation of the integer part of the decimals.
+ *
+ * Rounds the decimal to given scale and check whether the decimal can fit in 
provided precision
+ * or not. If not, if `nullOnOverflow` is `true`, it returns `null`; otherwise 
an
+ * `ArithmeticException` is thrown.
+ */
+abstract class DecimalArithmetic extends BinaryArithmetic {
+  protected val nullOnOverflow: Boolean = !failOnError
+  protected val allowPrecisionLoss: Boolean = 
SQLConf.get.decimalOperationsAllowPrecisionLoss
+
+  override def checkInputDataTypes(): TypeCheckResult = (left.dataType, 
right.dataType) match {
+    case (_: DecimalType, _: DecimalType) =>
+      // We allow eval decimal type with different precision and scale, and 
change the precision
+      // and scale before return result.
+      TypeCheckResult.TypeCheckSuccess
+    case _ => super.checkInputDataTypes()
+  }
+
+  /** Name of the function for this expression on a [[Decimal]] type. */
+  def decimalMethod: String =

Review Comment:
   yes, it is pulled out from `BinaryArithmetic`. I will change the method to 
protected.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to