Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20023#discussion_r161593215
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala
 ---
    @@ -93,41 +97,76 @@ object DecimalPrecision extends TypeCoercionRule {
         case e: BinaryArithmetic if e.left.isInstanceOf[PromotePrecision] => e
     
         case Add(e1 @ DecimalType.Expression(p1, s1), e2 @ 
DecimalType.Expression(p2, s2)) =>
    -      val dt = DecimalType.bounded(max(s1, s2) + max(p1 - s1, p2 - s2) + 
1, max(s1, s2))
    -      CheckOverflow(Add(promotePrecision(e1, dt), promotePrecision(e2, 
dt)), dt)
    +      val resultScale = max(s1, s2)
    +      val resultType = if 
(SQLConf.get.decimalOperationsAllowPrecisionLoss) {
    +        DecimalType.adjustPrecisionScale(max(p1 - s1, p2 - s2) + 
resultScale + 1,
    --- End diff --
    
    This is an example. `adjustPrecisionScale` is also be applied for all the 
operations. However, the doc shows the adjustment is only applicable to 
multiplication and division. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to