Github user ueshin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22200#discussion_r232608144
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
 ---
    @@ -154,6 +154,15 @@ object Cast {
         fromPrecedence >= 0 && fromPrecedence < toPrecedence
       }
     
    +  def canNullSafeCastToDecimal(from: DataType, to: DecimalType): Boolean = 
from match {
    +    case from: BooleanType if to.isWiderThan(DecimalType.BooleanDecimal) 
=> true
    +    case from: NumericType if to.isWiderThan(from) => true
    +    case from: DecimalType =>
    +      // truncating or precision lose
    +      (to.precision - to.scale) > (from.precision - from.scale)
    --- End diff --
    
    In this case, we need rounding, then we need an extra precision to avoid 
overflow.
    E.g., cast 99.95 of Decimal(4, 2) to Decimal(3, 1) will be 100.0, but 
it’s an overflow and ends up to null. We need Decimal(4, 1) to be null-safe.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to