Github user gczsjdy commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20010#discussion_r157928910
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
 ---
    @@ -99,6 +102,33 @@ object TypeCoercion {
         case (_: TimestampType, _: DateType) | (_: DateType, _: TimestampType) 
=>
           Some(TimestampType)
     
    +    case (ArrayType(pointType1, nullable1), ArrayType(pointType2, 
nullable2)) =>
    +      val dataType = if (withStringPromotion) {
    +        findWiderTypeForTwo(pointType1, pointType2)
    --- End diff --
    
    I think we break the `findTightest` semantic by `withStringPromotion` 
judgement and calling `findWiderTypeForTwo`, these are what we should add in 
`Case 2 type widening` -> `findWiderTypeForTwo`. I suggest we put these logic 
in another function`findWiderTypeForDecimal`. Will mention this in the other 
thread.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to