Github user mgaido91 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21028#discussion_r185783290
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
 ---
    @@ -19,14 +19,41 @@ package org.apache.spark.sql.catalyst.expressions
     import java.util.Comparator
     
     import org.apache.spark.sql.catalyst.InternalRow
    -import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
    +import org.apache.spark.sql.catalyst.analysis.{TypeCheckResult, 
TypeCoercion}
     import org.apache.spark.sql.catalyst.expressions.codegen._
     import org.apache.spark.sql.catalyst.util.{ArrayData, GenericArrayData, 
MapData, TypeUtils}
     import org.apache.spark.sql.types._
     import org.apache.spark.unsafe.Platform
     import org.apache.spark.unsafe.array.ByteArrayMethods
     import org.apache.spark.unsafe.types.{ByteArray, UTF8String}
     
    +/**
    + * Base trait for [[BinaryExpression]]s with two arrays of the same 
element type and implicit
    + * casting.
    + */
    +trait BinaryArrayExpressionWithImplicitCast extends BinaryExpression
    +  with ImplicitCastInputTypes {
    +
    +  protected lazy val elementType: DataType = 
inputTypes.head.asInstanceOf[ArrayType].elementType
    +
    +  override def inputTypes: Seq[AbstractDataType] = {
    +    TypeCoercion.findWiderTypeForTwo(left.dataType, right.dataType) match {
    +      case Some(arrayType) => Seq(arrayType, arrayType)
    +      case _ => Seq.empty
    +    }
    +  }
    +
    +  override def checkInputDataTypes(): TypeCheckResult = {
    +    TypeCoercion.findWiderTypeForTwo(left.dataType, right.dataType) match {
    --- End diff --
    
    with your suggestion, if we have two arrays with different element types 
(but one can be casted to the other), we would throw an exception. Instead, 
with this approach that use case is valid and we perform an implicit cast to 
the wider type.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to