Github user maropu commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19977#discussion_r157367539
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
 ---
    @@ -50,15 +51,23 @@ import org.apache.spark.unsafe.types.{ByteArray, 
UTF8String}
       """)
     case class Concat(children: Seq[Expression]) extends Expression with 
ImplicitCastInputTypes {
     
    -  override def inputTypes: Seq[AbstractDataType] = 
Seq.fill(children.size)(StringType)
    -  override def dataType: DataType = StringType
    +  private lazy val isBinaryMode = children.nonEmpty && 
children.forall(_.dataType == BinaryType)
    --- End diff --
    
    aha, thanks for the info!
    I checked the db2 behaviour and I found db2 seems to have a bit different 
casting rule.
    
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.1.0/com.ibm.db2.luw.sql.ref.doc/doc/r0000736.html?view=kc
    IIUC, in db2, the type of concat(binary, string) is binary?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to