Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22732#discussion_r225952267
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala
 ---
    @@ -39,29 +40,29 @@ import org.apache.spark.sql.types.DataType
      * @param nullable  True if the UDF can return null value.
      * @param udfDeterministic  True if the UDF is deterministic. 
Deterministic UDF returns same result
      *                          each time it is invoked with a particular 
input.
    - * @param nullableTypes which of the inputTypes are nullable (i.e. not 
primitive)
      */
     case class ScalaUDF(
         function: AnyRef,
         dataType: DataType,
         children: Seq[Expression],
    +    handleNullForInputs: Seq[Boolean],
         inputTypes: Seq[DataType] = Nil,
         udfName: Option[String] = None,
         nullable: Boolean = true,
    -    udfDeterministic: Boolean = true,
    -    nullableTypes: Seq[Boolean] = Nil)
    +    udfDeterministic: Boolean = true)
       extends Expression with ImplicitCastInputTypes with NonSQLExpression 
with UserDefinedExpression {
     
       // The constructor for SPARK 2.1 and 2.2
       def this(
           function: AnyRef,
           dataType: DataType,
           children: Seq[Expression],
    +      handleNullForInputs: Seq[Boolean],
    --- End diff --
    
    I think we should just remove this constructor. It's weird to keep backward 
compatibility for a private class, and I don't think it can work anymore. It's 
not OK to omit the nullable info.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to