Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20163#discussion_r160345387
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/python/EvaluatePython.scala
 ---
    @@ -120,10 +121,18 @@ object EvaluatePython {
         case (c: java.math.BigDecimal, dt: DecimalType) => Decimal(c, 
dt.precision, dt.scale)
     
         case (c: Int, DateType) => c
    +    // Pyrolite will unpickle a Python datetime.date to a 
java.util.Calendar
    +    case (c: Calendar, DateType) => 
DateTimeUtils.fromJavaCalendarForDate(c)
    --- End diff --
    
    How about we return `null` in this case? Other cases seems also returning 
`null` if it fails to be converted:
    
    ```
    >>> from pyspark.sql.functions import udf
    >>> f = udf(lambda x: x, "double")
    >>> spark.range(1).select(f("id")).show()
    +------------+
    |<lambda>(id)|
    +------------+
    |        null|
    +------------+
    ```
    
    Seems we can do it like:
    
    ```scala
    
        case StringType => (obj: Any) => nullSafeConvert(obj) {
          case c: Calendar => null
          case _ => UTF8String.fromString(obj.toString)
        }
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to