Kris Mok created SPARK-22966:
--------------------------------

             Summary: Spark SQL should handle Python UDFs that return a 
datetime.date or datetime.datetime
                 Key: SPARK-22966
                 URL: https://issues.apache.org/jira/browse/SPARK-22966
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.2.1, 2.2.0
            Reporter: Kris Mok


Currently, in Spark SQL, if a Python UDF returns a {{datetime.date}} (which 
should correspond to a Spark SQL {{date}} type) or {{datetime.datetime}} (which 
should correspond to a Spark SQL {{timestamp}} type), it gets unpickled into a 
{{java.util.Calendar}} which Spark SQL doesn't understand internally, and will 
thus give incorrect results.

e.g.
{code:python}
>>> import datetime
>>> from pyspark.sql import *
>>> py_date = udf(datetime.date)
>>> spark.range(1).select(py_date(lit(2017), lit(10), lit(30)) == 
>>> lit(datetime.date(2017, 10, 30))).show()
+----------------------------------------+
|(date(2017, 10, 30) = DATE '2017-10-30')|
+----------------------------------------+
|                                   false|
+----------------------------------------+
{code}
(changing the definition of {{py_date}} from {{udf(date)}} to {{udf(date, 
'date')}} doesn't work either)

We should correctly handle Python UDFs that return objects of such types.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to