Michael Nazario created SPARK-6289:
--------------------------------------

             Summary: PySpark doesn't maintain SQL Types
                 Key: SPARK-6289
                 URL: https://issues.apache.org/jira/browse/SPARK-6289
             Project: Spark
          Issue Type: Bug
          Components: PySpark, SQL
    Affects Versions: 1.2.1
            Reporter: Michael Nazario


For the TimestampType, Spark SQL requires a datetime.date in Python. However, 
if you collect a row based on that type, you'll end up with a returned value 
which is type datetime.datetime.

I have tried to reproduce this using the pyspark shell, but have been unable 
to. This is definitely a problem coming from pyrolite though:

https://github.com/irmen/Pyrolite/

Pyrolite is being used for datetime and date serialization, but appears to not 
map to date objects, but maps to datetime objects.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to