[ https://issues.apache.org/jira/browse/SPARK-6289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14541235#comment-14541235 ]
Davies Liu commented on SPARK-6289: ----------------------------------- [~mnazario] Is this still a problem after we upgrade Pyrolite to 4.4? Could you help to very that, thanks! > PySpark doesn't maintain SQL date Types > --------------------------------------- > > Key: SPARK-6289 > URL: https://issues.apache.org/jira/browse/SPARK-6289 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 1.2.1 > Reporter: Michael Nazario > Assignee: Davies Liu > > For the TimestampType, Spark SQL requires a datetime.date in Python. However, > if you collect a row based on that type, you'll end up with a returned value > which is type datetime.datetime. > I have tried to reproduce this using the pyspark shell, but have been unable > to. This is definitely a problem coming from pyrolite though: > https://github.com/irmen/Pyrolite/ > Pyrolite is being used for datetime and date serialization, but appears to > not map to date objects, but maps to datetime objects. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org