Michael Styles created SPARK-17035:
--------------------------------------

             Summary: Conversion of datetime.max to microseconds produces 
incorrect value
                 Key: SPARK-17035
                 URL: https://issues.apache.org/jira/browse/SPARK-17035
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.0.0
            Reporter: Michael Styles


Conversion of datetime.max to microseconds produces incorrect value. For 
example,

from datetime import datetime
from pyspark.sql import Row
from pyspark.sql.types import StructType, StructField, TimestampType

schema = StructType([StructField("dt", TimestampType(), False)])
data = [{"dt": datetime.max}]

# convert python objects to sql data
sql_data = [schema.toInternal(row) for row in data]

# Value is wrong.
sql_data
[(2.534023188e+17,)]

This value should be [(253402318799999999,)].



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to