Kevin Cox created SPARK-10162: --------------------------------- Summary: PySpark filters with datetimes mess up when datetimes have timezones. Key: SPARK-10162 URL: https://issues.apache.org/jira/browse/SPARK-10162 Project: Spark Issue Type: Bug Components: PySpark Reporter: Kevin Cox
PySpark appears to ignore timezone information when filtering on (and working in general with) datetimes. Please see the example below. The generated filter in the query plan is 5 hours off (my computer is EST). {code} In [1]: df = sc.sql.createDataFrame([], StructType([StructField("dt", TimestampType())])) In [2]: df.filter(df.dt > datetime(2000, 01, 01, tzinfo=UTC)).explain() Filter (dt#9 > 946702800000000) Scan PhysicalRDD[dt#9] {code} Note that 946702800000000 == Sat 1 Jan 2000 05:00:00 UTC -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org