These are relevant: JIRA: https://issues.apache.org/jira/browse/SPARK-6411 PR: https://github.com/apache/spark/pull/6250
On Thu, May 21, 2015 at 3:16 PM, Def_Os <njde...@gmail.com> wrote: > After deserialization, something seems to be wrong with my pandas DataFrames. > It looks like the timezone information is lost, and subsequent errors ensue. > > Serializing and deserializing a timezone-aware DataFrame tests just fine, so > it must be Spark that somehow changes the data. > > My program runs timezone-unaware data without problems. > > Anybody have any ideas on what causes this, or how to solve it? > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Pandas-timezone-problems-tp22985.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org