[ https://issues.apache.org/jira/browse/SPARK-14057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15214334#comment-15214334 ]
Andrew Davidson commented on SPARK-14057: ----------------------------------------- Hi Vijay here is some more info from the email thread mentioned above Russel is very involved in the development of cassandra and the spark cassandra connector. He has a suggestion for how to fix this bug kind regards Andy http://www.slideshare.net/RussellSpitzer On Fri, Mar 18, 2016 at 11:35 AM Russell Spitzer <russ...@datastax.com> wrote: > Unfortunately part of Spark SQL. They have based their type on > java.sql.timestamp (and date) which adjust to the client timezone when > displaying and storing. > See discussions > http://stackoverflow.com/questions/9202857/timezones-in-sql-date-vs-java-sql-d > ate > And Code > https://github.com/apache/spark/blob/bb3b3627ac3fcd18be7fb07b6d0ba5eae0342fc3/ > sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.s > cala#L81-L93 > > sql time stamps do not respect time zones > ----------------------------------------- > > Key: SPARK-14057 > URL: https://issues.apache.org/jira/browse/SPARK-14057 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.0 > Reporter: Andrew Davidson > Priority: Minor > > we have time stamp data. The time stamp data is UTC how ever when we load the > data into spark data frames, the system assume the time stamps are in the > local time zone. This causes problems for our data scientists. Often they > pull data from our data center into their local macs. The data centers run > UTC. There computers are typically in PST or EST. > It is possible to hack around this problem > This cause a lot of errors in their analysis > A complete description of this issue can be found in the following mail msg > https://www.mail-archive.com/user@spark.apache.org/msg48121.html -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org