[ https://issues.apache.org/jira/browse/SPARK-36590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17404600#comment-17404600 ]
Apache Spark commented on SPARK-36590: -------------------------------------- User 'MaxGekk' has created a pull request for this issue: https://github.com/apache/spark/pull/33838 > Special timestamp_ntz value should be converted in the session tz > ----------------------------------------------------------------- > > Key: SPARK-36590 > URL: https://issues.apache.org/jira/browse/SPARK-36590 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.2.0 > Reporter: Max Gekk > Assignee: Max Gekk > Priority: Major > > Currently, special timestamp_ntz strings like *today*, *tomorrow*, > *yesterday* are converted to timestamp_ntz values using JVM timezone which is > incorrect. The conversion should base on the session time zone. The examples > below demonstrate the problem: > {code:sql} > $ export TZ="Europe/Amsterdam" > $ ./bin/spark-sql -S > spark-sql> select timestamp_ntz'now'; > 2021-08-25 18:12:36.233 > spark-sql> set spark.sql.session.timeZone=America/Los_Angeles; > spark.sql.session.timeZone America/Los_Angeles > spark-sql> select timestamp_ntz'now'; > 2021-08-25 18:14:40.547 > {code} > The special timestamp_ntz strings are converted in "Europe/Amsterdam" tz, and > the converter doesn't respect Spark settings. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org