Kent Yao created SPARK-31868: -------------------------------- Summary: Wrong results for week-based-year Key: SPARK-31868 URL: https://issues.apache.org/jira/browse/SPARK-31868 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.0.0, 3.1.0 Reporter: Kent Yao
{code:sql} spark-sql> set spark.sql.legacy.timeParserPolicy=exception; spark.sql.legacy.timeParserPolicy exception spark-sql> explain select to_timestamp('1969-01-01', 'YYYY-MM-dd') > ; == Physical Plan == *(1) Project [-28800000000 AS to_timestamp(1969-01-01, YYYY-MM-dd)#37] +- *(1) Scan OneRowRelation[] spark-sql> set spark.sql.legacy.timeParserPolicy=legacy; spark.sql.legacy.timeParserPolicy legacy spark-sql> explain select to_timestamp('1969-01-01', 'YYYY-MM-dd'); == Physical Plan == *(1) Project [-31824000000000 AS to_timestamp(1969-01-01, YYYY-MM-dd)#53] +- *(1) Scan OneRowRelation[] spark-sql> set spark.sql.legacy.timeParserPolicy=exception; spark.sql.legacy.timeParserPolicy exception spark-sql> explain select to_timestamp('1969-01-01', 'YYYY-MM-dd'); == Physical Plan == *(1) Project [-28800000000 AS to_timestamp(1969-01-01, YYYY-MM-dd)#69] +- *(1) Scan OneRowRelation[] spark-sql> select to_timestamp('1969-01-01', 'YYYY-MM-dd'); 1970-01-01 00:00:00 spark-sql> set spark.sql.legacy.timeParserPolicy=legacy; spark.sql.legacy.timeParserPolicy legacy spark-sql> explain select to_timestamp('1969-01-01', 'YYYY-MM-dd'); == Physical Plan == *(1) Project [-31824000000000 AS to_timestamp(1969-01-01, YYYY-MM-dd)#87] +- *(1) Scan OneRowRelation[] spark-sql> select to_timestamp('1969-01-01', 'YYYY-MM-dd'); 1968-12-29 00:00:00 spark-sql> select to_timestamp('1969-01-01', 'yyyy-MM-dd'); 1969-01-01 00:00:00 spark-sql> set spark.sql.legacy.timeParserPolicy=exception; spark.sql.legacy.timeParserPolicy exception spark-sql> select to_timestamp('1969-01-01', 'yyyy-MM-dd'); 1969-01-01 00:00:00 {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org