gengliangwang edited a comment on pull request #30493: URL: https://github.com/apache/spark/pull/30493#issuecomment-733467240
> According to the report of SPARK-31710, it was a correctness issue. I just tried the following in Spark 3.0.0 ``` create table test(id bigint); insert into test select 1586318188000; create table test1(id bigint) partitioned by (year string); set hive.exec.dynamic.partition.mode=nonstrict; insert overwrite table test1 partition(year) select 234,cast(id as TIMESTAMP) from test; ``` and there is no exception as reported in https://issues.apache.org/jira/browse/SPARK-31710. (The affects version is 2.4.5) Casting from Long type to Timestamp type can be ambiguous. Spark converts the Long value as "how many seconds from Unix epoch" by default. And it is considered as "wrong result" in https://issues.apache.org/jira/browse/SPARK-31710 . (That's why we should disallow it in ANSI mode.) > Are you using ANSI mode in production really? It is possible when it is ready. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org