gengliangwang edited a comment on pull request #30493:
URL: https://github.com/apache/spark/pull/30493#issuecomment-733467240
> According to the report of SPARK-31710, it was a correctness issue.
I just tried the following in Spark 3.0.0
```
create table test(id bigint);
inse
gengliangwang edited a comment on pull request #30493:
URL: https://github.com/apache/spark/pull/30493#issuecomment-733467240
> According to the report of SPARK-31710, it was a correctness issue.
I just tried the following in Spark 3.0.0
```
create table test(id bigint);
inse
gengliangwang edited a comment on pull request #30493:
URL: https://github.com/apache/spark/pull/30493#issuecomment-733467240
> According to the report of SPARK-31710, it was a correctness issue.
I just tried the following in Spark 3.0.0
```
create table test(id bigint);
inse
gengliangwang edited a comment on pull request #30493:
URL: https://github.com/apache/spark/pull/30493#issuecomment-733456524
This is an automated message from the Apache Git Service.
To respond to the message, please log on t
gengliangwang edited a comment on pull request #30493:
URL: https://github.com/apache/spark/pull/30493#issuecomment-733456524
@dongjoon-hyun I can understand your concern. But if we are going to have
this configuration, then it is reasonable to add a feature flag for each
disallowed conver