[ https://issues.apache.org/jira/browse/SPARK-21774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16631568#comment-16631568 ]
ice bai commented on SPARK-21774: --------------------------------- I met the same problem in Spark 2.3.0. The flowlling is some tests ``` spark-sql> select '11111111'>0; true Time taken: 0.078 seconds, Fetched 1 row(s) spark-sql> select '111111111111'>0; NULL Time taken: 0.065 seconds, Fetched 1 row(s) spark-sql> select '1.0'=1; true Time taken: 0.054 seconds, Fetched 1 row(s) spark-sql> select '1.2'=1; true Time taken: 0.07 seconds, Fetched 1 row(s) ``` When set log level to trace, I found this: === Applying Rule org.apache.spark.sql.catalyst.analysis.TypeCoercion$PromoteStrings === !'Project [unresolvedalias((111111111111> 0), None)] 'Project [unresolvedalias((cast(111111111111 as int) > 0), None)] +- OneRowRelation +- OneRowRelation > The rule PromoteStrings cast string to a wrong data type > -------------------------------------------------------- > > Key: SPARK-21774 > URL: https://issues.apache.org/jira/browse/SPARK-21774 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.2.0 > Reporter: StanZhai > Priority: Critical > Labels: correctness > > Data > {code} > create temporary view tb as select * from values > ("0", 1), > ("-0.1", 2), > ("1", 3) > as grouping(a, b) > {code} > SQL: > {code} > select a, b from tb where a=0 > {code} > The result which is wrong: > {code} > +----+---+ > | a| b| > +----+---+ > | 0| 1| > |-0.1| 2| > +----+---+ > {code} > Logical Plan: > {code} > == Parsed Logical Plan == > 'Project ['a] > +- 'Filter ('a = 0) > +- 'UnresolvedRelation `src` > == Analyzed Logical Plan == > a: string > Project [a#8528] > +- Filter (cast(a#8528 as int) = 0) > +- SubqueryAlias src > +- Project [_1#8525 AS a#8528, _2#8526 AS b#8529] > +- LocalRelation [_1#8525, _2#8526] > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org