[ https://issues.apache.org/jira/browse/SPARK-25452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16628433#comment-16628433 ]
Meethu Mathew commented on SPARK-25452: --------------------------------------- This is not duplicate of -SPARK-24829.- !image-2018-09-26-14-14-47-504.png! > Query with where clause is giving unexpected result in case of float column > --------------------------------------------------------------------------- > > Key: SPARK-25452 > URL: https://issues.apache.org/jira/browse/SPARK-25452 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.3.1 > Environment: *Spark 2.3.1* > *Hadoop 2.7.2* > Reporter: Ayush Anubhava > Priority: Major > Attachments: image-2018-09-26-14-14-47-504.png > > > *Description* : Query with clause is giving unexpected result in case of > float column > > {color:#d04437}*Query with filter less than equal to is giving inappropriate > result{code}*{color} > {code} > 0: jdbc:hive2://10.18.18.214:23040/default> create table k2 ( a int, b float); > +---------+--+ > | Result | > +---------+--+ > +---------+--+ > 0: jdbc:hive2://10.18.18.214:23040/default> insert into table k2 values > (0,0.0); > +---------+--+ > | Result | > +---------+--+ > +---------+--+ > 0: jdbc:hive2://10.18.18.214:23040/default> insert into table k2 values > (1,1.1); > +---------+--+ > | Result | > +---------+--+ > +---------+--+ > 0: jdbc:hive2://10.18.18.214:23040/default> select * from k2 where b >=0.0; > +----+--------------------+--+ > | a | b | > +----+--------------------+--+ > | 0 | 0.0 | > | 1 | 1.100000023841858 | > +----+--------------------+--+ > Query with filter less than equal to is giving in appropriate result > 0: jdbc:hive2://10.18.18.214:23040/default> select * from k2 where b <=1.1; > +----+------+--+ > | a | b | > +----+------+--+ > | 0 | 0.0 | > +----+------+--+ > 1 row selected (0.299 seconds) > {code} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org