[ https://issues.apache.org/jira/browse/SPARK-26821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16759381#comment-16759381 ]
Sujith commented on SPARK-26821: -------------------------------- As per the initial analysis, this phenomenon is happening because the actual char data type length is 5 where as we are trying to insert a data with length 2, since its a char data type the system will pad the remaining part of the array block with 'space'. now when we try to apply a filter, the system will try to compare the predicate value with the actual table data which contains the space char like 'ds' == 'ds ' which leads to wrong result. I am trying to analyze more on this issue please let me know for any suggestions or guidance. thanks > filters not working with char datatype when querying against hive table > ----------------------------------------------------------------------- > > Key: SPARK-26821 > URL: https://issues.apache.org/jira/browse/SPARK-26821 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.4.0 > Reporter: Sujith > Priority: Major > > creates a table with a char type field, While inserting data to char data > type column, if the data string length is less than the specified datatype > length, spark2x will not process filter query properly leading to incorrect > result . > 0: jdbc:hive2://10.19.89.222:22550/default> create table jj(id int, name > char(5)); > +----------+-+ > |Result| > +----------+-+ > +----------+-+ > No rows selected (0.894 seconds) > 0: jdbc:hive2://10.19.89.222:22550/default> insert into table jj > values(232,'ds'); > +----------+-+ > |Result| > +----------+-+ > +----------+-+ > No rows selected (1.815 seconds) > 0: jdbc:hive2://10.19.89.222:22550/default> select * from jj where name='ds'; > +------+------++-- > |id|name| > +------+------++-- > +------+------++-- > > The above query will not give any result. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org