[ https://issues.apache.org/jira/browse/SPARK-17198?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15444781#comment-15444781 ]
Dongjoon Hyun commented on SPARK-17198: --------------------------------------- Yes. Spark 2.0 improves SQL features greatly. I think that Spark 2.0.1 will add more stability fixes. > ORC fixed char literal filter does not work > ------------------------------------------- > > Key: SPARK-17198 > URL: https://issues.apache.org/jira/browse/SPARK-17198 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.1 > Reporter: tuming > > I have got wrong result when I run the following query in SparkSQL. > select * from orc_table where char_col ='5LZS'; > Table orc_table is a ORC format table. > Column char_col is defined as char(6). > The hive record reader will return a char(6) string to the spark. And the > spark has no fixed char type. All fixed char type attributes are converted to > String by default. Meanwhile the constant literal is parsed to a string > Literal. So it won't return true forever while doing the equal comparison. > For instance: '5LZS'=='5LZS '. > But I can get correct result in Hive using same data and sql string because > hive append spaces for those constant literal. Please refer to: > https://issues.apache.org/jira/browse/HIVE-11312 > I found there is no such patch for spark. > -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org