[ 
https://issues.apache.org/jira/browse/SPARK-29634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16962791#comment-16962791
 ] 

Zhaoyang Qin commented on SPARK-29634:
--------------------------------------

I didn't find any similar issues. Is the question repetitive? anyone has any 
Suggestions?

And,does anyone know if spark-sql queries faster than hive sql when create data 
tables in spark's warehouse ?

 

 

> spark-sql can't query hive table values with schema Char by equality.
> ---------------------------------------------------------------------
>
>                 Key: SPARK-29634
>                 URL: https://issues.apache.org/jira/browse/SPARK-29634
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 2.2.1, 2.3.1
>         Environment: Spark 2.3.1
> Hive 3.0.0
> TPCDS Data & Tables
>            Reporter: Zhaoyang Qin
>            Priority: Major
>              Labels: spark-sql, spark-sql-perf
>
> spark-sql can't query hive table values that with schema Char by equality.
> When i use spark-sql CLI  to execute a query with hive tables,The expected 
> results can not be obtained. The query result is empty. Some equality 
> conditions did not work as expected.I checked and found that the table fields 
> had one thing in common: they were created as char,sometimes as varchar.Then 
> I execute the following statement and return an empty result: select * from 
> foo where bar = 'something'.In fact, the data does exist. Using hive sql 
> returns the correct results.
> Simulation steps:(use spark-sql)
> {code:java}
> //spark-sql
> {code}
> ./spark-sql 
> > create table test1 (name char(10), age int);
> > insert into test1 values('erya',15);
> > select * from test1 where name = 'erya';  // no results
> > select * from test1 where name = 'erya      ';  // add 6 spaces



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to