[ https://issues.apache.org/jira/browse/SPARK-36093?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-36093: -------------------------------- Labels: correctness (was: ) > The result incorrect if the partition path case is inconsistent > --------------------------------------------------------------- > > Key: SPARK-36093 > URL: https://issues.apache.org/jira/browse/SPARK-36093 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.2.0 > Reporter: Yuming Wang > Priority: Major > Labels: correctness > > Please reproduce this issue using HDFS. Local HDFS can not reproduce this > issue. > {code:scala} > sql("create table t1(cal_dt date) using parquet") > sql("insert into t1 values > (date'2021-06-27'),(date'2021-06-28'),(date'2021-06-29'),(date'2021-06-30')") > sql("create view t1_v as select * from t1") > sql("CREATE TABLE t2 USING PARQUET PARTITIONED BY (CAL_DT) AS SELECT 1 AS > FLAG,CAL_DT FROM t1_v WHERE CAL_DT BETWEEN '2021-06-27' AND '2021-06-28'") > sql("INSERT INTO t2 SELECT 2 AS FLAG,CAL_DT FROM t1_v WHERE CAL_DT BETWEEN > '2021-06-29' AND '2021-06-30'") > sql("SELECT * FROM t2 WHERE CAL_DT BETWEEN '2021-06-29' AND > '2021-06-30'").show > sql("SELECT * FROM t2 ").show > {code} > {noformat} > // It should not empty. > scala> sql("SELECT * FROM t2 WHERE CAL_DT BETWEEN '2021-06-29' AND > '2021-06-30'").show > +----+------+ > |FLAG|CAL_DT| > +----+------+ > +----+------+ > scala> sql("SELECT * FROM t2 ").show > +----+----------+ > |FLAG| CAL_DT| > +----+----------+ > | 1|2021-06-27| > | 1|2021-06-28| > +----+----------+ > scala> sql("SELECT 2 AS FLAG,CAL_DT FROM t1_v WHERE CAL_DT BETWEEN > '2021-06-29' AND '2021-06-30'").show > +----+----------+ > |FLAG| CAL_DT| > +----+----------+ > | 2|2021-06-29| > | 2|2021-06-30| > +----+----------+ > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org