Yuming Wang created SPARK-29231: ----------------------------------- Summary: Can not infer an additional set of constraints if contains CAST Key: SPARK-29231 URL: https://issues.apache.org/jira/browse/SPARK-29231 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 3.0.0 Reporter: Yuming Wang
How to reproduce: {code:scala} scala> spark.sql("create table t1(c11 int, c12 decimal) ") res0: org.apache.spark.sql.DataFrame = [] scala> spark.sql("create table t2(c21 bigint, c22 decimal) ") res1: org.apache.spark.sql.DataFrame = [] scala> spark.sql("select t1.*, t2.* from t1 left join t2 on t1.c11=t2.c21 where t1.c11=1").explain == Physical Plan == SortMergeJoin [cast(c11#0 as bigint)], [c21#2L], LeftOuter :- *(2) Sort [cast(c11#0 as bigint) ASC NULLS FIRST], false, 0 : +- Exchange hashpartitioning(cast(c11#0 as bigint), 200), true, [id=#30] : +- *(1) Filter (isnotnull(c11#0) AND (c11#0 = 1)) : +- Scan hive default.t1 [c11#0, c12#1], HiveTableRelation `default`.`t1`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [c11#0, c12#1], Statistics(sizeInBytes=8.0 EiB) +- *(4) Sort [c21#2L ASC NULLS FIRST], false, 0 +- Exchange hashpartitioning(c21#2L, 200), true, [id=#37] +- *(3) Filter isnotnull(c21#2L) +- Scan hive default.t2 [c21#2L, c22#3], HiveTableRelation `default`.`t2`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [c21#2L, c22#3], Statistics(sizeInBytes=8.0 EiB) {code} PostgreSQL suport this feature: {code:sql} postgres=# create table t1(c11 int4, c12 decimal); CREATE TABLE postgres=# create table t2(c21 int8, c22 decimal); CREATE TABLE postgres=# explain select t1.*, t2.* from t1 left join t2 on t1.c11=t2.c21 where t1.c11=1; QUERY PLAN ---------------------------------------------------------------- Nested Loop Left Join (cost=0.00..51.43 rows=36 width=76) Join Filter: (t1.c11 = t2.c21) -> Seq Scan on t1 (cost=0.00..25.88 rows=6 width=36) Filter: (c11 = 1) -> Materialize (cost=0.00..25.03 rows=6 width=40) -> Seq Scan on t2 (cost=0.00..25.00 rows=6 width=40) Filter: (c21 = 1) (7 rows) {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org