[jira] [Commented] (SPARK-42513) Push down topK through join

2023-03-15 Thread Yuming Wang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42513?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17700945#comment-17700945
 ] 

Yuming Wang commented on SPARK-42513:
-

TiDB:

{code:sql}
create table t1(a int, b int) ;
create table t2(c int, d int) ;
create table t3(e int, f int);

desc select * from t1 left join t2 on a = c left join t3 on a = e order by b 
limit 1000;
{code}

{noformat}
mysql> desc select * from t1 left join t2 on a = c left join t3 on a = e order 
by b limit 1000;
++--+---+---+-+
| id | estRows  | task  | access object 
| operator info   |
++--+---+---+-+
| TopN_17| 1000.00  | root  |   
| test1.t1.b, offset:0, count:1000|
| └─HashJoin_23  | 1250.00  | root  |   
| left outer join, equal:[eq(test1.t1.a, test1.t3.e)] |
|   ├─TopN_26(Build) | 1000.00  | root  |   
| test1.t1.b, offset:0, count:1000|
|   │ └─HashJoin_32  | 1250.00  | root  |   
| left outer join, equal:[eq(test1.t1.a, test1.t2.c)] |
|   │   ├─TopN_33(Build) | 1000.00  | root  |   
| test1.t1.b, offset:0, count:1000|
|   │   │ └─TableReader_42   | 1000.00  | root  |   
| data:TopN_41|
|   │   │   └─TopN_41| 1000.00  | cop[tikv] |   
| test1.t1.b, offset:0, count:1000|
|   │   │ └─TableFullScan_40 | 1.00 | cop[tikv] | table:t1  
| keep order:false, stats:pseudo  |
|   │   └─TableReader_47(Probe)  | 9990.00  | root  |   
| data:Selection_46   |
|   │ └─Selection_46 | 9990.00  | cop[tikv] |   
| not(isnull(test1.t2.c)) |
|   │   └─TableFullScan_45   | 1.00 | cop[tikv] | table:t2  
| keep order:false, stats:pseudo  |
|   └─TableReader_50(Probe)  | 9990.00  | root  |   
| data:Selection_49   |
| └─Selection_49 | 9990.00  | cop[tikv] |   
| not(isnull(test1.t3.e)) |
|   └─TableFullScan_48   | 1.00 | cop[tikv] | table:t3  
| keep order:false, stats:pseudo  |
++--+---+---+-+
14 rows in set (0.00 sec)
{noformat}



> Push down topK through join
> ---
>
> Key: SPARK-42513
> URL: https://issues.apache.org/jira/browse/SPARK-42513
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Yuming Wang
>Priority: Major
> Attachments: after-UI.png, before-UI.png
>
>
> {code:scala}
> spark.range(1).selectExpr("id % 1 as a", "id as 
> b").write.saveAsTable("t1")
> spark.range(1).selectExpr("id % 1 as x", "id as 
> y").write.saveAsTable("t2")
> sql("select * from t1 left join t2 on a = x order by b limit 5").collect()
> spark.sql("set 
> spark.sql.optimizer.excludedRules=org.apache.spark.sql.catalyst.optimizer.LimitPushDown")
> sql("select * from t1 left join t2 on a = x order by b limit 5").collect()
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42513) Push down topK through join

2023-02-21 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42513?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17691896#comment-17691896
 ] 

Apache Spark commented on SPARK-42513:
--

User 'wangyum' has created a pull request for this issue:
https://github.com/apache/spark/pull/40114

> Push down topK through join
> ---
>
> Key: SPARK-42513
> URL: https://issues.apache.org/jira/browse/SPARK-42513
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Yuming Wang
>Priority: Major
> Attachments: after-UI.png, before-UI.png
>
>
> {code:scala}
> spark.range(1).selectExpr("id % 1 as a", "id as 
> b").write.saveAsTable("t1")
> spark.range(1).selectExpr("id % 1 as x", "id as 
> y").write.saveAsTable("t2")
> sql("select * from t1 left join t2 on a = x order by b limit 5").collect()
> spark.sql("set 
> spark.sql.optimizer.excludedRules=org.apache.spark.sql.catalyst.optimizer.LimitPushDown")
> sql("select * from t1 left join t2 on a = x order by b limit 5").collect()
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org