[ 
https://issues.apache.org/jira/browse/SPARK-32765?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Leanken.Lin updated SPARK-32765:
--------------------------------
    Environment:     (was: Currently, EliminateJoinToEmptyRelation Rule will 
convert Join into EmptyRelation in some cases with AQE on. But if either sub 
plan of Join contains a ShuffleQueryStage(canChangeNumPartitions == false), 
which means the Exchange produced by repartition Or singlePartition, in this 
case, if we were to convert it into an EmptyRelation, it will lost user 
specified number partition information for downstream operator, it's not right. 

So in the Patch, try not to do the conversion if either sub plan of Join 
contains ShuffleQueryStage(canChangeNumPartitions == false))

> EliminateJoinToEmptyRelation should respect exchange behavior when 
> canChangeNumPartitions == false
> --------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-32765
>                 URL: https://issues.apache.org/jira/browse/SPARK-32765
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Leanken.Lin
>            Priority: Major
>
> Currently, EliminateJoinToEmptyRelation Rule will convert Join into 
> EmptyRelation in some cases with AQE on. But if either sub plan of Join 
> contains a ShuffleQueryStage(canChangeNumPartitions == false), which means 
> the Exchange produced by repartition Or singlePartition, in this case, if we 
> were to convert it into an EmptyRelation, it will lost user specified number 
> partition information for downstream operator, it's not right. 
> So in the Patch, try not to do the conversion if either sub plan of Join 
> contains ShuffleQueryStage(canChangeNumPartitions == false)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to