[ https://issues.apache.org/jira/browse/SPARK-33038?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17204958#comment-17204958 ]
Apache Spark commented on SPARK-33038: -------------------------------------- User 'allisonwang-db' has created a pull request for this issue: https://github.com/apache/spark/pull/29915 > AQE plan string should only display one plan when the initial and the current > plan are the same > ----------------------------------------------------------------------------------------------- > > Key: SPARK-33038 > URL: https://issues.apache.org/jira/browse/SPARK-33038 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.1.0 > Reporter: Allison Wang > Assignee: Apache Spark > Priority: Minor > > Currently, the AQE plan string displays both the initial plan and the current > or the final plan. This can be redundant when the initial plan and the > current physical plan are exactly the same. For instance, the `EXPLAIN` > command will not actually execute the query, and thus the plan string will > never change, but currently, the plan string still shows both the current and > the initial plan: > > {code:java} > AdaptiveSparkPlan (8) > +- == Current Plan == > Sort (7) > +- Exchange (6) > +- HashAggregate (5) > +- Exchange (4) > +- HashAggregate (3) > +- Filter (2) > +- Scan parquet default.explain_temp1 (1) > +- == Initial Plan == > Sort (7) > +- Exchange (6) > +- HashAggregate (5) > +- Exchange (4) > +- HashAggregate (3) > +- Filter (2) > +- Scan parquet default.explain_temp1 (1) > {code} > When the initial and the current plan are the same, there should be only one > plan string displayed. For example > {code:java} > AdaptiveSparkPlan (8) > +- Sort (7) > +- Exchange (6) > +- HashAggregate (5) > +- Exchange (4) > +- HashAggregate (3) > +- Filter (2) > +- Scan parquet default.explain_temp1 (1){code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org