Takeshi Yamamuro created SPARK-32704:
----------------------------------------

             Summary: Logging plan changes for execution
                 Key: SPARK-32704
                 URL: https://issues.apache.org/jira/browse/SPARK-32704
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.1.0
            Reporter: Takeshi Yamamuro


Since we only log plan changes for analyzer/optimizer now, this ticket targets 
adding code to log plan changes in the preparation phase in QueryExecution for 
execution.

{code}
scala> spark.sql("SET spark.sql.optimizer.planChangeLog.level=WARN")
scala> spark.range(10).groupBy("id").count().queryExecution.executedPlan
...
20/08/26 09:32:36 WARN PlanChangeLogger: 
=== Applying Rule org.apache.spark.sql.execution.CollapseCodegenStages ===
!HashAggregate(keys=[id#19L], functions=[count(1)], output=[id#19L, count#23L]) 
             *(1) HashAggregate(keys=[id#19L], functions=[count(1)], 
output=[id#19L, count#23L])
!+- HashAggregate(keys=[id#19L], functions=[partial_count(1)], output=[id#19L, 
count#27L])   +- *(1) HashAggregate(keys=[id#19L], 
functions=[partial_count(1)], output=[id#19L, count#27L])
!   +- Range (0, 10, step=1, splits=4)                                          
                +- *(1) Range (0, 10, step=1, splits=4)
         
20/08/26 09:32:36 WARN PlanChangeLogger: 
=== Result of Batch Preparations ===
!HashAggregate(keys=[id#19L], functions=[count(1)], output=[id#19L, count#23L]) 
             *(1) HashAggregate(keys=[id#19L], functions=[count(1)], 
output=[id#19L, count#23L])
!+- HashAggregate(keys=[id#19L], functions=[partial_count(1)], output=[id#19L, 
count#27L])   +- *(1) HashAggregate(keys=[id#19L], 
functions=[partial_count(1)], output=[id#19L, count#27L])
!   +- Range (0, 10, step=1, splits=4)  
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to