[ https://issues.apache.org/jira/browse/SPARK-27439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16826542#comment-16826542 ]
Huon Wilson commented on SPARK-27439: ------------------------------------- I think this (partially) broke {{df.explain(extended = true)}}, as the "Parsed Logical Plan" section is now the same as the analysed one: Before (2.4): {code:none} scala> spark.range(100).select(col("id")).explain(true) == Parsed Logical Plan == 'Project [unresolvedalias('id, None)] +- Range (0, 100, step=1, splits=Some(12)) == Analyzed Logical Plan == id: bigint Project [id#113L] +- Range (0, 100, step=1, splits=Some(12)) ... {code} After (master): {code:none} == Parsed Logical Plan == Project [id#0L] +- Range (0, 100, step=1, splits=Some(12)) == Analyzed Logical Plan == id: bigint Project [id#0L] +- Range (0, 100, step=1, splits=Some(12)) ... {code} > Use analyzed plan when explaining Dataset > ----------------------------------------- > > Key: SPARK-27439 > URL: https://issues.apache.org/jira/browse/SPARK-27439 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.1 > Reporter: xjl > Assignee: Liang-Chi Hsieh > Priority: Minor > Fix For: 3.0.0 > > > {code} > scala> spark.range(10).createOrReplaceTempView("test") > scala> spark.range(5).createOrReplaceTempView("test2") > scala> spark.sql("select * from test").createOrReplaceTempView("tmp001") > scala> val df = spark.sql("select * from tmp001") > scala> spark.sql("select * from test2").createOrReplaceTempView("tmp001") > scala> df.show > +---+ > | id| > +---+ > | 0| > | 1| > | 2| > | 3| > | 4| > | 5| > | 6| > | 7| > | 8| > | 9| > +---+ > scala> df.explain > {code} > Before: > {code} > == Physical Plan == > *(1) Range (0, 5, step=1, splits=12) > {code} > After: > {code} > == Physical Plan == > *(1) Range (0, 10, step=1, splits=12) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org