[ 
https://issues.apache.org/jira/browse/SPARK-41824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-41824:
------------------------------------

    Assignee:     (was: Apache Spark)

> Implement DataFrame.explain format to be similar to PySpark
> -----------------------------------------------------------
>
>                 Key: SPARK-41824
>                 URL: https://issues.apache.org/jira/browse/SPARK-41824
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect
>    Affects Versions: 3.4.0
>            Reporter: Sandeep Singh
>            Priority: Major
>
> {code:java}
> df = spark.createDataFrame([(14, "Tom"), (23, "Alice"), (16, "Bob")], ["age", 
> "name"]) 
> df.explain()
> df.explain(True)
> df.explain(mode="formatted")
> df.explain("cost"){code}
> {code:java}
> File 
> "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", 
> line 1296, in pyspark.sql.connect.dataframe.DataFrame.explain
> Failed example:
>     df.explain()
> Expected:
>     == Physical Plan ==
>     *(1) Scan ExistingRDD[age...,name...]
> Got:
>     == Physical Plan ==
>     LocalTableScan [age#1148L, name#1149]
>     <BLANKLINE>
>     <BLANKLINE>
> **********************************************************************
> File 
> "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", 
> line 1314, in pyspark.sql.connect.dataframe.DataFrame.explain
> Failed example:
>     df.explain(mode="formatted")
> Expected:
>     == Physical Plan ==
>     * Scan ExistingRDD (...)
>     (1) Scan ExistingRDD [codegen id : ...]
>     Output [2]: [age..., name...]
>     ...
> Got:
>     == Physical Plan ==
>     LocalTableScan (1)
>     <BLANKLINE>
>     <BLANKLINE>
>     (1) LocalTableScan
>     Output [2]: [age#1170L, name#1171]
>     Arguments: [age#1170L, name#1171]
>     <BLANKLINE>
>     <BLANKLINE>{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to