[ 
https://issues.apache.org/jira/browse/SPARK-34034?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17260095#comment-17260095
 ] 

Shixiong Zhu commented on SPARK-34034:
--------------------------------------

NVM. I just realized `show create table` didn't return the correct create table 
command for v2 table in Spark 3.0.1 (missing the table schema). Yep, it makes 
sense to disallow it since it returns an incorrect answer.

> "show create table" doesn't work for v2 table
> ---------------------------------------------
>
>                 Key: SPARK-34034
>                 URL: https://issues.apache.org/jira/browse/SPARK-34034
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Shixiong Zhu
>            Priority: Blocker
>              Labels: regression
>
> I was QAing Spark 3.1.0 RC1 and found one regression: "show create table" 
> doesn't work for v2 table.
> But when using Spark 3.0.1, "show create table" works for v2 table.
> Steps to test:
> {code:java}
> /bin/spark-shell --packages io.delta:delta-core_2.12:0.7.0 --conf 
> "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf 
> "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
> scala> spark.sql("create table foo(i INT) using delta")
>  res0: org.apache.spark.sql.DataFrame = []
> scala> spark.sql("show create table foo").show(false)
> +-----------------------------------------------+
> |createtab_stmt                                 |
> +-----------------------------------------------+
> |CREATE TABLE `default`.`foo` (
>   )
> USING delta
> |
> +-----------------------------------------------+
> {code}
> Looks like it's caused by [https://github.com/apache/spark/pull/30321]
>  which blocks "show create table" for v2.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to