This is an automated email from the ASF dual-hosted git repository.

joshrosen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8765eea1c08 [SPARK-39422][SQL] Improve error message for 'SHOW CREATE 
TABLE' with unsupported serdes
8765eea1c08 is described below

commit 8765eea1c08bc58a0cfc22b7cfbc0b5645cc81f9
Author: Josh Rosen <joshro...@databricks.com>
AuthorDate: Thu Jun 9 12:34:27 2022 -0700

    [SPARK-39422][SQL] Improve error message for 'SHOW CREATE TABLE' with 
unsupported serdes
    
    ### What changes were proposed in this pull request?
    
    This PR improves the error message that is thrown when trying to run `SHOW 
CREATE TABLE` on a Hive table with an unsupported serde. Currently this results 
in an error like
    
    ```
    org.apache.spark.sql.AnalysisException: Failed to execute SHOW CREATE TABLE 
against table rcFileTable, which is created by Hive and uses the following 
unsupported serde configuration
     SERDE: org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe 
INPUTFORMAT: org.apache.hadoop.hive.ql.io.RCFileInputFormat OUTPUTFORMAT: 
org.apache.hadoop.hive.ql.io.RCFileOutputFormat
    ```
    
    This patch improves this error message by adding a suggestion to use `SHOW 
CREATE TABLE ... AS SERDE`:
    
    ```
    org.apache.spark.sql.AnalysisException: Failed to execute SHOW CREATE TABLE 
against table rcFileTable, which is created by Hive and uses the following 
unsupported serde configuration
     SERDE: org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe 
INPUTFORMAT: org.apache.hadoop.hive.ql.io.RCFileInputFormat OUTPUTFORMAT: 
org.apache.hadoop.hive.ql.io.RCFileOutputFormat
    Please use `SHOW CREATE TABLE rcFileTable AS SERDE` to show Hive DDL 
instead.
    ```
    
    The suggestion's wording is consistent with other error messages thrown by 
SHOW CREATE TABLE.
    
    ### Why are the changes needed?
    
    The existing error message is confusing.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, it improves a user-facing error message.
    
    ### How was this patch tested?
    
    Manually tested with
    
    ```
    CREATE TABLE rcFileTable(i INT)
        ROW FORMAT SERDE 
'org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe'
        STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.RCFileInputFormat'
                  OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.RCFileOutputFormat'
    
    SHOW CREATE TABLE rcFileTable
    ```
    
    to trigger the error. Confirmed that the `AS SERDE` suggestion actually 
works.
    
    Closes #36814 from 
JoshRosen/suggest-show-create-table-as-serde-in-error-message.
    
    Authored-by: Josh Rosen <joshro...@databricks.com>
    Signed-off-by: Josh Rosen <joshro...@databricks.com>
---
 .../scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala     | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
index 68f4320ff67..2a8692efd0d 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
@@ -1992,7 +1992,8 @@ private[sql] object QueryCompilationErrors extends 
QueryErrorsBase {
     new AnalysisException("Failed to execute SHOW CREATE TABLE against table " 
+
         s"${table.identifier}, which is created by Hive and uses the " +
         "following unsupported serde configuration\n" +
-        builder.toString()
+        builder.toString() + "\n" +
+        s"Please use `SHOW CREATE TABLE ${table.identifier} AS SERDE` to show 
Hive DDL instead."
     )
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to