[ 
https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiaan.geng updated SPARK-26643:
-------------------------------
    Description: 
When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
{code:java}
spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
hive metastore as table property keys may not start with 'spark.sql.': 
[spark.sql.partitionProvider];
at 
org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
I found the message of this exception contains two typo.

one is persistent 

What is the function of the method named verifyTableProperties ? I check the 
comment of this method ,the comment contains :
{code:java}
/**
* If the given table properties contains datasource properties, throw an 
exception. We will do
* this check when create or alter a table, i.e. when we try to write table 
metadata to Hive
* metastore.
*/
{code}
So I think this analysis exception should change from
{code:java}
throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into 
hive metastore " +
 s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
 invalidKeys.mkString("[", ", ", "]")){code}
to
{code:java}
throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive 
metastore " +
 s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
 invalidKeys.mkString("[", ", ", "]")){code}

  was:
When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
{code:java}
spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
hive metastore as table property keys may not start with 'spark.sql.': 
[spark.sql.partitionProvider];
at 
org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
I found the message of this exception is not orrect.

What is the function of the method named verifyTableProperties ? I check the 
comment of this method ,the comment contains :
{code:java}
/**
* If the given table properties contains datasource properties, throw an 
exception. We will do
* this check when create or alter a table, i.e. when we try to write table 
metadata to Hive
* metastore.
*/
{code}
So I think this analysis exception should change from
{code:java}
throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into 
hive metastore " +
 s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
 invalidKeys.mkString("[", ", ", "]")){code}
to
{code:java}
throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive 
metastore " +
 s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
 invalidKeys.mkString("[", ", ", "]")){code}


> Spark Hive throw an AnalysisException,when set table properties.But this 
> AnalysisException contains two typo.
> -------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26643
>                 URL: https://issues.apache.org/jira/browse/SPARK-26643
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0, 3.0.0
>            Reporter: jiaan.geng
>            Priority: Minor
>
> When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
> {code:java}
> spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
> org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
> hive metastore as table property keys may not start with 'spark.sql.': 
> [spark.sql.partitionProvider];
> at 
> org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
> I found the message of this exception contains two typo.
> one is persistent 
> What is the function of the method named verifyTableProperties ? I check the 
> comment of this method ,the comment contains :
> {code:java}
> /**
> * If the given table properties contains datasource properties, throw an 
> exception. We will do
> * this check when create or alter a table, i.e. when we try to write table 
> metadata to Hive
> * metastore.
> */
> {code}
> So I think this analysis exception should change from
> {code:java}
> throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into 
> hive metastore " +
>  s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}
> to
> {code:java}
> throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive 
> metastore " +
>  s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to