[ 
https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiaan.geng updated SPARK-26643:
-------------------------------
    Description: 
When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
{code:java}
spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
hive metastore as table property keys may not start with 'spark.sql.': 
[spark.sql.partitionProvider];
at 
org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
I found the message of this exception is not orrect.

What is the function of the method named verifyTableProperties ? I check the 
comment of this method ,the comment contains :
{code:java}
/**
* If the given table properties contains datasource properties, throw an 
exception. We will do
* this check when create or alter a table, i.e. when we try to write table 
metadata to Hive
* metastore.
*/
{code}
So I think the 

 

 

  was:
When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
{code:java}
spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
hive metastore as table property keys may not start with 'spark.sql.': 
[spark.sql.partitionProvider];
at 
org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
I found the message of this exception is not orrect.

What is the function of the method named verifyTableProperties ?I check the 


> Spark hive throwing an incorrect analysis exception,when set table properties.
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-26643
>                 URL: https://issues.apache.org/jira/browse/SPARK-26643
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0, 3.0.0
>            Reporter: jiaan.geng
>            Priority: Minor
>
> When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
> {code:java}
> spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
> org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
> hive metastore as table property keys may not start with 'spark.sql.': 
> [spark.sql.partitionProvider];
> at 
> org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
> I found the message of this exception is not orrect.
> What is the function of the method named verifyTableProperties ? I check the 
> comment of this method ,the comment contains :
> {code:java}
> /**
> * If the given table properties contains datasource properties, throw an 
> exception. We will do
> * this check when create or alter a table, i.e. when we try to write table 
> metadata to Hive
> * metastore.
> */
> {code}
> So I think the 
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to