[ 
https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-26643.
-------------------------------
    Resolution: Not A Problem

> Spark Hive throw an AnalysisException,when set table properties.But this 
> AnalysisException contains one typo and one unsuited word.
> -----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26643
>                 URL: https://issues.apache.org/jira/browse/SPARK-26643
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0, 3.0.0
>            Reporter: jiaan.geng
>            Priority: Minor
>
> When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
> {code:java}
> spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
> org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into 
> hive metastore as table property keys may not start with 'spark.sql.': 
> [spark.sql.partitionProvider];
> at 
> org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
> I found the message of this exception contains one typo in "persistent"  and 
> one unsuited word "hive".
> So I think this analysis exception should change from
> {code:java}
> throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into 
> hive metastore " +
>  s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}
> to
> {code:java}
> throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive 
> metastore " +
>  s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to