[jira] [Updated] (SPARK-26643) Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception contains one typo in "persistent" and one unsuited word "hive". So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception contains two typo. one is persistent What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} > Spark Hive throw an AnalysisException,when set table properties.But this > AnalysisException contains one typo and one unsuited word. > --- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception contains one typo in "persistent" and > one unsuited word "hive". > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Summary: Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word. (was: Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains two typo.) > Spark Hive throw an AnalysisException,when set table properties.But this > AnalysisException contains one typo and one unsuited word. > --- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception contains two typo. > one is persistent > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains two typo.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception contains two typo. one is persistent What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} > Spark Hive throw an AnalysisException,when set table properties.But this > AnalysisException contains two typo. > - > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception contains two typo. > one is persistent > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --
[jira] [Updated] (SPARK-26643) Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains two typo.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Summary: Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains two typo. (was: Spark Hive throw an analysis exception,when set table properties.But this ) > Spark Hive throw an AnalysisException,when set table properties.But this > AnalysisException contains two typo. > - > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark Hive throw an analysis exception,when set table properties.But this
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Summary: Spark Hive throw an analysis exception,when set table properties.But this (was: Spark hive throwing an incorrect analysis exception,when set table properties.) > Spark Hive throw an analysis exception,when set table properties.But this > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive > metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129) {code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); {code} > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: I found the message of this exception {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129) {code} > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > > I found the message of this exception > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException (was: When I execute a DDL in spark-sql,a AnalysisException ) > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think this analysis exception should change from {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} to {code:java} throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " + s"as table property keys may start with '$SPARK_SQL_PREFIX': " + invalidKeys.mkString("[", ", ", "]")){code} was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think the > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think this analysis exception should change from > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may not start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} > to > {code:java} > throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into > hive metastore " + > s"as table property keys may start with '$SPARK_SQL_PREFIX': " + > invalidKeys.mkString("[", ", ", "]")){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains : {code:java} /** * If the given table properties contains datasource properties, throw an exception. We will do * this check when create or alter a table, i.e. when we try to write table metadata to Hive * metastore. */ {code} So I think the was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ?I check the > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ? I check the > comment of this method ,the comment contains : > {code:java} > /** > * If the given table properties contains datasource properties, throw an > exception. We will do > * this check when create or alter a table, i.e. when we try to write table > metadata to Hive > * metastore. > */ > {code} > So I think the > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. What is the function of the method named verifyTableProperties ?I check the was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. > What is the function of the method named verifyTableProperties ?I check the -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} I found the message of this exception is not orrect. was: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: I found the message of this exception {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider]; at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into > hive metastore as table property keys may not start with 'spark.sql.': > [spark.sql.partitionProvider]; > at > org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code} > I found the message of this exception is not orrect. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: {code:java} spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); {code} was:When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: > {code:java} > spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test'); > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,throwing a AnalysisException as follows: (was: When I execute a DDL in spark-sql,throwing a AnalysisException ) > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,throwing a AnalysisException as follows: -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,a AnalysisException (was: When I execute a DDL in spark-sql,a AnalysisException) > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,a AnalysisException -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive throwing an incorrect analysis exception,when set table properties.
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Summary: Spark hive throwing an incorrect analysis exception,when set table properties. (was: Spark hive ) > Spark hive throwing an incorrect analysis exception,when set table properties. > -- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,a AnalysisException -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-26643) Spark hive
[ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] jiaan.geng updated SPARK-26643: --- Description: When I execute a DDL in spark-sql,a AnalysisException > Spark hive > --- > > Key: SPARK-26643 > URL: https://issues.apache.org/jira/browse/SPARK-26643 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.0, 2.4.0, 3.0.0 >Reporter: jiaan.geng >Priority: Minor > > When I execute a DDL in spark-sql,a AnalysisException -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org