Hi, I am trying to connect to spark thrift server to create an external table. In my table DDL, I have a tbl property 'spark.sql.sources.provider' = 'parquet', but I am getting an error "Cannot persistent into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.sources.provider];
However, I try to create an external table in spark-shell using spark.catalog.createExternalTable() api. When I look at the table definition via beeline using "show create table", I saw these tblproperties: | TBLPROPERTIES ( | | 'COLUMN_STATS_ACCURATE'='false', | | 'numFiles'='0', | | 'numRows'='-1', | | 'rawDataSize'='-1', | | 'spark.sql.sources.provider'='parquet', | | 'spark.sql.sources.schema.numParts'='1', Can someone explain why the creating the external table via jdbc to the spark thrift server complains about the spark.sql tbl properties? Thanks. Antonio. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark2-create-hive-external-table-tp29118.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org