GitHub user zhengsg created a discussion: How to configure SparkSQL to access 
the Hive catalog in Gravitino?

I have already configured the Hive Thrift URI in the UI. Following the official 
documentation, starting SparkSQL and running show databasesand show tables 
returns results normally. However, select * from xxx and insert into xxx 
operations throw errors with the following information:
WARN ObjectStore:Failed to get database db, returning NoSuchObjectException
[SCHEMA_NOT_FOUND] The schema `db` can not found. Verify the spelling and 
correctness of the schema and catalog.

I have copied the HDFS configuration files to the SparkSQL conf directory. 

The SparkSQL comand:
./bin/spark-sql  \
--conf 
spark.plugins="org.apache.gravitino.spark.connector.plugin.GravitinoSparkPlugin"
 \
--conf spark.sql.gravitino.uri=http://127.0.0.1:8090 \
--conf spark.sql.gravitino.metalake=test \
--conf spark.sql.gravitino.client.socketTimeoutMs=60000 \
--conf spark.sql.gravitino.client.connectionTimeoutMs=60000 \
--conf spark.sql.warehouse.dir=hdfs://nameservice/hive

GitHub link: https://github.com/apache/gravitino/discussions/9161

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to