roryqi commented on issue #10698:
URL: https://github.com/apache/gravitino/issues/10698#issuecomment-4197330487

   > That does not work because Spark isn't recognizing iceberg_nyc as a 
catalog — it's trying to find it as a schema within spark_catalog.
   > 
   > spark-sql (public)> use spark_catalog; Time taken: 0.011 seconds spark-sql 
(default)> use iceberg_nyc; 26/04/07 07:27:59 WARN ObjectStore: Failed to get 
database iceberg_nyc, returning NoSuchObjectException [SCHEMA_NOT_FOUND] The 
schema `iceberg_nyc` cannot be found. Verify the spelling and correctness of 
the schema and catalog. If you did not qualify the name with a catalog, verify 
the current_schema() output, or qualify the name with the correct catalog. To 
tolerate the error on drop use DROP SCHEMA IF EXISTS. spark-sql (default)> use 
nyc_taxi; 26/04/07 07:28:12 WARN ObjectStore: Failed to get database nyc_taxi, 
returning NoSuchObjectException [SCHEMA_NOT_FOUND] The schema `nyc_taxi` cannot 
be found. Verify the spelling and correctness of the schema and catalog. If you 
did not qualify the name with a catalog, verify the current_schema() output, or 
qualify the name with the correct catalog. To tolerate the error on drop use 
DROP SCHEMA IF EXISTS. spark-sql (default)> show table
 s; Time taken: 0.036 seconds spark-sql (default)>
   > 
   > It does not work the same way in with postgres.
   > 
   > spark-sql (default)> USE postgres_demo.public; Time taken: 0.176 seconds 
spark-sql (public)> show tables; customers transactions Time taken: 0.096 
seconds, Fetched 2 row(s) spark-sql (public)> SELECT COUNT(*) FROM 
postgres_demo.public.customers; 10 Time taken: 0.906 seconds, Fetched 1 row(s)
   
   Could u use the command instead?
   ```
   USE CATALOG catalog_name
   USE SCHEMA schema_name
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to