when execute below command in beeline or pyspark, the table metadata is stored 
successfully in hive metastore with below warning

CREATE EXTERNAL TABLE testtable USING DELTA LOCATION 
's3a://path/to/delta/delta-folder/'

WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data 
source provider delta. Persisting data source table `testdb`.` testtable` into 
Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.

Is there anyway to save the external table using format that hive metastore can 
support without throwing above error when using together with beeline/pyspark? 

I can’t use  STORED BY 'io.delta.hive.DeltaStorageHandler' , because I create 
external table using pyspark, instead of pyhive . I would like it to be 
compatible with spark 



Reply via email to