delta-io/connectors

external table using delta format

cometta opened this issue · 1 comments

when execute below command in beeline or pyspark, the table metadata is stored successfully in hive metastore with below warning

CREATE EXTERNAL TABLE testtable USING DELTA LOCATION 's3a://path/to/delta/delta-folder/'
WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider delta. Persisting data source table `testdb`.` testtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.

Is there anyway to save the external table using format that hive metastore can support without throwing above error when using together with beeline/pyspark?

I can’t use STORED BY 'io.delta.hive.DeltaStorageHandler' , because I create external table using pyspark, instead of pyhive . I would like it to be compatible with spark and able to query using pyhive

This is not supported today. We have an issue to track this delta-io/delta#1045 . Please add new comments there and I'm going to close this.