Never mind, I got the point, spark replace hive parquet with it's own,
Should set spark.sql.hive.convertMetastoreParquet=false to use hive's.
Thanks
On Thu, Apr 25, 2019 at 5:00 PM Jun Zhu wrote:
> Hi,
> We are using plugins from apache hudi which self defined a hive external
> table inputformat
Hi,
We are using plugins from apache hudi which self defined a hive external
table inputformat with:
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
WITH SERDEPROPERTIES (
'serialization.format' = '1'
)
STORED AS
INPUTFORMAT 'com.uber.hoodie.hadoop.HoodieInp