The partition extractor can be configured, what partition field you use?

Danny

仙剑……情动人间 <[email protected]>于2021年12月25日 周六下午4:41写道:

>
> Dear All,
>
> * I'm using the HMS template for Flink hive sync,The template is as
> follows*
>
>
>
>
>
>
> *The application deployment mode is Flink on yarn,However, the flink sync
> hive failed with the following error message:*
>
> org.apache.hudi.exception.HoodieException: Got runtime exception when
> hive syncing unknown
>     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool
> .java:120) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.sink.StreamWriteOperatorCoordinator.syncHive(
> StreamWriteOperatorCoordinator.java:302) ~[hudi-flink-bundle_2.11-0.10.0
> .jar:0.10.0]
>     at org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(
> NonThrownExecutor.java:93) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1149) [?:1.8.0_261]
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:624) [?:1.8.0_261]
>     at java.lang.Thread.run(Thread.java:748) [?:1.8.0_261]
> Caused by: org.apache.hudi.hive.HoodieHiveSyncException: Failed to sync
> partitions for table unknown
>     at org.apache.hudi.hive.HiveSyncTool.syncPartitions(HiveSyncTool.java:
> 348) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool
> .java:195) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:131)
> ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool
> .java:117) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     ... 5 more
> Caused by: java.lang.IllegalArgumentException: Partition path
> tenant_id=******************** is not in the form yyyy/mm/dd
>     at org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
> .extractPartitionValuesInPath(SlashEncodedDayPartitionValueExtractor.java:
> 55) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HoodieHiveClient.getPartitionEvents(
> HoodieHiveClient.java:163) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.syncPartitions(HiveSyncTool.java:
> 339) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool
> .java:195) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:131)
> ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool
> .java:117) ~[hudi-flink-bundle_2.11-0.10.0.jar:0.10.0]
>     ... 5 more
>
> *The error prompt indicates that the partition field of the flink sync
> hive table must be of date type. Is that right? Ask for help*
>
> *best regards*
>
> *Luke Yan*
>
>
>
>

Reply via email to