Hi, I have some questions when I try to use Hudi in my company’s prod env:
1. When I migrate the history table in HDFS, I tried use hudi-cli and HDFSParquetImporter tool. How can I specify Spark parameters in this tool, such as Yarn queue, etc? 2. Hudi needs to write metadata to Hive and it uses HiveMetastoreClient and HiveJDBC. How can I do if the Hive has Kerberos Authentication? Thanks. Best, Qian
