JoshuaZhuCN opened a new issue, #7351: URL: https://github.com/apache/hudi/issues/7351
The keygenerator.class value set when using SparkSQL to create a table does not finally take effect in hoodie.properties e.g: we want to set the value to 'ComplexKeyGenerator' but finally it is 'SimpleKeyGenerator' in hoodie.properties **To Reproduce** Steps to reproduce the behavior: 1. execute th create table ddl like : ``` spark.sql("drop table if exists `default`.`spark_0_12_1_test`") spark.sql( s"""| |CREATE TABLE IF NOT EXISTS `default`.`spark_0_12_1_test` ( | `id` INT | ,`name` STRING | ,`age` INT | ,`sync_time` TIMESTAMP |) USING HUDI |TBLPROPERTIES ( | `type` = 'mor' | ,`primaryKey` = 'id' | ,`preCombineField` = 'sync_time' | ,`hoodie.index.bucket.engine` = 'CONSISTENT_HASHING' | ,`hoodie.bucket.index.hash.field` = 'id' | ,`hoodie.index.type` = 'BUCKET' | ,`hoodie.datasource.write.hive_style_partitioning` = 'false' | ,`hoodie.storage.layout.type` = 'BUCKET' | ,`hoodie.storage.layout.partitioner.class` = 'org.apache.hudi.table.action.commit.SparkBucketIndexPartitioner' | ,`hoodie.datasource.write.keygenerator.class` = 'org.apache.hudi.keygen.ComplexKeyGenerator' | ,`hoodie.compaction.payload.class` = 'org.apache.hudi.common.model.OverwriteWithLatestAvroPayload' |) |COMMENT 'test_0.12.1' |PARTITIONED BY (id) |""".stripMargin ) ``` 2. View the hoodie.properties file and find that its value is not the value specified in the DDL ![image](https://user-images.githubusercontent.com/62231347/204973200-fc8ecc9b-c2bc-41aa-ba60-152f85234d6e.png) **Environment Description** * Hudi version : 0.12.1 * Spark version : 3.1.3 * Hive version : 3.1.1 * Hadoop version : 3.1.0 * Storage (HDFS/S3/GCS..) : HDFS * Running on Docker? (yes/no) : no -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org