chenbodeng719 commented on issue #8279:
URL: https://github.com/apache/hudi/issues/8279#issuecomment-1492970849

   I use below conf to test bulk insert. There is only one parquet. Did I miss 
something? I expect 5 parquet. My dataset is about 120GB.
   ```
   
           CREATE TABLE hbase2hudi_sink(
               uid STRING PRIMARY KEY NOT ENFORCED,
               oridata STRING,
               update_time TIMESTAMP_LTZ(3)
           ) WITH (
               'table.type' = 'MERGE_ON_READ',
               'connector' = 'hudi',
               'path' = '%s',
               'write.operation' = 'bulk_insert',
               'precombine.field' = 'update_time',
               'write.tasks' = '2',
               'index.type' = 'BUCKET',
               'hoodie.bucket.index.hash.field' = 'uid',
               'hoodie.bucket.index.num.buckets' = '5'
           )
   
   ```
   <img width="835" alt="image" 
src="https://user-images.githubusercontent.com/104059106/229291867-c6c4f9fa-1183-4adb-838b-c72684868b6f.png";>
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to