Hi team, 1.When executing the load data command in the source MySQL, which involves a CSV file with 1 million records, I noticed that after the execution completes, I see 1 million individual insert operations being performed in Seatunnal to synchronize the data with the destination MySQL. My concern is whether there is an option to perform batch insert operations in Seatunnal instead of individual inserts for better efficiency.
2.In the scenario where the Seatunnal application goes down while bulk loading or performing insert operations, I encountered a synchronization mismatch between the source and sink MySQL databases after restarting the Seatunnal application. The following error is reported: Caused by: io.debezium.DebeziumException: A replica with the same server_uuid/server_id as this replica has connected to the source; the first event 'binlog.000003' at 324102180, the last event read from './binlog.000003' at 126, the last byte read from './binlog.000003' at 324102180. Error code: 1236; SQLSTATE: HY000. configuration file is attached below
mysql-cdc-exactly-once-test.conf
Description: Binary data
