Hi, 请问 1. 有完整的异常栈吗? 你是怎么从 ck 恢复的呢? 用的什么命令? 2. 是的。因为 source 只能并发1。先写到 kafka,再从 kafka 同步是可以的。
Best, Jark On Fri, 11 Sep 2020 at 17:56, 引领 <yrx73...@163.com> wrote: > > > 1、在checkpoint后,用ck恢复时报错。 > org.apache.kafka.connect.errors.ConnectException: > com.github.shyiko.mysql.binlog.event.deserialization.EventDataDeserializationException: > Failed to deserialize data ofEventHeaderV4{timestamp=1599815908000, > eventType=EXT_UPDATE_ROWS, serverId=501, headerLength=19, dataLength=25879, > nextPosition=721073164, flags=0} > 2、关于flink cdc读取数据后,并执行join【加载维表的操作】后,写入mysql中。并发调不上去,一直是1 > 我已在配置文件中做了相应的设置,包括sql-client中 > taskmanager.numberOfTaskSlots: 5 # The parallelism used for > programs that did not specify and other parallelism. > parallelism.default: 5 > > > 我的sql是: > > > Insert into orders Select * from order o join sku s FOR SYSTEM_TIME as > of o.proc_time s on o.sku_id = s.id > > > 提前感谢各位大佬回复 > > > > > >