Hi Lei

For my understanding, from the github page you provided.  the 
kafka-connect-jdbc project is licensed under the Confluent Community 
License<https://github.com/confluentinc/kafka-connect-jdbc/blob/master/LICENSE>.

The project has different support channels.

"For more information, check the documentation for the JDBC connector on the 
confluent.io<https://docs.confluent.io/current/connect/kafka-connect-jdbc/index.html>
 website. Questions related to the connector can be asked on Community 
Slack<https://launchpass.com/confluentcommunity> or the Confluent Platform 
Google Group<https://groups.google.com/forum/#!topic/confluent-platform/>."


________________________________
From: wangl...@geekplus.com.cn <wangl...@geekplus.com.cn>
Sent: Monday, May 11, 2020 6:10 AM
To: users <users@kafka.apache.org>
Subject: How actually jdbc sink connetor run

doc:  
https://docs.confluent.io/3.1.1/connect/connect-jdbc/docs/sink_connector.html
github code:  https://github.com/confluentinc/kafka-connect-jdbc

I  glance over the code.
Seems  the actually worker is JdbcSinkTask.  After   put(Collection<SinkRecord> 
records) It will generate sql and execute it

I  try to implement it using java code instead of  kafka connect configuration.
Receive avro serialized kafka record and sent to  JdbcSinkTask.

Is there any document that will tells me some details or some example code?

Thanks,
Lei



wangl...@geekplus.com.cn

Reply via email to