Hi All,

I was going through the K-Connect stubs created by Chris in the kafka
feature branch.

Some of the findings I found are here(let me know if they are valid or not):

1)
https://github.com/apache/incubator-plc4x/blob/feature/apache-kafka/integrations/apache-kafka/src/main/java/org/apache/plc4x/kafka/source/Plc4xSourceTask.java#L98

Should this block of code be within an infinite loop like while(true)? I am
not exactly sure of the semantics of the PlcReader hence asking this
question.

2) Another question is, what are the maxTasks that we envision here?
https://github.com/apache/incubator-plc4x/blob/feature/apache-kafka/integrations/apache-kafka/src/main/java/org/apache/plc4x/kafka/Plc4xSourceConnector.java#L46

Also, as part of documentation, there's a utility called ConnectorUtils
which typically should be used to create the configs(not a hard and fast
rule though):

https://docs.confluent.io/current/connect/javadocs/index.html?org/apache/kafka/connect/util/ConnectorUtils.html

If we go that route, then we also need to specify how the offsets would be
stored in the offsets topic(by using the task name). So, if it can be
figured out as to how would the connectors be setup, then that'll be
helpful.

3) While building the SourceRecord ->

https://github.com/apache/incubator-plc4x/blob/feature/apache-kafka/integrations/apache-kafka/src/main/java/org/apache/plc4x/kafka/source/Plc4xSourceTask.java#L109

, we would also need some DataConverter layer to have them mapped to the
connect types. Also, which message types would be supported? Json or binary
protocols like Avro/protobuf etc or some other protocols? Those things
might also need to be factored in.

4) Lastly, need to remove the JdbcSourceTask from the catch block here :) ->

https://github.com/apache/incubator-plc4x/blob/feature/apache-kafka/integrations/apache-kafka/src/main/java/org/apache/plc4x/kafka/source/Plc4xSourceTask.java#L67

Thanks!
Sagar.

Reply via email to