Hi Javier, As far as I understand, the reason for this separation is that Connect has separate producers and consumers for its internal mechanics (like Kafka-backed configuration/status/offset storage) and for tasks. It will use some non-prefixed configurations for both tasks' consumers and producers (like "bootstrap.servers"), but the rest is pretty much should be prefixed with "consumer." or "producer." (code [1] <https://github.com/apache/kafka/blob/89f331eac3aaeab53a3b36bc437eba5f6213ca91/connect/runtime/src/main/java/org/apache/kafka/connect/runtime/Worker.java#L538> and [2] <https://github.com/apache/kafka/blob/89f331eac3aaeab53a3b36bc437eba5f6213ca91/connect/runtime/src/main/java/org/apache/kafka/connect/runtime/Worker.java#L569> is somebody is interested).
Br, Ivan On Tue, 4 Jun 2019 at 13:26, Javier Arias Losada <[email protected]> wrote: > Hello Andrew, > yes it fixed it! Thank you! I was missing the consumer. configuration... > so here we seem to be configuring different Kafka clients... what is the > reason for this? The configs prefixed with consumer. are for the consumers > I'm trying to connect with sinks? and the others are for internal > management of KafkaConnect, such as storing offsets, configs, etc? > Thx for your help > Javier arias > > El mar., 4 jun. 2019 a las 11:50, Andrew Schofield (< > [email protected]>) escribió: > > > Hi, > > The thing that always seem to catch people out with this is that it’s > > necessary to repeat the SSL/SASL configuration. > > > > For a sink connector, you need something like: > > security.protocol=SASL_SSL > > ssl.protocol=TLSv1.2 > > sasl.mechanism=PLAIN > > sasl.jaas.config=... > > > > And you also need the same with "consumer." prefixed on each of the > > configuration items: > > consumer.security.protocol=SASL_SSL > > consumer.ssl.protocol=TLSv1.2 > > consumer.sasl.mechanism=PLAIN > > consumer.sasl.jaas.config=.. > > > > Hope this helps. > > > > Andrew Schofield > > > > From: Javier Arias Losada <[email protected]> > > Reply-To: "[email protected]" <[email protected]> > > Date: Tuesday, 4 June 2019 at 10:36 > > To: "[email protected]" <[email protected]> > > Subject: KafkaConnect not consuming from SSL/SASL cluster > > > > Hello there, > > > > we are trying to use KafkaConnect, but it isn't consuming any messages > > after changing to a SSL and authenticated Kafka cluster. > > > > With a cluster without SSL or authentication it's working perfectly fine > > with the same configuration except for the sasl/ssl settings. > > > > I think probably is a small config error, but I've been struggling to fix > > it, so your help will be very much appreciated. > > > > No errors appear on logs, and it seems to connect properly since if I > > intentionally change connection parameters to wrong values > > (usr/pass/truststore or IPs) I see errors. > > > > In order to check the SSL configuration I've done > > kafka-console-producer.sh and kafka-console-consumer.sh with the very > same > > .properties and hosts successfully sending/receiving messages. > > > > In order to try to isolate the problem from the KafkaConnect Connector, I > > developed my own simple Connnector that just outputs calls to the > Console, > > but the behavior is the same: runs OK when connecting to NON-SSL brokers > > while does not receive messages or prints errors when connecting to a SSL > > enabled Broker. > > > > The plugin I was trying to use is aiven-kakfa-s3-connector and tried to > > run it with connect-distributed.sh > > > > Thank you very much. > > > > Best, > > Javier Arias > > > > > > > > >
