From: Awadhesh Gupta
Sent: Friday, September 29, 2017 7:05 AM
To: users@kafka.apache.org; manme...@gmail.com
Subject: Re: SSL setup in Kafka 2.10.0.10.2.1 for keystore and truststore files
Thanks M Manna.
I followed the steps to recreate the keystore & trusts
Writing to a file is much slower than reading from a Kafka topic, and I would
like to know if there is an existing connector to help do this. I need to write
a message almost as it is read from the kafka topic into a text file, and after
a while the writing falls well behind.Any advice or insigh
That's correct: If EOS is enabled, we enforce some producer configs:
https://github.com/apache/kafka/blob/0.11.0.1/streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java#L678-L688
https://github.com/apache/kafka/blob/0.11.0.1/streams/src/main/java/org/apache/kafka/streams/StreamsConfig
Hi,
I have the FileStreamSinkConnector working perfectly fine in a distributed mode
when only good messages are being sent to the input event topic.
However, if I send a message that is bad - for example, not in a correct JSON
format, and I am using the Json converter for keys/values as followin
See instruction at https://kafka.apache.org/contact
On Fri, Sep 29, 2017 at 7:01 AM, Alex.Chen wrote:
> subscription
>
subscription
Ah, I see - yes, that's one option, to just point the Connect to use an already
exiting and running Kafka cluster.
Would there be issues with potential version mis-match of dependencies between
Kafka Connect (that uses, say, newer libs from the Confluent 3.3 distribution)
and Apache Kafka of e
You can choose to run just kafka connect in the confluent platform (just
run the kafka connect shell script(s)) and configure the connectors to
point towards your kafka installation. The confluent platform uses vanilla
kafka under the covers, but there isn't anything requiring you to run kafka
foun
Thanks, Stephen,
Yes, I have no problem getting the Confluent distribution working.
But my goal right now is to use the vanilla Apache Kafka distribution which we
already have setup and fully automated in our environments. Switching to
another distribution would invalidate most of that work, and
You can set ProducerConfig.RETRIES_CONFIG in your StreamsConfig, i.e,
Properties props = new Properties();
props.put(ProducerConfig.RETRIES_CONFIG, Integer.MAX_VALUE);
...
On Fri, 29 Sep 2017 at 13:17 Sameer Kumar wrote:
> I guess once stream app are enabled exactly-once, producer idempotence g
The confluent platform download has everything pre-configured for
everything to work after unzipping the download, including any dependencies
that it needs to run. The shell script that starts up the kafka connect
worker ensures that everything that needs to be on the classpath is on the
classpath
Thanks, Matthias,
the "download" part is what I am not clear about
I don't see a separate download for just a connector, only for the whole
Confluent platform.
Is it just a matter of taking one jar, say, the
kafka-connect-elasticsearch-3.3.0.jar from the
.../confluent-3.3.0/share/java/kafka-
This normally means that the truststore in your producer doesn't contain a)
the public key of your broker or b) the public keys of the CA which signed
the broker key. With this error it didn't even get to the verification of
the client certificate yet. Looking at the blog post it looks like there i
I guess once stream app are enabled exactly-once, producer idempotence get
enabled by default and so do the retries. I guess producer retries are
managed internally and not exposed through streamconfig.
https://kafka.apache.org/0110/documentation/#streamsconfigs
-Sameer.
On Thu, Sep 28, 2017 at
Looks like Ralph logged KAFKA-4946 for this already.
On Fri, Sep 29, 2017 at 12:40 AM, Dong Lin wrote:
> Hi Kafka users,
>
> I am wondering if anyone is currently using feature from MX4J loader. This
> feature is currently enabled by default. But if kafka_mx4jenable is
> explicitly set to true i
Thanks M Manna.
I followed the steps to recreate the keystore & truststore for SSL setup on
both Client&Server machine and it is working fine if I run the client and
broker on same Linux host.
Problem starts when I publish the messages from Kafka Client deployed on
different Linux machine.
I en
Hi Kafka users,
I am wondering if anyone is currently using feature from MX4J loader. This
feature is currently enabled by default. But if kafka_mx4jenable is
explicitly set to true in the broker config, then broker will disable MX4J
load. And if kafka_mx4jenable is explicitly set to false in the
17 matches
Mail list logo