Multiple Leaders on single Partition in Kafka Cluster

2018-10-29 Thread Weichu Liu
Hi, We recently saw a split-brain behavior happened in production. There were a controller switch, and then unclean leader elections. It led to 24 partitions (out of 70) had 2 leaders for 2 days until I restarted the broker. Producers and consumers were talking to different "leaders" so the recor

Get count of messages

2018-10-29 Thread Sachit Murarka
Hi All, Could you please help me in getting count of all messages stored in kafka from a particular offset? I have tried GetOffsetShell command, it is not giving me. Kind Regards, Sachit Murarka

Re: Converting a Stream to a Table - groupBy/reduce vs. stream.to/builder.table

2018-10-29 Thread Patrik Kleindl
Hi John and Matthias thanks for the questions, maybe explaining our use case helps a bit: We are receiving CDC records (row-level insert/update/delete) in one topic per table. The key is derived from the DB records, the value is null in case of deletes. Those would be the immutable facts I guess. T

How to create kafka producer by .Net framework in Kerberos-enabled Kafka cluster?

2018-10-29 Thread 張雅涵
Hi, everyone, As titled, I'm dealing with the problem "How to create kafka producer by .Net framework in Kerberos-enabled Kafka cluster?" And I need some help. Does anyone have similar experience and kindly provide some suggestion? Best, *Yvonne Chang 張雅涵

Having issue with Kafka partition

2018-10-29 Thread Karthick Kumar
Hi All, Recently I faced the Below issue WARN Fetcher:679 - Received unknown topic or partition error in ListOffset request for partition messages-2-1. The topic/partition may not exist or the user may not have Describe access to it My consumer is not working for the particular topic. When I ch

Problem with kafka-streams aggregate windowedBy

2018-10-29 Thread Pavel Koroliov
Hi everyone! I use kafka-streams, and i have a problem when i use windowedBy. Everything works well until I restart the application. After restarting my aggregation starts from beginning. Code bellow: > > StreamsBuilder builder = new StreamsBuilder() > KStream stream = builder.stream(topic,

Kubernetes: Using Load Balancers

2018-10-29 Thread Phillip Neumann
Hi all! I was trying to deploy a little Kafka cluster over Kubernetes (k8s), and noticed this wonderful helm charts from here https://github.com/confluentinc/cp-helm-charts The Kafka chart included there, will try to expose the brokers so they can be accesible not only from within the k8s cluster,

Re: Problem with kafka-streams aggregate windowedBy

2018-10-29 Thread Patrik Kleindl
Hi Does your applicationId change? Best regards Patrik > Am 29.10.2018 um 13:28 schrieb Pavel Koroliov : > > Hi everyone! I use kafka-streams, and i have a problem when i use > windowedBy. Everything works well until I restart the application. After > restarting my aggregation starts from beginn

Re: Problem with kafka-streams aggregate windowedBy

2018-10-29 Thread Pavel Koroliov
Hi No, my application id doesn't change пн, 29 окт. 2018 г. в 19:11, Patrik Kleindl : > Hi > Does your applicationId change? > Best regards > Patrik > > > Am 29.10.2018 um 13:28 schrieb Pavel Koroliov : > > > > Hi everyone! I use kafka-streams, and i have a problem when i use > > windowedBy. Ever

Re: Problem with kafka-streams aggregate windowedBy

2018-10-29 Thread Patrik Kleindl
Hi How long does your application run? More than the 60 seconds you set for commit interval? Have a look at https://sematext.com/opensee/m/Kafka/uyzND1SdDRMgROjn?subj=Re+Kafka+Streams+why+aren+t+offsets+being+committed+ and check if your offsets are really comitted Best regards Patrik > Am 29.

Error when using mockSchemaregistry

2018-10-29 Thread chinchu chinchu
Hey folks, I am getting the below exception when using a mockSchemaRegsitry in a junit test . I appreciate your help.I am using confluent 4.0 private SchemaRegistryClient schemaRegistry; private KafkaAvroSerializer avroSerializer; Properties defaultConfig = new Properties(); defaultConfi

Re: Get count of messages

2018-10-29 Thread Burton Williams
you can user kafkacat starting at that offset to head and pipe the output to "wc -l" (word count). -BW On Mon, Oct 29, 2018 at 3:39 AM Sachit Murarka wrote: > Hi All, > > Could you please help me in getting count of all messages stored in kafka > from a particular offset? > I have tried GetOffs

Re: Problem with kafka-streams aggregate windowedBy

2018-10-29 Thread Matthias J. Sax
Make sure to call `KafkaStreams#close()` to get the latest offsets committed. Beside this, you can check the consumer and Streams logs in DEBUG mode, to see what offset is picked up (or not). -Matthias On 10/29/18 11:43 AM, Patrik Kleindl wrote: > Hi > How long does your application run? More t

Re: Error when using mockSchemaregistry

2018-10-29 Thread Matthias J. Sax
You set: > senderProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,avroSerializer.getClass()); This will tell the Producer to create a new AvroSerailizer object, and this object expects "schema.registry.url" to be set during initialization, ie, you need to add the config to `senderProps`. H

Re: Get count of messages

2018-10-29 Thread Matthias J. Sax
That is quite expensive to do... Might be best to write a short Java program that uses Consumer#endOffset() and Consumer#beginningOffsets() -Matthias On 10/29/18 3:02 PM, Burton Williams wrote: > you can user kafkacat starting at that offset to head and pipe the output > to "wc -l" (word count)

Re: Error when using mockSchemaregistry

2018-10-29 Thread chinchu chinchu
Thanks Mathias . How do I deal with this scenario in the case of a consumer that expects a specific record type ?.I am trying to write an integration test for a scenario for the below scenairo.All of these uses an avro object as the value for producer record .I am kind of able to write this

Re: Error when using mockSchemaregistry

2018-10-29 Thread Matthias J. Sax
You can use `ByteArrayDeserializer` to get `byte[]` key and value from the consumer and deserialize both "manually" using AvorDeserializer that is created with MockSchemaRegistry. `ConsumerRecord` also give you the topic name for each record via #topic(). -Matthias On 10/29/18 4:03 PM, chinchu ch