Kafka Mirror Maker 2 - source topic keep getting created

2021-08-10 Thread AlR
I have 2 Kafka setup, A and B. A is a cluster with 3 instances running on the same machine, while B is a stand alone in another machine. I tried to use Mirror Maker to replicate from A to B. The config file is as follows: - clusters = A, BA.bootstrap.servers = host1:9091, host1:9092,

Re: LeftJoin after Map operation

2021-08-10 Thread Matthias J. Sax
Glad you figured it out. If you think the docs should be improved, we take PRs :) -- In the end, it's a broader community effort to have good docs... -Matthias On 6/11/21 12:10 PM, Richard Rossel wrote: > Thanks Matthias for your help, > I was able to find the issue, it was actually my Serdes

Re: KSQLdb Stream, Getting Topic Key

2021-08-10 Thread Daniel Hinojosa
I had to verify myself, I wrote an example that uses a key that is not in the payload and I did where I specify the KEY and now it is a field. Notice the state down below CREATE STREAM my_avro_orders (total BIGINT, shipping VARCHAR, state VARCHAR KEY, discount DOUBLE, gender VARCHAR) WITH

Re: KSQLdb Stream, Getting Topic Key

2021-08-10 Thread Daniel Hinojosa
Ah, nevermind, just saw the title that this is KSQLDB. On Tue, Aug 10, 2021 at 12:09 PM Daniel Hinojosa < dhinoj...@evolutionnext.com> wrote: > The keys are already part of the stream. When you run builder.stream or > builder.table it returns a Stream or a Table. From there every > operation has

Re: KSQLdb Stream, Getting Topic Key

2021-08-10 Thread Daniel Hinojosa
The keys are already part of the stream. When you run builder.stream or builder.table it returns a Stream or a Table. From there every operation has a lambda that accepts both key and value. You can use map for example to accept the key and do something with that. Let me know if you have any other

KSQLdb Stream, Getting Topic Key

2021-08-10 Thread Greer, Andrew C
Hello, I am trying to create a Stream that will accept the data from my topic and be able to use the message keys in the stream as unique identifiers for the sensors the data originated from. The data in the messages does not have anything that would be able to identify which sensor it came

High disk read with Kafka streams

2021-08-10 Thread mangat rai
Hey All, We are using the low level processor API to create kafka stream applications. Each app has 1 or more in-memory state stores with caching disabled and changelog enabled. Some of the apps also have global stores. We noticed from the node metrics (kubernetes) that the stream applications

Re: New To Kafka - How to Start

2021-08-10 Thread Yu Watanabe
Hello . How about starting with 'quick start' ? https://kafka.apache.org/quickstart Thanks, Yu Watanabe On Tue, Aug 10, 2021 at 10:44 AM Gilbert Flores < gilbert.flo...@primergrp.com> wrote: > Hi, > > Good day to you. We are on the stage of exploring integration systems and > we have read and

Re: Kafka Streams Handling uncaught exceptions REPLACE_THREAD

2021-08-10 Thread Yoda Jedi Master
Hi Bruno, thank you for your answer. I mean that the message that caused the exception was consumed and replaced thread will continue from the next message. How then does it handle uncaught exceptions, if it will fail again? On Tue, Aug 10, 2021 at 12:33 PM Bruno Cadonna wrote: > Hi Yoda, > >

Re: Kafka Streams Handling uncaught exceptions REPLACE_THREAD

2021-08-10 Thread Yoda Jedi Master
Hi Luke, thank you for your answer. I will try it, I think I will set an alert if there are too many messages. To ignore the message should I simply return "replace_thread" in the handler? On Tue, Aug 10, 2021 at 12:16 PM Luke Chen wrote: > Hi Yoda, > For your question: > > If an application

Re: Kafka Streams Handling uncaught exceptions REPLACE_THREAD

2021-08-10 Thread Bruno Cadonna
Hi Yoda, What do you mean exactly with "skipping that failed message"? Do you mean a record consumed from a topic that caused an exception that killed the stream thread? If the record killed the stream thread due to an exception, for example, a deserialization exception, it will probably

Re: Kafka Streams Handling uncaught exceptions REPLACE_THREAD

2021-08-10 Thread Luke Chen
Hi Yoda, For your question: > If an application gets an uncaught exception, then the failed thread will be replaced with another thread and it will continue processing messages, skipping that failed message? --> Yes, if everything goes well after `replace thread`, you can ignore this failed

Kafka Streams Handling uncaught exceptions REPLACE_THREAD

2021-08-10 Thread Yoda Jedi Master
"REPLACE_THREAD - Replaces the thread receiving the exception and processing continues with the same number of configured threads. (Note: this can result in duplicate records depending on the application’s processing mode determined by the PROCESSING_GUARANTEE_CONFIG value)" If an application

Kafka metrics to calculate number of messages in a topic

2021-08-10 Thread Dhirendra Singh
Hi All, I have a requirement to display the total number of messages in a topic in grafana dashboard. I am looking at the metrics exposed by kafka broker and came across the following metrics. kafka_log_log_logendoffset kafka_log_log_logstartoffset My understanding is that if I take the