Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Hans Jespersen
I believe that the new topics are picked up at the next metadata refresh which is controlled by the metadata.max.age.ms parameter. The default value is 30 (which is 5 minutes). -hans /** * Hans Jespersen, Principal Systems Engineer, Confluent Inc. * h...@confluent.io (650)924-2670 */ On

RE: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Haopu Wang
>From the document, the new topics are picked up after some period of time. My question is: how long is the duration before the new topics are detected and is the duration configurable? Much thanks! -Original Message- From: Matthias J. Sax [mailto:matth...@confluent.io] Sent:

Upgrading from kafka-0.8.1.1 to kafka-0.9.0.1

2016-11-09 Thread Divyajothi Baskaran
Hi, For the past 6 months,I am the dev for our solution written on top of kafka-0.8.1.1. It is in stable for us. We thought we would upgrade to kafka-0.9.0.1. With the server upgrade, we did not face any issues. We have our own solution built to extract the messages and write to different

Re: Leader became -1 and no ISR for all topics/patitions

2016-11-09 Thread Gwen Shapira
Mind sharing how you got to this situation? Did you restart brokers? did the replicas drop off the ISR one by one or all together? Do you have unclear leader election enabled? What are the errors you are seeing in the logs? Once none of the partitions are available and lacking any additional

Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Matthias J. Sax
-BEGIN PGP SIGNED MESSAGE- Hash: SHA512 Yes. If new topics are created that match the regex, Streams will automatically subscribed to them. - -Mathias On 11/9/16 3:11 PM, Haopu Wang wrote: > Hi, do you mean that the new matched topics should be consumed > after the regex subscription

RE: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Haopu Wang
Hi, do you mean that the new matched topics should be consumed after the regex subscription has been established? Thanks! -Original Message- From: Guozhang Wang [mailto:wangg...@gmail.com] Sent: 2016年11月10日 3:41 To: users@kafka.apache.org Subject: Re: Is it possible to resubcribe

RE: Kafka UTF 8 encoding problem

2016-11-09 Thread Radoslaw Gruchalski
It’s rather difficult to diagnose without having a minimum viable example. It’s either that the encoding used is not what the data is ancoded as or the data is actually utf-8 but the output (what you see after reading out of kafka) for whatever reason is not utf. Can you provide a simple unit test

Leader became -1 and no ISR for all topics/patitions

2016-11-09 Thread Karthi SsiSamsung
Hi, I am facing similar issue where couple of my partitions have a Leader as -1 and No ISR. I tried to use kafka-reassign-partitions.sh and kafka-preferred-replica-election.sh tool and this did not help as the ISR was empty. Other users who faced this issue suggesting broker restart. Does any

Re: Kafka performance on an ordinary machine

2016-11-09 Thread Kevin A
> As to what procedures I'd propose is spinning up a cluster and running some tests on it. This is really good advice. Capacity planning for your Kafka cluster is not particularly straightforward - especially if you intend to run a multi-tenant cluster. (If someone has a general model, please

Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Guozhang Wang
Timur, As Michael said one thing you can consider is the regex subscription if those topics have some common prefix / suffix / pattern. Another idea I can think of is to have two Streams app, one for reading the input topics and pipe them into the repartitioned intermediate topics that can be

Re: Understanding the topology of high level kafka stream

2016-11-09 Thread Matthias J. Sax
-BEGIN PGP SIGNED MESSAGE- Hash: SHA512 Hey, changelog topics are compacted topics and no retention time is applied (one exception are window-changelog topics though, which have both -- compaction and retention policy enabled) If an input message is purged via retention time (and this

Re: replica fetch error and shuabing

2016-11-09 Thread Guozhang Wang
Which version of Kafka are you using? On Mon, Nov 7, 2016 at 10:35 AM, Json Tu wrote: > Hi, when I move __consumer_offsets from old broker to new broker, we > encounter error as follow and it always shuabing. > server.log.2016-11-07-19:[2016-11-07 19:17:15,392] ERROR Found

Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Michael Noll
I am not aware of any short-term plans to support that, but perhaps others in the community / mailing list are. On Wed, Nov 9, 2016 at 11:15 AM, Timur Yusupov wrote: > Are there any nearest plans to support that? > > On Wed, Nov 9, 2016 at 1:11 PM, Michael Noll

Re: Understanding the topology of high level kafka stream

2016-11-09 Thread Sachin Mittal
Hi, What happens when the message itself is purged by kafka via retention time setting or something else, which was later than the last offset stored by the stream consumer. I am asking this because I am planning to keep the retention time for internal changelog topics also small so no message

Re: Understanding the topology of high level kafka stream

2016-11-09 Thread Eno Thereska
Hi Sachin, Kafka Streams is built on top of standard Kafka consumers. For for every topic it consumes from (whether changelog topic or source topic, it doesn't matter), the consumer stores the offset it last consumed from. Upon restart, by default it start consuming from where it left off from

RE: Kafka UTF 8 encoding problem

2016-11-09 Thread Baris Akgun (Garanti Teknoloji)
Hi I try to run with below parameters but again I face with same issue Properties props = new Properties(); props.put("metadata.broker.list", brokerList); props.put("serializer.class", encoder); //"kafka.serializer.StringEncoder" //props.put("partitioner.class",

Understanding the topology of high level kafka stream

2016-11-09 Thread Sachin Mittal
Hi, I had some basic questions on sequence of tasks for streaming application restart in case of failure or otherwise. Say my stream is structured this way source-topic branched into 2 kstreams source-topic-1 source-topic-2 each mapped to 2 new kstreams (new key,value pairs) backed

RE: Kafka UTF 8 encoding problem

2016-11-09 Thread Baris Akgun (Garanti Teknoloji)
H, @Ali , I tried base64 but it did not work. My original case, I collect the tweets that is in json format. And tweet text includes turkish characters. I will try the key.serializer.encoding properties and I will inform you Thanks, -Original Message- From: Radoslaw Gruchalski

Re: Kafka UTF 8 encoding problem

2016-11-09 Thread Radoslaw Gruchalski
Yes, understandandable, however, the OP mentions the data in UTF-8. If it’s not UTF, it needs to be converted to UTF. Or consider using value.serializer.encoding https://github.com/apache/kafka/blob/0.9.0/clients/src/main/java/org/apache/kafka/common/serialization/StringSerializer.java#L29 – Best

Re: Kafka UTF 8 encoding problem

2016-11-09 Thread Ali Akhtar
Its probably not UTF-8 if it contains Turkish characters. That's why base64 encoding / decoding it might help. On Wed, Nov 9, 2016 at 4:22 PM, Radoslaw Gruchalski wrote: > Are you sure your string is in utf-8 in the first place? > What if you pass your string via something

RE: Kafka UTF 8 encoding problem

2016-11-09 Thread Radoslaw Gruchalski
Are you sure your string is in utf-8 in the first place? What if you pass your string via something like: System.out.println( new String( args[0].getBytes(StandardCharsets.UTF8), StandardCharsets.UTF8) ) – Best regards, Radek Gruchalski ra...@gruchalski.com On November 9, 2016 at 12:14:03 PM,

RE: Kafka UTF 8 encoding problem

2016-11-09 Thread Baris Akgun (Garanti Teknoloji)
Hi, Producer Side// Properties props = new Properties(); props.put("metadata.broker.list", brokerList); props.put("serializer.class", “kafka.serializer.StringEncoder”); props.put("request.required.acks", "1"); Consumer side// I am using Spark Streaming Kafka API, I also try with Kafka CLI and

Re: Kafka performance on an ordinary machine

2016-11-09 Thread Majid Golshadi
Thank you. I have red these articles. Do you have any other documentation? On Wed, Nov 9, 2016 at 12:07 PM Karolis Pocius wrote: > There is no 'best' configuration, it all depends on your use cases, > expected load, etc. > > Documentation is a good place to start >

Re: Kafka UTF 8 encoding problem

2016-11-09 Thread Radoslaw Gruchalski
Baris, Kafka does not care about encoding, everything is transported as bytes. What’s the configueration of your producer / consumer? Are you using Java / JVM? – Best regards, Radek Gruchalski ra...@gruchalski.com On November 9, 2016 at 11:42:02 AM, Baris Akgun (Garanti Teknoloji) (

Re: Kafka UTF 8 encoding problem

2016-11-09 Thread Ali Akhtar
I would recommend base64 encoding the message on the producer side, and decoding it on the consumer side. On Wed, Nov 9, 2016 at 3:40 PM, Baris Akgun (Garanti Teknoloji) < barisa...@garanti.com.tr> wrote: > Hi All, > > We are using Kafka 0,9.0.0 and we want to send our messages to topic in >

Kafka UTF 8 encoding problem

2016-11-09 Thread Baris Akgun (Garanti Teknoloji)
Hi All, We are using Kafka 0,9.0.0 and we want to send our messages to topic in UTF-8 format but when we consume the messages from topic we saw that kafka does not keep the original utf-8 format and we did not see the messages exactly. For example our message that includes turkish characters

Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Timur Yusupov
Are there any nearest plans to support that? On Wed, Nov 9, 2016 at 1:11 PM, Michael Noll wrote: > This is not possible at the moment. However, depending on your use case, > you might be able to leverage regex topic subscriptions (think: "b*" to > read from all topics

Re: Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Michael Noll
This is not possible at the moment. However, depending on your use case, you might be able to leverage regex topic subscriptions (think: "b*" to read from all topics starting with letter `b`). On Wed, Nov 9, 2016 at 10:56 AM, Timur Yusupov wrote: > Hello, > > In our system

Is it possible to resubcribe KafkaStreams in runtime to different set of topics?

2016-11-09 Thread Timur Yusupov
Hello, In our system it is possible to add/remove topics in runtime and we are trying to use KafkaStreams for incoming messages processing. It is possible to resubscribe KafkaStreams instance to updated set of topics? For now I see the only way is to shutdown exiting KafkaStreams instance and

Re: Kafka performance on an ordinary machine

2016-11-09 Thread Karolis Pocius
There is no 'best' configuration, it all depends on your use cases, expected load, etc. Documentation is a good place to start https://kafka.apache.org/documentation.html#hwandos There are a few good benchmark articles. They're a bit dated by now, but still hold true in most cases: *