Re: To find the Lag of consumer offset using kafka client library

2016-10-01 Thread Anish Mashankar
Yes. kafka.admin helps. You can create an application that resembles the
ConsumerGroupCommand.scala to get consumer offsets for both old and new
consumers.

On Thu, 29 Sep 2016, 8:17 p.m. Gourab Chowdhury, 
wrote:

> Thanks for your suggestion, I had previously read about Yahoo Kafka monitor
> as suggested some where.
>
> What I actually need is function/class in kafka java libaray (if any) that
> helps to find the lag and other details? Can kafka.admin help in this
> matter?
>
> A code snippet equivalent to:-
> bin/kafka-consumer-groups.sh --zookeeper localhost:2182 --describe --group
> DemoConsumer
> bin/kafka-run-class.sh kafka.admin.ConsumerGroupCommand --zookeeper
> localhost:2182 --describe --group DemoConsumer
>
> Thanks
> Gourab
>
> On Thu, Sep 29, 2016 at 7:19 PM, Jan Omar  wrote:
>
> >
> > Hi Gourab,
> >
> > Check this out:
> >
> > https://github.com/linkedin/Burrow 
> >
> > Regards
> >
> > Jan
> >
> > > On 29 Sep 2016, at 15:47, Gourab Chowdhury 
> wrote:
> > >
> > > I can get the *Lag* of offsets with the following command:-
> > >
> > > bin/kafka-run-class.sh kafka.admin.ConsumerGroupCommand --zookeeper
> > > localhost:2182 --describe --group DemoConsumer
> > >
> > > I am trying to find code that uses kafka library to find the *Lag* of
> > > offsets in consumer?
> > >
> > > Also is there any other documentation other than
> > https://github.com/apache/
> > > kafka/tree/trunk/docs? I can't find much documentation of kafka.
> > >
> > > Thanks,
> > > Gourab Chowdhury,
> > > Software Engg. JunctionTV Inc.
> >
> >
>


Kafka streams Processor life cycle behavior of close()

2016-10-01 Thread Srikanth
Hello,

I'm testing out a WriteToSinkProcessor() that batches records before
writing it to a sink.
The actual commit to sink happens in punctuate(). I also wanted to commit
in close().
Idea here is, during a regular shutdown, we'll commit all records and
ideally stop with an empty state.
My commit() process is 3 step 1) Read from KV store 2) write to sink 3)
delete written keys from KV store.

I get this exception when closing though. It looks like the kafka producer
is closed before the changelog topic is updated after close().
Should the producer be closed after all tasks and processors are closed?

16/10/01 17:01:15 INFO StreamThread-1 WriteToSinkProcessor: Closing
processor instance
16/10/01 17:01:16 ERROR StreamThread-1 StreamThread: Failed to remove
stream tasks in thread [StreamThread-1]:
java.lang.IllegalStateException: Cannot send after the producer is closed.
at
org.apache.kafka.clients.producer.internals.RecordAccumulator.append(RecordAccumulator.java:173)
at
org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:467)
at
org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:430)
at
org.apache.kafka.streams.processor.internals.RecordCollector.send(RecordCollector.java:84)
at
org.apache.kafka.streams.processor.internals.RecordCollector.send(RecordCollector.java:71)
at
org.apache.kafka.streams.state.internals.StoreChangeLogger.logChange(StoreChangeLogger.java:108)
at
org.apache.kafka.streams.state.internals.InMemoryKeyValueLoggedStore.flush(InMemoryKeyValueLoggedStore.java:161)
at
org.apache.kafka.streams.state.internals.MeteredKeyValueStore.flush(MeteredKeyValueStore.java:165)
at
org.apache.kafka.streams.processor.internals.ProcessorStateManager.close(ProcessorStateManager.java:343)
at
org.apache.kafka.streams.processor.internals.AbstractTask.close(AbstractTask.java:112)
at
org.apache.kafka.streams.processor.internals.StreamTask.close(StreamTask.java:317)

Srikanth


Re: Kafka streaming and topic filter whitelist

2016-10-01 Thread Damian Guy
That is correct.

On Fri, 30 Sep 2016 at 18:00 Gary Ogden  wrote:

> So how exactly would that work? For example, I can currently do this:
>
> KStream
> textLines = builder.stream(stringSerde, stringSerde, SYSTEM_TOPIC);
>
> Are you saying that I could put a regex in place of the SYSTEM_TOPIC and
> that one KStream would be streaming from multiple topics that match that
> regex?
>
> If so, that could be useful.
>
> Gary
>
>
> On 30 September 2016 at 13:35, Damian Guy  wrote:
>
> > Hi Gary,
> >
> > In the upcoming 0.10.1 release you can do regex subscription - will that
> > help?
> >
> > Thanks,
> > Damian
> >
> > On Fri, 30 Sep 2016 at 14:57 Gary Ogden  wrote:
> >
> > > Is it possible to use the topic filter whitelist within a Kafka
> Streaming
> > > application? Or can it only be done in a consumer job?
> > >
> >
>


Re: list of stream or consumer objects per topic

2016-10-01 Thread Matthias J. Sax
Please see my SO answer:
https://stackoverflow.com/questions/39799293/kafka-new-api-0-10-doesnt-provide-a-list-of-stream-and-consumer-objects-per-top/39803689#39803689

-Matthias

On 09/30/2016 12:13 PM, Mudassir Maredia wrote:
> I am using kafka api 0.10.
> 
> //Sample code
>  List topicsList = new ArrayList<>();
>  topicsList.add("topic1");
>  topicsList.add("topic2");
> 
>   KafkaConsumer consumer = new KafkaConsumer(props);
>  consumer.subscribe(topicsList);
> 
> Problem:
> 
> For each topic, I want to spawn a separate thread who would be handling
> data on it. It seems like for that to achieve I have to create multiple
> KafkaConsumer. I don't want to do that. Does anyone have any idea how to
> achieve that simply.
> 
> Previously, in 0.8 version if have used createMessageStreams method which
> returns collection of kafkaStream (one for each topic). I want some thing
> similar to that.
> 
> //0.8 code sample
> Map>> consumerMap = consumer.
> createMessageStreams(topicCountMap);
> 
> 
> Thanks,
> 
> Moody
> 



signature.asc
Description: OpenPGP digital signature