Hi Robin,

It would be helpful if you posted the full code you were trying to use. How
to seek largely depends on whether you are using new consumer in "simple"
or "group" mode. In simple mode, when you know the partitions you want to
consume, you should just be able to do something like the following:

consumer.assign(Arrays.asList(partition));
consumer.seek(partition, 500);

Then you can call poll() in a loop until you hit offset 1000 and stop. Does
that make sense?

-Jason


On Wed, Feb 17, 2016 at 11:39 AM, Alex Loddengaard <a...@confluent.io>
wrote:

> Hi Robin,
>
> I believe seek() needs to be called after the consumer gets its partition
> assignments. Try calling poll() before you call seek(), then poll() again
> and process the records from the latter poll().
>
> There may be a better way to do this -- let's see if anyone else has a
> suggestion.
>
> Alex
>
> On Wed, Feb 17, 2016 at 9:13 AM, Péricé Robin <perice.ro...@gmail.com>
> wrote:
>
> > Hi,
> >
> > I'm trying to use the new Consumer API with this example :
> >
> >
> https://github.com/apache/kafka/tree/trunk/examples/src/main/java/kafka/examples
> >
> > With a Producer I sent 1000 messages to my Kafka broker. I need to know
> if
> > it's possible, for example, to read message from offset 500 to 1000.
> >
> > What I did :
> >
> >
> >    -         consumer.seek(new TopicPartition("topic1", 0), 500);
> >
> >
> >    -         final ConsumerRecords<Integer, String> records =
> >    consumer.poll(1000);
> >
> >
> > But this didn't nothing (when I don't use seek() method I consume all the
> > messages without any problems).
> >
> > Any help on this will be greatly appreciated !
> >
> > Regards,
> >
> > Robin
> >
>
>
>
> --
> *Alex Loddengaard | **Solutions Architect | Confluent*
> *Download Apache Kafka and Confluent Platform: www.confluent.io/download
> <http://www.confluent.io/download>*
>

Reply via email to