Seek will do the trick. Just make sure that when you run it, it only runs
on partitions the current reader is assigned (call assignments() and filter
only the ones assigned to you now)

On Tue, Dec 6, 2016 at 12:30 PM Amit K <amitk....@gmail.com> wrote:

> Sorry for not providing complete information.
>
> I use the auto-commit. Most of the other properties are more or less the
> default one.
>
> Actually further analysis reveled that the records are consumed by consumer
> but some dependent component was down (unfortunately it went completely
> un-detected :( ). Hence now I need to reconsume them all for last 2 days.
>
> Will seek() be helpful, like having another application tuned to same topic
> and start consuming in it from approx offset that was there 2 days before?
>
> Thanks for help in advance!
>
> On Tue, Dec 6, 2016 at 3:35 PM, Asaf Mesika <asaf.mes...@gmail.com> wrote:
>
> > Do you use auto-commit or committing your self? I'm trying to figure out
> > how the offset moved if it was stuck.
> >
> > On Tue, Dec 6, 2016 at 10:28 AM Amit K <amitk....@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > Is there any way to re-consume the older records from Kafka broker with
> > > kafka consumer?
> > >
> > > I am using kafka 0.9.0.0 In one of the scenario, I saw records for 2
> days
> > > from today were not consumed as consumer was stuck. When the consumer
> > > restarted, it started processing records from today but older records
> for
> > > last 2 days are not processed.
> > >
> > > Is there any way to achieve the same?
> > > Any help will be highly appreciated.
> > >
> > > Thanks,
> > > Amit
> > >
> >
>

Reply via email to