Not that I know of.

On Fri, Dec 5, 2014 at 9:44 AM, Sa Li <sal...@gmail.com> wrote:

> Thanks, Neha, is there a java version batch consumer?
>
> thanks
>
>
>
> On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <sc...@heroku.com> wrote:
>
> > if you are using scala/akka this will handle the batching and acks for
> you.
> >
> > https://github.com/sclasen/akka-kafka#akkabatchconsumer
> >
> > On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <sal...@gmail.com> wrote:
> >
> > > Thank you very much for the reply, Neha, I have a question about
> > consumer,
> > > I consume the data from kafka and write into DB, of course I have to
> > create
> > > a hash map in memory, load data into memory and bulk copy to DB instead
> > of
> > > insert into DB line by line. Does it mean I need to ack each message
> > while
> > > load to memory?
> > >
> > > thanks
> > >
> > >
> > >
> > > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <n...@confluent.io>
> wrote:
> > >
> > > > This is specific for pentaho but may be useful -
> > > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > > >
> > > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sal...@gmail.com> wrote:
> > > >
> > > > > Hello, all
> > > > >
> > > > > I never developed a kafka consumer, I want to be able to make an
> > > advanced
> > > > > kafka consumer in java to consume the data and continuously write
> the
> > > > data
> > > > > into postgresql DB. I am thinking to create a map in memory and
> > > getting a
> > > > > predefined number of messages in memory then write into DB in
> batch,
> > is
> > > > > there a API or sample code to allow me to do this?
> > > > >
> > > > >
> > > > > thanks
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > Alec Li
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Thanks,
> > > > Neha
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > Alec Li
> > >
> >
>
>
>
> --
>
> Alec Li
>



-- 
Thanks,
Neha

Reply via email to