We've talked about it a little bit, but part of the problem is that it is
pretty well integrated into our infrastructure, and as such it's hard to
pull it out. I illustrated this a little differently than Jon did in my
latest blog post (http://engineering.linkedin.com/kafka/running-kafka-scale),
how the producer (and consumer) bits that handle audit are integrated in
our internal libraries that wrap the open source libraries. Between the
schema-registry, the publishing of the audit data back into Kafka, the
audit consumers, and the database that is needed for storing the audit
data, it gets woven in pretty tightly.

Confluent has made a start on this by releasing a stack with schemas
integrated in. This is probably a good place to start as far as building an
open source audit service.

-Todd


On Mon, Mar 23, 2015 at 12:47 AM, Navneet Gupta (Tech - BLR) <
navneet.gu...@flipkart.com> wrote:

> Are there any plans to open source the same? What alternates do we have
> here?
>
> We are building an internal auditing framework for our entire big data
> pipeline. Kafka is one of the data sources we have (ingested data).
>
> On Mon, Mar 23, 2015 at 1:03 PM, tao xiao <xiaotao...@gmail.com> wrote:
>
> > Linkedin has an excellent tool that monitors lag/data loss/data
> duplication
> > and etc. Here is the reference
> >
> >
> >
> http://www.slideshare.net/JonBringhurst/kafka-audit-kafka-meetup-january-27th-2015
> >
> > it is not open sourced though.
> >
> > On Mon, Mar 23, 2015 at 3:26 PM, sunil kalva <kalva.ka...@gmail.com>
> > wrote:
> >
> > > Hi
> > > What is best practice for adding audit feature in kafka, Is there any
> > > framework available for enabling audit feature at producer and consumer
> > > level and any UI frameworks for monitoring.
> > >
> > > tx
> > > SunilKalva
> > >
> >
> >
> >
> > --
> > Regards,
> > Tao
> >
>
>
>
> --
> Thanks & Regards,
> Navneet Gupta
>

Reply via email to