Or introduce an app layer between the producers and kafka that does the
processing without changes/load to the producers.


On Thu, Mar 13, 2014 at 1:18 PM, Neha Narkhede <neha.narkh...@gmail.com>wrote:

> In general, the preference has been to avoid having user code run on the
> brokers since that just opens a can of worms where the broker logic get's
> complicated trying to deal with errors that the user code can throw. The
> suggestion is to push any user specific processing to the client side. In
> this case, you can imagine a producer that encrypts sensitive data before
> sending it to a topic on the broker.
>
> Thanks,
> Neha
>
>
> On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <johan.lund...@gmail.com
> >wrote:
>
> > Hi,
> >
> > I have a use case for which it would be useful with pluggable processing
> > functions in the broker.
> >
> > We have some data containing sensitive information which is legally ok to
> > transmit over the internal network to the Kafka brokers and keep in
> > volatile memory but not to flush to disk unconcealed/unencrypted. The
> > application server resources are too scarce and critical to handle this
> > processing so we must do it elsewhere.
> >
> > To cope with this, I'm looking for a way to plug a "concealer" somewhere
> > near KafkaApis.handleProducerRequest before anything has been flushed to
> > disk but I imagine that other people might come up with ideas where
> > plugging in custom functions would be interesting as well. My case might
> be
> > relatively specific but has the general idea of user plugins in different
> > areas of the broker ever been discussed?
> >
>

Reply via email to