Thanks Guozhang.

So there's no way we could also use InternalTopicManager to specify the
number of partitions and RF?

https://github.com/apache/kafka/blob/0.10.1/streams/src/main/java/org/apache/kafka/streams/processor/internals/InternalTopicManager.java



On 4 October 2016 at 19:25, Guozhang Wang <wangg...@gmail.com> wrote:

> Hello Gary,
>
> What you described should be workable with the lower-level Processor
> interface of Kafka Streams, i.e. dynamic aggregations based on the input
> data indicating changes to the JSON schemas. For detailed examples of how
> the Processor API works please read the corresponding sections on the web
> docs:
>
> http://docs.confluent.io/3.0.1/streams/developer-guide.html#processor-api
>
>
> Guozhang
>
> On Mon, Oct 3, 2016 at 6:51 AM, Gary Ogden <gog...@gmail.com> wrote:
>
> > I have a use case, and I'm wondering if it's possible to do this with
> > Kafka.
> >
> > Let's say we will have customers that will be uploading JSON to our
> system,
> > but that JSON layout will be different between each customer. They are
> able
> > to define the schema of the JSON being uploaded.
> >
> > They will then be able to define the fields in that JSON they want to
> > gather metrics on (sum, counts etc).
> >
> > Is there a way with Kafka streaming to dynamically read the configuration
> > for that customer and process the json and do counts and sums for the
> > fields they've defined.
> >
> > It's possible at any time they may want to modify the configuration for
> > their json as well. Stop counting one field, start counting another.
> >
> > They will also want to do some inferences as well. IE, if this particular
> > JSON is uploaded with a field in it, then check to see if another json
> was
> > uploaded within 8 hours.
> >
> > Is it possible for Kafka streaming to be this dynamic?
> >
>
>
>
> --
> -- Guozhang
>

Reply via email to