Thanks a lot Micheal. I used WallClockTimeStampExtractor for now. Thanks, Vivek
> On Jul 8, 2016, at 1:25 AM, Michael Noll <mich...@confluent.io> wrote: > > Vivek, > > in this case you should manually embed a timestamp within the payload of > the produced messages (e.g. as a Long field in an Avro-encoded message > value). This would need to be done by the producer. > > Then, in Kafka Streams, you'd need to implement a custom > TimestampExtractor that can retrieve this timestamp from the message > payload. And you need to configure your StreamsConfig to use that custom > timestamp. > > Hope this helps, > Michael > > > >> On Thursday, July 7, 2016, vivek thakre <vivek.tha...@gmail.com> wrote: >> >> Thats right Ismael, I am looking for work arounds either on 0.9.0.1 >> Producer side or on the Kafka Streams side so that I can process messages >> produced by 0.9.0.1 producer using Kafka Streams Library. >> >> Thanks, >> Vivek >> >> On Thu, Jul 7, 2016 at 9:05 AM, Ismael Juma <ism...@juma.me.uk >> <javascript:;>> wrote: >> >>> Hi, >>> >>> Matthias, I think Vivek's question is not whether Kafka Streams can be >> used >>> with a Kafka 0.9 broker (which it cannot). The question is whether Kafka >>> Streams can process messages produced with a 0.9.0.1 producer into a >>> 0.10.0.0 broker. Is that right? If so, would a custom TimestampExtractor >>> work? >>> >>> Ismael >>> >>> On Thu, Jul 7, 2016 at 12:29 PM, Matthias J. Sax <matth...@confluent.io >> <javascript:;>> >>> wrote: >>> >>>> Hi Vivek, >>>> >>>> Kafka Streams works only with Kafka 0.10 (but not with 0.9). >>>> >>>> I am not aware of any work around to allow for 0.9 usage. >>>> >>>> >>>> -Matthias >>>> >>>>> On 07/07/2016 05:37 AM, vivek thakre wrote: >>>>> Can kafka streams library work with the messages produced by 0.9.0.1 >>>>> producer? >>>>> I guess not since the old producer would not add timestamp. ( I am >>>> getting >>>>> invalid timestamp exception) >>>>> >>>>> As I cannot change our producer application setup, I have to use >>> 0.9.0.1 >>>>> producer. >>>>> Is there a workaround that I can try to use Kafka Streams? >>>>> >>>>> Thanks, >>>>> Vivek > > > -- > Best regards, > Michael Noll > > > > *Michael G. Noll | Product Manager | Confluent | +1 650.453.5860Download > Apache Kafka and Confluent Platform: www.confluent.io/download > <http://www.confluent.io/download>*