Hi Shekar,
Alternatively, you could make each stage of your pipeline to write to a
Cassandra (or other DB) and your API will read from it. With Cassandra TTL, the
row will be deleted after TTL is passed. No manual cleanup is required.
Best regards / Mit freundlichen Grüßen / Sincères salutations
M. Lohith Samaga
-----Original Message-----
From: Shekar Tippur [mailto:[email protected]]
Sent: Wednesday, June 29, 2016 12.10
To: users
Subject: Building API to make Kafka reactive
I am looking at building a reactive api on top of Kafka.
This API produces event to Kafka topic. I want to add a unique session id into
the payload.
The data gets transformed as it goes through different stages of a pipeline. I
want to specify a final topic where I want the api to know that the processing
was successful.
The API should give different status at each part of the pipeline.
At the ingestion, the API responds with "submitted"
During the progression, the API returns "in progress"
After successful completion, the API returns "Success"
Couple of questions:
1. Is this feasible?
2. I was looking at project reactor (https://projectreactor.io) where the docs
talk about event bus. I wanted to see if I can implement a consumer that points
to the "end" topic and throws an event into the event bus.
Since I would know the session ID, I can process the request accordingly.
Appreciate your inputs.
- Shekar
Information transmitted by this e-mail is proprietary to Mphasis, its
associated companies and/ or its customers and is intended
for use only by the individual or entity to which it is addressed, and may
contain information that is privileged, confidential or
exempt from disclosure under applicable law. If you are not the intended
recipient or it appears that this mail has been forwarded
to you without proper authority, you are notified that any use or dissemination
of this information in any manner is strictly
prohibited. In such cases, please notify us immediately at
[email protected] and delete this mail from your records.