Hello All,

This is my first question into this mailing list and am looking for some 
insights from more experienced Kafka users.
We are currently setting up a Apache Kafka cluster to create an event 
based/stream processing system.
In our system we have a web based UI for doing certain operations manually, 
similar to the standard automated processing.

We were thinking of tackling the UI like this.
1. UI HTTP request comes in, and gets written to a ‘request’ topic.
2. The standard processing gets the request from the ‘request’ topic and 
processes it asynchronously.
3. Once the standard processing is done, the response is written to a 
‘response’ topic.
4. We only return an HTTP response once we see the corresponding response on 
the ‘response’ topic.

Basically:

        normal automated producer ——> request topic ——> processor (kafka 
consumer) ——> response topic

        request HTTP ——> request topic 
                <—— wait for corresponding event on response topic
        <—— respond HTTP

Mapping request response would be doing by a unique id (maybe offset).

Not sure if this is a good approach, right way of doing things.
All thoughts are welcome.

Regards,
Bruno Rassaerts | Freelance Java Developer

Novazone, Edingsesteenweg 302, B-1755 Gooik, Belgium
T: +32(0)54/26.02.03 - M: +32(0)477/39.01.15
[email protected] - www.novazone.be



Reply via email to