Hi all,

I am considering adopting an "event sourcing" architecture for a system I
am developing and Kafka seems like a good choice of store for events.

For those who aren't aware, this architecture style consists in storing all
state changes of the system as an ordered log of events and building
derivative views as needed for easier querying (using a SQL database for
example). Those views must be completely derived from the event log alone
so that the log effectively becomes a "single source of truth".

I was wondering if anyone else is using Kafka for that purpose and more
specifically:

1) Can Kafka store messages permanently?

2) Let's say I throw away my derived view and want to re-build it from
scratch, is it possible to consume messages from a topic from its very
first message and once it has caught up, listen for new messages like it
would normally do?

2) Does it support transactions? Let's say I want to push 3 messages
atomically but the producer process crashes after sending only 2 messages,
is it possible to "rollback" the first 2 messages (e.g. "all or nothing"
semantics)?

3) Does it support request/response style semantics or can they be
simulated? My system's primary interface with the outside world is an HTTP
API so it would be nice if I could publish an event and wait for all the
internal services which need to process the event to be "done"
processing before returning a response.

PS: I'm a Node.js/Go developer so when possible please avoid Java centric
terminology.

Thanks!

- Oli

-- 
- Oli

Olivier Lalonde
http://www.syskall.com <-- connect with me!

Reply via email to