Hello Guys,

I’m trying to understand streaming processing in real life scenario. 
Yeah, we know systems produce data, these huge amount of data are sent to Kafka 
(or other platform), Kafka redistribute them to topics (sometimes enrich them 
with other information), ‘consumers’ use them in some way. Reports, frauds, 
e-commerce recommendations… blah, blah blah…

But this is quite simple scenario – one system sends, other receives. No 
interaction. Of course we have this ‘security’, ‘no single database’, 
‘scalability’… blah, blah, blah

But what if we need to design some interaction. For instance hotel booking. We 
can have several booking systems (several operators) – reservation made by one 
system for a room can is visible to others (others read content of topics). But 
when some system place reservation it needs information weather room is 
available or not and block other potential reservations to avoid conflicts. 

Looks like still one, central place with ‘traditional’ API is required? At 
least to manage essential information.
Similar information with e-commerce orders, stock availability and so on.
What’s the best system structure for ‘stream processing’ in such situations? 😊

Regards,

Mike

Reply via email to