Hi Apache Kafka developers!

Recently I've been more interested in the difference between APIs and Messaging 
in a distributed backend? To me, it seems like event-streaming is very fast and 
is ideal for hooking up all of the non-public facing internal systems of a 
project, allowing for great decoupling as well. Whereas Apis excel at providing 
interfaces for external facing systems, but are a bit slower.

However, in my experience, I've worked with tools like graphQL which also 
provide the great decoupling through the super-graph. It also provides 
end-to-end type safety, something that seems to be notably lacking from all 
major messaging systems (as far as I know?). It might be slower, but it's 
really hard to ignore the thriving ecosystem around api client generation (even 
for REST/openAPI) and tooling that messaging seems to be lacking.

What happened? Is end-to-end type safety and client generation tooling just not 
a much of a problem in messaging land? I somehow doubt it. Then has APIs won 
the backend mindshare, when you don't need high throughput, it's easier for 
most SMBs/SMEs to just use APIs then messaging? That seems... lackluster. I 
don't see much technical challenges there.

I wanted to get some perspective from the developers of Kafka on how they see 
this ecosystem and trends. No major players in messaging seem to offer 
end-to-end message type safety, MassTransist, SignalR, Kafka to name a few I've 
looked for solutions but haven't found anything.

And having to have a common code base doesn't work when the distributed 
services aren't always in the same language.

I'd love to hear some feedback on this,

Sincerely puzzled and interested,

Liam Gryphon

Reply via email to