Hello,

I was hoping, that I could get some input from the experts.
We are currently discussing an architecture, where we are using raspberry
pi sized embedded computers connected via LTE (Southbound) to collect data
and need to forward this data from our system to different customer systems
(Northbound).

A colleague suggested to use Kafka as an interface for both the consumers
and producers in this scenario. In the proposed design we would have 6+
Kafka topics per app instance, and a small number of customers that would
have access to a set of those topics. The customers should not be able to
see each others data. The amount of data on those topics would probably be
well below 100MB for all topics combined per month.

I have three points of concern:
1. It seems to me, that although Kafka has the possibility to authorise
using SASL, this is not really a use case that Kafka is intended for, and
that it makes sense to have a facade in front of Kafka, and not have the
apps and the customers connect to the Kafka Cluster directly. (Similar to
Kafka Connect)
2. I had a quick look over the Kafka protocol, and it doesn't seem to have
unreliable mobile networks in mind, but rather is about having the Kafka
Brokers and the Producers in the same data center.
3. The topic structure with many topics with only small amount of low
velocity data, also seems kind of a bad fit to me

It would be great to get some feedback from you, to know if my concerns are
justified.

Best Regards and thanks for your support,
Chris

Reply via email to