Hello.

First I thank you so much for the devs since they've been making a great
tool as open-source software.

I'm considering to apply a new feature of the Kafka, aka Kafka Streams, on
my simple handler application, which receives monitoring data from Collectd
and reproduce transformed messages to Kafka Broker(s). For example, I'd
want to change the collected message from Collectd like,

[{"values":[1901474177],"dstypes":["counter"],"dsnames":["value"],"time":1280959128,"interval":10,"host":"
leeloo.octo.it
","plugin":"cpu","plugin_instance":"0","type":"cpu","type_instance":"idle"}]

to my customized alarm message like,

{"alarmMsgType":"threshold", "time":1459436400000, "host":"leeloo.octo.it
","category":"CPU","type":"IDLE",
"detail":"0","alarmLevel":"critical","message":"cpu
error","value":"1901474177"}

of course, the re-produced message must be sent to Kafka Broker(s).

The problem is that, the message(s) from Collectd is Json-formatted so that
it seems the Kafka Streams processing would become complicated, i.e., it
should be JSONParsed from String to JSON and vise versa after transform.

Is it suitable to use the Kafka Stream for this kind of application?

Any better idea or comments would also really helpful for me. Thanks in
advance!

Best regards

KIM

Reply via email to