Hi, all. It's looking like some of the machines I wanted to feed into Heka are too old to run Heka. Most of the data I wanted from them are in /var/log. One of the reasons why I wanted to use Heka is that it is supposed to do an exceptional job with log file handling, so that's particularly tragic. ;-) Now I'm figuring out how to get the log file data and send it to Heka.
What is a comparably robust tool other than Heka for watching log files and sending them places? Most likely we'll be putting up Kafka as a standard buffer for (probably) all of our log and other event data before going to Heka, so as long as the solution goes to either Heka or Kafka it should be fine. I was looking at the stdin/stdout <https://cwiki.apache.org/confluence/display/KAFKA/Clients#Clients-stdin/stdout> Kafka client as a simple solution for sending data from Linux machines to Kafka, but that doesn't address the log file watching problem. There's rsyslog <http://www.rsyslog.com/doc/master/configuration/modules/imfile.html>, but it doesn't claim to be infallible when log files are rotated. Since rsyslog is standard on all our RHEL boxes it seems like that's the solution to beat. Any other ideas? FYI, I also looked at deploying Heka to those older RHEL boxes in a Docker container, but the machines that are too old to run Heka are also too old to run Docker. TIA, Ali
_______________________________________________ Heka mailing list [email protected] https://mail.mozilla.org/listinfo/heka

