Take a look at kafka-connect-spooldir and see if it meets your needs.

https://www.confluent.io/connector/kafka-connect-spooldir/

This connector can monitor a directory and pick up any new files that are 
created. Great for picking up batch files, parsing them, and publishing each 
line as if it were published in realtime.

-hans

> On Mar 15, 2019, at 7:52 AM, Pulkit Manchanda <pulkit....@gmail.com> wrote:
> 
> Hi All,
> 
> I am building a data pipeline to send logs from one data source to the
> other node.
> I am using Kafka Connect standalone for this integration.
> Everything works fine but the problem is on Day1 the log file is renamed as
> log_Day0 and a new log file  log_Day1 is created.
> And my Kafka Connect don't process the new log file.
> Looking for a solution. Any help is appreciated.
> 
> Thanks
> Pulkit

Reply via email to