[
https://issues.apache.org/jira/browse/FLUME-2940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15356753#comment-15356753
]
Attila Simon commented on FLUME-2940:
-------------------------------------
Hi [~yogeshbelur],
Please correct me if I'm wrong but seems to me that this request is trying to
violate flume's at least once delivery guarantee. If that is the case I would
recommend of having this code as a plugin so that it doesn't have to conform
with the inception idea of Flume.
Removing old events from channel sounds like a perfect task for a sink which is
basically responsible for that. The only modification you need is to drop the
events read from the channel instead of forwarding/persisting them in case of
channel is full and source would like to add more. In theory this requires a
shared knowledge between source and sink but again this can be part of a plugin.
> File channel - delete old events making queue available for new events
> ----------------------------------------------------------------------
>
> Key: FLUME-2940
> URL: https://issues.apache.org/jira/browse/FLUME-2940
> Project: Flume
> Issue Type: Question
> Reporter: Yogesh BG
>
> I am using flume agent with avro source and custom sink.
> I want to keep sending events using avro source to agent, sink will not be
> available at this point of time
> file size gets reached or capacity is reached, in both the case i have log-1,
> log-2 etc.
> In this case when i send next event, i get the exception, instead can i do
> something to delete old events and add new events? basically i want to clean
> up some old data making room for new data.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)