Recently we switched over from Memory Channel to File Channel, as Memory
Channel has some GC issues.
Occasionally in File Channel I see this exception
org.apache.flume.ChannelException: Put queue for FileBackedTransaction of
capacity 5000 full, consider committing more frequently, increasing
What source are you using? Looks like the source is writing 5K events in one
transaction
Thanks,
Hari
On Tuesday, October 15, 2013 at 12:24 PM, Bhaskar V. Karambelkar wrote:
Recently we switched over from Memory Channel to File Channel, as Memory
Channel has some GC issues.
Source is Avro Source which gets evnets fed by a custom JVM application
using the flume client SDK.
So referring to the client SDK, if the batchSize property has be set to
1,000, but I pass say 10,000 events in the client.addBatch(ListEvent) call
what happens ?
On Tue, Oct 15, 2013 at 3:54 PM,
Hi,
We are using CDH flume 1.3 (which ships with 4.2.1). We see this error in our
flume logs in our production system and restarting flume did not help. Looking
at the flume code, it appears to be expecting the byte to be an OPERATION, but
is not. Any ideas on what happened?
Thanks,
~Rahul.
Looks like the file may have been corrupted. Can you verify if you are out
of disk space or can see something that might have caused the data to be
corrupted?
Hari
On Thu, Jun 27, 2013 at 6:41 AM, Rahul Ravindran rahu...@yahoo.com wrote:
Hi,
We are using CDH flume 1.3 (which ships with
; Rahul Ravindran
rahu...@yahoo.com
Sent: Thursday, June 27, 2013 11:24 AM
Subject: Re: Flume error in FIleChannel
Looks like the file may have been corrupted. Can you verify if you are out of
disk space or can see something that might have caused the data to be corrupted?
Hari
On Thu, Jun 27