Hi All ,
I have just implemented the Flume agent with below configuration
Configuration
# example.conf: A single-node Flume configuration
# Name the components on this agenta1.sources = r1a1.sinks = k1a1.channels = c1
# Describe/configure the sourcea1.sources.r1.type = avroa1.sources.r1.bind =
localhosta1.sources.r1.port = 44440
# Describe the sinka1.sinks.k1.type = hdfsa1.sinks.k1.hdfs.fileType =
DataStreama1.sinks.k1.hdfs.fileSuffix= .txta1.sinks.k1.hdfs.rollSize =
1048576a1.sinks.k1.hdfs.rollCount = 0a1.sinks.k1.hdfs.rollInterval =
0a1.sinks.k1.hdfs.batchSize = 1000a1.sinks.k1.hdfs.minBlockReplicas =
1a1.sinks.k1.hdfs.path = hdfs://localhost:9000/flume/MemoryChannel/Avro #using
the file channela1.channels.c1.type = filea1.channels.c1.capacity =
1000000a1.channels.c1.transactionCapacity = 10000
# Bind the source and sink to the channela1.sources.r1.channels =
c1a1.sinks.k1.channel = c1
Now i am sending batch of 1000 event to flume AVRO source and each event with
UID incremented by one. HDFS server create text file of 1MB each as per my
configuration and file with .tmp extension (file which is process now). Now i
stopping the flume agent and start it again. below are my two Expectation when
starting flume agent again
1. Agent will resend the event from next to last successfully received event
(in my case .tmp file has event with UID 12000 as last so next event will
be event with UID as 12001 )But what append is it start event with 12500 UID ,
event from 12001 to 12499 is completely lost
2. Agent will resume the appending event to file where it left last that is
file which is not completed (file with .tmp extension)But agent not resumed the
appending event to file where it had left . it created the new text file and
start to append it .
Can any one explain we why my two expectation failed ?
And also file are remained with .tmp extension once i stopped the agent it
doesn't remove this extension . can any know why these happening ?
Regards,Mahendran