Not sure if this is the issue but I've found it's easy to have custom flume
components like interceptors, sinks, sources and serializers swallow
exceptions.  You won't see any errors in standard out and the flume agent
will look like it is working but it's not doing anything.  When you kill
the agent it will then close your files in HDFS and any data that was in
the channel before the error will get written out.  Do the avro fies have
all the data you expect them to have after you kill the flume agent?

Best Regards,

Ed


On Mon, Mar 3, 2014 at 4:11 AM, Himanshu Patidar <
[email protected]> wrote:

> I have a custom HDFS Sink which takes events, parse them (convert binary
> data to .avro files) and then writes these files to different directories
> in hdfs. Trying to do so I get a strange error - Only the last avro file
> gets written and to hdfs and rest of the files show a size of 0 kbs untill
> I kill my flume agent (Ctrl + C). As soon as I kill the script, I can see
> the data in rest of the .avro files. I am using flume 1.4 cdh 4.0.0 with
> hdfs 2.0.0.
>
> Can anyone please suggest some solution?
>
> Thanks,
> Himanshu
>

Reply via email to