Can you sent the input to logger and pipe that through snappy to unpack them?
Another task could be to disable snappy and check the plain files if they some missing, because snappy does only compress. regards, Alex On Sep 8, 2012, at 3:11 AM, Mohit Anchlia <[email protected]> wrote: > I am using HDFS sink with snappycompression. After I write data and read > from the hdfs file I only see the keys but not data. How can I debug this > with flume? > > > DEFINE SequenceFileLoader > org.apache.pig.piggybank.storage.SequenceFileLoader(); > > A = LOAD '/flume_vol/flume/2012/09/07/17/dslg1/*.snappy*' USING > SequenceFileLoader AS (key:long, document:bytearray); > > DUMP A; > > (1347065726395,) > > (1347065726395,) > > (1347065726395,) > > (1347065726395,) > > (1347065726395,) > > (1347065726395,) > > (1347065726396,) > > (1347065726396,) > > (1347065726396,) > > (1347065726396,) > > (1347065726396,) -- Alexander Alten-Lorenz http://mapredit.blogspot.com German Hadoop LinkedIn Group: http://goo.gl/N8pCF
