The Avro Sink is used for comunication between Flume agents. To directly insert
into HDFS you simply use an Avro Serializer with the HDFS sink.
Thanks,
Hari
On Sunday, October 6, 2013 at 3:38 PM, Deepak Subhramanian wrote:
> Hi Hari ,
> I tried using an avro sink after HTTPSource and then an
Hi Hari ,
I tried using an avro sink after HTTPSource and then an avro source and
hdfs sink and it seems to be working. Do we have to use an avro sink first
or can we directly convert to avro using HDFS sink ?
Thanks, Deepak
On Sun, Oct 6, 2013 at 11:27 PM, Deepak Subhramanian <
deepak.subhraman
There was a mistake in my configuration. I had hdfs infront of
serializer.
Changed
tier1.sinks.sink1.hdfs.serializer = avro_event
to tier1.sinks.sink1.serializer = avro_event
But it is still generating a sequence file. This is what I get.
SEQ!org.apache.hadoop.io.LongWritableorg.apache.had
I see that we use the legacy thrift in 1.4.. so it will only work with OG as
Harri stated.
When we are ready to update I will get NG thrift working. If anyone interested
and like to contribute NG code I happily accept PRs :-)
cheers,
Lars
On Oct 6, 2013, at 6:07 PM, Hari Shreedharan wrote:
>
Thanks Harri for the feedback! Its already working against flume-ng (1.4) but
I'll regenerate the py-files from the flume master source to stay current.
cheers.
On Oct 6, 2013, at 6:07 PM, Hari Shreedharan wrote:
> Hi Lars,
>
> Is this for Flume OG (0.9.x) or NG (1.x)? If this is for Flume 0
Hi Lars,
Is this for Flume OG (0.9.x) or NG (1.x)? If this is for Flume 0.9.x,
please be aware that that version of Flume is no longer developed. Instead
Flume 1.x is what is developed and supported. You can find the source code
here: https://github.com/apache/flume/tree/trunk, and the thrift IDL
Hello fellow Flume users!
I like to announce a python log handler to send events to flume.
https://github.com/lsjostro/flumelogger
can be installed using pip installer as well.
$ pip install flumelogger
Have fun!
Cheers,
Lars