Inder, when did you build?

You may be running into https://issues.apache.org/jira/browse/FLUME-1098 which 
was checked-in yesterday.

You know if that is your problem if you have any hadoop jars in your 
distribution lib directory (this should not be the case).

On startup, flume searches for hadoop and uses the jars corresponding with the 
local installation of hadoop. So hadoop must be installed and the flume start 
script has to be able to find it in order to use the HDFS sink.

Best,
Mike

On Apr 17, 2012, at 8:23 AM, Inder Pall wrote:

> thanks, didn't knew that. Here is the log - http://pastebin.com/TQQc9TWU
> 
> Thanks,
> - Inder
> 
> On Tue, Apr 17, 2012 at 8:37 PM, Brock Noland <[email protected]> wrote:
> 
>> The mailing list doesn't like attachements, yours was not included. I
>> would use pastebin or something similar.
>> 
>> On Tue, Apr 17, 2012 at 3:04 PM, Inder Pall <[email protected]> wrote:
>>> Brock,
>>> 
>>> looking at the code and from past usage that wasn't an issue, however i
>>> tried the suggested change
>>> i.e. agent1.sinks.log-sink1.hdfs.path = /tmp/flume-data/ with no luck.
>>> 
>>> attaching flume.log with this email for reference. Server does startup
>>> however telnet/connection fails.
>>> Weirdly if i change the sink type to logger it does work.
>>> 
>>> Not sure if i am missing something obvious here.
>>> 
>>> Thanks,
>>> - inder
>>> 
>>> 
>>> 
>>> On Tue, Apr 17, 2012 at 8:17 PM, Brock Noland <[email protected]>
>> wrote:
>>>> 
>>>> Hi,
>>>> 
>>>> inline
>>>> 
>>>> On Tue, Apr 17, 2012 at 2:41 PM, Inder Pall <[email protected]>
>> wrote:
>>>>> folks,
>>>>> 
>>>>> i am not sure whether anyone else is facing this or i am missing
>>>>> something
>>>>> here. Here are some observations -
>>>>> 
>>>>> 1. checkout flume trunk
>>>>> 2. provide the following config
>>>>> 
>>>>> # Define a memory channel called ch1 on agent1
>>>>> agent1.channels.ch1.type = memory
>>>>> 
>>>>> # Define an Avro source called avro-source1 on agent1 and tell it
>>>>> # to bind to 0.0.0.0:41414. Connect it to channel ch1.
>>>>> agent1.sources.avro-source1.type = avro
>>>>> agent1.sources.avro-source1.bind = 0.0.0.0
>>>>> agent1.sources.avro-source1.port = 41414
>>>> 
>>>> 1) Does localhost work?
>>>> 2) Can you paste the output of the following?
>>>> 
>>>> sudo /usr/sbin/lsof -i tcp | grep 41414
>>>> 
>>>>> agent1.sources.avro-source1.channels = ch1
>>>>> 
>>>>> # Define a logger sink that simply logs all events it receives
>>>>> # and connect it to the other end of the same channel.
>>>>> agent1.sinks.log-sink1.type = hdfs
>>>>> agent1.sinks.log-sink1.channel = ch1
>>>>> agent1.sinks.log-sink1.hdfs.path = hdfs://localhost
>>>> 
>>>> I think this should be something like /user/noland/
>>>> 
>>>>> 
>>>>> 
>>>>> # Finally, now that we've defined all of our components, tell
>>>>> # agent1 which ones we want to activate.
>>>>> agent1.sources = avro-source1
>>>>> agent1.sinks = log-sink1
>>>>> agent1.channels = ch1
>>>>> flume.cfg (END)
>>>>> 
>>>>> 3. start agent - ./flume-ng agent --conf ../conf/ -f flume.cfg -n
>> agent1
>>>> 
>>>> Does flume.log tell you its getting started up?
>>>> 
>>>> --
>>>> Apache MRUnit - Unit testing MapReduce -
>>>> http://incubator.apache.org/mrunit/
>>> 
>>> 
>>> 
>>> 
>>> --
>>> Thanks,
>>> - Inder
>>>  Tech Platforms @Inmobi
>>>  Linkedin - http://goo.gl/eR4Ub
>> 
>> 
>> 
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>> 
> 
> 
> 
> -- 
> Thanks,
> - Inder
>  Tech Platforms @Inmobi
>  Linkedin - http://goo.gl/eR4Ub

Reply via email to