Replace 127.0.1.1 with 127.0.0.1 in /etc/hosts..Also add the
hadoop-core*.jar from your HADOOP_HOME and common-cofigurations from
the HADOOP_HOME/lib to your HBASE_HOME/lib folder.

Regards,
    Mohammad Tariq



On Tue, Apr 10, 2012 at 8:18 PM, shashwat shriparv
<dwivedishash...@gmail.com> wrote:
> Comment out  127.0.1.1, if present in the /etc/hosts file. check if you can
> ssh to localhost,
>
> On Tue, Apr 10, 2012 at 6:38 PM, Dave Wang <d...@cloudera.com> wrote:
>
>> Shaharyar,
>>
>> Did you format the namenode ("hadoop namenode -format")?
>>
>> What do the namenode logs say?
>>
>> - Dae
>>
>> On Tue, Apr 10, 2012 at 6:00 AM, shaharyar khan <
>> shaharyar.khan...@gmail.com
>> > wrote:
>>
>> >
>> > when i try to start hadoop then its all services like TaskTracker,
>> > JobTracker, DataNode ,SecondaryNameNode are running except NameNode.So,
>> > HBase is unable to find hadoop as namenode is not up.So please guide me
>> why
>> > this is happening.i have checked all configuration files as well as
>> > iptables
>> > / firewall configuration for that port . everything is ok but unable to
>> > know
>> > why this is happening
>> > --
>> > View this message in context:
>> > http://old.nabble.com/namenode-not-starting-tp33661242p33661242.html
>> > Sent from the HBase User mailing list archive at Nabble.com.
>> >
>> >
>>
>
>
>
> --
>
>
> ∞
> Shashwat Shriparv

Reply via email to