I suggest mapping localhost to actual IP in my /etc/hosts file and
running it again.

Akshar



On Wed, May 14, 2008 at 9:13 AM, Shimon <[EMAIL PROTECTED]> wrote:

> Hi all,
>
> I've set up a standalone hadoop server , and when I run
> bin/hadoop dfs namenode -format
>
> I get the following message ( repeating 10 times ) :
>
> ipc.Client: Retrying connect to server: localhost/127.0.0.1:50000
>
>
>
>
> My hadoop-site.xml file is as follows :
>
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
>
>
> <configuration>
>
> <property>
>  <name>fs.default.name</name>
>  <value>localhost:50000</value>
> </property>
>
> <property>
>  <name>mapred.job.tracker</name>
>  <value>localhost:50001</value>
> </property>
>
>
> <property>
>  <name>hadoop.tmp.dir</name>
>  <value>/tmp/hadoop_storage</value>
> </property>
>
> <property>
>  <name>dfs.replication</name>
>  <value>1</value>
> </property>
>
>
> <property>
>  <name>mapred.map.tasks.speculative.execution</name>
>  <value>false</value>
> </property>
>
> <property>
>  <name>mapred.reduce.tasks.speculative.execution</name>
>  <value>false</value>
> </property>
>
> </configuration>
>
>
> Any happen would be appreciated.
>
> Thanks a lot ,
> Shimon
>

Reply via email to