Helllo Shagun,

        Make sure you have set the required config parameters correctly.
Also, modify the line containing "127.0.1.1" to "127.0.0.1" in your
"etc/hosts" file and add the hostname of your VM along with the IP in the
etc/hosts file if you are using FQDN.

If you still face any issue, have a look at this
link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UPVyops-vl9>
.

HTH

Warm Regards,
Tariq
https://mtariq.jux.com/


On Tue, Jan 15, 2013 at 1:12 PM, Yuva Raj raghunapu
<monkey2c...@gmail.com>wrote:

> Check this out it maybe helpful , but this illustrates on ubuntu hope they
> are almost same .
>
>
>
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
>
>
>
>
> Shagun Bhardwaj <shagun...@gmail.com> wrote:
>
> Hi,
>
>
> I am not able to install Apache Hadoop in a Pseudo-distributed mode on a
> CentOS operating system (which I have installed on a VMPlayer).
>
> I am following the Hadoop officieal installation guide (
> http://hadoop.apache.org/docs/r0.20.2/quickstart.html ) and have been
> able to run it in a Standalone mode. But, nxt step I tried was to run it in
> a pseudo-distributed mode. I followed the below steps as in the above docs
> but the below statemetn gave in an exception.
>    $ bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
>
> The above statement keeps on running till I give a bin/stop-all.sh command
> from another terminal to kill the process and in the result I see an
> *Java.IO.Exception: Call to localhost:127.0.0.1:9000 failed at local
> exception.*
>
>  Please advise what is the error that could be giving this exception.
>
>  Regards,
>
> Shagun Bhardwaj
>

Reply via email to