Jeff:

Thank you for your explanation.
The problem was indeed basic, but it is solved now.
I asked for a permission to have a static IP address from the University where 
my wife works. But next Sunday I will be out  for two weeks and I wish to 
continue learning Hadoop from the hotel, where only wireless DHCP IP addresses 
are available.

Thanks to Mohammad and Harsh for their time too.

Cristián.


Date: Tue, 14 Aug 2012 16:24:39 -0700
Subject: Re: Hello! - Hadoop: System Requirements.
From: jeffsilver...@google.com
To: user@hadoop.apache.org

Cristian,
You have a basic network problem.  You have a single name, RHEL, which points 
to two IP addresses, 10.9.6.160 and 10.9.0.188.  That won't work.  The 
/etc/hosts file is searched sequentially  so it always finds the first 
occurrence of RHEL.

By default, any process that listens on all interfaces will listen on the 
loopback interface ( 127.0.0.1).
You have an additional problem and that is that wherever you go, your IP 
address is going to change.  There is a document on the subject, RFC 1918.  
Basically, any IP address that begins with 10., 172.12 through 172.31, and 
192.168 is a private address.  You're getting the 10.9.6.180 and 10.9.0.188 
addresses from the network, and that's unusual but perfectly legitimate.

If you are only going to use these two addresses, then what you can do is add 
the following to your /etc/hosts file:
# wireless

10.9.6.160 RHEL6_wireless# wired
10.9.0.188 RHEL6_wired


 When your systems attempt to connect to the wired IP address and you are 
running in wireless mode, then the connection attempt will fail and the 
map/reduce software won't send any work to.  Similarly, if you attempt to 
connect to the wireless IP address and you are wired.


Jeff SilvermanGoogle


                                          

Reply via email to