Thanks Harsh J!

But never heard about binding services to localhost... How can I do that? Could 
you be so kind to point to a website or the like in order to learn “how to”?
As you already know... I am trying to become a “newbie”... 

Thanks!


From: Harsh J 
Sent: Tuesday, August 14, 2012 7:57 PM
To: user@hadoop.apache.org 
Subject: RE: Hello! - Hadoop: System Requirements.

Can you not bind all your services, including Hadoop, to the localhost 
interface? That usually works for my pseudo instances.

On Aug 15, 2012 4:22 AM, "Cristian Carranza" <cristiancarranz...@hotmail.com> 
wrote:

  Thanks Mohammad...

  Yep, the problem is when IP gets changed... And this will be the rule since I 
travel a lot for business reasons (I am a quality consultant, hotels here and 
there...). Also I am testing/learning now in order to give a recommendation to 
my BI manager in near future... single node is enough for now...

  I've added the wlan IP address to the /etc/hosts file, but it is still 
impossible to ping myself, but it seems that I am "almost there"...
  because If I ping RHEL6 (hostname), I've got an error massage (Destination 
host unreachable)
  but if I ping the wlan IP address, it is ok (but Hadoop refuses to start: 
"Failed to retrieve hostname/IP from RHEL6.ccet.ufrn.br"...) 

  I guess that an association between RHEL6 hostname to this two IP addresses 
is needed, but I do not know if this is possible...

  Here it is the hosts file:

  Hosts:
  ---------------------------------------------------
  127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
  ::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
  # wireless
  10.9.6.160 RHEL6
  # wired
  10.9.0.188 RHEL6
  ------------------------------------------------------


  Thanks again!





------------------------------------------------------------------------------
  From: donta...@gmail.com
  Date: Wed, 15 Aug 2012 00:26:01 +0530
  Subject: Re: Hello! - Hadoop: System Requirements.
  To: user@hadoop.apache.org

  Hello Cristian, 


        No question is "dull"..I also do the same thing when stuck. Now,the 
reason behind your problem 
  is that when you switch from wired to wireless, the IP gets changed. Just use 
"ifconfig" command to 
  get the IP and paste it in your hosts file along with your hostname. BTW, are 
you using Hadoop in 
  pseudo or fully distributed mode??Also, if there is any need of this 
switching quite often just have 2
  sets of configuration files.


  Regards, 
      Mohammad Tariq




  On Wed, Aug 15, 2012 at 12:16 AM, Cristian Carranza 
<cristiancarranz...@hotmail.com> wrote:

    Thanks again Mohammad!

    Please help me to go further on your advice, since I am not good dealing 
with hostnames and network conf...

    I am using a wired connection to the internet right now and can "ping 
myself" {ping <myhostname>} and IBI/Hadoop are running fine. 

    But if I change to a wireless connection, I can neither ping myself anymore 
nor Hadoop. 

    Having said that: What changes in network configuration files (hostname) 
and in /etc/hosts are necessary in order to ping myself again, this time with a 
wireless DHCP IP address?

    Really hope that this is not a dull question... but any help will be much 
appreciated.


    Cristián.





----------------------------------------------------------------------------
    From: donta...@gmail.com
    Date: Tue, 14 Aug 2012 19:50:28 +0530 

    Subject: Re: Hello! - Hadoop: System Requirements.

    To: user@hadoop.apache.org 


    If you don't want to use static IP, use the hostname everywhere in your 
configuration. But you need to modify the /etc/hosts file everytime to reflect 
the changes. 

    Regards, 
        Mohammad Tariq




    On Tue, Aug 14, 2012 at 7:43 PM, Cristian Carranza 
<cristiancarranz...@hotmail.com> wrote:

      Julien:

      Thanks for your prompt response. But...
      Is there a way to use Hadoop without a static IP address with a VM?

      Thanks!


      From: Julien Muller 
      Sent: Tuesday, August 14, 2012 11:05 AM
      To: user@hadoop.apache.org 
      Subject: Re: Hello! - Hadoop: System Requirements.

      if your purpose is learn / dev / demo, it would be a good idea to use a 
VM. 
      You will not only be able to use a static IP, but also keep copies of 
working systems, make some tests, learn how to setup a cluster with only 1 
physical machine. And much more.

      Julien

      2012/8/14 Cristian Carranza <cristiancarranz...@hotmail.com>

        Hello all in this list!

        Thank you Harsh J for your help! I decided to post it again in order to 
properly identify myself in the archive list.




        My name is Cristián and I am trying to learn Hadoop and to use it in 
near future...

        I’ve started to learn Hadoop via Infosphere BigInsights (IBI), Basic 
Edition, from IBM.
        But I am facing problems now that make me wonder if there are 
alternative ways to learn Hadoop.

        The biggest problem is that IBI demands a static IP address, and this 
is a tough requirement for me, since I travel a lot for business and always use 
IP address assigned via DHCP..
        Is there a way to use Hadoop without a static IP address?

        More general question: What are the system requirements for 
installation of Hadoop?

        Thank you in advance!

        Cristián Carranzal.



<<wlEmoticon-embarrassedsmile[1].png>>

Reply via email to