namenode is stopping automatically!!

On Tue, Jun 23, 2009 at 10:29 PM, bharath vissapragada <
bharathvissapragada1...@gmail.com> wrote:

> It worked fine when i updated /etc/hosts file (of all the slaves) and
> writing fully qualified domain name in the hadoop-site.xml.
>
> It worked fine for sometime .. then started giving new error
>
> 09/06/23 22:21:49 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 0 time(s).
> 09/06/23 22:21:50 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 1 time(s).
> 09/06/23 22:21:51 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 2 time(s).
> 09/06/23 22:21:52 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 3 time(s).
> 09/06/23 22:21:53 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 4 time(s).
> 09/06/23 22:21:54 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 5 time(s).
> 09/06/23 22:21:55 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 6 time(s).
> 09/06/23 22:21:56 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 7 time(s).
> 09/06/23 22:21:57 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 8 time(s).
> 09/06/23 22:21:58 INFO ipc.Client: Retrying connect to server: master/
> 10.2.24.21:54310. Already tried 9 time(s).
>
>
>
> On Tue, Jun 23, 2009 at 8:33 PM, Raghu Angadi <rang...@yahoo-inc.com>wrote:
>
>> Raghu Angadi wrote:
>>
>>>
>>> This is at RPC client level and there is requirement for fully qualified
>>>
>>
>> I meant to say "there is NO requirement ..."
>>
>>  hostname. May be "." at the end of "10.2.24.21" causing the problem?
>>>
>>> btw, in 0.21 even fs.default.name does not need to be fully qualified
>>>
>>
>> that fix is probably in 0.20 too.
>>
>> Raghu.
>>
>>
>>  name.. anything that resolves to an ipaddress is fine (at least for
>>> common/FS and HDFS).
>>>
>>> Raghu.
>>>
>>> Matt Massie wrote:
>>>
>>>> fs.default.name in your hadoop-site.xml needs to be set to a
>>>> fully-qualified domain name (instead of an IP address)
>>>>
>>>> -Matt
>>>>
>>>> On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:
>>>>
>>>>  when i try to execute the command bin/start-dfs.sh  , i get the
>>>>> following
>>>>> error . I have checked the hadoop-site.xml file on all the nodes , and
>>>>> they
>>>>> are fine ..
>>>>> can some-one help me out!
>>>>>
>>>>> 10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
>>>>> unknown host: 10.2.24.21.
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
>>>>> Source)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>>>>> 10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>>>>> 10.2.24.21:     at
>>>>> org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>>>>>
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to