Mark just write down the following lines in /etc/hosts file

Ip-address hostname

e. g.  192.168.0.111 ws-test
        192.1685.0.165 rahul

samr for all nodes



Mark wrote:
I am getting the following errors from my datanodes when I start the namenode.

2010-09-08 14:17:40,690 INFO org.apache.hadoop.ipc.RPC: Server at hadoop1/10.XXX.XXX.XX:9000 not available yet, Zzzzz... 2010-09-08 14:17:42,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: hadoop1/10.XXX.XXX.XX:9000. Already tried 0 time(s). 2010-09-08 14:17:43,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: hadoop1/10.XXX.XXX.XX:9000. Already tried 1 time(s). 2010-09-08 14:17:44,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: hadoop1/10.XXX.XXX.XX:9000. Already tried 2 time(s). 2010-09-08 14:17:45,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: hadoop1/10.XXX.XXX.XX:9000. Already tried 3 time(s).
....

If I remove the following line from /etc/hosts then the datanodes will be able to connect:

127.0.0.1               localhost.localdomain localhost hadoop1

Can someone explain why this is happening? I feel like I shouldn't have to alter my hosts file to get this up and running.

Here is the log from the NameNode server

2010-09-08 14:16:45,179 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = hadoop1.mycompany.com/10.XXX.XXX.XX
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
2010-09-08 14:16:45,309 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000 2010-09-08 14:16:45,315 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Namenode up at: localhost.localdomain/127.0.0.1:9000 2010-09-08 14:16:45,318 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null 2010-09-08 14:16:45,319 INFO org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext 2010-09-08 14:16:45,386 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner=root,root,bin,daemon,sys,adm,disk,wheel 2010-09-08 14:16:45,387 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup=supergroup 2010-09-08 14:16:45,387 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled=true 2010-09-08 14:16:45,396 INFO org.apache.hadoop.hdfs.server.namenode.metrics.FSNamesystemMetrics: Initializing FSNamesystemMetrics using context object:org.apache.hadoop.metrics.spi.NullContext 2010-09-08 14:16:45,398 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemStatusMBean 2010-09-08 14:16:45,440 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files = 5 2010-09-08 14:16:45,448 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files under construction = 0 2010-09-08 14:16:45,449 INFO org.apache.hadoop.hdfs.server.common.Storage: Image file of size 419 loaded in 0 seconds. 2010-09-08 14:16:45,449 INFO org.apache.hadoop.hdfs.server.common.Storage: Edits file /home/hadoop/dfs/name/current/edits of size 4 edits # 0 loaded in 0 seconds. 2010-09-08 14:16:45,488 INFO org.apache.hadoop.hdfs.server.common.Storage: Image file of size 419 saved in 0 seconds. 2010-09-08 14:16:45,606 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 248 msecs 2010-09-08 14:16:45,612 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode ON. The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically. 2010-09-08 14:16:45,767 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2010-09-08 14:16:45,847 INFO org.apache.hadoop.http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 50070 2010-09-08 14:16:45,848 INFO org.apache.hadoop.http.HttpServer: listener.getLocalPort() returned 50070 webServer.getConnectors()[0].getLocalPort() returned 50070 2010-09-08 14:16:45,848 INFO org.apache.hadoop.http.HttpServer: Jetty bound to port 50070
2010-09-08 14:16:45,848 INFO org.mortbay.log: jetty-6.1.14
2010-09-08 14:17:15,274 INFO org.mortbay.log: Started selectchannelconnec...@0.0.0.0:50070 2010-09-08 14:17:15,274 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Web-server up at: 0.0.0.0:50070 2010-09-08 14:17:15,275 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2010-09-08 14:17:15,275 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting 2010-09-08 14:17:15,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9000: starting 2010-09-08 14:17:15,276 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9000: starting 2010-09-08 14:17:15,277 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9000: starting 2010-09-08 14:17:15,277 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9000: starting 2010-09-08 14:17:15,277 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9000: starting 2010-09-08 14:17:15,277 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9000: starting 2010-09-08 14:17:15,277 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9000: starting 2010-09-08 14:17:15,278 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9000: starting 2010-09-08 14:17:15,278 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9000: starting 2010-09-08 14:17:15,278 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9000: starting 2010-09-08 14:17:15,323 INFO org.apache.hadoop.ipc.Server: Error register getProtocolVersion java.lang.IllegalArgumentException: Duplicate metricsName:getProtocolVersion at org.apache.hadoop.metrics.util.MetricsRegistry.add(MetricsRegistry.java:53) at org.apache.hadoop.metrics.util.MetricsTimeVaryingRate.<init>(MetricsTimeVaryingRate.java:89) at org.apache.hadoop.metrics.util.MetricsTimeVaryingRate.<init>(MetricsTimeVaryingRate.java:99)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:523)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)


Thanks


Reply via email to