There is no other information in the log (although when I run it on my pc and 
it works, there is more information in the log) and also the web page of the 
namenode doesn't contain any live datanodes as it should. 

That's why I said it freezes... I have no idea what is going on...

Please if anyone can help because it is really important to make it work as 
soon as possible (for my diploma thesis) and I really have no idea what might 
go wrong.


> Subject: Re: Datanode doesn't start but there is no exception in the log
> From: highpoint...@gmail.com
> Date: Fri, 13 May 2011 08:22:53 -0600
> To: common-user@hadoop.apache.org
> 
> When you say "freeze" you mean there is nothing rolling in the log?
> 
> Sent from my iPhone
> 
> On May 13, 2011, at 2:28 AM, Panayotis Antonopoulos 
> <antonopoulos...@hotmail.com> wrote:
> 
> > 
> > There is no shutdown message until I shutdown the DataNode.
> > 
> > I used hostname of the machine that will run the DataNode and I now used 
> > the IP but there is no difference.
> > Again the DataNode seems to freeze and the output at the log is the one I 
> > mentioned before.
> > 
> > 
> > 
> >> Subject: Re: Datanode doesn't start but there is no exception in the log
> >> From: highpoint...@gmail.com
> >> Date: Thu, 12 May 2011 23:59:02 -0600
> >> To: common-user@hadoop.apache.org
> >> 
> >> Have you defined the IP
> >> of the DN in the slaves file?
> >> 
> >> 
> >> 
> >> Sent from my iPhone
> >> 
> >> On May 12, 2011, at 7:27 PM, Bharath Mundlapudi <bharathw...@yahoo.com> 
> >> wrote:
> >> 
> >>> Is that all the messages in the datanode log? Do you see any SHUTDOWN 
> >>> message also?
> >>> 
> >>> -Bharath
> >>> 
> >>> 
> >>> 
> >>> ________________________________
> >>> From: Panayotis Antonopoulos <antonopoulos...@hotmail.com>
> >>> To: common-user@hadoop.apache.org
> >>> Sent: Thursday, May 12, 2011 6:07 PM
> >>> Subject: Datanode doesn't start but there is no exception in the log
> >>> 
> >>> 
> >>> Hello,
> >>> I am trying to set up Hadoop HDFS in a cluster for the first time. So far 
> >>> I was using pseudo-distributed mode on my PC at home and everything was 
> >>> working perfectly.
> >>> Tha NameNode starts but the DataNode doesn't start and the log contains 
> >>> the following:
> >>> 
> >>> 2011-05-13 04:01:13,663 INFO 
> >>> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
> >>> /************************************************************
> >>> STARTUP_MSG: Starting DataNode
> >>> STARTUP_MSG:   host = clone1/147.102.4.129
> >>> STARTUP_MSG:   args = []
> >>> STARTUP_MSG:   version = 0.20.2-cdh3u0
> >>> STARTUP_MSG:   build =  -r 81256ad0f2e4ab2bd34b04f53d25a6c23686dd14; 
> >>> compiled by 'hudson' on Fri Mar 25 19:56:23 PDT 2011
> >>> ************************************************************/
> >>> 2011-05-13 04:01:14,019 INFO 
> >>> org.apache.hadoop.security.UserGroupInformation: JAAS Configuration 
> >>> already set up for Hadoop, not re-installing.
> >>> 2011-05-13 04:01:14,143 INFO 
> >>> org.apache.hadoop.hdfs.server.datanode.DataNode: Registered 
> >>> FSDatasetStatusMBean
> >>> 2011-05-13 04:01:14,152 INFO 
> >>> org.apache.hadoop.hdfs.server.datanode.DataNode: Opened info server at 
> >>> 50010
> >>> 2011-05-13 04:01:14,154 INFO 
> >>> org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 
> >>> 1048576 bytes/s
> >>> 2011-05-13 04:01:14,206 INFO org.mortbay.log: Logging to 
> >>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via 
> >>> org.mortbay.log.Slf4jLog
> >>> 2011-05-13 04:01:14,272 INFO org.apache.hadoop.http.HttpServer: Added 
> >>> global filtersafety 
> >>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> >>> 2011-05-13 04:01:14,278 INFO org.apache.hadoop.http.HttpServer: Port 
> >>> returned by webServer.getConnectors()[0].getLocalPort() before open() is 
> >>> -1. Opening the listener on 50075
> >>> 2011-05-13 04:01:14,278 INFO org.apache.hadoop.http.HttpServer: 
> >>> listener.getLocalPort() returned 50075 
> >>> webServer.getConnectors()[0].getLocalPort() returned 50075
> >>> 2011-05-13 04:01:14,278 INFO org.apache.hadoop.http.HttpServer: Jetty 
> >>> bound to port 50075
> >>> 2011-05-13 04:01:14,278 INFO org.mortbay.log: jetty-6.1.26
> >>> 2011-05-13 04:01:14,567 INFO org.mortbay.log: Started 
> >>> SelectChannelConnector@0.0.0.0:50075
> >>> 2011-05-13 04:01:14,570 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: 
> >>> Initializing JVM Metrics with processName=DataNode, sessionId=null
> >>> 2011-05-13 04:01:14,976 INFO org.apache.hadoop.ipc.Server: Starting 
> >>> Socket Reader #1 for port 50020
> >>> 2011-05-13 04:01:14,978 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: 
> >>> Initializing RPC Metrics with hostName=DataNode, port=50020
> >>> 2011-05-13 04:01:14,981 INFO 
> >>> org.apache.hadoop.ipc.metrics.RpcDetailedMetrics: Initializing RPC 
> >>> Metrics with hostName=DataNode, port=50020
> >>> 2011-05-13 04:01:14,984 INFO 
> >>> org.apache.hadoop.hdfs.server.datanode.DataNode: dnRegistration = 
> >>> DatanodeRegistration(clone1:50010, storageID=, infoPort=50075, 
> >>> ipcPort=50020)
> >>> 
> >>> Does anyone know what might be wrong??
> >>> 
> >>> Thank you in advance!
> >>> Panagiotis
> >                         
                                          

Reply via email to