So it's retrying to connect... you looked into the Namenode's log? It's probably not running and the problem will probably be easy to fix.
J-D On Tue, Jan 5, 2010 at 3:37 AM, Muhammad Mudassar <[email protected]> wrote: > hi > problem is still there i found there was error in hadoop site configuration > file port was not mentioned now i have mentioned the port but problem is > still there but when i tried to check file sstem of hadoop by *./bin/hadoop > fs -ls* its saying *Retrying connect to server: > localhost/127.0.0.1:54310.*however now i have configured port number i > am pasting my hadoop core site > configuration file contents here > <configuration> > <property> > <name>fs.default.name</name> > <value>hdfs://localhost:54310</value> > </property> > </configuration> > > Any help regarding the issue > > > thanks > > > On Tue, Jan 5, 2010 at 11:50 AM, stack <[email protected]> wrote: > >> Check your configuration. Master is shutting down because it can't connect >> to hdfs running at: >> >> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on >> connection exception: java.net.ConnectException: Connection refused >> >> Is 127.0.0.1 where your hdfs is running? Can you connect to your hdfs >> using >> ./bin/hadoop fs -lsr / before you run hbase? >> >> St.Ack >> >> >> >> On Fri, Jan 1, 2010 at 12:02 AM, Muhammad Mudassar <[email protected] >> >wrote: >> >> > i am starting hbase and then checked by JPS it shows that hmaster is >> > running >> > when i check it again by jps it goes off >> > >> > >> > here is log of hmaster >> > >> > >> > Wed Dec 30 14:54:41 PKT 2009 Starting master on mudassar-desktop >> > ulimit -n 1024 >> > 2009-12-30 14:54:42,014 INFO org.apache.hadoop.hbase.master.HMaster: >> > vmName=Java HotSpot(TM) Server VM, vmVendor=Sun Microsystems Inc., >> > vmVersion=14.0-b16 >> > 2009-12-30 14:54:42,015 INFO org.apache.hadoop.hbase.master.HMaster: >> > vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError, >> > -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, >> > -Dhbase.log.dir=/home/hadoop/Desktop/hbase-0.20.2/bin/../logs, >> > -Dhbase.log.file=hbase-hadoop-master-mudassar-desktop.log, >> > -Dhbase.home.dir=/home/hadoop/Desktop/hbase-0.20.2/bin/.., >> > -Dhbase.id.str=hadoop, -Dhbase.root.logger=INFO,DRFA, >> > >> > >> -Djava.library.path=/home/hadoop/Desktop/hbase-0.20.2/bin/../lib/native/Linux-i386-32] >> > 2009-12-30 14:54:42,054 INFO org.apache.hadoop.hbase.master.HMaster: My >> > address is mudassar-desktop:60000 >> > 2009-12-30 14:54:43,229 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 0 time(s). >> > 2009-12-30 14:54:44,229 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 1 time(s). >> > 2009-12-30 14:54:45,230 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 2 time(s). >> > 2009-12-30 14:54:46,230 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 3 time(s). >> > 2009-12-30 14:54:47,231 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 4 time(s). >> > 2009-12-30 14:54:48,231 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 5 time(s). >> > 2009-12-30 14:54:49,231 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 6 time(s). >> > 2009-12-30 14:54:50,232 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 7 time(s). >> > 2009-12-30 14:54:51,232 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 8 time(s). >> > 2009-12-30 14:54:52,233 INFO org.apache.hadoop.ipc.Client: Retrying >> connect >> > to server: localhost/127.0.0.1:54310. Already tried 9 time(s). >> > 2009-12-30 14:54:52,234 ERROR org.apache.hadoop.hbase.master.HMaster: Can >> > not start master >> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on >> > connection exception: java.net.ConnectException: Connection refused >> > at org.apache.hadoop.ipc.Client.wrapException(Client.java:766) >> > at org.apache.hadoop.ipc.Client.call(Client.java:742) >> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) >> > at $Proxy0.getProtocolVersion(Unknown Source) >> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) >> > at >> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:105) >> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208) >> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:169) >> > at >> > >> > >> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82) >> > at >> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1373) >> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) >> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1385) >> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191) >> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95) >> > at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:194) >> > at >> > >> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:94) >> > at >> > >> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78) >> > at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1214) >> > at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1259) >> > Caused by: java.net.ConnectException: Connection refused >> > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) >> > at >> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574) >> > at >> > >> > >> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) >> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) >> > at >> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304) >> > at >> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176) >> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:859) >> > at org.apache.hadoop.ipc.Client.call(Client.java:719) >> > ... 17 more >> > >> > On Fri, Jan 1, 2010 at 12:07 AM, Jean-Daniel Cryans <[email protected] >> > >wrote: >> > >> > > If you started the master and it gives you this message, look at the >> > > master's log in hbase_folder/logs to see if there's anything obviously >> > > wrong in there. >> > > >> > > J-D >> > > >> > > On Thu, Dec 31, 2009 at 3:31 AM, Muhammad Mudassar < >> [email protected] >> > > >> > > wrote: >> > > > hi >> > > > >> > > > I am trying to start hbase in pseudo-distributed mode but it gives >> me >> > > > MasterNotRunning Exception my hbase-site configurations are >> > > > <configuration> >> > > > <property> >> > > > <name>hbase.rootdir</name> >> > > > >> <value>hdfs://localhost:54310/home/hadoop/Desktop/hbasedata</value> >> > > > <description>The directory shared by region servers. >> > > > </description> >> > > > </property> >> > > > >> > > > </configuration> >> > > > >> > > > >> > > > >> > > > any help >> > > > >> > > > Mudassar >> > > > >> > > >> > >> >
