I think you might need to change the IP itself. Try something similar to 192.168.1.20
-Vinay On 27 Apr 2017 8:20 pm, "Bhushan Pathak" <bhushan.patha...@gmail.com> wrote: > Hello > > I have a 3-node cluster where I have installed hadoop 2.7.3. I have > updated core-site.xml, mapred-site.xml, slaves, hdfs-site.xml, > yarn-site.xml, hadoop-env.sh files with basic settings on all 3 nodes. > > When I execute start-dfs.sh on the master node, the namenode does not > start. The logs contain the following error - > 2017-04-27 14:17:57,166 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: > Failed to start namenode. > java.net.BindException: Problem binding to [master:51150] > java.net.BindException: Cannot assign requested address; For more details > see: http://wiki.apache.org/hadoop/BindException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at org.apache.hadoop.net.NetUtils.wrapWithMessage( > NetUtils.java:792) > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:721) > at org.apache.hadoop.ipc.Server.bind(Server.java:425) > at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:574) > at org.apache.hadoop.ipc.Server.<init>(Server.java:2215) > at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:951) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.< > init>(ProtobufRpcEngine.java:534) > at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer( > ProtobufRpcEngine.java:509) > at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:796) > at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.< > init>(NameNodeRpcServer.java:345) > at org.apache.hadoop.hdfs.server.namenode.NameNode. > createRpcServer(NameNode.java:674) > at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize( > NameNode.java:647) > at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>( > NameNode.java:812) > at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>( > NameNode.java:796) > at org.apache.hadoop.hdfs.server.namenode.NameNode. > createNameNode(NameNode.java:1493) > at org.apache.hadoop.hdfs.server.namenode.NameNode.main( > NameNode.java:1559) > Caused by: java.net.BindException: Cannot assign requested address > at sun.nio.ch.Net.bind0(Native Method) > at sun.nio.ch.Net.bind(Net.java:433) > at sun.nio.ch.Net.bind(Net.java:425) > at sun.nio.ch.ServerSocketChannelImpl.bind( > ServerSocketChannelImpl.java:223) > at sun.nio.ch.ServerSocketAdaptor.bind( > ServerSocketAdaptor.java:74) > at org.apache.hadoop.ipc.Server.bind(Server.java:408) > ... 13 more > 2017-04-27 14:17:57,171 INFO org.apache.hadoop.util.ExitUtil: Exiting > with status 1 > 2017-04-27 14:17:57,176 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: > SHUTDOWN_MSG: > /************************************************************ > SHUTDOWN_MSG: Shutting down NameNode at master/1.1.1.1 > ************************************************************/ > > > > I have changed the port number multiple times, every time I get the same > error. How do I get past this? > > > > Thanks > Bhushan Pathak >