[ 
https://issues.apache.org/jira/browse/HDFS-6725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shashang Sheth updated HDFS-6725:
---------------------------------

    Description: 
I am unable to start the datanode and tasktracker daemons on one of my slave 
nodes. I have got two slave nodes for my test env.
The error is familiar to Jira, however the solutions provided in JIRA is not 
working for me.

Below are the errors 
Datanode log file:

2014-07-22 07:17:54,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = Hadslave1/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.0
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; 
compiled by 'hortonfo' on Mon May  6 
06:59:37 UTC 2013
STARTUP_MSG:   java = 1.6.0_31
************************************************************/
2014-07-22 07:17:55,691 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2014-07-22 07:17:55,703 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
MetricsSystem,sub=Stats registered.
2014-07-22 07:17:55,732 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2014-07-22 07:17:55,732 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
DataNode metrics system started
2014-07-22 07:17:56,265 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
registered.
2014-07-22 07:17:56,275 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Source name ugi already exists!
2014-07-22 07:17:57,536 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: 
java.lang.IllegalArgumentException: Does not contain a valid
 host:port authority: file:///
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:236)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:357)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:319)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1698)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1637)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1655)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1781)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1798)

2014-07-22 07:17:57,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Hadslave1/127.0.1.1
************************************************************/


Task tracker's log files:

2014-07-22 07:17:59,297 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting TaskTracker
STARTUP_MSG:   host = Hadslave1/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.0
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; 
compiled by 'hortonfo' on Mon May  6 
06:59:37 UTC 2013
STARTUP_MSG:   java = 1.6.0_31
************************************************************/
2014-07-22 07:17:59,671 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2014-07-22 07:17:59,814 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
MetricsSystem,sub=Stats registered.
2014-07-22 07:17:59,815 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2014-07-22 07:17:59,815 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
TaskTracker metrics system started
2014-07-22 07:18:00,028 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded 
the native-hadoop library
2014-07-22 07:18:00,158 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
registered.
2014-07-22 07:18:00,160 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Source name ugi already exists!
2014-07-22 07:18:00,265 ERROR org.apache.hadoop.mapred.TaskTracker: Can not 
start task tracker because java.lang.IllegalArgumentException: 
Does not contain a valid host:port authority: local
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
        at org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2121)
        at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1540)
        at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3937)

2014-07-22 07:18:00,265 INFO org.apache.hadoop.mapred.TaskTracker: 
SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down TaskTracker at Hadslave1/127.0.1.1
************************************************************/

Below are my configuration files:
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat core-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://192.168.111.131:8020</value>

</property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat hdfs-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>
<name>dfs.replication</name>
<value>2</value>
</property>

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat mapred-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

        <property>

                <name>mapred.job.tracker</name>

                <value>localhost:8021</value>

        </property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ 

Both the daemons start on Hadmast node but doesn't start on the other node.

user@Hadmast:~$ jps
7947 Jps
6421 NameNode
7661 DataNode
6941 JobTracker
6866 SecondaryNameNode
7172 TaskTracker
user@Hadmast:~$ 

user@Hadslave1:~$ jps
4826 Jps
user@Hadslave1:~$

I have formated the namenode multiple times and have also rebuilt the Hadslave1 
once but no change.

  was:
I am unable to start the datanode and tasktracker daemons on one of my slave 
nodes. I have got two slave nodes for my test env.
The error is familiar to Jira, however the solutions provided in JIRA is not 
working for me.

Below are the errors 
Datanode log file:

2014-07-22 07:17:54,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = Hadslave1/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.0
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; 
compiled by 'hortonfo' on Mon May  6 
06:59:37 UTC 2013
STARTUP_MSG:   java = 1.6.0_31
************************************************************/
2014-07-22 07:17:55,691 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2014-07-22 07:17:55,703 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
MetricsSystem,sub=Stats registered.
2014-07-22 07:17:55,732 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2014-07-22 07:17:55,732 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
DataNode metrics system started
2014-07-22 07:17:56,265 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
registered.
2014-07-22 07:17:56,275 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Source name ugi already exists!
2014-07-22 07:17:57,536 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: 
java.lang.IllegalArgumentException: Does not contain a valid
 host:port authority: file:///
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:236)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:357)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:319)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1698)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1637)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1655)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1781)
        at 
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1798)

2014-07-22 07:17:57,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Hadslave1/127.0.1.1
************************************************************/


Task tracker's log files:

2014-07-22 07:17:59,297 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting TaskTracker
STARTUP_MSG:   host = Hadslave1/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.0
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; 
compiled by 'hortonfo' on Mon May  6 
06:59:37 UTC 2013
STARTUP_MSG:   java = 1.6.0_31
************************************************************/
2014-07-22 07:17:59,671 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2014-07-22 07:17:59,814 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
MetricsSystem,sub=Stats registered.
2014-07-22 07:17:59,815 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2014-07-22 07:17:59,815 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
TaskTracker metrics system started
2014-07-22 07:18:00,028 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded 
the native-hadoop library
2014-07-22 07:18:00,158 INFO 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
registered.
2014-07-22 07:18:00,160 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Source name ugi already exists!
2014-07-22 07:18:00,265 ERROR org.apache.hadoop.mapred.TaskTracker: Can not 
start task tracker because java.lang.IllegalArgumentException: 
Does not contain a valid host:port authority: local
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
        at org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2121)
        at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1540)
        at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3937)

2014-07-22 07:18:00,265 INFO org.apache.hadoop.mapred.TaskTracker: 
SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down TaskTracker at Hadslave1/127.0.1.1
************************************************************/

Below are my configuration files:
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat core-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://192.168.111.131:8020</value>

</property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat hdfs-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>
<name>dfs.replication</name>
<value>2</value>
</property>

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ cat mapred-site.xml 
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

        <property>

                <name>mapred.job.tracker</name>

                <value>localhost:8021</value>

        </property>

</configuration>
user@Hadmast:/opt/hadoop-1.2.0/conf$ 

Both the daemons start on Hadmast node but doesn't start on the other node.

user@Hadmast:/opt/hadoop-1.2.0/conf$ jps
6421 NameNode
7592 Jps
6941 JobTracker
6866 SecondaryNameNode
7172 TaskTracker

user@Hadslave1:~$ jps
4826 Jps
user@Hadslave1:~$

I have formated the namenode multiple times and have also rebuilt the Hadslave1 
once but no change.


> Datanode and Task Tracker not starting
> --------------------------------------
>
>                 Key: HDFS-6725
>                 URL: https://issues.apache.org/jira/browse/HDFS-6725
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: datanode
>         Environment: user@Hadslave1:~$ uname -a
> Linux Hadslave1 3.2.0-23-generic-pae #36-Ubuntu SMP Tue Apr 10 22:19:09 UTC 
> 2012 i686 i686 i386 GNU/Linux
> user@Hadslave1:~$
> user@Hadslave1:~$ /opt/hadoop-1.2.0/bin/hadoop -version
> java version "1.6.0_31"
> OpenJDK Runtime Environment (IcedTea6 1.13.3) (6b31-1.13.3-1ubuntu1~0.12.04.2)
> OpenJDK Client VM (build 23.25-b01, mixed mode, sharing)
> user@Hadslave1:~$ java -version
> java version "1.6.0_31"
> OpenJDK Runtime Environment (IcedTea6 1.13.3) (6b31-1.13.3-1ubuntu1~0.12.04.2)
> OpenJDK Client VM (build 23.25-b01, mixed mode, sharing)
> user@Hadslave1:~$
>            Reporter: Shashang Sheth
>            Priority: Minor
>
> I am unable to start the datanode and tasktracker daemons on one of my slave 
> nodes. I have got two slave nodes for my test env.
> The error is familiar to Jira, however the solutions provided in JIRA is not 
> working for me.
> Below are the errors 
> Datanode log file:
> 2014-07-22 07:17:54,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
> STARTUP_MSG: 
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = Hadslave1/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.2.0
> STARTUP_MSG:   build = 
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 
> 1479473; compiled by 'hortonfo' on Mon May  6 
> 06:59:37 UTC 2013
> STARTUP_MSG:   java = 1.6.0_31
> ************************************************************/
> 2014-07-22 07:17:55,691 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
> loaded properties from hadoop-metrics2.properties
> 2014-07-22 07:17:55,703 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
> MetricsSystem,sub=Stats registered.
> 2014-07-22 07:17:55,732 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period 
> at 10 second(s).
> 2014-07-22 07:17:55,732 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system 
> started
> 2014-07-22 07:17:56,265 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
> registered.
> 2014-07-22 07:17:56,275 WARN 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already 
> exists!
> 2014-07-22 07:17:57,536 ERROR 
> org.apache.hadoop.hdfs.server.datanode.DataNode: 
> java.lang.IllegalArgumentException: Does not contain a valid
>  host:port authority: file:///
>       at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>       at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
>       at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
>       at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:236)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:357)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:319)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1698)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1637)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1655)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1781)
>       at 
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1798)
> 2014-07-22 07:17:57,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
> SHUTDOWN_MSG: 
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at Hadslave1/127.0.1.1
> ************************************************************/
> Task tracker's log files:
> 2014-07-22 07:17:59,297 INFO org.apache.hadoop.mapred.TaskTracker: 
> STARTUP_MSG: 
> /************************************************************
> STARTUP_MSG: Starting TaskTracker
> STARTUP_MSG:   host = Hadslave1/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.2.0
> STARTUP_MSG:   build = 
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 
> 1479473; compiled by 'hortonfo' on Mon May  6 
> 06:59:37 UTC 2013
> STARTUP_MSG:   java = 1.6.0_31
> ************************************************************/
> 2014-07-22 07:17:59,671 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
> loaded properties from hadoop-metrics2.properties
> 2014-07-22 07:17:59,814 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source 
> MetricsSystem,sub=Stats registered.
> 2014-07-22 07:17:59,815 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period 
> at 10 second(s).
> 2014-07-22 07:17:59,815 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: TaskTracker metrics system 
> started
> 2014-07-22 07:18:00,028 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded 
> the native-hadoop library
> 2014-07-22 07:18:00,158 INFO 
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi 
> registered.
> 2014-07-22 07:18:00,160 WARN 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already 
> exists!
> 2014-07-22 07:18:00,265 ERROR org.apache.hadoop.mapred.TaskTracker: Can not 
> start task tracker because java.lang.IllegalArgumentException: 
> Does not contain a valid host:port authority: local
>       at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
>       at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
>       at org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2121)
>       at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1540)
>       at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3937)
> 2014-07-22 07:18:00,265 INFO org.apache.hadoop.mapred.TaskTracker: 
> SHUTDOWN_MSG: 
> /************************************************************
> SHUTDOWN_MSG: Shutting down TaskTracker at Hadslave1/127.0.1.1
> ************************************************************/
> Below are my configuration files:
> user@Hadmast:/opt/hadoop-1.2.0/conf$ cat core-site.xml 
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <!-- Put site-specific property overrides in this file. -->
> <configuration>
> <property>
> <name>fs.default.name</name>
> <value>hdfs://192.168.111.131:8020</value>
> </property>
> </configuration>
> user@Hadmast:/opt/hadoop-1.2.0/conf$ cat hdfs-site.xml 
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <!-- Put site-specific property overrides in this file. -->
> <configuration>
> <property>
> <name>dfs.replication</name>
> <value>2</value>
> </property>
> <property>
> <name>dfs.permissions</name>
> <value>false</value>
> </property>
> </configuration>
> user@Hadmast:/opt/hadoop-1.2.0/conf$ cat mapred-site.xml 
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <!-- Put site-specific property overrides in this file. -->
> <configuration>
>       <property>
>               <name>mapred.job.tracker</name>
>               <value>localhost:8021</value>
>       </property>
> </configuration>
> user@Hadmast:/opt/hadoop-1.2.0/conf$ 
> Both the daemons start on Hadmast node but doesn't start on the other node.
> user@Hadmast:~$ jps
> 7947 Jps
> 6421 NameNode
> 7661 DataNode
> 6941 JobTracker
> 6866 SecondaryNameNode
> 7172 TaskTracker
> user@Hadmast:~$ 
> user@Hadslave1:~$ jps
> 4826 Jps
> user@Hadslave1:~$
> I have formated the namenode multiple times and have also rebuilt the 
> Hadslave1 once but no change.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to