Hi, I am trying to create the Har files in hadoop 0.19.0, but on my hadoop cluster mapreduce jobs are not running , It gives very known exception
09/04/12 09:54:07 INFO mapred.FileInputFormat: Total input paths to process : 1 09/04/12 09:54:08 INFO mapred.JobClient: Running job: job_200904051339_0016 09/04/12 09:54:09 INFO mapred.JobClient: map 0% reduce 0% 09/04/12 09:54:18 INFO mapred.JobClient: Task Id : attempt_200904051339_0016_m_000003_0, Status : FAILED Error initializing attempt_200904051339_0016_m_000003_0: j*ava.lang.IllegalArgumentException: Wrong FS: hdfs:// 172.16.6.102:21011/tmp/hadoop-root/mapred/system/job_200904051339_0016/job.xml, expected: hdfs://namenodemc:21011* at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:322) at org.apache.hadoop.hdfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:91) at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:129) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:390) at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:699) at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1636) at org.apache.hadoop.mapred.TaskTracker.access$1200(TaskTracker.java:102) at org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:1602) 09/04/12 09:54:18 WARN mapred.JobClient: Error reading task outputhttp://cachenode1:50060/tasklog?plaintext=true&taskid=attempt_200904051339_0016_m_000003_0&filter=stdout 09/04/12 09:54:18 WARN mapred.JobClient: Error reading task outputhttp://cachenode1:50060/tasklog?plaintext=true&taskid=attempt_200904051339_0016_m_000003_0&filter=stderr 09/04/12 09:54:23 INFO mapred.JobClient: Task Id : attempt_200904051339_0016_m_000003_1, Status : FAILED Error initializing attempt_200904051339_0016_m_000003_1: java.lang.IllegalArgumentException: Wrong FS: hdfs:// 172.16.6.102:21011/tmp/hadoop-root/mapred/system/job_200904051339_0016/job.xml, expected: hdfs://namenodemc:21011 I searched it in the forum and *applied this patch* https://issues.apache.org/jira/browse/HADOOP-5191 Though i am not using aix or solaris , Then only its not working This problem arises when in *hadoop*-site.xml , ip addresses are used For Example NameNode machine has I have proper respected entry in */etc/hosts * 127.0.0.1 localhost.localdomain localhost ::1 localhost6.localdomain6 localhost6 172.16.19.125 cachenode1 *PS.: -cachenode1 is my datanode * */etc/sysconfig/network* NETWORKING=yes HOSTNAME=172.16.6.102 similar configuration is on datanode ... Can you please help? Thanks ... In advance Regards, Snehal Nagmote iiit-Hyderabad