Hi Shawn, 

Look at core-site.xml config file and look for the fs.default.name property. 
Here you will specify the host name and port of the HDFS NameNode. 


Rohit Bakhshi





www.hortonworks.com (http://www.hortonworks.com/)





On Tuesday, February 7, 2012 at 3:01 PM, Xiaomeng Wan wrote:

> Hi,
> I got the following error when running some pig script,
> 
> Error initializing attempt_201201031543_0083_m_000000_1:
> java.lang.IllegalArgumentException: Wrong FS:
> hdfs://10.2.0.135:54310/app/datastore/hadoop-hadoop/mapred/staging/hadoop/.staging/job_201201031543_0083/job.xml,
> expected: hdfs://dev-hadoop-01.***.com:54310
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:410)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:106)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:162)
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:542)
> at 
> org.apache.hadoop.mapred.TaskTracker.localizeJobConfFile(TaskTracker.java:1280)
> at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1174)
> at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1098)
> at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2271)
> at 
> org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2235)
> 
> It seems some ips in config files need to be changed to hostnames. Any hints?
> 
> Shawn 

Reply via email to