Try exporting HADOOP_HOME in the launching user's environment to the right path, on every one of your slave nodes where there is no such symlink. Then try starting all again.
On Tue, Jan 31, 2012 at 2:57 AM, Mohamed Riadh Trad <mohamed.t...@inria.fr> wrote: > Hi, > > I am upgraded my cluster to hadoop 1.0.0, however, hdfs fails to start and I > get the following message: > > ################################### > > starting namenode, logging to > /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-namenode-master_dfs.out > slave001: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun > fichier ou dossier de ce type. > slave002: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun > fichier ou dossier de ce type. > slave003: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun > fichier ou dossier de ce type. > master_dfs: starting secondarynamenode, logging to > /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-secondarynamenode-master_dfs.out > > ################### > > Hadoop is installed as follows: > > on master_dfs: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ However the > /local/ is actually a symlink to /home/local so the actual path is > : /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ > on slave001,slave002,slave003: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ > > How to force hadoop 1.0.0 to bypass this redirection? > > Kind regards > > > > Trad Mohamed Riadh, M.Sc, Ing. > PhD. student > INRIA-TELECOM PARISTECH - ENPC School of International Management > > Office: 11-15 > Phone: (33)-1 39 63 59 33 > Fax: (33)-1 39 63 56 74 > Email: riadh.t...@inria.fr > Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/ > -- Harsh J Customer Ops. Engineer, Cloudera