Hi, Here are the steps I had to do to get the `sbin/hadoop-daemons.sh stop datanode` (note the plural daemons) work.
1. copy the hadoop-env.sh from ./share/hadoop/common/templates/conf to conf folder. 2. Add the JAVA_HOME in hadoop-env.sh. 3. Comment the below line in hadoop-env.sh. export HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER Still need to figure out the below problem with starting the NodeManager on the cluster praveensripati@praveen-laptop:~/Installations/hadoop-0.23.0$ *bin/yarn-daemons.sh start nodemanager* cat: ~/home/praveensripati/Installations/hadoop-0.23.0/conf/slaves: No such file or directory Regards, Praveen On Sat, Jan 7, 2012 at 3:23 PM, Praveen Sripati <praveensrip...@gmail.com>wrote: > Ronald, > > Here is the output > > uid=1000(praveensripati) gid=1000(praveensripati) > groups=1000(praveensripati),4(adm),20(dialout),24(cdrom),46(plugdev),116(lpadmin),118(admin),124(sambashare) > > Regards, > Praveen > > On Sat, Jan 7, 2012 at 12:45 PM, Ronald Petty <ronald.pe...@gmail.com>wrote: > >> Praveen, >> >> What does 'id' output? >> >> Kindest regards. >> >> Ron >> >> >> On Fri, Jan 6, 2012 at 9:51 AM, Praveen Sripati <praveensrip...@gmail.com >> > wrote: >> >>> Hi, >>> >>> I am able to run 0.23 on a single node and trying to setup it on a >>> cluster and getting errors. >>> >>> When I try to start the data nodes, I get the below errors. I have also >>> tried adding `export >>> HADOOP_LOG_DIR=/home/praveensripati/Installations/hadoop-0.23.0/logs` to >>> .bashrc and there hadn't been any change. >>> >>> praveensripati@praveen-laptop:~/Installations/hadoop-0.23.0$ >>> *sbin/hadoop-daemons.sh >>> start datanode* >>> praveen-laptop: mkdir: cannot create directory `/praveensripati': >>> Permission denied >>> praveen-laptop: chown: cannot access `/praveensripati/praveensripati': >>> No such file or directory >>> praveen-laptop: starting datanode, logging to >>> /praveensripati/praveensripati/hadoop-praveensripati-datanode-praveen-laptop.out >>> praveen-laptop: >>> /home/praveensripati/Installations/hadoop-0.23.0/sbin/hadoop-daemon.sh: >>> line 144: >>> /praveensripati/praveensripati/hadoop-praveensripati-datanode-praveen-laptop.out: >>> No such file or directory >>> ubuntu-guest: mkdir: cannot create directory `/praveensripati': >>> Permission denied >>> ubuntu-guest: chown: cannot access `/praveensripati/praveensripati': No >>> such file or directory >>> ubuntu-guest: starting datanode, logging to >>> /praveensripati/praveensripati/hadoop-praveensripati-datanode-ubuntu-guest.out >>> ubuntu-guest: >>> /home/praveensripati/Installations/hadoop-0.23.0/sbin/hadoop-daemon.sh: >>> line 143: >>> /praveensripati/praveensripati/hadoop-praveensripati-datanode-ubuntu-guest.out: >>> No such file or directory >>> praveen-laptop: head: cannot open >>> `/praveensripati/praveensripati/hadoop-praveensripati-datanode-praveen-laptop.out' >>> for reading: No such file or directory >>> ubuntu-guest: head: cannot open >>> `/praveensripati/praveensripati/hadoop-praveensripati-datanode-ubuntu-guest.out' >>> for reading: No such file or directory >>> >>> When I try to start the node manager, I get the below errors >>> >>> praveensripati@praveen-laptop:~/Installations/hadoop-0.23.0$ >>> *bin/yarn-daemons.sh >>> start nodemanager* >>> cat: ~/home/praveensripati/Installations/hadoop-0.23.0/conf/slaves: No >>> such file or directory >>> >>> Is there a bug in the code or am I missing some settings? >>> >>> Regards, >>> Praveen >>> >>> >> >