Re: hadoop installation problem(single-node)
hi everybody Thanks for your help. I configured the hadoop on single node but there is a problem which is every time I issue the command bin/start-all .sh ,Its always ask me for password. Here are the command and result of what I'm doing: hadoop@ws40-man-lin:~/project/hadoop-0.20.0$ bin/start-all.sh starting namenode, logging to /var/log/hadoop/hadoop-hadoop-namenode-ws40-man-lin.out hadoop@localhost's password: localhost: starting datanode, logging to /var/log/hadoop/hadoop-hadoop-datanode-ws40-man-lin.out hadoop@localhost's password: localhost: starting secondarynamenode, logging to /var/log/hadoop/hadoop-hadoop-secondarynamenode-ws40-man-lin.out starting jobtracker, logging to /var/log/hadoop/hadoop-hadoop-jobtracker-ws40-man-lin.out hadoop@localhost's password: localhost: starting tasktracker, logging to /var/log/hadoop/hadoop-hadoop-tasktracker-ws40-man-lin.out can anyone tell me the way that i don't have to issue the password again and again ? can anyone tell me the way that all the process or hadoop scripts start automatically when i turn on my machine means i don't want to issue command bin/start-all.sh every time when I reboot my computer? -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2649448.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
Re: hadoop installation problem(single-node)
It's because you have not configured password-free ssh. The master node should be able to ssh without password into all slaves. Read the ssh access section here: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/ On Tue, Mar 8, 2011 at 9:46 AM, manish.yadav manish.ya...@orkash.comwrote: hi everybody Thanks for your help. I configured the hadoop on single node but there is a problem which is every time I issue the command bin/start-all .sh ,Its always ask me for password. Here are the command and result of what I'm doing: hadoop@ws40-man-lin:~/project/hadoop-0.20.0$ bin/start-all.sh starting namenode, logging to /var/log/hadoop/hadoop-hadoop-namenode-ws40-man-lin.out hadoop@localhost's password: localhost: starting datanode, logging to /var/log/hadoop/hadoop-hadoop-datanode-ws40-man-lin.out hadoop@localhost's password: localhost: starting secondarynamenode, logging to /var/log/hadoop/hadoop-hadoop-secondarynamenode-ws40-man-lin.out starting jobtracker, logging to /var/log/hadoop/hadoop-hadoop-jobtracker-ws40-man-lin.out hadoop@localhost's password: localhost: starting tasktracker, logging to /var/log/hadoop/hadoop-hadoop-tasktracker-ws40-man-lin.out can anyone tell me the way that i don't have to issue the password again and again ? can anyone tell me the way that all the process or hadoop scripts start automatically when i turn on my machine means i don't want to issue command bin/start-all.sh every time when I reboot my computer? -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2649448.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
Re: hadoop installation problem(single-node)
sorry for responding late. Thanks for ur help I tried your command but the result is same can u tell me what should i do ___ If you reply to this email, your message will be added to the discussion below: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2644953.html To unsubscribe from hadoop installation problem(single-node), visit http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=2613742code=Y29yZS11c2VyQGhhZG9vcC5hcGFjaGUub3JnfDI2MTM3NDJ8LTk1MzA2MTk5NA==
Re: hadoop installation problem(single-node)
Are you configure your ssh service? 2011/3/5 Tanping Wang tanp...@yahoo-inc.com Try $HADOOP_HOME/bin/hadoop namenode -format or maybe consider export PATH=$HADOOP_HOME/bin:$PATH Regards, Tanping -Original Message- From: Manish Yadav [mailto:manish.ya...@orkash.com] Sent: Wednesday, March 02, 2011 1:00 AM To: core-u...@hadoop.apache.org Subject: hadoop installation problem(single-node) Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop.
RE: hadoop installation problem(single-node)
Try $HADOOP_HOME/bin/hadoop namenode -format or maybe consider export PATH=$HADOOP_HOME/bin:$PATH Regards, Tanping -Original Message- From: Manish Yadav [mailto:manish.ya...@orkash.com] Sent: Wednesday, March 02, 2011 1:00 AM To: core-u...@hadoop.apache.org Subject: hadoop installation problem(single-node) Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop.
hadoop installation problem(single-node)
Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop. ?xml version=1.0? ?xml-stylesheet type=text/xsl href=configuration.xsl? !-- Put site-specific property overrides in this file. -- configuration property namefs.default.name/name valuehdfs://192.168.0.133:54310/value descriptionThe name of the default file system. A URI whose scheme and authority determine the FileSystem implementation. The uri's scheme determines the config property (fs.SCHEME.impl) naming the FileSystem implementation class. The uri's authority is used to determine the host, port, etc. for a filesystem./description /property property namefs.checkpoint.dir/name value/home/hadoop/project/hadoop-0.20.2/check/value descriptionDetermines where on the local filesystem the DFS secondary name node should store the temporary images to merge. If this is a comma-delimited list of directories then the image is replicated in all of the directories for redundancy. /description /property property namehadoop.tmp.dir/name value/htemp/hadoop-${user.name}/value descriptionA base for other temporary directories./description /property /configuration hadoop-env.sh Description: Bourne shell script ?xml version=1.0? ?xml-stylesheet type=text/xsl href=configuration.xsl? !-- Put site-specific property overrides in this file. -- configuration property namedfs.name.dir/name value/home/hadoop/project/hadoop-0.20.2/name/value descriptionDetermines where on the local filesystem the DFS name node should store the name table(fsimage). If this is a comma-delimited list of directories then the name table is replicated in all of the directories, for redundancy. /description /property property namedfs.data.dir/name value/hdd4/data/value descriptionDetermines where on the local filesystem an DFS data node should store its blocks. If this is a comma-delimited list of directories, then data will be stored in all named directories, typically on different devices. Directories that do not exist are ignored. /description /property property namedfs.replication/name value2/value descriptionDefault block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. /description /property property namedfs.permissions/name valuefalse/value description If true, enable permission checking in HDFS. If false, permission checking is turned off, but all other behavior is unchanged. Switching from one parameter value to the other does not change the mode, owner or group of files or directories. /description /property property namedfs.permissions.supergroup/name valuehadoop,root/value descriptionThe name of the group of super-users./description /property /configuration ?xml version=1.0? ?xml-stylesheet type=text/xsl href=configuration.xsl? !-- Put site-specific property overrides in this file. -- configuration property namemapred.job.tracker/name value192.168.0.133:54311/value descriptionThe host and port that the MapReduce job tracker runs at. If local, then jobs are run in-process as a single map and reduce task. /description /property property namemapred.local.dir/name value/hdd4/mapred/local/value descriptionThe local directory where MapReduce stores intermediate data files. May be a comma-separated list of directories on different devices in order to spread disk i/o. Directories that do not exist are ignored. /description /property property namemapred.system.dir/name value/home/mapred/system/value descriptionThe shared
RE: hadoop installation problem(single-node)
If you are interested in a quick start hadoop and don't mind if hbase is included take a look at the dashboard application at www.habermaas.com It is a free packaged hadoop setup. Just unzip it and run it. Bill -Original Message- From: Manish Yadav [mailto:manish.ya...@orkash.com] Sent: Wednesday, March 02, 2011 4:00 AM To: core-u...@hadoop.apache.org Subject: hadoop installation problem(single-node) Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop.
Re: hadoop installation problem(single-node)
hey Manish, Are u giving the commands in the Hadoop_home directory ? if yes please give bin/hadoop namenode -format dont forget to append bin/ before ur commands because all the scripts reside in the bin directory. Matthew On Wed, Mar 2, 2011 at 2:29 PM, Manish Yadav manish.ya...@orkash.com wrote: Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop.
Re: hadoop installation problem(single-node)
The instructions at http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html should be what you need. Cheers, Tom On Wed, Mar 2, 2011 at 12:59 AM, Manish Yadav manish.ya...@orkash.com wrote: Dear Sir/Madam I'm very new to hadoop. I'm trying to install hadoop on my computer. I followed a weblink and try to install it. I want to install hadoop on my single node cluster. i 'm using Ubuntu 10.04 64-bit as my operating system . I have installed java in /usr/java/jdk1.6.0_24. the step i take to install hadoop are following 1: Make a group hadoop and a user hadoop with home directory in hadoop directory i have a directory called projects and download hadoop binary there than extract them there; i configured the ssh also. than i made changes to some file which are following. i'm attaching them with this male please check them . 1: hadoop_env_sh 2:core-site.xml 3mapreduce-site.xml 4 hdfs-site. xml 5 hadoop's usre .bashrc 6 hadoop'user .profile After making changes to these fie ,I just enter the hadoop account and enter the few command following thing happen : hadoop@ws40-man-lin:~$ echo $HADOOP_HOME /home/hadoop/project/hadoop-0.20.0 hadoop@ws40-man-lin:~$ hadoop namenode -format hadoop: command not found hadoop@ws40-man-lin:~$ namenode -format namenode: command not found hadoop@ws40-man-lin:~$ now I'm completely stuck i don't know what to do? please help me as there is no more help around the net. i' m attaching the files also which i changed can u tell me the exact configuration which i should use to install hadoop.
Re: hadoop installation problem(single-node)
Matthew thanks for the help now the command is working but I got the following errors .Will u help me to solve these error im giving you the error list i just use the command hadoop@ws40-man-lin:~/project/hadoop-0.20.0$ bin/hadoop namenode -format and i get following result Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit. now what i'm doing wrong -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2622997.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
Re: hadoop installation problem(single-node)
thanks for the help now the command is working but I got the following errors .Will u help me to solve these error im giving you the error list which i faced in installing hadoop on single node cluster all the configuration files are attached to the earlier post i just use the command hadoop@ws40-man-lin:~/project/hadoop-0.20.0$ bin/hadoop namenode -format and i get following result Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit. now what i'm doing wrong -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2623014.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
Re: hadoop installation problem(single-node)
Hey Manish, I am not very sure if you have got your configurations correct including the javapath. Can u try re-installing hadoop following the guidelines given in the following link step by step. That would take care of any glitches possible. http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ Thanks, Matthew On Thu, Mar 3, 2011 at 10:42 AM, manish.yadav manish.ya...@orkash.com wrote: thanks for the help now the command is working but I got the following errors .Will u help me to solve these error im giving you the error list which i faced in installing hadoop on single node cluster all the configuration files are attached to the earlier post i just use the command hadoop@ws40-man-lin:~/project/hadoop-0.20.0$ bin/hadoop namenode -format and i get following result Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit. now what i'm doing wrong -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2623014.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
Re: hadoop installation problem(single-node)
hi thanx for replying . i already attached the configuration files in my earlier post please check them and tell me what I'm doing wrong -- View this message in context: http://lucene.472066.n3.nabble.com/hadoop-installation-problem-single-node-tp2613742p2623269.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.