User accounts in Master and Slaves

2008-04-23 Thread Sridhar Raman
After trying out Hadoop in a single machine, I decided to run a MapReduce across multiple machines. This is the approach I followed: 1 Master 1 Slave (A doubt here: Can my Master also be used to execute the Map/Reduce functions?) To do this, I set up the masters and slaves files in the conf dir

Re: User accounts in Master and Slaves

2008-04-23 Thread Harish Mallipeddi
On Wed, Apr 23, 2008 at 3:03 PM, Sridhar Raman <[EMAIL PROTECTED]> wrote: > After trying out Hadoop in a single machine, I decided to run a MapReduce > across multiple machines. This is the approach I followed: > 1 Master > 1 Slave > > (A doubt here: Can my Master also be used to execute the Map

Re: User accounts in Master and Slaves

2008-04-23 Thread Sridhar Raman
Ok, what about the issue regarding the users? Do all the machines need to be under the same user? On Wed, Apr 23, 2008 at 12:43 PM, Harish Mallipeddi < [EMAIL PROTECTED]> wrote: > On Wed, Apr 23, 2008 at 3:03 PM, Sridhar Raman <[EMAIL PROTECTED]> > wrote: > > > After trying out Hadoop in a singl

Re: User accounts in Master and Slaves

2008-04-23 Thread Norbert Burger
Yes, this is the suggested configuration. Hadoop relies on password-less SSH to be able to start tasks on slave machines. You can find instructions on creating/transferring the SSH keys here: http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29 On Wed, Apr 23

Re: User accounts in Master and Slaves

2008-04-24 Thread Sridhar Raman
I tried following the instructions for a single-node cluster (as mentioned in the link). I am facing a strange roadblock. In the hadoop-site.xml, I have set the value of hadoop.tmp.dir to /WORK/temp/hadoop/workspace/hadoop-${user.name}. After doing this, I run bin/hadoop namenode -format, and th

Re: User accounts in Master and Slaves

2008-05-01 Thread Sridhar Raman
Though I am able to run MapReduce tasks without errors, I am still not able to get stop-all to work. It still says, "no tasktracker to stop, no datanode to stop, ...". And also, there are a lot of java processes running in my Task Manager which I need to forcibly shut down. Are these two problem