Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-19 Thread Niels Basjes
I never configure the ssh feature.
Not for running on a single node and not for a full size cluster.
I simply start all the required deamons (name/data/job/task) and configure
them on which ports each can be reached.

Niels Basjes
 On May 16, 2013 4:55 PM, "Raj Hadoop"  wrote:

>  Hi,
>
> I have a dedicated user on Linux server for hadoop. I am installing it in
> psedo distributed mode on this box. I want to test my programs on this
> machine. But i see that in installation steps - they were mentioned that
> SSH needs to be configured. If it is single node, I dont require it
> ...right? Please advise.
>
> I was looking at this site
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> It menionted like this -
> "
> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
> your local machine if you want to use Hadoop on it (which is what we want
> to do in this short tutorial). For our single-node setup of Hadoop, we
> therefore need to configure SSH access to localhost for the hduser user
> we created in the previous section.
> "
>
> Thanks,
> Raj
>
>


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-18 Thread Amal G Jose
Without passwordless ssh you can start it in pseudo-distributed mode.
Instead of start-all.sh, start-dfs.sh etc, Use
hadoop-daemon.sh start/stop  

eg: for starting jobtracker

hadoop-daemon.sh start jobtracker


On Fri, May 17, 2013 at 4:45 PM, Bertrand Dechoux wrote:

> The scrits themselves will use ssh to connect to every machine (even
> localhost).
> It's up to you if you want to type the password everytime. For a
> pseudo-distributed system, I don't see the issue with configuring a local
> ssh access.
>
> BUT Hadoop in itself does not require ssh. If you have a more appropriate
> way to start/stop the processes, you can write your own scripts.
>
> Regards
>
> Bertrand
>
>
>
> On Thu, May 16, 2013 at 6:02 PM, Mohammad Tariq wrote:
>
>> Hello Raj,
>>
>>  ssh is actually 2 things :
>> 1- ssh : The command we use to connect to remote machines - the client.
>> 2- sshd : The daemon that is running on the server and allows clients to
>> connect to the server.
>> ssh is pre-enabled on Linux, but in order to start sshd daemon, we need
>> to install ssh first.
>>
>> To start the Hadoop daemons you have to make ssh passwordless and issue
>> bin/start-dfs.sh and bin/start-mapred.sh.
>>
>> You might find this 
>> link<http://cloudfront.blogspot.in/2012/07/how-to-setup-and-configure-ssh-on-ubuntu.html#.UZUCkUAW38s>
>>  useful.
>>
>> Warm Regards,
>> Tariq
>> cloudfront.blogspot.com
>>
>>
>> On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop  wrote:
>>
>>>  Hi,
>>>
>>> I am a bit confused here. I am planning to run on a single machine.
>>>
>>> So what should i do to start hadoop processes. How should I do an SSH?
>>> Can you please breifly explain me what SSH is?
>>>
>>> Thanks,
>>> Raj
>>>   *From:* Jay Vyas 
>>> *To:* "common-u...@hadoop.apache.org" 
>>> *Cc:* Raj Hadoop 
>>> *Sent:* Thursday, May 16, 2013 11:34 AM
>>> *Subject:* Re: Configuring SSH - is it required? for a psedo
>>> distriburted mode?
>>>
>>>  Actually, I should amend my statement -- SSH is required, but
>>> passwordless ssh (i guess) you can live without if you are willing to enter
>>> your password for each process that gets started.
>>>
>>> But Why wouldn't you want to implement passwordless ssh in a pseudo
>>> distributed cluster ?  Its very easy to implement on a single node:
>>>
>>> cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys
>>>
>>>
>>>
>>>
>>> On Thu, May 16, 2013 at 11:31 AM, Jay Vyas  wrote:
>>>
>>> Yes it is required -- in psuedodistributed node the jobtracker is not
>>> necessarily aware that the task trackers  / data nodes are on the same
>>> machine, and will thus attempt to ssh into them when starting the
>>> respective deamons etc (i.e. start-all.sh)
>>>
>>>
>>> On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
>>> alajangikish...@gmail.com> wrote:
>>>
>>>  When you start the hadoop procecess, each process will ask the
>>> password to start, to overcome this we will configure SSH if you use single
>>> node or multiple nodes for each process, if you can enter the password for
>>> each process Its not a mandatory even if you use multiple systems.
>>>
>>> Thanks,
>>> Kishore.
>>>
>>>
>>> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>>>
>>>   Hi,
>>>
>>> I have a dedicated user on Linux server for hadoop. I am installing it
>>> in psedo distributed mode on this box. I want to test my programs on this
>>> machine. But i see that in installation steps - they were mentioned that
>>> SSH needs to be configured. If it is single node, I dont require it
>>> ...right? Please advise.
>>>
>>> I was looking at this site
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>>
>>> It menionted like this -
>>> "
>>> Hadoop requires SSH access to manage its nodes, i.e. remote machines
>>> plus your local machine if you want to use Hadoop on it (which is what we
>>> want to do in this short tutorial). For our single-node setup of Hadoop, we
>>> therefore need to configure SSH access to localhost for the hduser user
>>> we created in the previous section.
>>> "
>>>
>>> Thanks,
>>> Raj
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com/
>>>
>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com/
>>>
>>>
>>>
>>
>


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-17 Thread Bertrand Dechoux
The scrits themselves will use ssh to connect to every machine (even
localhost).
It's up to you if you want to type the password everytime. For a
pseudo-distributed system, I don't see the issue with configuring a local
ssh access.

BUT Hadoop in itself does not require ssh. If you have a more appropriate
way to start/stop the processes, you can write your own scripts.

Regards

Bertrand


On Thu, May 16, 2013 at 6:02 PM, Mohammad Tariq  wrote:

> Hello Raj,
>
>  ssh is actually 2 things :
> 1- ssh : The command we use to connect to remote machines - the client.
> 2- sshd : The daemon that is running on the server and allows clients to
> connect to the server.
> ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to
> install ssh first.
>
> To start the Hadoop daemons you have to make ssh passwordless and issue
> bin/start-dfs.sh and bin/start-mapred.sh.
>
> You might find this 
> link<http://cloudfront.blogspot.in/2012/07/how-to-setup-and-configure-ssh-on-ubuntu.html#.UZUCkUAW38s>
>  useful.
>
> Warm Regards,
> Tariq
> cloudfront.blogspot.com
>
>
> On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop  wrote:
>
>>  Hi,
>>
>> I am a bit confused here. I am planning to run on a single machine.
>>
>> So what should i do to start hadoop processes. How should I do an SSH?
>> Can you please breifly explain me what SSH is?
>>
>> Thanks,
>> Raj
>>   *From:* Jay Vyas 
>> *To:* "common-u...@hadoop.apache.org" 
>> *Cc:* Raj Hadoop 
>> *Sent:* Thursday, May 16, 2013 11:34 AM
>> *Subject:* Re: Configuring SSH - is it required? for a psedo
>> distriburted mode?
>>
>>  Actually, I should amend my statement -- SSH is required, but
>> passwordless ssh (i guess) you can live without if you are willing to enter
>> your password for each process that gets started.
>>
>> But Why wouldn't you want to implement passwordless ssh in a pseudo
>> distributed cluster ?  Its very easy to implement on a single node:
>>
>> cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys
>>
>>
>>
>>
>> On Thu, May 16, 2013 at 11:31 AM, Jay Vyas  wrote:
>>
>> Yes it is required -- in psuedodistributed node the jobtracker is not
>> necessarily aware that the task trackers  / data nodes are on the same
>> machine, and will thus attempt to ssh into them when starting the
>> respective deamons etc (i.e. start-all.sh)
>>
>>
>> On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
>> alajangikish...@gmail.com> wrote:
>>
>>  When you start the hadoop procecess, each process will ask the password
>> to start, to overcome this we will configure SSH if you use single node or
>> multiple nodes for each process, if you can enter the password for each
>> process Its not a mandatory even if you use multiple systems.
>>
>> Thanks,
>> Kishore.
>>
>>
>> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>>
>>   Hi,
>>
>> I have a dedicated user on Linux server for hadoop. I am installing it in
>> psedo distributed mode on this box. I want to test my programs on this
>> machine. But i see that in installation steps - they were mentioned that
>> SSH needs to be configured. If it is single node, I dont require it
>> ...right? Please advise.
>>
>> I was looking at this site
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>
>> It menionted like this -
>> "
>> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
>> your local machine if you want to use Hadoop on it (which is what we want
>> to do in this short tutorial). For our single-node setup of Hadoop, we
>> therefore need to configure SSH access to localhost for the hduser user
>> we created in the previous section.
>> "
>>
>> Thanks,
>> Raj
>>
>>
>>
>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com/
>>
>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com/
>>
>>
>>
>


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread Mohammad Tariq
Hello Raj,

 ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client.
2- sshd : The daemon that is running on the server and allows clients to
connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to
install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue
bin/start-dfs.sh and bin/start-mapred.sh.

You might find this
link<http://cloudfront.blogspot.in/2012/07/how-to-setup-and-configure-ssh-on-ubuntu.html#.UZUCkUAW38s>
 useful.

Warm Regards,
Tariq
cloudfront.blogspot.com


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop  wrote:

>  Hi,
>
> I am a bit confused here. I am planning to run on a single machine.
>
> So what should i do to start hadoop processes. How should I do an SSH? Can
> you please breifly explain me what SSH is?
>
> Thanks,
> Raj
>   *From:* Jay Vyas 
> *To:* "common-u...@hadoop.apache.org" 
> *Cc:* Raj Hadoop 
> *Sent:* Thursday, May 16, 2013 11:34 AM
> *Subject:* Re: Configuring SSH - is it required? for a psedo distriburted
> mode?
>
>  Actually, I should amend my statement -- SSH is required, but
> passwordless ssh (i guess) you can live without if you are willing to enter
> your password for each process that gets started.
>
> But Why wouldn't you want to implement passwordless ssh in a pseudo
> distributed cluster ?  Its very easy to implement on a single node:
>
> cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys
>
>
>
>
> On Thu, May 16, 2013 at 11:31 AM, Jay Vyas  wrote:
>
> Yes it is required -- in psuedodistributed node the jobtracker is not
> necessarily aware that the task trackers  / data nodes are on the same
> machine, and will thus attempt to ssh into them when starting the
> respective deamons etc (i.e. start-all.sh)
>
>
> On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
> alajangikish...@gmail.com> wrote:
>
>  When you start the hadoop procecess, each process will ask the password
> to start, to overcome this we will configure SSH if you use single node or
> multiple nodes for each process, if you can enter the password for each
> process Its not a mandatory even if you use multiple systems.
>
> Thanks,
> Kishore.
>
>
> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>
>   Hi,
>
> I have a dedicated user on Linux server for hadoop. I am installing it in
> psedo distributed mode on this box. I want to test my programs on this
> machine. But i see that in installation steps - they were mentioned that
> SSH needs to be configured. If it is single node, I dont require it
> ...right? Please advise.
>
> I was looking at this site
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> It menionted like this -
> "
> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
> your local machine if you want to use Hadoop on it (which is what we want
> to do in this short tutorial). For our single-node setup of Hadoop, we
> therefore need to configure SSH access to localhost for the hduser user
> we created in the previous section.
> "
>
> Thanks,
> Raj
>
>
>
>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com/
>
>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com/
>
>
>


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread Raj Hadoop
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you 
please breifly explain me what SSH is?
 
Thanks,
Raj


From: Jay Vyas 
To: "common-u...@hadoop.apache.org"  
Cc: Raj Hadoop  
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?



Actually, I should amend my statement -- SSH is required, but passwordless ssh 
(i guess) you can live without if you are willing to enter your password for 
each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed 
cluster ?  Its very easy to implement on a single node: 

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys 






On Thu, May 16, 2013 at 11:31 AM, Jay Vyas  wrote:

Yes it is required -- in psuedodistributed node the jobtracker is not 
necessarily aware that the task trackers  / data nodes are on the same machine, 
and will thus attempt to ssh into them when starting the respective deamons etc 
(i.e. start-all.sh)
>
>
>
>
>On Thu, May 16, 2013 at 11:21 AM, kishore alajangi  
>wrote:
>
>When you start the hadoop procecess, each process will ask the password to 
>start, to overcome this we will configure SSH if you use single node or 
>multiple nodes for each process, if you can enter the password for each 
>process Its not a mandatory even if you use multiple systems.
>>
>>Thanks,
>>Kishore.
>>
>>
>>
>>On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>>
>> Hi,
>>> 
>>>I have a dedicated user on Linux server for hadoop. I am installing it in 
>>>psedo distributed mode on this box. I want to test my programs on this 
>>>machine. But i see that in installation steps - they were mentioned that SSH 
>>>needs to be configured. If it is single node, I dont require it ...right? 
>>>Please advise. 
>>> 
>>>I was looking at this site
>>>http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>> 
>>>It menionted like this - 
>>>"Hadoop requires SSH access to manage its nodes, i.e. remote machines plus 
>>>your local machine if you want to use Hadoop on it (which is what we want to 
>>>do in this short tutorial). For our single-node setup of Hadoop, we 
>>>therefore need to configure SSH access to localhost for the hduser user we 
>>>created in the previous section.
>>>"
>>> 
>>>Thanks,
>>>Raj
>>> 
>>
>
>
>-- 
>Jay Vyas
>http://jayunit100.blogspot.com/ 


-- 
Jay Vyas
http://jayunit100.blogspot.com/ 

Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread Jay Vyas
Actually, I should amend my statement -- SSH is required, but passwordless
ssh (i guess) you can live without if you are willing to enter your
password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo
distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas  wrote:

> Yes it is required -- in psuedodistributed node the jobtracker is not
> necessarily aware that the task trackers  / data nodes are on the same
> machine, and will thus attempt to ssh into them when starting the
> respective deamons etc (i.e. start-all.sh)
>
>
> On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
> alajangikish...@gmail.com> wrote:
>
>> When you start the hadoop procecess, each process will ask the password
>> to start, to overcome this we will configure SSH if you use single node or
>> multiple nodes for each process, if you can enter the password for each
>> process Its not a mandatory even if you use multiple systems.
>>
>> Thanks,
>> Kishore.
>>
>>
>> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>>
>>>  Hi,
>>>
>>> I have a dedicated user on Linux server for hadoop. I am installing it
>>> in psedo distributed mode on this box. I want to test my programs on this
>>> machine. But i see that in installation steps - they were mentioned that
>>> SSH needs to be configured. If it is single node, I dont require it
>>> ...right? Please advise.
>>>
>>> I was looking at this site
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>>
>>> It menionted like this -
>>> "
>>> Hadoop requires SSH access to manage its nodes, i.e. remote machines
>>> plus your local machine if you want to use Hadoop on it (which is what we
>>> want to do in this short tutorial). For our single-node setup of Hadoop, we
>>> therefore need to configure SSH access to localhost for the hduser user
>>> we created in the previous section.
>>> "
>>>
>>> Thanks,
>>> Raj
>>>
>>>
>>
>>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
>



-- 
Jay Vyas
http://jayunit100.blogspot.com


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread Jay Vyas
Yes it is required -- in psuedodistributed node the jobtracker is not
necessarily aware that the task trackers  / data nodes are on the same
machine, and will thus attempt to ssh into them when starting the
respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
alajangikish...@gmail.com> wrote:

> When you start the hadoop procecess, each process will ask the password to
> start, to overcome this we will configure SSH if you use single node or
> multiple nodes for each process, if you can enter the password for each
> process Its not a mandatory even if you use multiple systems.
>
> Thanks,
> Kishore.
>
>
> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:
>
>>  Hi,
>>
>> I have a dedicated user on Linux server for hadoop. I am installing it in
>> psedo distributed mode on this box. I want to test my programs on this
>> machine. But i see that in installation steps - they were mentioned that
>> SSH needs to be configured. If it is single node, I dont require it
>> ...right? Please advise.
>>
>> I was looking at this site
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>
>> It menionted like this -
>> "
>> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
>> your local machine if you want to use Hadoop on it (which is what we want
>> to do in this short tutorial). For our single-node setup of Hadoop, we
>> therefore need to configure SSH access to localhost for the hduser user
>> we created in the previous section.
>> "
>>
>> Thanks,
>> Raj
>>
>>
>
>


-- 
Jay Vyas
http://jayunit100.blogspot.com


Re: Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread kishore alajangi
When you start the hadoop procecess, each process will ask the password to
start, to overcome this we will configure SSH if you use single node or
multiple nodes for each process, if you can enter the password for each
process Its not a mandatory even if you use multiple systems.

Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop  wrote:

>  Hi,
>
> I have a dedicated user on Linux server for hadoop. I am installing it in
> psedo distributed mode on this box. I want to test my programs on this
> machine. But i see that in installation steps - they were mentioned that
> SSH needs to be configured. If it is single node, I dont require it
> ...right? Please advise.
>
> I was looking at this site
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> It menionted like this -
> "
> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
> your local machine if you want to use Hadoop on it (which is what we want
> to do in this short tutorial). For our single-node setup of Hadoop, we
> therefore need to configure SSH access to localhost for the hduser user
> we created in the previous section.
> "
>
> Thanks,
> Raj
>
>


Configuring SSH - is it required? for a psedo distriburted mode?

2013-05-16 Thread Raj Hadoop
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo 
distributed mode on this box. I want to test my programs on this machine. But i 
see that in installation steps - they were mentioned that SSH needs to be 
configured. If it is single node, I dont require it ...right? Please advise. 
 
I was looking at this site
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
 
It menionted like this - 
"Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your 
local machine if you want to use Hadoop on it (which is what we want to do in 
this short tutorial). For our single-node setup of Hadoop, we therefore need to 
configure SSH access to localhost for the hduser user we created in the 
previous section.
"
 
Thanks,
Raj