Hi,

I started with "ssh localhost" command.
Does anything else is needed to check SSH?

Then I stopped all the services which were running by "stop-all.sh"
and start them again with "start-all.sh".

I have copied the way it executed on the terminal for some commands.

I don't know, why after start-all.sh it says starting namenode and does not
show any failure but
when I check through jps it does not list namenode.

I tried opening namenode in browser. It is also not getting open.

----------------------------------------------------------------------------------------------------------------------------------------

These is the way it executed on terminal:

hduser@ubuntu:~$ ssh localhost
hduser@localhost's password:
Welcome to Ubuntu 12.04.2 LTS

 * Documentation:  https://help.ubuntu.com/

459 packages can be updated.
209 updates are security updates.

Last login: Sun Feb  2 00:28:46 2014 from localhost




hduser@ubuntu:~$ /usr/local/hadoop/bin/hadoop namenode -format
14/04/07 01:44:20 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/127.0.0.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
************************************************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) y
Format aborted in /app/hadoop/tmp/dfs/name
14/04/07 01:44:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.0.1
************************************************************/


hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-namenode-ubuntu.out
ehduser@localhost's password:
hduser@localhost's password: localhost: Permission denied, please try again.
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-datanode-ubuntu.out
hduser@





On Sun, Apr 13, 2014 at 9:14 PM, Mahesh Khandewal <mahesh.k....@gmail.com>wrote:

> Ekta it may be ssh problem. first check for ssh
>
>
> On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <ektacloudst...@gmail.com>wrote:
>
>> I already used the same guide to install hadoop.
>>
>> If HDFS does not require anything except Hadoop single node
>> installation then the installation part is complete.
>>
>> I tried running bin/hadoop dfs -mkdir /foodir
>>             bin/hadoop dfsadmin -safemode enter
>>
>> these commands are giving following exception:
>>
>> 14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/
>> 127.0.0.1:54310. Already tried 9 time(s).
>> Bad connection to FS. command aborted. exception: Call to localhost/
>> 127.0.0.1:54310 failed on connection exception:
>> java.net.ConnectException: Connection refused
>>
>> Can somebody help me to understand that why it is happening?
>>
>>
>>
>>
>>
>> On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <
>> mahesh.k....@gmail.com> wrote:
>>
>>> I think in hadoop installation only hdfs comes.
>>> Like you need to insert script like
>>> bin/hadoop start-dfs.sh in $hadoop_home path
>>>
>>>
>>> On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <ektacloudst...@gmail.com
>>> > wrote:
>>>
>>>> Can anybody suggest any good tutorial to install hdfs and work with
>>>> hdfs?
>>>>
>>>> I installed hadoop on Ubuntu as single node. I can see those service
>>>> running.
>>>>
>>>> But how to install and work with hdfs? Please give some guidance.
>>>>
>>>
>>>
>>
>

Reply via email to