Dear Wellington:
Many thanks for your help. Deeply appreciate it. It seems to work. I have tried
shutting down and starting up twice and tested hdfs dfs -ls /, and it connects
to hdfs.
Once again many thanks.
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (
Because you are probably not defining "dfs.namenode.name.dir", the NN metadata
directory is being created at tmp and getting deleted once the process is
restarted.
On 27 Apr 2015, at 11:50, Anand Murali wrote:
> Wellington:
>
> I have done it at installation time. I shall try once again. How
Wellington:
I have done it at installation time. I shall try once again. However, request
you look at this URL, and maybe let me know your views/suggestions. BTW, if I
uninstall and re-install this error goes away for that session.
Thanks.
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, Mylapo
Hello Anand,
This error means NN could not find it's metadata directory. You probably need
to run "hadoop namenode -format" command before trying to start hdfs.
…
2015-04-27 15:21:42,696 WARN
org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Encountered exception
loading fsimage
org.apache
Dear Wellington:
You were right. There is a error with respect to temp files. Find attached log
file. Appreciate your help.
Thanks
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Monday, April 27, 2015 2:46 PM,
There might be some FATAL/ERROR/WARN or Exception messages in this log file
that can explain why NN process is dying. Can you paste some of the last lines
on the log file?
On 27 Apr 2015, at 09:37, Susheel Kumar Gadalay wrote:
> jps listing is not showing namenode daemon.
>
> Verify why name
Many thanks Wellington , but what should I look for.
Regards
Anand
Sent from my iPhone
> On 27-Apr-2015, at 2:34 pm, Wellington Chevreuil
> wrote:
>
> Hello Anand,
>
> Per your original email, this would be:
> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.o
Hello Anand,
Per your original email, this would be:
/home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
Cheers.
On 27 Apr 2015, at 09:41, Anand Murali wrote:
> Susheel:
>
> Since I am new to this, what log file should I look for in the log dir and
> what shou
Susheel:
Since I am new to this, what log file should I look for in the log dir and what
should I be looking for.
Thanks
Sent from my iPhone
> On 27-Apr-2015, at 2:07 pm, Susheel Kumar Gadalay wrote:
>
> jps listing is not showing namenode daemon.
>
> Verify why namenode is not up from the
jps listing is not showing namenode daemon.
Verify why namenode is not up from the logs.
On 4/27/15, Anand Murali wrote:
> Dear All:
>
> Please find below.
>
> and_vihar@Latitude-E5540:~/hadoop-2.6.0/sbin$
> start-dfs.sh
>
> Starting namenodes on [localhost]
> localhost: starting namenode, loggi
Dear All:
Please find below.
and_vihar@Latitude-E5540:~/hadoop-2.6.0/sbin$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode,
can you send me ur hosts file
On Thu, Apr 23, 2015 at 11:55 AM, Anand Murali
wrote:
> Hi:
>
> I tried and was succesfull in changing etc/hosts. I shutdown and
> re-started and get the same error.
>
> anand_vihar@Latitude-E5540:~$ ssh localhost
> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-gener
Hi:
I tried and was succesfull in changing etc/hosts. I shutdown and re-started and
get the same error.
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
* Documentation: https://help.ubuntu.com/
Last login: Thu Apr 23 11:18:43 2015 from l
Many thanks my friend. Shall try it right away.
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Thursday, April 23, 2015 10:51 AM, sandeep vura
wrote:
run this command in the terminal from root directory
run this command in the terminal from root directory
$ sudo nano /etc/hosts (( It will prompt to enter root password))
Later you can comment those lines in hosts files #127.0.1.1
add this line 127.0.0.1 localhost
save the host file and exit
On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali
Sudo what my friend. There are so many options to sudo
Sent from my iPhone
> On 23-Apr-2015, at 8:20 am, sandeep vura wrote:
>
> Ananad,
>
> Try sudo it will work
>
>> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus wrote:
>> Can you try sudo?
>> https://www.linux.com/learn/tutorials/306766:
Ananad,
Try sudo it will work
On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus
wrote:
> Can you try sudo?
> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
>
> Regards,
> Shahab
>
> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali
> wrote:
>
>> Dear Sandeep:
>>
>> many th
Can you try sudo?
https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo
Regards,
Shahab
On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali wrote:
> Dear Sandeep:
>
> many thanks. I did find hosts, but I do not have write priveleges,
> eventhough I am administrator. This is stra
Dear Sandeep:
many thanks. I did find hosts, but I do not have write priveleges, eventhough I
am administrator. This is strange. Can you please advise.
Thanks
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Wed
Hi Anand,
You should search /etc directory in root not Hadoop directory.
On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali wrote:
> Dear All:
>
> I dont see a etc/host. Find below.
>
>
> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
> total 76
> d
Dear All:
I dont see a etc/host. Find below.
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
total 76
drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar 4096 Apr 22 14:05 ..
drwxr-xr-x 2 anand_vihar ana
Ok thanks will do
Sent from my iPhone
> On 22-Apr-2015, at 2:39 pm, sandeep vura wrote:
>
> hosts file will be available in /etc directory please check once.
>
>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali wrote:
>> I don't seem to have etc/host
>>
>>
>> Sent from my iPhone
>>
>>> On 22
hosts file will be available in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali wrote:
> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
I don't seem to have etc/host
Sent from my iPhone
> On 22-Apr-2015, at 2:30 pm, sandeep vura wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1 localhost in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /
Hi Anand,
comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1 localhost in /etc/hosts.
Restart your hadoop cluster after made changes in /etc/hosts
Regards,
Sandeep.v
On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali wrote:
> Dear All:
>
> Has anyone encoun
Dear All:
Has anyone encountered this error and if so how have you fixed it other then
re-installing Hadoop or re-starting start-dfs.sh when you have already started
after boot. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
*
Himanwan:
Jps fails on my laptop although JDK1.7.0 is installed. There is no etc/hosts
and the slaves file has one entry called localhost
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Monday, April 20, 2015 6
Are you sure the namenode is running well from ouput on jps command?
Have you try to give an IP on your PC other than 127.0.0.1?
and could you paste your /etc/hosts and hadoop_folder/etc/hadoop/slaves
file configuration on this reply?
On Mon, Apr 20, 2015 at 8:10 PM, Anand Murali wrote:
> Hi
>
>
Hi
But the Hadoop wiki say this is a network issue especially with Ubuntu. Please
look at my paste and follow thru link.
As regards my temporary solution. I have to remove all Hadoop files and re
extract it and start over and then it works for a couple of runs before it
starts all over again.
you just run "jps" on your terminal, here my jps output command on my
namenode:
hadoop@node-17:~$ jps
18487 Jps
18150 NameNode
18385 SecondaryNameNode
hadoop@node-17:~$
from that output I could make sure that my namenode is running well, how
bout your namenode, are you sure it's running well
No. I shall try. Can you point me to jps resources.
Thanks
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Monday, April 20, 2015 5:50 PM, Himawan Mahardianto
wrote:
have you try jps command and looking
have you try jps command and looking what hadoop service is running?
On Mon, Apr 20, 2015 at 6:45 PM, Anand Murali wrote:
> Yes. All Hadoop commands. The error message is linked to IP address,a dn I
> checked Hadoop wiki, this is a network issue on Ubuntu. Unfortunately, I
> dont know much about
Yes. All Hadoop commands. The error message is linked to IP address,a dn I
checked Hadoop wiki, this is a network issue on Ubuntu. Unfortunately, I dont
know much about networks.
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voic
have you try:
hdfs dfs -ls /
*with slash in the end of command?
On Mon, Apr 20, 2015 at 5:50 PM, Anand Murali wrote:
> Hi All:
>
> I am using Ubuntu 14.10 desktop and Hadoop-2.6 pseudo mode.
> Start-dfs/Stop-dfs is normal. However, after a couple of times of usage,
> when I try to connect to HDF
Hi All:
I am using Ubuntu 14.10 desktop and Hadoop-2.6 pseudo mode. Start-dfs/Stop-dfs
is normal. However, after a couple of times of usage, when I try to connect to
HDFS,, I am refused connection. Find below
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34
Many thanks. I had to re-install hadoop all over and then start the daemons. It
works now
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Saturday, April 18, 2015 11:43 AM, sreebalineni .
wrote:
if no c
if no changes maku sure all daemons running properly though it looks like
started it might not running for long time
On Apr 18, 2015 7:59 AM, "Anand Murali" wrote:
> Yes
>
> Sent from my iPhone
>
> On 17-Apr-2015, at 9:01 pm, madhav krish wrote:
>
> Did you start your name node using start-dfs.s
Yes
Sent from my iPhone
> On 17-Apr-2015, at 9:01 pm, madhav krish wrote:
>
> Did you start your name node using start-dfs.sh?
>
>> On Apr 17, 2015 1:52 AM, "Anand Murali" wrote:
>> Dear All:
>>
>> I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
>> connect to hdfs a
Did you start your name node using start-dfs.sh?
On Apr 17, 2015 1:52 AM, "Anand Murali" wrote:
> Dear All:
>
> I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
> connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
> unable to connect with following
Did you start your name node using start-dfs.sh?
On Apr 17, 2015 1:52 AM, "Anand Murali" wrote:
> Dear All:
>
> I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
> connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
> unable to connect with following
Dear All:
I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
unable to connect with following
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/sbin$ start-yarn.sh
starting yarn daemons
starting resourcemanag
All,
I have a 3 node hadoop cluster CDH 4.4 and every few days or when ever I load
some data through sqoop or query through hive , sometimes I get the following
error -
Call From <> to <> failed on connection exception:
java.net.ConnectException: Connection refused
This has become so freque
42 matches
Mail list logo