Hi Indrashish,
Can you please check if your you DN is accessible by nn , and the other
this is hdfs-site.xml of DN is NN ip is given or not becoz if DN is up and
running the issue is DN is not able to attached to NN for getting register.
You can add DN in include file as well .
thanks
Vikas
Hello,
My name is Indrashish Basu and I am a Masters student in the Department
of Electrical and Computer Engineering.
Currently I am doing my research project on Hadoop implementation on
ARM processor and facing an issue while trying to run a sample Hadoop
source code on the same. Every
As per your dfs report, available DataNodes count is ZERO in you cluster.
Please check your data node logs.
Regards
Jitendra
On 10/8/13, Basu,Indrashish indrash...@ufl.edu wrote:
Hello,
My name is Indrashish Basu and I am a Masters student in the Department
of Electrical and Computer
Hi Jitendra,
This is what I am getting in the datanode logs :
2013-10-07 11:27:41,960 INFO
org.apache.hadoop.hdfs.server.common.Storage: Storage directory
/app/hadoop/tmp/dfs/data is not formatted.
2013-10-07 11:27:41,961 INFO
org.apache.hadoop.hdfs.server.common.Storage: Formatting ...
You don't have any more space left in your HDFS. Delete some old data or
add additional storage.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Oct 8, 2013 at 11:47 PM, Basu,Indrashish indrash...@ufl.edu wrote:
Hi ,
Just to update on this, I have deleted all the old logs and files
Hi ,
Just to update on this, I have deleted all the old logs and files from
the /tmp and /app/hadoop directory, and restarted all the nodes, I have
now 1 datanode available as per the below information :
Configured Capacity: 3665985536 (3.41 GB)
Present Capacity: 24576 (24 KB)
DFS
Yes
Thanks
Jitendra
On 10/8/13, Basu,Indrashish indrash...@ufl.edu wrote:
Hi ,
Just to update on this, I have deleted all the old logs and files from
the /tmp and /app/hadoop directory, and restarted all the nodes, I have
now 1 datanode available as per the below information :
Configured
Hi Tariq,
Thanks a lot for your help.
Can you please let me
know the path where I can check the old files in the HDFS and remove
them accordingly. I am sorry to bother with these questions, I am
absolutely new to Hadoop.
Thanks again for your time and pateince.
Regards,
Indrashish
You are welcome Basu.
Not a problem. You can use *bin/hadoop fs -lsr /* to list down all the HDFS
files and directories. See which files are no longer required and delete
them using *bin/hadoop fs -rm /path/to/the/file*
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Oct 8, 2013 at 11:59
Hi Tariq,
Thanks for your help again.
I tried deleting the old
HDFS files and directories as you suggested , and then do the
reformatting and starting all the nodes. However after running the
dfsadmin report I am again seeing that datanode is not generated.
10 matches
Mail list logo