Take a look at HDFS Quotas Guide .
http://hadoop.apache.org/core/docs/r0.20.0/hdfs_quota_admin_guide.html
Thanks,
-
Ravi
On 6/15/09 11:16 PM, "Palleti, Pallavi" wrote:
Hi all,
We have chown command in hadoop dfs to make a particular directory own
by a person. Do we have something similar t
re is some problem with the path ,
> Classpath etc. Someone kindly explain
>
> Thanks
>
> On Sun, Jun 14, 2009 at 1:36 AM, Ravi Phulari wrote:
>
>> > You can find answers to your all questions on this web page .
>> >
>> > http://hadoop.apache.org/core/docs/cu
You can find answers to your all questions on this web page .
http://hadoop.apache.org/core/docs/current/mapred_tutorial.html
-
Ravi
On 6/13/09 3:30 AM, "bharath vissapragada" wrote:
Also please tell me , how to set path and stuff , and directory in wch the
code mst be compiled etc,
Thanks
On
>From logs looks like your Hadoop cluster is facing two different issues .
At Slave
1. exception: java.net.NoRouteToHostException: No route to host in your logs
Diagnosis - One of your nodes cannot be reached correctly. Make sure you can
ssh to your master and slave and passwordless ssh keys
If you have hadoop superuser/administrative permissions you can use fsck with
correct options to view block report and locations for every block.
For further information please refer -
http://hadoop.apache.org/core/docs/r0.20.0/commands_manual.html#fsck
On 5/19/09 12:13 AM, "Foss User" wrote
Looks like your NameNode is down .
Verify if hadoop process are running ( jps should show you all java running
process).
If your hadoop process are running try restarting your hadoop process .
I guess this problem is due to your fsimage not being correct .
You might have to format your namenode.
Add your second disk name in dfs.data.dir .
Refer - http://hadoop.apache.org/core/docs/r0.19.1/cluster_setup.html
dfs.data.dir = Comma separated list of paths on the local filesystem of a
DataNode where it should store its blocks. If this is a comma-delimited list of
directories, then data wil
Try using Hadoop fs -cat
E.g
Hadoop fs -cat
http://hadoop.apache.org/core/docs/r0.19.1/hdfs_shell.html#cat
--
Ravi
On 4/9/09 8:56 PM, "Sid123" wrote:
I need to reuse the O/P of my DFS file without copying to local. Is there a
way?
--
View this message in context:
http://www.nabble.com/C
Hello Steve.
Assuming you are using *nix.
To Apply patch
patch -p0 -E < HADOOP-X.patch
To remove Patch
patch -p0 --reverse -E < HADOOP-X.patch
Hope this helps.
Regards,
Ravi
On 3/17/09 4:48 PM, "Steve Gao" wrote:
I want to apply this patch https://issues.apache.org/jira/browse/HADOOP-17