You may want to look at
http://hadoop.apache.org/hdfs/docs/current/hdfs_quota_admin_guide.html

Thanks,
Anirudh

On Sat, Dec 31, 2011 at 10:03 PM, Hamed Ghavamnia <ghavamni...@gmail.com>wrote:

> Hi,
> I've been searching on how to configure the maximum capacity of a
> datanode. I've added big volumes to one of my datanodes, but the configured
> capacity doesn't get bigger than the default 5GB. If I want a datanode with
> 100GB of capacity, I have to add 20 directories, each having 5GB so the
> maximum capacity reaches 100. Is there anywhere this can be set? Can
> different datanodes have different capacities?
>
> Also it seems like the *dfs.datanode.du.reserved *doesn't work either,
> because I've set it to zero, but it still leaves 50% of the free space for
> non-dfs usage.
>
> Thanks,
> Hamed
>
> P.S. This is my first message in the mailing list, so if I have to follow
> any rules for sending emails, I'll be thankful if you let me know. :)
>

Reply via email to