Hey Marc:

You are still seeing 'too many open files'? Whats your schema look like. I added to http://wiki.apache.org/hadoop/Hbase/FAQ#5 a rough formula for counting how many open mapfiles in a running regionserver.

Currently, your only recourse is upping the ulimit. Addressing this scaling barrier will be a focus of next hbase release.

St.Ack



Marc Harris wrote:
I have seen that hbase can cause "too many open file" errors. I increase
my limit to 10240 (10 times the previous limit) but still get errors.

Is there a recommended value that I should set my open files limit to?
Is there something else I can do to reduce the number of files, perhaps
with some other trade-off?

Thanks
- Marc



Reply via email to