Hi all,

I recently run into the problem that HdfsBroker throws out of memory
exception, because too many CellStore files in HDFS are kept open - I
have over 600 ranges per range server, with a maximum of 10 cell
stores per range, that'll be 6,000 open files at the same time, making
HdfsBroker to take gigabytes of memory.

If we open the CellStore file on demand, i.e. when a scanner is
created on it, this problem is gone. However random-read performance
may drop due to the the overhead of opening a file in HDFS. Any better
solution?

Donald
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Hypertable Development" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/hypertable-dev?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to