Sorry, but your initial problem statement doesn’t seem to parse … 

Are you saying that you a single row with approximately 100,000 elements where 
each element is roughly 1-5KB in size and in addition there are ~5 elements 
which will be between one and five MB in size? 

And you then mention a coprocessor? 

Just looking at the numbers… 100K * 5KB means that each row would end up being 
500MB in size. 

That’s a pretty fat row.

I would suggest rethinking your strategy. 

> On Apr 7, 2015, at 11:13 AM, Kristoffer Sjögren <sto...@gmail.com> wrote:
> 
> Hi
> 
> I have a row with around 100.000 qualifiers with mostly small values around
> 1-5KB and maybe 5 largers ones around 1-5 MB. A coprocessor do random
> access of 1-10 qualifiers per row.
> 
> I would like to understand how HBase loads the data into memory. Will the
> entire row be loaded or only the qualifiers I ask for (like pointer access
> into a direct ByteBuffer) ?
> 
> Cheers,
> -Kristoffer

The opinions expressed here are mine, while they may reflect a cognitive 
thought, that is purely accidental. 
Use at your own risk. 
Michael Segel
michael_segel (AT) hotmail.com





Reply via email to