I'm trying to create a time series table that contains a couple of billion rows. It contains daily values for several millions of items. This table will be visible to the outside world, so it should be able to support lots of reads at any point in time. My plan is to every night use map/reduce to batch load the days values for each of the items into that table. The problem seems to be that read performance is dismal while I'm writing data to the table.
Is there any way to accomplish what I'm trying to do? Fwiw I'm currently using the hive hbase integration to load data to the hbase table. Thank you, Paul Nickerson
