Hello,

I need to build a large database that has roughly 500,000 keys, and a
variable amount of data for each key. The data for each key could
range from 100 bytes to megabytes.The data under each will grow with
time as the database is being built.  Are there some flags I should be
setting when opening the database to handle large amounts of data per
key? Is hash or binary tree recommended for this type of job, I'll be
building the database from scratch, so lots of lookups and appending
of data. Testing is showing bt to be faster, so I'm leaning towards
that. The estimated build time is around 10~12 hours on my machine, so
I want to make sure that something won't get messed up in the 10th
hour.

TIA,
JM

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to