On Thu, Aug 02, 2007 at 07:43:58PM -0000, lazy wrote: > I have a berkely db and Im using the bsddb module to access it. The Db > is quite huge (anywhere from 2-30GB). I want to iterate over the keys > serially. > I tried using something basic like > > for key in db.keys() > > but this takes lot of time. I guess Python is trying to get the list > of all keys first and probbaly keep it in memory. Is there a way to > avoid this, since I just want to access keys serially.
Does db.iterkeys() work better? Christoph -- http://mail.python.org/mailman/listinfo/python-list