Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Steve Holden
per wrote: > hi all, > > i have a very large dictionary object that is built from a text file > that is about 800 MB -- it contains several million keys. ideally i > would like to pickle this object so that i wouldnt have to parse this > large file to compute the dictionary every time i run my pr

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Tim Chase
i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program. however currently the pi

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Dave Angel
As others have said, a database is probably the right answer. There, the data is kept on disk, and only a few records at a time are read for each access, with modification transactions usually being synchronous. However, there are use cases where your approach makes more sense. And it should

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Dave Angel
As others have said, a database is probably the right answer. There, the data is kept on disk, and only a few records at a time are read for each access, with modification transactions usually being synchronous. However, there are use cases where your approach makes more sense. And it should

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread John Machin
On Mar 23, 1:32 pm, per wrote: > hi all, > > i have a very large dictionary object that is built from a text file > that is about 800 MB -- it contains several million keys.  ideally i > would like to pickle this object so that i wouldnt have to parse this > large file to compute the dictionary ev

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Steven D'Aprano
On Sun, 22 Mar 2009 23:10:21 -0400, Terry Reedy wrote: > Searching for a key in, say, 10 dicts will be slower than searching for > it in just one. The only reason I would do this would be if the dict > had to be split, say over several machines. But then, you could query > them in parallel. Tha

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Terry Reedy
per wrote: hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program. how

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Paul Rubin
per writes: > fair enough - what native python database would you recommend? i > prefer not to install anything commercial or anything other than > python modules I think sqlite is the preferred one these days. -- http://mail.python.org/mailman/listinfo/python-list

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Armin
On Monday 23 March 2009 00:01:40 per wrote: > On Mar 22, 10:51 pm, Paul Rubin wrote: > > per writes: > > > i would like to split the dictionary into smaller ones, containing > > > only hundreds of thousands of keys, and then try to pickle them. > > > > That already s

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread per
On Mar 22, 10:51 pm, Paul Rubin wrote: > per writes: > > i would like to split the dictionary into smaller ones, containing > > only hundreds of thousands of keys, and then try to pickle them. > > That already sounds like the wrong approach.  You want a database. fa

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread odeits
On Mar 22, 7:32 pm, per wrote: > hi all, > > i have a very large dictionary object that is built from a text file > that is about 800 MB -- it contains several million keys.  ideally i > would like to pickle this object so that i wouldnt have to parse this > large file to compute the dictionary ev

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Paul Rubin
per writes: > i would like to split the dictionary into smaller ones, containing > only hundreds of thousands of keys, and then try to pickle them. That already sounds like the wrong approach. You want a database. -- http://mail.python.org/mailman/listinfo/python-list

splitting a large dictionary into smaller ones

2009-03-22 Thread per
hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program. however currentl