On Jul 4, 3:29 pm, MRAB wrote:
> mclovin wrote:
>
> [snip]
>
> > like I said I need to do this 480,000 times so to get this done
> > realistically I need to analyse about 5 a second. It appears that the
> > average matrix size contains about 15 million elements.
>
On Jul 4, 12:51 pm, Scott David Daniels wrote:
> mclovin wrote:
> > OK then. I will try some of the strategies here but I guess things
> > arent looking too good. I need to run this over a dataset that someone
> > pickled. I need to run this 480,000 times so you can see my
&
gt; >> 2009/7/4 Andre Engels :
> >>> On Sat, Jul 4, 2009 at 9:33 AM, mclovin wrote:
> >>>> Currently I need to find the most common elements in thousands of
> >>>> arrays within one large array (arround 2 million instances with ~70k
> >>>>
Currently I need to find the most common elements in thousands of
arrays within one large array (arround 2 million instances with ~70k
unique elements)
so I set up a dictionary to handle the counting so when I am
iterating I up the count on the corrosponding dictionary element. I
then iterate thr
Hello all,
I need to have a dictionary of about 8 gigs (well the data it is
processing is around 4gb). so naturally i am running into memory
errors.
So i looked around and found bsddb which acts like a dictionary object
only offloads the data from the RAM to the HDD, however that only
supports st