Hello Ludwig,

What exactly is the problem you are encountering right now? Is there
any issue that you want to address? Maybe this issue can be taken from
a different approach as the saying goes - theres more than one way to
skin a cat (poor cat).



On Tue, Mar 29, 2011 at 6:22 PM, Ludwig Isaac Lim <[email protected]> wrote:
> Hi Guys:
>
>         Here's the scenario:
>         A program written in perl reads a large file and creates a very large
> hash (actually a hash of records, hash of hash) and then used that hash in a
> lookup for other data processing. The hash has about 3,096,000 entries and is
> incrementing day by day. Right now the perl process consumes 1 G of RAM using 
> a
> 32 bit perl, and there's no problem (i.e. out-of-memory error).
>
>       Is there a way to put in into a disk hash that is using written in pure
> perl (No DBMS or tools such as Berkeley DB)? I saw on Google people recommend
> DBM::Deep. Anyone here uses DBM::Deep? I like to know if people uses DBM::Deep
> for large disk hash and learn the performance of using it.
>
>       Thanks in advance.
>
> Regards,
> Ludwig
>
> _________________________________________________
> Philippine Linux Users' Group (PLUG) Mailing List
> http://lists.linux.org.ph/mailman/listinfo/plug
> Searchable Archives: http://archives.free.net.ph
>
_________________________________________________
Philippine Linux Users' Group (PLUG) Mailing List
http://lists.linux.org.ph/mailman/listinfo/plug
Searchable Archives: http://archives.free.net.ph

Reply via email to