I am now writing some database extraction utilities at the company I work for, using Perl on Windows 2000/XP. There are several static tables I download via ODBC that I perform lookups into... An ideal application for a hash.
After the 5th table, I started to worry about how much memory this was using, so I checked the combined size of these hashes, and it came to about 64KB. This is without factoring-in the hash overhead. The app runs well right now, but I'm concerned about breakage going forward. Since this is used in-house, I can guarantee that the app will always run on a machine with at least 2 GB of RAM and a 4 GB swap file. The user might also be running a couple of Office apps, but that would be all. Granted that the answer to this question includes *many* variables I have not addressed, is there still some sort of guideline I can use to determine how much memory tied up in hashes is "too much"? Also, if I reinitialize an existing hash to (), will that return the memory to Windows? Barry Brevik _______________________________________________ ActivePerl mailing list [email protected] To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs
