On Thu, 12 Mar 2009 07:43:40 +0700, Bembi Prima <[email protected]> wrote:
>Thank you for giving me a number of solution.I have implemented the sparse >data structure on my array. It needs a lot more memory on the run, but >definitely a quicker method to find an element in a big array. > >I also have found my problem about memory insufficient, after a long debug, >i found that setlength statement is the culprit. >It seems there's an event where setlength couldn't find any long continuous >unused memory, so it returns out of memory error. >Even though I still don't quite sure it's the real problem since i have 2GB >of RAM, and the program always crashes around 200MB of memory usage. But, I >already have the solution to it, and it runs smoothly now. The way that setlength works (IIRC) is that it allocates space for the new larger copy ... copies the existing data over ... then frees the old copy. This means that as time goes by the heap gets more and more fragmented and the available pieces become smaller. Increasing the size of the increment by which memory is allocated should reduce heap fragmentation and if you know how much space the item will take and can allocate that in advance you should find that the problem will go entirely. A >On Tue, Mar 10, 2009 at 9:19 PM, Thomas Hruska <[email protected]>wrote: >> >> >> I'll pass on the myriad of available jokes about "memory problems". >> Tempting though. >> >> The maximum amount of virtual memory any Windows program can use is 2GB >> (virtual memory includes RAM). If you plan on using more than 500MB >> RAM, you should start looking into hard drive caching because getting >> your program kicked to swap will be far more expensive than implementing >> your own cache. >> >> As to your problem with the array lookup, you have a couple options. >> Searching the array now is O(N) and your overall algorithm could be >> O(N^2) or worse. (Big-O notation is a way to measure how good an >> algorithm is - when N becomes large, O(N^2)*C algorithms are very slow. >> You also have to watch out for large C - O(log N)*C could be slower >> than O(N)*C depending on the algorithm. C is the time it takes to >> execute a single operation). >> >> If the index positions DO NOT matter, you can pre-sort the array (which >> sounds like a one-time O(N log N) operation) and then use a binary >> search on it O(log N). >> >> If the index positions DO matter, things get trickier. You have a >> couple options: >> >> If you DO have a fairly densely packed set of data that you know the >> range of, you can create an array of an array of positions. If you want >> to know where all the 5's are, you look at position 5 in the first array >> to get at the position list. This is O(1) (the fastest possible lookup >> time is an instant lookup with little to no overhead). >> >> If you DO NOT have a fairly densely packed set of data (say you have 50 >> numbers and the range is 1 to 50000), then you will need to use a sparse >> data structure like a map where a number maps to an array of positions. >> Lookup time is O(log N). >> >> There are variations on the above but it highly depends on the patterns >> you see in your data. The above are good rules of thumb to go by. >> >> -- >> Thomas Hruska >> CubicleSoft President >> Ph: 517-803-4197 >> >> *NEW* MyTaskFocus 1.1 >> Get on task. Stay on task. >> >> http://www.CubicleSoft.com/MyTaskFocus/ >> >> >> > > >[Non-text portions of this message have been removed]

