Bembi Prima wrote:
> Hi all, i have a weird problem..
> i am trying to make a program with big arrays, but it always crashes with
> out of memory error even though the computer has 2GB memory and i've
> maximized the pagefile. from task manager i notice that total physical
> memory usage never exceed 60% when my program is running. why this keeps
> happening? i don't know what else to do..
> 
> one more question, is there a faster method than a simple loop to find the
> indexes of a certain number in an array?
> for example, if i have an array [2, 3, 5, 7, 9, 5, 11, 5, 5], and i want to
> find 5 in that array, the method should return the indexes where the element
> is 5, which is [2, 5, 7, 8]. i still use a simple check every element method
> and i think it takes a significant amount of time if the array is so big.
> 
> 
> thank you.

I'll pass on the myriad of available jokes about "memory problems". 
Tempting though.

The maximum amount of virtual memory any Windows program can use is 2GB 
  (virtual memory includes RAM).  If you plan on using more than 500MB 
RAM, you should start looking into hard drive caching because getting 
your program kicked to swap will be far more expensive than implementing 
your own cache.

As to your problem with the array lookup, you have a couple options. 
Searching the array now is O(N) and your overall algorithm could be 
O(N^2) or worse.  (Big-O notation is a way to measure how good an 
algorithm is - when N becomes large, O(N^2)*C algorithms are very slow. 
  You also have to watch out for large C - O(log N)*C could be slower 
than O(N)*C depending on the algorithm.  C is the time it takes to 
execute a single operation).

If the index positions DO NOT matter, you can pre-sort the array (which 
sounds like a one-time O(N log N) operation) and then use a binary 
search on it O(log N).

If the index positions DO matter, things get trickier.  You have a 
couple options:

If you DO have a fairly densely packed set of data that you know the 
range of, you can create an array of an array of positions.  If you want 
to know where all the 5's are, you look at position 5 in the first array 
to get at the position list.  This is O(1) (the fastest possible lookup 
time is an instant lookup with little to no overhead).

If you DO NOT have a fairly densely packed set of data (say you have 50 
numbers and the range is 1 to 50000), then you will need to use a sparse 
data structure like a map where a number maps to an array of positions. 
  Lookup time is O(log N).


There are variations on the above but it highly depends on the patterns 
you see in your data.  The above are good rules of thumb to go by.

-- 
Thomas Hruska
CubicleSoft President
Ph: 517-803-4197

*NEW* MyTaskFocus 1.1
Get on task.  Stay on task.

http://www.CubicleSoft.com/MyTaskFocus/

Reply via email to