Our app has some memory peaks, which let the heap grow to 13M. After
the peak, there's 9M of free heap. Everything's fine until Bitmap
objects come into play. Despite having 9M of free heap creating
Bitmaps, which have only some 100k, now fail with a
OutOfMemoryException!

My theory: Allocation fails if (Java) heap *size* plus external
allocations (e.g. Bitmaps) exceed 16M, no matter that there is lots of
free memory in the heap. The VM could shrink the heap, but actually
does not. Would be quite a flaw in the VM.

Does anyone share this theory?

On the downside of this theory: I do not see anything appropriate
(killing the process is probably not) an app developer could do to
prevent those OutOfMemoryExceptions given that those memory peaks are
legitimate and unavoidable.

Please share your knowledge & thoughts!

Thanks,
Markus

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to