If too much stuff is put into a Java caching system too fast, one may run
into the JVM out-of-memory problem.

One idea is that every time there is a "put" operation (which is the only
operation that may increase the memory usage by the caching system), the
caching system checks if the current JVM memory usage is over a particular
threshold, and if so temporarily disable the "put" by making it a no-op.

Something like:

        static final float MEMORY_USAGE_THRESHOLD_PERCENT = 70.0F;
        ...
        // Method of putting stuff to the cache
        public Serializable put(Serializable stuff) {
                Runtime rt = Runtime.getRuntime()
                double mFree = rt.freeMemory();
                double mMax = rt.maxMemory();
                double mTotal = rt.totalMemory();
                double usage = (mMax - mTotal + mFree) / mMax * 100

                if (usage > MEMORY_USAGE_THRESHOLD_PERCENT)
                        return null;    // cache "put" temporarily disabled
                // normal put

                ...
        }


With such measures, if there was a out-of-memory problem, it still wouldn't
be caused directly by the caching system.

Good idea ?

Regards,
Hanson

Reply via email to