Hi,

Is there an upper-bound on the memory required to run syscs_util.syscs_compress_table? What determines how much memory is required (for example, should it be fixed or proportional to table size? affected by page size or page cache size?) We're occasionally getting out of memory errors when calling this function.

For example, in testing our app (which is not small, but is in a pretty quiescent state during my testing), the app is in a stable state at approx 116M with max heap at 256M, and then running a series of table compresses (strictly one-at-a-time) eventually gives OOME. So it seems that 140M is not enough to accommodate this function. I did get all tables to compress successfully when using 512M max heap, but I need to understand how much memory Derby will require, and if the requirements will grow with larger tables.

I'm using Derby 10.4.2.  The "sequential" parameter is set to false.

Thanks,
Jim

Reply via email to