Hello,

I’m working with a database that has a full-text index. I have found that if I 
iteratively add XML documents, then optimize, add more documents, optimize 
again, and so on, eventually the “optimize” command will fail with “Out of Main 
Memory.” I edited the basex startup script to change the memory allocation from 
-Xmx2g to -Xmx12g. My computer has 16 GB of memory, but of course the OS uses 
up some of it. I have found that if I exit memory-hungry programs (web browser, 
Oxygen), start basex, and then run the “optimize” command, I still get “Out of 
Main Memory.” I’m wondering if there are any known workarounds or strategies 
for this situation. If I understand the documentation about indexes correctly, 
index data is periodically written to disk during optimization. Does this mean 
that running optimize again will pick up where the previous attempt left off, 
such that running optimize repeatedly will eventually succeed?

Thanks,
Greg


Gregory Murray
Director of Digital Initiatives
Wright Library
Princeton Theological Seminary


Reply via email to