On 11/14/2019 1:46 AM, Hongxu Ma wrote:
Thank you @Shawn Heisey<mailto:apa...@elyograg.org> , you help me many times.

My -xms=1G
When restart solr, I can see the progress of memory increasing (from 1G to 9G, 
took near 10s).

I have a guess: maybe solr is loading some needed files into heap memory, e.g. 
*.tip : term index file. What's your thoughts?

Solr's basic operation involves quite a lot of Java memory allocation. Most of what gets allocated turns into garbage almost immediately, but Java does not reuse that memory right away ... it can only be reused after garbage collection on the appropriate memory region runs.

The algorithms in Java that decide between either grabbing more memory (up to the configured heap limit) or running garbage collection are beyond my understanding. For programs with heavy memory allocation, like Solr, the preference does seem to lean towards allocating more memory if it's available than performing garbage collection.

I can imagine that initial loading of indexes containing billions of documents will require quite a bit of heap. I do not know what data is stored in that memory.

Thanks,
Shawn

Reply via email to