In a word, "no", there are simply too many variables. It's like asking "how much memory will a Java program need?"....
But Solr does like memory, both the Java heap and the OS memory. Here's a long blog on how to scope this out: https://lucidworks.com/blog/2012/07/23/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/ Best, Erick On Thu, Apr 14, 2016 at 12:25 PM, Betsey Benagh <betsey.ben...@stresearch.com> wrote: > bin/solr status shows the memory usage increasing, as does the admin ui. > > I¹m running this on a shared machine that is supporting several other > applications, so I can¹t be particularly greedy with memory usage. Is > there anything out there that gives guidelines on what an appropriate > amount of heap is based on number of documents or whatever? We¹re just > playing around with it right now, but it sounds like we may need a > different machine in order to load in all of the data we want to have > available. > > Thanks, > betsey > > On 4/14/16, 3:08 PM, "Shawn Heisey" <apa...@elyograg.org> wrote: > >>On 4/14/2016 12:45 PM, Betsey Benagh wrote: >>> I'm running solr 6.0.0 in server mode. I have one core. I loaded about >>>2000 documents in, and it was using about 54 MB of memory. No problem. >>>Nobody was issuing queries or doing anything else, but over the course >>>of about 4 hours, the memory usage had tripled to 152 MB. I shut solr >>>down and restarted it, and saw the memory usage back at 54 MB. Again, >>>with no queries or anything being executed against the core, the memory >>>usage is creeping up - after 17 minutes, it was up to 60 MB. I've looked >>>at the documentation for how to limit memory usage, but I want to >>>understand why it's creeping up when nothing is happening, lest it run >>>out of memory when I limit the usage. The machine is running CentOS 6.6, >>>if that matters, with Java 1.8.0_65. >> >>When you start Solr 5.0 or later directly from the download or directly >>after installing it with the service installer script (on *NIX >>platforms), Solr starts with a 512MB Java heap. You can change this if >>you need to -- most Solr users do need to increase the heap size to a >>few gigabytes. >> >>Java uses a garbage collection memory model. It's perfectly normal >>during the operation of a Java program, even one that is not doing >>anything you can see, for the memory utilization to rise up to the >>configured heap size. This is simply how things work in systems using a >>garbage collection memory model. >> >>Where exactly are you looking to find the memory utilization? In the >>admin UI, that number will go up over time, until one of the memory >>pools gets full and Java does a garbage collection, and then it will >>likely go down again. From the operating system point of view, the >>resident memory usage will increase up to a point (when the entire heap >>has been allocated) and probably never go back down -- but it also >>shouldn't go up either. >> >>Thanks, >>Shawn >> >