On 8/28/2019 12:55 AM, Vignan Malyala wrote:
Im planning to create separate core for each of my client in solr.
Can I create around 500 cores in solr. Is it a good idea?
For each client i have around 100000 records on average currently.

There is no limit that I know of to the number of cores. You're only limited by system resources. That many cores will have a lot of files to open, and a lot of threads, so you would definitely need to increase the OS limits on file handles and processes.

Solr startup with that many cores could take a very long time. If you run SolrCloud, I would say that you should find a way to run fewer indexes -- SolrCloud begins to have scalability problems with only a few hundred.

How much physical memory it might consume. Plz help with this.
Thank you

500 cores each with 100000 documents is only 50 million total documents. This isn't very big, but you will need plenty of resources.

The most important resource for good performance will be memory. And we can't tell you how much you'll need. That will depend on exactly how you use Solr and the nature of your data. I've personally handled several cores with about 80 million documents total with 8GB of heap and 64GB of total system memory, which only left enough memory to cache about a third of the total index size. Some indexes can have difficulty handling only a few million documents on the same hardware.

https://cwiki.apache.org/confluence/display/solr/SolrPerformanceProblems#RAM

https://lucidworks.com/post/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/

Thanks,
Shawn

Reply via email to