On 3/24/2016 4:02 AM, Robert Brown wrote:
> If my index data directory size is 70G, and I don't have 70G (plus
> heap, etc) in the system, this will occasionally affect search speed
> right?  When Solr has to resort to reading from disk?
>
> Before I go out and throw more RAM into the system, in the above
> example, what would you recommend?

Having enough memory available to cache all your index data offers the
best possible performance.

You may be able to achieve acceptable performance when you don't have
that much memory, but I would try to make sure there's at least enough
memory available to cache *half* the index data.  Depending on the
nature of your queries and your index, this might not be enough, but
chances are good that it would work well.

I have a dev server where there's only enough memory available to cache
about a tenth of the index -- it's got full copies of all three of my
large indexes on ONE machine, while production runs two copies of these
same indexes on ten machines.  Performance of any single query is not
very good on the dev server, but if I absolutely had to use that server
for production with one of my indexes, it would be a slow, but I could
do it.  I don't think it would have enough performance to handle running
all three indexes for production, though.

Thanks,
Shawn

Reply via email to