Hoss,

What about the case where there's only a small number of fields (a dozen or 
two) but each field has hundreds of thousands or millions of values? Would Solr 
be able to handle that?



________________________________
 From: Chris Hostetter <hossman_luc...@fucit.org>
To: solr-user@lucene.apache.org 
Sent: Tuesday, March 19, 2013 6:09 PM
Subject: Re: Facets with 5000 facet fields
 

: In order to support faceting, Solr maintains a cache of the faceted
: field. You need one cache for each field you are faceting on, meaning
: your memory requirements will be substantial, unless, I guess, your

1) you can consider trading ram for time by using "facet.method=enum" (and 
disabling your filterCache) ... it will prevent the need for hte 
FieldCaches but will probably be slower as it will compute the docset per 
value per field instead of generating the FieldCaches once and re-useing 
them.

2) the entire question seems suspicious...

: > We have configured solr for 5000 facet fields as part of request
: > handler.We
: > have 10811177 docs in the index.

...i have lots of experience dealing with indexes that had thousands of 
fields that were faceted on, but i've never seen any realistic usecase for 
faceting on more then a few hundred fields per search.  Can you please 
elaborate on your goals and usecases so we can offer better advice...

https://people.apache.org/~hossman/#xyproblem
XY Problem

Your question appears to be an "XY Problem" ... that is: you are dealing
with "X", you are assuming "Y" will help you, and you are asking about "Y"
without giving more details about the "X" so that we can understand the
full issue.  Perhaps the best solution doesn't involve "Y" at all?
See Also: http://www.perlmonks.org/index.pl?node_id=542341


-Hoss

Reply via email to