Are you using shards or have everything in same index?

What problem did you experience with the StatsCompnent? How did you use it? I 
think the right approach will be to optimize StatsComponent to do quick sum()

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

On 8. mars 2011, at 16.52, stockii wrote:

> Hello.
> 
> i have 34.000.000 documents in my index and each doc have a field with a
> double-value. i want the sum of these fields. i testet it with the
> statscomponent but this is not usable. !! so i get all my values directly
> from solr, from the index and with php-sum() i get my sum.
> 
> that works fine but, when a user search over really much documents (~
> 30.000), my skript need longer than 30 seconds and php skipped this.
> 
> 
> how can i tune solr, to geht much faster this double-values from the index
> !?
> 
> -----
> ------------------------------- System 
> ----------------------------------------
> 
> One Server, 12 GB RAM, 2 Solr Instances, 7 Cores, 
> 1 Core with 31 Million Documents other Cores < 100.000
> 
> - Solr1 for Search-Requests - commit every Minute  - 4GB Xmx
> - Solr2 for Update-Request  - delta every 2 Minutes - 4GB Xmx
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/getting-much-double-Values-from-solr-timeout-tp2650981p2650981.html
> Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to