Hi,

If this is how you configure the field collapsing cache, then I don't
have it setup:


 <fieldCollapsing>

        <fieldCollapseCache
      class="solr.FastLRUCache"
      size="512"
      initialSize="512"
      autowarmCount="128"/>

  </fieldCollapsing>


I didnt add that part to solrconfig.xml.

The way I setup field collapsing is I added this tag:

<searchComponent name="collapse"
class="org.apache.solr.handler.component.CollapseComponent" />

Then I modified the default request handler (for standard queries) with this:

 <requestHandler name="standard" class="solr.SearchHandler" default="true">
    <!-- default values for query parameters -->
     <lst name="defaults">
       <str name="echoParams">explicit</str>

     </lst>
     <arr name="components">
        <str>collapse</str>
        <str>facet</str>
        <str>highlight</str>
        <str>debug</str>
     </arr>
  </requestHandler>




On Wed, Sep 1, 2010 at 4:11 PM, Jean-Sebastien Vachon
<js.vac...@videotron.ca> wrote:
> can you tell us what are your current settings regarding the 
> fieldCollapseCache?
>
> I had similar issues with field collapsing and I found out that this cache 
> was responsible for
> most of the OOM exceptions.
>
> Reduce or even remove this cache from your configuration and it should help.
>
>
> On 2010-09-01, at 1:10 PM, Moazzam Khan wrote:
>
>> Hi guys,
>>
>> I have about 20k documents in the Solr index (and there's a lot of
>> text in each of them). I have field collapsing enabled on a specific
>> field (AdvisorID).
>>
>> The thing is if I have field collapsing enabled in the search request
>> I don't get correct count for the total number of records that
>> matched. It always says that the number of "rows" I asked to get back
>> is the number of total records it found.
>>
>> And, when I run a query with search criteria *:* (to get the number of
>> total advisors in the index) solr runs of out memory and gives me an
>> error saying
>>
>> SEVERE: java.lang.OutOfMemoryError: Java heap space
>>        at java.nio.CharBuffer.wrap(CharBuffer.java:350)
>>        at java.nio.CharBuffer.wrap(CharBuffer.java:373)
>>        at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:138)
>>        at java.lang.StringCoding.decode(StringCoding.java:173)
>>
>>
>> This is going to be a huge problem later on when we index 50k
>> documents later on.
>>
>> These are the options I am running Solr with :
>>
>> java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
>> MaxPermSize=1024m    -jar  start.jar
>>
>>
>> Is there any way I can get the counts and not run out of memory?
>>
>> Thanks in advance,
>> Moazzam
>
>

Reply via email to