Re: Need help with field collapsing and out of memory error

2010-09-02 Thread Moazzam Khan
Oh, I don't know if this matters but I store text fields in Solr but I
never get them from the index (I only get the ID field from the index
and everything else is pulled from DB cache). I store all the fields
just in case I need to debug search queries, etc and want to see the
data.

Regards,

Moazzam



On Wed, Sep 1, 2010 at 5:13 PM, Moazzam Khan  wrote:
> Hi,
>
>
> If this is how you configure the field collapsing cache, then I don't
> have it setup:
>
>
>  
>
>              class="solr.FastLRUCache"
>      size="512"
>      initialSize="512"
>      autowarmCount="128"/>
>
>  
>
>
> I didnt add that part to solrconfig.xml.
>
> The way I setup field collapsing is I added this tag:
>
>  class="org.apache.solr.handler.component.CollapseComponent" />
>
> Then I modified the default request handler (for standard queries) with this:
>
>  
>    
>     
>       explicit
>
>     
>     
>        collapse
>        facet
>        highlight
>        debug
>     
>  
>
>
>
>
> On Wed, Sep 1, 2010 at 4:11 PM, Jean-Sebastien Vachon
>  wrote:
>> can you tell us what are your current settings regarding the 
>> fieldCollapseCache?
>>
>> I had similar issues with field collapsing and I found out that this cache 
>> was responsible for
>> most of the OOM exceptions.
>>
>> Reduce or even remove this cache from your configuration and it should help.
>>
>>
>> On 2010-09-01, at 1:10 PM, Moazzam Khan wrote:
>>
>>> Hi guys,
>>>
>>> I have about 20k documents in the Solr index (and there's a lot of
>>> text in each of them). I have field collapsing enabled on a specific
>>> field (AdvisorID).
>>>
>>> The thing is if I have field collapsing enabled in the search request
>>> I don't get correct count for the total number of records that
>>> matched. It always says that the number of "rows" I asked to get back
>>> is the number of total records it found.
>>>
>>> And, when I run a query with search criteria *:* (to get the number of
>>> total advisors in the index) solr runs of out memory and gives me an
>>> error saying
>>>
>>> SEVERE: java.lang.OutOfMemoryError: Java heap space
>>>        at java.nio.CharBuffer.wrap(CharBuffer.java:350)
>>>        at java.nio.CharBuffer.wrap(CharBuffer.java:373)
>>>        at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:138)
>>>        at java.lang.StringCoding.decode(StringCoding.java:173)
>>>
>>>
>>> This is going to be a huge problem later on when we index 50k
>>> documents later on.
>>>
>>> These are the options I am running Solr with :
>>>
>>> java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
>>> MaxPermSize=1024m    -jar  start.jar
>>>
>>>
>>> Is there any way I can get the counts and not run out of memory?
>>>
>>> Thanks in advance,
>>> Moazzam
>>
>>
>


Re: Need help with field collapsing and out of memory error

2010-09-01 Thread Moazzam Khan
Hi,


If this is how you configure the field collapsing cache, then I don't
have it setup:


 



  


I didnt add that part to solrconfig.xml.

The way I setup field collapsing is I added this tag:



Then I modified the default request handler (for standard queries) with this:

 

 
   explicit

 
 
collapse
facet
highlight
debug
 
  




On Wed, Sep 1, 2010 at 4:11 PM, Jean-Sebastien Vachon
 wrote:
> can you tell us what are your current settings regarding the 
> fieldCollapseCache?
>
> I had similar issues with field collapsing and I found out that this cache 
> was responsible for
> most of the OOM exceptions.
>
> Reduce or even remove this cache from your configuration and it should help.
>
>
> On 2010-09-01, at 1:10 PM, Moazzam Khan wrote:
>
>> Hi guys,
>>
>> I have about 20k documents in the Solr index (and there's a lot of
>> text in each of them). I have field collapsing enabled on a specific
>> field (AdvisorID).
>>
>> The thing is if I have field collapsing enabled in the search request
>> I don't get correct count for the total number of records that
>> matched. It always says that the number of "rows" I asked to get back
>> is the number of total records it found.
>>
>> And, when I run a query with search criteria *:* (to get the number of
>> total advisors in the index) solr runs of out memory and gives me an
>> error saying
>>
>> SEVERE: java.lang.OutOfMemoryError: Java heap space
>>        at java.nio.CharBuffer.wrap(CharBuffer.java:350)
>>        at java.nio.CharBuffer.wrap(CharBuffer.java:373)
>>        at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:138)
>>        at java.lang.StringCoding.decode(StringCoding.java:173)
>>
>>
>> This is going to be a huge problem later on when we index 50k
>> documents later on.
>>
>> These are the options I am running Solr with :
>>
>> java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
>> MaxPermSize=1024m    -jar  start.jar
>>
>>
>> Is there any way I can get the counts and not run out of memory?
>>
>> Thanks in advance,
>> Moazzam
>
>


Re: Need help with field collapsing and out of memory error

2010-09-01 Thread Jean-Sebastien Vachon
can you tell us what are your current settings regarding the fieldCollapseCache?

I had similar issues with field collapsing and I found out that this cache was 
responsible for 
most of the OOM exceptions.

Reduce or even remove this cache from your configuration and it should help.


On 2010-09-01, at 1:10 PM, Moazzam Khan wrote:

> Hi guys,
> 
> I have about 20k documents in the Solr index (and there's a lot of
> text in each of them). I have field collapsing enabled on a specific
> field (AdvisorID).
> 
> The thing is if I have field collapsing enabled in the search request
> I don't get correct count for the total number of records that
> matched. It always says that the number of "rows" I asked to get back
> is the number of total records it found.
> 
> And, when I run a query with search criteria *:* (to get the number of
> total advisors in the index) solr runs of out memory and gives me an
> error saying
> 
> SEVERE: java.lang.OutOfMemoryError: Java heap space
>at java.nio.CharBuffer.wrap(CharBuffer.java:350)
>at java.nio.CharBuffer.wrap(CharBuffer.java:373)
>at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:138)
>at java.lang.StringCoding.decode(StringCoding.java:173)
> 
> 
> This is going to be a huge problem later on when we index 50k
> documents later on.
> 
> These are the options I am running Solr with :
> 
> java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
> MaxPermSize=1024m-jar  start.jar
> 
> 
> Is there any way I can get the counts and not run out of memory?
> 
> Thanks in advance,
> Moazzam



Need help with field collapsing and out of memory error

2010-09-01 Thread Moazzam Khan
Hi guys,

I have about 20k documents in the Solr index (and there's a lot of
text in each of them). I have field collapsing enabled on a specific
field (AdvisorID).

The thing is if I have field collapsing enabled in the search request
I don't get correct count for the total number of records that
matched. It always says that the number of "rows" I asked to get back
is the number of total records it found.

And, when I run a query with search criteria *:* (to get the number of
total advisors in the index) solr runs of out memory and gives me an
error saying

SEVERE: java.lang.OutOfMemoryError: Java heap space
at java.nio.CharBuffer.wrap(CharBuffer.java:350)
at java.nio.CharBuffer.wrap(CharBuffer.java:373)
at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:138)
at java.lang.StringCoding.decode(StringCoding.java:173)


This is going to be a huge problem later on when we index 50k
documents later on.

These are the options I am running Solr with :

java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
MaxPermSize=1024m-jar  start.jar


Is there any way I can get the counts and not run out of memory?

Thanks in advance,
Moazzam