at
certain condition. I tried to enable Autocommit / reduced the
maxdocbuffersize but of no use.. Can someone let me know what is the
best
way to overcome this issue?
Thanks,
Barani
--
View this message in context:
http://old.nabble.co
ketServerInstanc
>> eImpl$1.run(SocketServerInstanceImpl.java:578)
>>at java.lang.Thread.run(Unknown Source)
>> Mar 13, 2010 3:52:09 PM org.apache.solr.update.DirectUpdateHandler2
>> rollback
>> INFO: end_rollback
>>
>> Thanks,
>> Barani
>>
uot;x.id"/>
> >
> >
> > I having more than 2 million rows returned for Entity 2 and around 30
> > rows returned for entity1.
> >
> > I am have set the heap size to 1 GB but even then I am always getting
> heap
> > out of size error. I am not sure how to flush the documents in buffer at
> > certain condition. I tried to enable Autocommit / reduced the
> > maxdocbuffersize but of no use.. Can someone let me know what is the best
> > way to overcome this issue?
> >
> > Thanks,
> > Barani
> >
>
> --
> View this message in context:
> http://old.nabble.com/DIH---Out-of-Memory-error-when-using-CachedsqlEntityProcessor-tp27889623p27890751.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
me know what is the best
> way to overcome this issue?
>
> Thanks,
> Barani
>
--
View this message in context:
http://old.nabble.com/DIH---Out-of-Memory-error-when-using-CachedsqlEntityProcessor-tp27889623p27890751.html
Sent from the Solr - User mailing list archive at Nabble.com.
:
http://old.nabble.com/DIH---Out-of-Memory-error-when-using-CachedsqlEntityProcessor-tp27889623p27889623.html
Sent from the Solr - User mailing list archive at Nabble.com.