Hello - you need to increase the heap to work around the out of memory 
exception. There is not much you can to do increase the indexing speed using 
OpenNLP.

Regards,
Markus
 
-----Original message-----
> From:aruninfo100 <arunabraham...@gmail.com>
> Sent: Wednesday 22nd March 2017 12:27
> To: solr-user@lucene.apache.org
> Subject: RE: Exception while integrating openNLP with Solr
> 
> Hi,
> 
> I was able to resolve the issue.But when I run the indexing process it is
> taking too long to index bigger documents and some times I get java heap
> memory exception.
> How can I improve the performance while using dictionary lemmmatizers.
> 
> Thanks and Regards,
> Arun
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Exception-while-integrating-openNLP-with-Solr-tp4326146p4326197.html
> Sent from the Solr - User mailing list archive at Nabble.com.
> 

Reply via email to