How'd you figure it out? I have the same problem and I can't find a solution.
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
Hi,
I applied the LUCENE-2899.patch which provide the openNLP capabilities to
solr for nlp capabilities.One such feature it provides is
lemmatization,which helps to match the root word.But integrating the same
was too much time consuming(indexing). It provides you with POS,Sentence
detection,Named
al message-
> From:aruninfo100
> Sent: Wednesday 22nd March 2017 19:15
> To: solr-user@lucene.apache.org
> Subject: RE: Exception while integrating openNLP with Solr
>
> Hi,
> Thanks for the reply.
>
> Kindly find the filed type scghema i am using :
>
>
>
>
Hi,
Thanks for the reply.
Kindly find the filed type scghema i am using :
Does the *opennlp_text* field be indexed="true"?
Here the en-lemmatizer.txt is 7mb in size.Without lemmatization usually the
whole indexing process takes on an average b
-Original message-
> From:aruninfo100
> Sent: Wednesday 22nd March 2017 18:30
> To: solr-user@lucene.apache.org
> Subject: RE: Exception while integrating openNLP with Solr
>
> Hi
>
> I am really finding it difficult to index documents using openNLP
> lemmatizer
Hi
I am really finding it difficult to index documents using openNLP
lemmatizer.The indexing is taking too much time(including commit).Is there a
way to optimize or increase the performance.
Also it will be helpful in knowing different opennlp lemmatizer
implementations which are also good perfo
pache.org
> Subject: RE: Exception while integrating openNLP with Solr
>
> Hi,
>
> I was able to resolve the issue.But when I run the indexing process it is
> taking too long to index bigger documents and some times I get java heap
> memory exception.
> How can I im
Hi,
I was able to resolve the issue.But when I run the indexing process it is
taking too long to index bigger documents and some times I get java heap
memory exception.
How can I improve the performance while using dictionary lemmmatizers.
Thanks and Regards,
Arun
--
View this message in conte
Hello - there is an underlying SIOoBE causing you trouble:
at java.lang.Thread.run(Thread.java:745)
*Caused by: java.lang.ArrayIndexOutOfBoundsException: 1*
at
opennlp.tools.lemmatizer.SimpleLemmatizer.(SimpleLemmatizer.java:46)
Regards,,
Marks
-Original message-
> From: