Thanks Andrew!
Parallel i also found this thread:
http://grokbase.com/t/lucene/solr-user/117m8e9n8t/solr-3-3-exception-in-thread-lucene-merge-thread-1
they are talking about the same
We just started the importer again, with the unlimited-flag (/ulimit -v
unlimited /), then we will see.
8
An: solr-user@lucene.apache.org
Betreff: Re: java.io.IOException: Map failed :: OutOfMemory
today the same exception:
INFO: [] webapp=/solr path=/update
params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=2}
status=0 QTime=10
today the same exception:
INFO: [] webapp=/solr path=/update
params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=2}
status=0 QTime=1009
Nov 13, 2012 2:02:27 PM org.apache.solr.core.SolrDeletionPolicy onInit
INFO: SolrDeletionPolicy.onInit: commits:num=1
commit{dir=/net/smtcax
Kernel: 2.6.32.29-0.3-default #1 SMP 2011-02-25 13:36:59 +0100 x86_64
x86_64 x86_64 GNU/Linux
SUSE Linux Enterprise Server 11 SP1 (x86_64)
physical Memory: 4 GB
portadm@smtcax0033:/srv/connect/tomcat/instances/SYSTEST_Portal_01/bin>
java -version
java version "1.6.0_33"
Java(TM) SE Runtime Envi
Thanks Eric. We are using:
export JAVA_OPTS="-XX:MaxPermSize=400m -Xmx2000m -Xms200M
-Dsolr.solr.home=/home/connect/ConnectPORTAL/preview/solr-home"
We have arround 5 Millions documents. The index size is arround 50GB.
Before we add a document we delete the same id in the cache, doesn't matter
i
Have you tried the really simple solution of giving your JVM more memory
(-Xmx option)?
Best
Erick
On Tue, Nov 13, 2012 at 2:38 AM, uwe72 wrote:
> Version is 3.6.1 of solr
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/java-io-IOException-Map-failed-OutOfMemory
While adding lucene document we got this problem: What can we do here?
Nov 12, 2012 3:25:09 PM org.apache.solr.update.DirectUpdateHandler2 commit
INFO: start
commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false)
Exception in thread "Lucene Merge Thread #0"
org.apache.lucene