On Fri, Nov 13, 2009 at 11:17 AM, Ian Lea <ian....@gmail.com> wrote: >> I got OutOfMemoryError at >> org.apache.lucene.search.Searcher.search(Searcher.java:183) >> My index is 43G bytes. Is that too big for Lucene ? >> Luke can see the index has over 1800M docs, but the search is also out >> of memory. >> I use -Xmx1024M to specify 1G java heap space. > > 43Gb is not too big for lucene, but it certainly isn't small and that > is a lot of docs. Just give it more memory. I would strongly recommend to give it more memory, what version of lucene do you use? Depending on your setup you could run into a JVM bug if you use a lucene version < 2.9. Your index is big enough (document wise) that you norms file grows > 100MB, depending on your Xmx settings this could trigger a false OOM during index open. So if you are using < 2.9 check out this issue https://issues.apache.org/jira/browse/LUCENE-1566
> >> One abnormal thing is that I broke a running optimize of this index. >> Is that can be a problem ? > > Possibly ... In general, this should not be a problem. The optimize will not destroy the index you are optimizing as segments are write once. > >> If so, how can I fix an index after optimize process is broken. > > Probably depends on what you mean by broken. Start with running > org.apache.lucene.index.CheckIndex. That can also fix some things - > but see the warning in the javadocs. 100% recommended to make sure nothing is wrong! :) > > > -- > Ian. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org > For additional commands, e-mail: java-user-h...@lucene.apache.org > > --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org