Yes, I optimized, but in the with SOLR. I don´t know why, but when optimize
an index with SOLR, it leaves you with about 15 files, instead of the 3...
I´ll try to optimize directly on lucene, and see what happens, if nothing
happens I´ll try your suggestion. Thanks a lot Mark!!

On 7/26/07, Mark Miller <[EMAIL PROTECTED]> wrote:

You know, on second though, a merge shouldn't even try to access a doc >
maxdoc (i think). Have you just tried an optimize?

On 7/25/07, Rafael Rossini <[EMAIL PROTECTED]> wrote:
>
> Hi guys,
>
>     Is there a way of deleting a document that, because of some
> corruption,
> got and docID larger than the maxDoc() ? I´m trying to do this but I get
> this Exception:
>
> Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException:
Array
> index out of range: 106577
>    at org.apache.lucene.util.BitVector.set(BitVector.java:53)
>    at org.apache.lucene.index.SegmentReader.doDelete (SegmentReader.java
> :301)
>    at org.apache.lucene.index.IndexReader.deleteDocument(
IndexReader.java
> :674)
>    at org.apache.lucene.index.MultiReader.doDelete(MultiReader.java:125)
>    at org.apache.lucene.index.IndexReader.deleteDocument (
IndexReader.java
> :674)
>    at teste.DeleteError.main(DeleteError.java:9)
>
> Thanks
>

Reply via email to