On Oct 18, 2009, at 1:47 PM, GlenAbbeyDrive wrote:
I commit the IndexWriter every 200 documents in a batch as follows
and you
can see that I reopened the reader after the commit.
private void commit(IndexWriter writer) throws CorruptIndexException {
writer.commit();
g and opening a new IndexReader? Or, by getting a near
>> real-time reader?
>>
>> Are you closing the IndexWriter after each batch, or calling commit?
>>
>> Mike
>>
>> On Sun, Oct 18, 2009 at 11:14 AM, GlenAbbeyDrive
>> wrote:
>>>
>>&g
>
> Mike
>
> On Sun, Oct 18, 2009 at 11:14 AM, GlenAbbeyDrive
> wrote:
>>
>> I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too many
>> files open within a few hours from my indexing process. Our indexing
>> Java
>> pr
nd opening a new IndexReader? Or, by getting a near
> > real-time reader?
> >
> > Are you closing the IndexWriter after each batch, or calling commit?
> >
> > Mike
> >
> > On Sun, Oct 18, 2009 at 11:14 AM, GlenAbbeyDrive
> > wrote:
> >>
that this is a hack, but it was needed ASAP, and
I couldn't find in my code anywhere where I wasn't closing the opened
indexreaders or indexwriters.
Aaron
On Oct 18, 2009, at 11:14 AM, GlenAbbeyDrive wrote:
I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too
many
file
opened
> indexreaders or indexwriters.
>
> Aaron
>
>
> On Oct 18, 2009, at 11:14 AM, GlenAbbeyDrive wrote:
>
>>
>> I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too
>> many
>> files open within a few hours from my indexing process. Ou
t 18, 2009 at 11:14 AM, GlenAbbeyDrive
> wrote:
>>
>> I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too many
>> files open within a few hours from my indexing process. Our indexing
>> Java
>> process adds about 2000 documents/minute.
>> The I
-time reader?
Are you closing the IndexWriter after each batch, or calling commit?
Mike
On Sun, Oct 18, 2009 at 11:14 AM, GlenAbbeyDrive wrote:
>
> I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too many
> files open within a few hours from my indexing process. Our
:
I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too
many
files open within a few hours from my indexing process. Our
indexing Java
process adds about 2000 documents/minute.
The IndexWriter (iw) has the following settings:
iw.setMaxFieldLength(1024*1024*1024); /
I switched from Lucene 2.4.0 to the latest 2.9.0 version and got too many
files open within a few hours from my indexing process. Our indexing Java
process adds about 2000 documents/minute.
The IndexWriter (iw) has the following settings:
iw.setMaxFieldLength(1024*1024*1024); // 1G
I open an index
> writer, write each document, optimize the index, then close the index writer.
> The problems comes about in the second 1000. Somewhere in writing the
> second 1000 I start to get a Too many files open exception. What I don't
> understand is why I can write the firs
I open an index
> writer, write each document, optimize the index, then close the index writer.
> The problems comes about in the second 1000. Somewhere in writing the
> second 1000 I start to get a Too many files open exception. What I don't
> understand is why I can write the firs
index writer.
The problems comes about in the second 1000. Somewhere in writing the second
1000 I start to get a Too many files open exception. What I don't understand
is why I can write the first 1000 with no problem, optimize and close the
writer and then have a problem the next time a
is
> 10,000. I
> got "too many files open" error when I index documents with 1,000,000
>
> documents. Then, My index maxMergeDocument is 10,000, and mergeFactor
> is
> 10,000. I can index that succefully. But when I search that, I got
> "too many
> files open&qu
14 matches
Mail list logo