I don't know. I do it as a matter of course. But if it fixes the problem,
then at least you know why you are getting the error and can work on a
scheme (using counters maybe), to do regular commits after every 10/20/100
documents.

But you can't fix it until you know why it happens and this would confirm or
eliminate one possible cause.

> -----Original Message-----
> From: Michel Blase [mailto:mblas...@gmail.com]
> Sent: Friday, May 18, 2012 1:49 PM
> To: java-user@lucene.apache.org
> Subject: Re: old fashioned....."Too many open files"!
> 
> but commit after each insert should be really expensive and
> unnecessary! no?
> 
> On Fri, May 18, 2012 at 10:31 AM, Edward W. Rouse
> <ero...@comsquared.com>wrote:
> 
> > Have you tried adding im.commit() after adding a document? Could be
> all of
> > the uncommitted documents are leaving files open.
> >
> > > -----Original Message-----
> > > From: Michel Blase [mailto:mblas...@gmail.com]
> > > Sent: Friday, May 18, 2012 1:24 PM
> > > To: java-user@lucene.apache.org
> > > Subject: Re: old fashioned....."Too many open files"!
> > >
> > > also.....my problem is indexing!
> > >
> > > Preparation:
> > >
> > > private void SetUpWriters() throws Exception {
> > >         Set set = IndexesPaths.entrySet();
> > >         Iterator i = set.iterator();
> > >
> > >         while(i.hasNext()){
> > >             Map.Entry index = (Map.Entry)i.next();
> > >             int id = (Integer)index.getKey();
> > >             String path = (String)index.getValue();
> > >
> > >             File app = new File(path);
> > >             Directory dir = FSDirectory.open(app);
> > >             IndexWriterConfig config = new
> > > IndexWriterConfig(LuceneVersion.CurrentVersion,new
> > > StandardAnalyzer(LuceneVersion.CurrentVersion));
> > >
> > >             //config.setMaxBufferedDocs(50);
> > >             config.setRAMBufferSizeMB(400);
> > >             TieredMergePolicy mp =
> > > (TieredMergePolicy)config.getMergePolicy();
> > >             mp.setUseCompoundFile(true);
> > >             config.setMergePolicy(mp);
> > >
> > >             /*
> > >             LogMergePolicy lmp =
> > > (LogMergePolicy)config.getMergePolicy();
> > >             lmp.setUseCompoundFile(true);
> > >             lmp.setMaxMergeDocs(10000);
> > >             config.setMergePolicy(lmp);
> > >             */
> > >
> > >             Writers.put(id, new IndexWriter(dir,config));
> > >         }
> > >     }
> > >
> > >
> > > adding document:
> > >
> > > public void AddDocument(Document doc,Analyzer analyzer) throws
> > > CorruptIndexException, IOException {
> > >         IndexWriter im = Writers.get(this.CurrentOpenIndex_ID);
> > >         im.addDocument(doc, analyzer);
> > >     }
> > >
> > >
> > > there's not much more I'm doing!
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
> > For additional commands, e-mail: java-user-h...@lucene.apache.org
> >
> >


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org

Reply via email to