IndexSearcher Lucene 3.6 API:
public void close()
throws IOException
<http://download.oracle.com/javase/1.5.0/docs/api/java/io/IOException.html?is-external=true>
Note that the underlying IndexReader is not closed, if IndexSearcher was
constructed with IndexSearcher(IndexReader r). If the IndexReader was
supplied implicitly by specifying a directory, then the IndexReader is
closed.
just added:
private void CloseIndexSearcher(IndexSearcher is) throws IOException {
IndexReader[] rl = is.getSubReaders();
for(IndexReader r : rl) {
r.close();
}
is.close();
}
and everything seems fine now!
Sorry for wasting your time! Hope that my stupidity will help someone else!
Michel
On Fri, May 18, 2012 at 11:30 AM, Michel Blase <[email protected]> wrote:
> Ian was right! I didn't notice that before each insert the code was
> performing a search!
>
> but I'm not sure how to solve the problem! This is how I changed the code,
> after each search I'm closing the IndexSearcher....but still....I get too
> many open files!
>
>
> private IndexSearcher getSearcher() throws CorruptIndexException,
> IOException {
> int NumberOfIndexes = Writers.size();
>
> ArrayList<IndexReader> readers = new ArrayList<IndexReader>();
> IndexReader[] readerList = new IndexReader[NumberOfIndexes];
>
> Set set = Writers.entrySet();
> Iterator i = set.iterator();
> while(i.hasNext()){
> Map.Entry index = (Map.Entry)i.next();
> IndexWriter iw = (IndexWriter)index.getValue();
> readers.add(IndexReader.open(iw, true));
> }
>
> MultiReader mr = new MultiReader(readers.toArray(readerList));
> return new IndexSearcher(mr);
> }
>
> public TopDocs Search(String q,Analyzer analyzer,int NumberOfResults)
> throws Exception {
> ExtendedQueryParser parser = new
> ExtendedQueryParser(LuceneVersion.CurrentVersion,"ID",analyzer);
> Query query = parser.parse(q);
>
> IndexSearcher is = getSearcher();
> TopDocs res = is.search(query, NumberOfResults);
> is.close();
>
> return res;
> }
>
>
>
>
> On Fri, May 18, 2012 at 11:04 AM, Edward W. Rouse
> <[email protected]>wrote:
>
>> I don't know. I do it as a matter of course. But if it fixes the problem,
>> then at least you know why you are getting the error and can work on a
>> scheme (using counters maybe), to do regular commits after every 10/20/100
>> documents.
>>
>> But you can't fix it until you know why it happens and this would confirm
>> or
>> eliminate one possible cause.
>>
>> > -----Original Message-----
>> > From: Michel Blase [mailto:[email protected]]
>> > Sent: Friday, May 18, 2012 1:49 PM
>> > To: [email protected]
>> > Subject: Re: old fashioned....."Too many open files"!
>> >
>> > but commit after each insert should be really expensive and
>> > unnecessary! no?
>> >
>> > On Fri, May 18, 2012 at 10:31 AM, Edward W. Rouse
>> > <[email protected]>wrote:
>> >
>> > > Have you tried adding im.commit() after adding a document? Could be
>> > all of
>> > > the uncommitted documents are leaving files open.
>> > >
>> > > > -----Original Message-----
>> > > > From: Michel Blase [mailto:[email protected]]
>> > > > Sent: Friday, May 18, 2012 1:24 PM
>> > > > To: [email protected]
>> > > > Subject: Re: old fashioned....."Too many open files"!
>> > > >
>> > > > also.....my problem is indexing!
>> > > >
>> > > > Preparation:
>> > > >
>> > > > private void SetUpWriters() throws Exception {
>> > > > Set set = IndexesPaths.entrySet();
>> > > > Iterator i = set.iterator();
>> > > >
>> > > > while(i.hasNext()){
>> > > > Map.Entry index = (Map.Entry)i.next();
>> > > > int id = (Integer)index.getKey();
>> > > > String path = (String)index.getValue();
>> > > >
>> > > > File app = new File(path);
>> > > > Directory dir = FSDirectory.open(app);
>> > > > IndexWriterConfig config = new
>> > > > IndexWriterConfig(LuceneVersion.CurrentVersion,new
>> > > > StandardAnalyzer(LuceneVersion.CurrentVersion));
>> > > >
>> > > > //config.setMaxBufferedDocs(50);
>> > > > config.setRAMBufferSizeMB(400);
>> > > > TieredMergePolicy mp =
>> > > > (TieredMergePolicy)config.getMergePolicy();
>> > > > mp.setUseCompoundFile(true);
>> > > > config.setMergePolicy(mp);
>> > > >
>> > > > /*
>> > > > LogMergePolicy lmp =
>> > > > (LogMergePolicy)config.getMergePolicy();
>> > > > lmp.setUseCompoundFile(true);
>> > > > lmp.setMaxMergeDocs(10000);
>> > > > config.setMergePolicy(lmp);
>> > > > */
>> > > >
>> > > > Writers.put(id, new IndexWriter(dir,config));
>> > > > }
>> > > > }
>> > > >
>> > > >
>> > > > adding document:
>> > > >
>> > > > public void AddDocument(Document doc,Analyzer analyzer) throws
>> > > > CorruptIndexException, IOException {
>> > > > IndexWriter im = Writers.get(this.CurrentOpenIndex_ID);
>> > > > im.addDocument(doc, analyzer);
>> > > > }
>> > > >
>> > > >
>> > > > there's not much more I'm doing!
>> > >
>> > >
>> > > ---------------------------------------------------------------------
>> > > To unsubscribe, e-mail: [email protected]
>> > > For additional commands, e-mail: [email protected]
>> > >
>> > >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>
>