I think you should close your files in a finally clause in case of
exceptions with file system and also print out the exception. 

You could be running out of file handles.

-----Original Message-----
From: Jin, Ying [mailto:[EMAIL PROTECTED] 
Sent: Friday, December 10, 2004 11:15 AM
To: [EMAIL PROTECTED]
Subject: OutOfMemoryError with Lucene 1.4 final

Hi, Everyone,

 

We're trying to index ~1500 archives but get OutOfMemoryError about
halfway through the index process. I've tried to run program under two
different Redhat Linux servers: One with 256M memory and 365M swap
space. The other one with 512M memory and 1G swap space. However, both
got OutOfMemoryError at the same place (at record 898). 

 

Here is my code for indexing:

===============================================

    Document doc = new Document();

    doc.add(Field.UnIndexed("path", f.getPath()));

    doc.add(Field.Keyword("modified",

 
DateField.timeToString(f.lastModified())));

    doc.add(Field.UnIndexed("eprintid", id));

    doc.add(Field.Text("metadata", metadata));

 

    FileInputStream is = new FileInputStream(f);  // the text file

    BufferedReader reader = new BufferedReader(new
InputStreamReader(is));

 

    StringBuffer stringBuffer = new StringBuffer();

    String line = "";

    try{

      while((line = reader.readLine()) != null){

        stringBuffer.append(line);

      }

      doc.add(Field.Text("contents", stringBuffer.toString()));

      // release the resources

      is.close();

      reader.close();

}catch(java.io.IOException e){}

=================================================

Is there anything wrong with my code or I need more memory?

 

Thanks for any help!

Ying


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to