Hello,
I have a problem and i tried everything i could think of to solve it. TO 
understand my situation, i create indexes on several computers on our network 
and they are copied to one server. There, once a day, they are merged into one 
masterIndex, which is then searched. The problem is in merging. I use the 
following code:
 
Directory[] ar = new Directory[fileList.length];
       for(int i=0; i<fileList.length;i++) {
           ar[i] = FSDirectory.getDirectory(fileList[i], false);
       }
       writer.addIndexes(ar);
       for(int i=0; i<fileList.length;i++) {
           ar[i].close();         
       }
      writer.optimize();
      writer.close();
 
I also tried a longer way of opening every index separately and adding it 
document by document. The problem is i am getting OutOfMemory errors on this. 
When I use the per document way, it happens on the IndexReader.open command and 
only on indexes of approx 100M+ (The largest index I have is only about 150MB) 
When I run it on windows machine with JDK1.5 I get the following:
    Exception in thread "main" java.lang.OutOfMemoryError: Requested array size 
exceeds VM limit
On Linux I am running 1.4 and I get the message without the Array size 
information.
 
I did try it also on test index that was made from 11359 files  (1,59GB) that 
had 120MB and I got this error too. In my opinion 120MB index is not that big. 
The machine it runs on is a Xeon 3,2GHz with 2GB of RAM, so it should be 
enough. Can you please help me?
 
Thank you in advance,
 
Michael Trezzi

Reply via email to