hey,
   
  i ran a small test and i have
   
  12,055,022 terms in the index,
   
  i have a strong feeling that the OS is not allowing the
   
  new Term[12055022]
   
   allocation
   
   
  JVM - 64bit
  Linux - 16GB RAM
   
  any ideas?


Andrew Schetinin <[EMAIL PROTECTED]> wrote:
  Hi,

That's somewhat strange, if I remember correctly the index size was 6
Gb, wasn't it? 
I saw posts from people working with tens of Gb indexes. And we worked
with index of 8 Gb in 32-bit JVM (on Windows 2000) with as little as 700
Mb of max memory allowed to JVM.

Are there too many documents/terms/segments the index contains? 
Did you try opening the same index with Luke? Allow to Luke 1-2 Gb and
give it a try...
Is it possible that the index is corrupted by some reason and there is
invalid number of term entires like 0x7effffff in the dictionary?

Best Regards,

Andrew






-----Original Message-----
From: zzzzz shalev [mailto:[EMAIL PROTECTED] 
Sent: Thursday, March 09, 2006 12:02 AM
To: java-user@lucene.apache.org
Subject: Re: 1.4.3 and 64bit support? out of memory??

hey chris,

i will check and let you know just to make sure,

basically i see the OS allocating memory (up to about 4GB) while
loading the indexes to memory and then crashing on the TermInfosReader
class. what i noticed was that the crash occured when lucene tried to
create a Term array with the following code

new Term[indexSize]

i assume, since this is an array java was trying to allocate
consecutive blocks in memory and this is hard to find , even in a 16 GB
RAM machine, especially since (if im not mistaken) indexSize here is the
termEnum size (which in my case is rather large)

i will get back to you about the one liner, if you have any other
thoughts id be extremely happy to hear them as this problem is a Major
road block 

thanks a million



Chris Hostetter wrote:

: i am recieving the following stack trace:
:
: JVMDUMP013I Processed Dump Event "uncaught", detail
"java/lang/OutOfMemoryError".
: Exception in thread "main" java.lang.OutOfMemoryError
: at
org.apache.lucene.index.TermInfosReader.readIndex(TermInfosReader.java:8
2)

is it possible that parts of your application are eating up all of the
heap in your JVM before this exception is encountered? Possibly by
opening a the index many times without closing it?

More specifically, if you write a 4 line app that does nothing by open
your index and then close it again, do you get an OOM? ...

public class Main {
public static void main(String[] args) throws Exception { Searcher s =
new IndexSearcher("/your/index/path");
s.close();
}
}



-Hoss


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




---------------------------------
Yahoo! Mail
Bring photos to life! New PhotoMail makes sharing a breeze. 


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



                
---------------------------------
Yahoo! Mail
Bring photos to life! New PhotoMail  makes sharing a breeze. 

Reply via email to