Great!!! It works perfect after I setup -Xms and -Xmx JVM command-line
parameters with:
java -Xms128m -Xmx128m
It turns out that my JVM is running out of memory. And Otis is right on
my
reader closing too.
reader.close() will close the reader and release any system resources
associated with it.
if I expand the VM's mem, it is then appears ok.
>
> :)
>
>
>
>
>
> On Fri, 10 Dec 2004, Jin, Ying wrote:
>
> > Hi, Everyone,
> >
> >
> >
> > We're trying to index ~1500 archives but get OutOfMemoryError about
> > halfway thro
max mem, if
there are 1M docs under the same dir (a stupid mistake I made).
But if I expand the VM's mem, it is then appears ok.
:)
On Fri, 10 Dec 2004, Jin, Ying wrote:
> Hi, Everyone,
>
>
>
> We're trying to index ~1500 archives but get OutOfMemoryError about
> ha
:[EMAIL PROTECTED]
> Sent: Friday, December 10, 2004 11:15 AM
> To: [EMAIL PROTECTED]
> Subject: OutOfMemoryError with Lucene 1.4 final
>
> Hi, Everyone,
>
> We're trying to index ~1500 archives but get OutOfMemoryError about
> halfway through the index process. I've t
10, 2004 11:15 AM
> To: [EMAIL PROTECTED]
> Subject: OutOfMemoryError with Lucene 1.4 final
>
> Hi, Everyone,
>
>
>
> We're trying to index ~1500 archives but get OutOfMemoryError about
> halfway through the index process. I've tried to run program under
> two
&
Hi, Everyone,
We're trying to index ~1500 archives but get OutOfMemoryError about
halfway through the index process. I've tried to run program under two
different Redhat Linux servers: One with 256M memory and 365M swap
space. The other one with 512M memory and 1G swap space. However
PROTECTED]
Subject: OutOfMemoryError with Lucene 1.4 final
Hi, Everyone,
We're trying to index ~1500 archives but get OutOfMemoryError about
halfway through the index process. I've tried to run program under two
different Redhat Linux servers: One with 256M memory and 365M swap
space.
> > Otis
> >
> > --- Terence Lai <[EMAIL PROTECTED]> wrote:
> >
> > > Hi All,
> > >
> > > I am getting a OutOfMemoryError when I deploy my EJB application.
> To
> > > debug the problem, I wrote the following test progra
close() method does
> not release all the memory and the server keeps unloading and
> re-instantiating the class, it will eventually hit the
> OutOfMemoryError issue. The test program from my previous email is
> simulating this condition. The reason why I instantiate/close the
> IndexSear
pplication server is shut down. Every time the EJB object is
> re-actived, a new IndexSearcher is open. If the resources allocated
> to the previous IndexSearcher cannot be fully released, the system
> will use up more memory. Eventually, it may run into the
> OutOfMemoryError.
>
>
ation server is shut down. Every time the EJB object is
re-actived, a new IndexSearcher is open. If the resources allocated to the previous
IndexSearcher cannot be fully released, the system will use up more memory.
Eventually, it may run into the OutOfMemoryError.
I am not very familiar with EJ
> I tried to reuse the IndexSearcher, but I have another question. What
> happen if an application server unloads the class after it is idle for a
> while, and then re-instantiate the object back when it recieves a new
> request?
The EJB spec takes this into account, as there are hook methods you
instance will be
created. If the IndexSearcher.close() method does not release all the memory and the
server keeps unloading and re-instantiating the class, it will eventually hit the
OutOfMemoryError issue. The test program from my previous email is simulating this
condition. The reason why I
SF.net.
>
> Otis
>
> --- Terence Lai <[EMAIL PROTECTED]> wrote:
>
> > Hi All,
> >
> > I am getting a OutOfMemoryError when I deploy my EJB application. To
> > debug the problem, I wrote the following test program:
> >
> > public static void main
Reuse your IndexSearcher! :)
Also, I think somebody has written some EJB stuff to work with Lucene.
The project is on SF.net.
Otis
--- Terence Lai <[EMAIL PROTECTED]> wrote:
> Hi All,
>
> I am getting a OutOfMemoryError when I deploy my EJB application. To
> debug the pr
executing the Lucene search. At the beginning, the session been is
> working fine. It returns the correct search results to me. As more and more search
> requests being processed, the server ends up having the OutOfMemoryError. If I
> restart the server, every thing works fine again.
>
Thanks for pointing this out. Even I fixed the code to close the "fsDir" and also add
the ex.printStackTrace(System.out), I am still hitting the OutOfMemeoryError.
Terence
> On Wednesday 18 August 2004 00:30, Terence Lai wrote:
> > Â Â Â Â Â Â if (fsDir != null) {> Â Â Â Â Â Â Â Â try {> Â Â Â
On Wednesday 18 August 2004 00:30, Terence Lai wrote:
> Â Â Â Â Â Â if (fsDir != null) {
> Â Â Â Â Â Â Â Â try {
> Â Â Â Â Â Â Â Â Â Â is.close();
> Â Â Â Â Â Â Â Â } catch (Exception ex) {
> Â Â Â Â Â Â Â Â }
> Â Â Â Â Â Â }
You close is here again, not fsDir. Also, it's a good idea to never ign
the OutOfMemoryError. If I restart the server,
every thing works fine again.
Terence
> Hi All,
>
> I am getting a OutOfMemoryError when I deploy my EJB application. To debug the
> problem,
> I wrote the following test program:
>
> public static void main(String[]
Hi All,
I am getting a OutOfMemoryError when I deploy my EJB application. To debug the
problem, I wrote the following test program:
public static void main(String[] args) {
try {
Query query = getQuery();
for (int i=0; i<1000; i++) {
sea
petite_abeille wrote:
On Apr 13, 2004, at 02:45, Kevin A. Burton wrote:
He mentioned that I might be able to squeeze 5-10% out of index
merges this way.
Talking of which... what strategy(ies) do people use to minimize
downtime when updating an index?
This should probably be a wiki page.
Any
I'm actually pretty lazy about index updates, and haven't had the need for
efficiency, since my requirement is that new documents should be
available on a next working day basis.
I reindex everything from scatch every night (400,000 docs) and store it
in an timestamped index. When the reindexin
On Apr 13, 2004, at 02:45, Kevin A. Burton wrote:
He mentioned that I might be able to squeeze 5-10% out of index merges
this way.
Talking of which... what strategy(ies) do people use to minimize
downtime when updating an index?
My current "strategy" is as follow:
(1) use a temporary RAMDirect
Not sure if this is a bug or expected behavior.
I took Doug's suggestion and migrated to a large BUFFER_SIZE of 1024^2
. He mentioned that I might be able to squeeze 5-10% out of index
merges this way.
I'm not sure if this is expected behavior but this requires a LOT of
memory. Without this
From: Âkila [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 15, 2003 9:15 AM
To: Lucene Users List
Subject: OutOfMemoryError when using wildcard queries
Hi,
Am using Lucene 1.2 and getting OutOfMemoryError when searching using
some wildcard queries.
Is there some provision that restricts the
ssHere
>
> -Original Message-
> From: Âkila [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, October 15, 2003 9:15 AM
> To: Lucene Users List
> Subject: OutOfMemoryError when using wildcard queries
>
>
> Hi,
>
> Am using Lucene 1.2 and getting OutOfMemoryEr
Sent: Wednesday, October 15, 2003 9:15 AM
To: Lucene Users List
Subject: OutOfMemoryError when using wildcard queries
Hi,
Am using Lucene 1.2 and getting OutOfMemoryError when searching using
some wildcard queries.
Is there some provision that restricts the number of terms for wildcard
qu
Hi,
Am using Lucene 1.2 and getting OutOfMemoryError when searching using
some wildcard queries.
Is there some provision that restricts the number of terms for wildcard
queries?
Thanks,
Akila
-
To unsubscribe, e-mail: [EMAIL
lt;[EMAIL PROTECTED]>
Sent: Wednesday, May 28, 2003 10:21 PM
Subject: Re: too many hits - OutOfMemoryError
>
> Unfortunately, no.
> The modifications are not very extreme, though.
> If you're interested in seeing our approach, let me know.
>
> DaveB
cc:
05/28/03 12:22 PM Subject: Re: too many hits -
OutOfMemoryError
Please respond to
> We ran into this problem and decided to put a check
> on the number of expanded terms and abort the query
> if the number got too high.
Is it possible to perform this check without having to modify Lucene's
source code?
--
Eric Jain
---
ROTECTED]To: "Lucene Users
List"
com>
<[EMAIL PROTECTED]>
cc:
05/28/03 11:16 AMSubject: Re: too many
hits - OutOfMemoryError
Please respond
to
AMSubject: Re: too many hits -
OutOfMemoryError
Please respond to
"Lucene Users
> Yes. Is that the problem?
I believe a term with a wildcard is expanded into all possible terms in
memory before searching for it, so if the term is 'a*', and you have a
million different terms starting with 'a' occuring in your documents,
it's quite possible to run out of memory.
Does anyone kn
Yes. Is that the problem?
At 05:13 PM 5/28/2003 +0200, you wrote:
> When I search with a query I know will hit most of the 1.8 million
> records, the "collect" print
> does not even print, it eats up the 700+MB I allocated and then
> throws an OutOfMemoryError.
Are you
> When I search with a query I know will hit most of the 1.8 million
> records, the "collect" print
> does not even print, it eats up the 700+MB I allocated and then
> throws an OutOfMemoryError.
Are you using wildcard
Thanks for the info, but unfortunately it still is getting an OutOfMemoryError,
Here's my code:
--
final BitSet bits = new BitSet();
HitCollector hc = new HitCollector() {
public void collect(int doc, float score){
System.out.pr
> Hits hits = searcher.search(myQuery);
BitSet results = new BitSet();
searcher.search(myQuery, new HitCollector()
{
public void collect(int doc, float score)
{
if (score > THRESHOLD)
results.set(doc);
}
});
--
Eric Jain
-
The computer is a 1.7Ghz P4 with 1.25GB Ram. I tried the jvm
arg: -Xmx700M (as I had a little over 700MB free).
Cory Albright
At 02:37 PM 5/27/2003 -0400, you wrote:
Out of curiosity, how much free RAM does the computer normally have? And
have you tried increasing the amount available to the
Out of curiosity, how much free RAM does the computer normally have? And
have you tried increasing the amount available to the JVM?
David Medinets
http://www.codebits.com
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For addition
Hi -
I have created an index of 1.8 million documents, each document containing
5-10 fields. When I run a search, that I know has a small number of hits,
it works great. However, if I run a search that I know will hit most of
the documents, I get an OutOfMemoryError.I am using the basic
rån: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
> Skickat: den 19 mars 2003 16:19
> Till: [EMAIL PROTECTED]
> Ämne: Re: OutOfMemoryError with boolean queries
>
>
> Robert,
>
> I'm moving this to lucene-user, which is a more appropriate list for
> this type of a pro
Robert,
I'm moving this to lucene-user, which is a more appropriate list for
this type of a problem.
You are not saying whether you are using some of those handy -X (-Xms
-Xmx) command line switches when you invoke your application that dies
with OutOfMemoryError.
If you are not, try that, i
-
From: Ian Lea [mailto:[EMAIL PROTECTED]]
Sent: Thursday, November 29, 2001 6:11 AM
To: Lucene Users List
Subject: Re: OutOfMemoryError
Doug sent the message below to the list on 3-Nov in response to
a query about file size limits. There may have been more
related stuff on the thread as well
I wrote:
> > Java often has misleading error messages. For example, on
> > solaris machines the default ulimit used to be 24 - that's 24 open
> > file handles! Yeesh. This will cause an OutOfMemoryError. So don't
Jeff Trent replied:
> Wow. I did not know
AIL PROTECTED]>
Sent: Thursday, November 29, 2001 11:46 AM
Subject: Re: OutOfMemoryError
> Chantal,
> > For what I know now, I had a bug in my own code. still I don't
understand
> > where these OutOfMemoryErrors came from. I will try to index again in
one
> > thread wit
or example, on
solaris machines the default ulimit used to be 24 - that's 24 open
file handles! Yeesh. This will cause an OutOfMemoryError. So don't
assume it's actually a memory problem, particularly if a memory
problem doesn't particularly make sense. Just a thought.
Steven
Doug sent the message below to the list on 3-Nov in response to
a query about file size limits. There may have been more
related stuff on the thread as well.
--
Ian.
> *** Anyway, is there anyway to control how big the indexes
> grow ?
The easiset thing is to set Ind
hi Ian, hi Winton, hi all,
sorry I meant heap size of 100Mb. I'm starting java with -Xmx100m. I'm not
setting -Xms.
For what I know now, I had a bug in my own code. still I don't understand
where these OutOfMemoryErrors came from. I will try to index again in one
thread without RAMDirectory
use RAMDirectory (as mentioned in the mailing list) and just use
>> IndexWriter.addDocument(). At the moment it seems not to make any
>>difference.
>> after a while _all_ the threads exit one after another (not all at once!)
>> with an OutOfMemoryError. the priority of all of
>
>I tried to use RAMDirectory (as mentioned in the mailing list) and just use
>IndexWriter.addDocument(). At the moment it seems not to make any difference.
>after a while _all_ the threads exit one after another (not all at once!)
>with an OutOfMemoryError. the priority of al
tried to use RAMDirectory (as mentioned in the mailing list) and just use
> IndexWriter.addDocument(). At the moment it seems not to make any difference.
> after a while _all_ the threads exit one after another (not all at once!)
> with an OutOfMemoryError. the priority of all of them is at
ems not to make any difference.
after a while _all_ the threads exit one after another (not all at once!)
with an OutOfMemoryError. the priority of all of them is at the minimum.
even if the multithreading doesn't increase performance I would be glad if I
could just once get it running aga
53 matches
Mail list logo