Great!!! It works perfect after I setup -Xms and -Xmx JVM command-line
parameters with:
java -Xms128m -Xmx128m
It turns out that my JVM is running out of memory. And Otis is right on
my
reader closing too.
reader.close() will close the reader and release any system resources
associated with it.
Ok, I see. Seems most ppl think is the third possiblity
On Fri, 10 Dec 2004, Xiangyu Jin wrote:
>
> I am not sure. But guess there are three possilities,
>
> (1). see that you use
> Field.Text("contents", stringBuffer.toString())
> This will store all your string of text into document object.
>
I am not sure. But guess there are three possilities,
(1). see that you use
Field.Text("contents", stringBuffer.toString())
This will store all your string of text into document object.
And it might be long ...
I do not know the detail how Lucene implemented.
I think you can try use unstored fir
You probably need to increase the amount of RAM available to your JVM.
See the parameters:
-Xmx :Maximum memory usable by the JVM
-Xms :Initial memory allocated to JVM
My params are; -Xmx2048m -Xms128m (2G max, 128M initial)
On Fri, 10 Dec 2004 11:17:29 -0600, Sildy Augustine
<[EMAIL PR
Ying,
You should follow this finally block advice below. In addition, I
think you can just close the reader, and it will close the underlying
stream (I'm not sure about that, double-check it).
You are not running out of file handles, though. Your JVM is running
out of memory. You can play with
I think you should close your files in a finally clause in case of
exceptions with file system and also print out the exception.
You could be running out of file handles.
-Original Message-
From: Jin, Ying [mailto:[EMAIL PROTECTED]
Sent: Friday, December 10, 2004 11:15 AM
To: [EMAIL PRO
Terence,
> 2) I have a background process to update the index files. If I keep
> the IndexSearcher opened, I am not sure whether it will pick up the
> changes from the index updates done in the background process.
This is a frequently asked question. Basically, you have to make use
of IndexReade
Use the life-cycle hooks mentioned in another email
(activate/passivate) and when you detect that the server is about to
unload your class, call close() on IndexSearcher. I haven't used
Lucene in an EJB environment, so I don't know the details,
unfortunately. :(
Your simulation may be too fast fo
Terence,
Calling close() on IndexSearcher will not release the memory
immediately. It will only release resources (e.g. other Java objects
used by IndexSearcher), and it is up to the JVM's garbage collector to
actually reclaim/release the previously used memory. There are
command-line parameters
Hi David,
In my test program, I invoke the IndexSearcher.close() method at the end of the loop.
However, it doesn't seems to release the memory. My concern is that even though I put
the IndexSearcher.close() statement in the hook methods, it may not release all the
memory until the application
> I tried to reuse the IndexSearcher, but I have another question. What
> happen if an application server unloads the class after it is idle for a
> while, and then re-instantiate the object back when it recieves a new
> request?
The EJB spec takes this into account, as there are hook methods you
Hi,
I tried to reuse the IndexSearcher, but I have another question. What happen if an
application server unloads the class after it is idle for a while, and then
re-instantiate the object back when it recieves a new request?
Everytime the server re-instantiates the class, a new IndexSearcher i
Hi Otis,
The reason why I ran into this problem is that I partition my search documents into
multiple index directories ordered by document modified date. My application only
returns the lastest 500 documents that matches the criteria. By partitioning the
documents into different directories, w
Reuse your IndexSearcher! :)
Also, I think somebody has written some EJB stuff to work with Lucene.
The project is on SF.net.
Otis
--- Terence Lai <[EMAIL PROTECTED]> wrote:
> Hi All,
>
> I am getting a OutOfMemoryError when I deploy my EJB application. To
> debug the problem, I wrote the fol
Terence,
This may help:
http://issues.apache.org/bugzilla/show_bug.cgi?id=30628
I had the problem, above...but I managed to resolve it be not closing
the indexsearcher. Instead I now reuse the same indexsearcher all of the
time within my JSP code as an application variable. GC keeps memory in
che
Thanks for pointing this out. Even I fixed the code to close the "fsDir" and also add
the ex.printStackTrace(System.out), I am still hitting the OutOfMemeoryError.
Terence
> On Wednesday 18 August 2004 00:30, Terence Lai wrote:
> > Â Â Â Â Â Â if (fsDir != null) {> Â Â Â Â Â Â Â Â try {> Â Â Â
On Wednesday 18 August 2004 00:30, Terence Lai wrote:
> Â Â Â Â Â Â if (fsDir != null) {
> Â Â Â Â Â Â Â Â try {
> Â Â Â Â Â Â Â Â Â Â is.close();
> Â Â Â Â Â Â Â Â } catch (Exception ex) {
> Â Â Â Â Â Â Â Â }
> Â Â Â Â Â Â }
You close is here again, not fsDir. Also, it's a good idea to never ign
Sorry. I should make it more clear in my last email. I have implemented an EJB Session
Bean executing the Lucene search. At the beginning, the session been is working fine.
It returns the correct search results to me. As more and more search requests being
processed, the server ends up having th
Yes, it was implemented to limit the number of clauses in a
BooleanQuery, which is what a WildcardQuery is rewritten to after
enumerating all the terms matching the wildcard expression:
BooleanQuery:
private static int maxClauseCount = 1024;
WildcardQuerys, as you've experienced, have
> No, but the JVM does have a memory limit. By default it's 64 megs, I
> believe. To increase it, use the -Xmx option when you run java.
Dan & Akila,
I may be wrong. But I remembered a while back there is a dicussion about
limiting the number of terms expanded using the wildcard query. I'm not
No, but the JVM does have a memory limit. By default it's 64 megs, I
believe. To increase it, use the -Xmx option when you run java.
For example, to give the JVM 100 megs of ram, you would write:
java -Xmx100m YourClassHere
-Original Message-
From: Âkila [mailto:[EMAIL PROTECTED]
Sent:
rån: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
> Skickat: den 19 mars 2003 16:19
> Till: [EMAIL PROTECTED]
> Ämne: Re: OutOfMemoryError with boolean queries
>
>
> Robert,
>
> I'm moving this to lucene-user, which is a more appropriate list for
> this type of a pro
Robert,
I'm moving this to lucene-user, which is a more appropriate list for
this type of a problem.
You are not saying whether you are using some of those handy -X (-Xms
-Xmx) command line switches when you invoke your application that dies
with OutOfMemoryError.
If you are not, try that, it may
I wrote:
> > Java often has misleading error messages. For example, on
> > solaris machines the default ulimit used to be 24 - that's 24 open
> > file handles! Yeesh. This will cause an OutOfMemoryError. So don't
Jeff Trent replied:
> Wow. I did not know that!
>
> I also don't see an op
AIL PROTECTED]>
Sent: Thursday, November 29, 2001 11:46 AM
Subject: Re: OutOfMemoryError
> Chantal,
> > For what I know now, I had a bug in my own code. still I don't
understand
> > where these OutOfMemoryErrors came from. I will try to index again in
one
> > thread wit
Chantal,
> For what I know now, I had a bug in my own code. still I don't understand
> where these OutOfMemoryErrors came from. I will try to index again in one
> thread without RAMDirectory just to check if the program is sane.
Java often has misleading error messages. For example, on
so
Doug sent the message below to the list on 3-Nov in response to
a query about file size limits. There may have been more
related stuff on the thread as well.
--
Ian.
> *** Anyway, is there anyway to control how big the indexes
> grow ?
The easiset thing is to set Ind
hi Ian, hi Winton, hi all,
sorry I meant heap size of 100Mb. I'm starting java with -Xmx100m. I'm not
setting -Xms.
For what I know now, I had a bug in my own code. still I don't understand
where these OutOfMemoryErrors came from. I will try to index again in one
thread without RAMDirectory
Watch out for merge sizes > 100 -- you'll run out of file descriptors
-- ? Also does mergeFactor have any effect in RAMdirectory ?
Winton
>I've loaded a large (but not as large as yours) index with mergeFactor
>set to 1000. Was substantially faster than with default setting.
>Making it highe
Were you using -mx and -ms (setting heap size ?)
Cheers,
Winton
>hi to all,
>
>please help! I think I mixed my brain up already with this stuff...
>
>I'm trying to index about 29 textfiles where the biggest one is ~700Mb and
>the smallest ~300Mb. I achieved once to run the whole index, with
I've loaded a large (but not as large as yours) index with mergeFactor
set to 1000. Was substantially faster than with default setting.
Making it higher didn't seem to make things much faster but did cause
it to use more memory. In addition I loaded the data in chunks in
separate processes and o
31 matches
Mail list logo