I have also just seen this exception in a production system (lucene
1.4.3).
Any tips on what might be causing it? I'll be attempting to reproduce
it later, but I'm quite confident that while I'm using that
IndexWriter, no other readers are open, and no other Writers, either.
java.lang.Illeg
Hello Monsur,
Since the index is not being searched while it is being built, I'd use
a higher mergeFactor.
As for deployment, I am not sure about the Windows environment, but you
could try imitating this:
http://mail-archives.eu.apache.org/mod_mbox/lucene-java-user/200503.mbox/[EMAIL
PROTECTED]
Quite feasible and you'll be very glad you picked Lucene :)
Erik
On Mar 23, 2005, at 8:02 PM, [EMAIL PROTECTED] wrote:
Hi,
We are evaluating to use Lucence as one search engine for almost
100,000 documents (each will be less than 10,000 words)collections, is
it feasible or no sense at all
Hi,
We are evaluating to use Lucence as one search engine for almost 100,000
documents (each will be less than 10,000 words)collections, is it feasible or
no sense at all? someone has similar experience before?
Many thanks for suggestions,
George
The setup: Using Lucene.NET in a web environment on Win2k3 servers. One
process runs every 5 minutes, grabbing new rows from the database, and
adding them to a Lucene index. Only additions are made to the index, no
deletions. The mergeFactor is set to 2 to minimize the number of segments.
This
Hello, we have an issue with Lucene and haven't found a solution so far.
So, I'm asking for your help if that's possible.
The scenario is as follows:
We are using 2 lucene FS directories, a main one and a small temporary one
where all our documents are added. After adding 100 documents, that small
That sounds a little low. I'll assume that you profiled your whole
application, which leaves room for something else slowing things down,
and I'll suggest you write a standalone application whose only task is
to index documents as quickly as possible. Hm, this reminds me that
I've written stuff l
Daniel Naber wrote:
If that doesn't help: are you sure
you're using Lucene the right way, e.g. having only one
IndexReader/Searcher and using it for all searches?
That's my first suggestion too. Memory consumption should not primarily
grow per query, rather per IndexSearcher. You're seeing 80M
On Wednesday 23 March 2005 19:30, Jochen Franke wrote:
> 2. Are there possibilities to restrict or reduce the memory consumption
> of Lucene?
I think Doug made a fix for indexes with many fields. So you could try with
the development version from SVN. If that doesn't help: are you sure
you're u
Hello all,
our web application is currectly executing queries on a Lucene index
with 6 Mio. records. The memory used by the virtual machine increases up
to 80MB when a search executed. With eight parallel searches we hit the
400MB mark.
Because we had some "out of memory" exceptions in the appli
Pasha, in short, that is all I'm trying to do. Wasn't an issue really before.
Otis, not sure what Luke is. But the documents appear after we optimize.
Roy.
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e
oops. wow, that's a first for me that message of course was
meant for the tapestry-user list.
Erik
On Mar 22, 2005, at 9:36 PM, Erik Hatcher wrote:
A common pattern I've been implementing is to throw a
RedirectException in my listener methods from form submits or
DirectLink's.
12 matches
Mail list logo