break the index up into smaller sub-indexes so that I
can distribute them across seperate physical disks for
better disk IO.
Thanks for your help!
Jim
--- Otis Gospodnetic [EMAIL PROTECTED]
wrote:
Hello,
--- James Dunn [EMAIL PROTECTED] wrote:
Hello all,
I have an index that's about 13GB
they had), but are
you sure running out of memory is due to Lucene, or
could it be a leak
in the app from which you are running queries?
Otis
--- James Dunn [EMAIL PROTECTED] wrote:
Doug,
We only search on analyzed text fields. There are
a
couple of additional fields
Hello,
I was wondering if anyone has had problems with memory
usage and MultiSearcher.
My index is composed of two sub-indexes that I search
with a MultiSearcher. The total size of the index is
about 3.7GB with the larger sub-index being 3.6GB and
the smaller being 117MB.
I am using Lucene 1.3
from memory.
-Will
-Original Message-
From: James Dunn [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 26, 2004 3:02 PM
To: [EMAIL PROTECTED]
Subject: Memory usage
Hello,
I was wondering if anyone has had problems with
memory
usage and MultiSearcher.
My index
Gilberto,
Look at the IndexWriter class. It has a property,
maxFieldLength, which you can set to determine the max
number of characters to be stored in the index.
http://jakarta.apache.org/lucene/docs/api/org/apache/lucene/index/IndexWriter.html
Jim
--- Gilberto Rodriguez
[EMAIL PROTECTED]
? It stores,
internally, up to 200 documents.
Erik
On May 26, 2004, at 4:08 PM, James Dunn wrote:
Will,
Thanks for your response. It may be an object
leak.
I will look into that.
I just ran some more tests and this time I create
a
20GB index by repeatedly merging
--- Doug Cutting [EMAIL PROTECTED] wrote:
James Dunn wrote:
Also I search across about 50 fields but I don't
use
wildcard or range queries.
Lucene uses one byte of RAM per document per
searched field, to hold the
normalization values. So if you search a 10M
document collection with
50
PROTECTED] wrote:
It is cached by the IndexReader and lives until the
index reader is
garbage collected. 50-70 searchable fields is a
*lot*. How many are
analyzed text, and how many are simply keywords?
Doug
James Dunn wrote:
Doug,
Thanks!
I just asked a question regarding
Kevin,
I have a similar issue. The only solution I have been
able to come up with is, after the merge, to open an
IndexReader against the merge index, iterate over all
the docs and delete duplicate docs based on my
primary key field.
Jim
--- Kevin A. Burton [EMAIL PROTECTED] wrote:
Let's say
Alex,
Could you send along whatever error messages you are
receiving?
Thanks,
Jim
--- Alex Wybraniec [EMAIL PROTECTED]
wrote:
I'm sorry if this is not the correct place to post
this, but I'm very
confused, and getting towards the end of my tether.
I need to install/compile and run Lucene
it would
be to write one?
Thanks,
Jim
--- Phil brunet [EMAIL PROTECTED] wrote:
Hi.
I had this problem when i transfered a Lucene index
by FTP in ASCII mode.
Using binary mode, i never has such a problem.
Philippe
From: James Dunn [EMAIL PROTECTED]
Reply-To: Lucene Users List
[EMAIL
Which version of lucene are you using? In 1.2, I
believe the lock file was located in the index
directory itself. In 1.3, it's in your system's tmp
folder.
Perhaps it's a permission problem on either one of
those folders. Maybe your process doesn't have write
access to the correct folder and
Hello all,
I have a web site whose search is driven by Lucene
1.3. I've been doing some load testing using JMeter
and occassionally I will see the exception below when
the search page is under heavy load.
Has anyone seen similar errors during load testing?
I've seen some posts with similar
13 matches
Mail list logo