From: "Otis Gospodnetic" <[EMAIL PROTECTED]>
To: "Lucene Users List" <[EMAIL PROTECTED]>
Sent: Thursday, August 19, 2004 8:00 AM
Subject: Re: Index Size
Just go for 1.4.1 and look at the CHANGES.txt file to see if there
were
any index format changes. If there were,
ndex? i.e. you're
not adding the same document twice into or via your tmp index.
sv
On Thu, 19 Aug 2004, Rob Jose wrote:
> Paul
> Thank you for your response. I have appended to the bottom of this
message
> the field structure that I am using. I hope that this helps. I am using
Dan
Thanks for your response. Yes, I have used Luke to look at the index and
everything looks good.
Rob
- Original Message -
From: "Armbrust, Daniel C." <[EMAIL PROTECTED]>
To: "Lucene Users List" <[EMAIL PROTECTED]>
Sent: Thursday, August 19, 2004 9:14 AM
Subject: RE: Index Size
Have
dex Size
Just go for 1.4.1 and look at the CHANGES.txt file to see if there were
any index format changes. If there were, you'll need to re-index.
Otis
--- Rob Jose <[EMAIL PROTECTED]> wrote:
> Otis
> I am using Lucene 1.3 final. Would it help if I move to Lucene 1.4
> f
htly build and see if using that takes car eof
your problem.
Otis
--- Rob Jose <[EMAIL PROTECTED]> wrote:
> Hey George
> Thanks for responding. I am using windows and I don't see any hidden
> files.
> I have a ton of CFS files (1366/1405). I have 22 F# (F1, F2, etc.)
>
that once in awhile one of the production indexes will have a 0 length
FNM file.
Rob
- Original Message -
From: "Rob Jose" <[EMAIL PROTECTED]>
To: "Lucene Users List" <[EMAIL PROTECTED]>
Sent: Thursday, August 19, 2004 6:42 AM
Subject: Re: Index Size
Be
an you check your code for any open IndexReaders when indexing, or
paste the relevant part to the list so we could have a look on it.
hope this helps
Bernhard
Rob Jose wrote:
>Hello
>I have indexed several thousand (52 to be exact) text files and I keep
running out of disk space to store
ich the lucene
creates.
That was taking half of the total size.
My problem is that after deleting the temporary files,
the index size is same as that of the data size. That
again seems to be a problem. I am yet to find out the
reason..
Thanks,
george
--- Rob Jose <[EMAIL PROTECTED]> w
to be a problem. I am yet to find out the
reason..
Thanks,
george
--- Rob Jose <[EMAIL PROTECTED]> wrote:
> Hello
> I have indexed several thousand (52 to be exact)
> text files and I keep running out of disk space to
> store the indexes. The size of the documents I have
&
har +
sCntyCode);
}
prodWriter.setUseCompoundFile(true);
prodWriter.addIndexes(new IndexReader[] { tempReader });
- Original Message -----
From: "Paul Elschot" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Thursday, August 19, 2004 12:16 AM
Subject: Re: Index Siz
Hello
I have indexed several thousand (52 to be exact) text files and I keep running out of
disk space to store the indexes. The size of the documents I have indexed is around
2.5 GB. The size of the Lucene indexes is around 287 GB. Does this seem correct? I
am not storing the contents of th
path and my analyzer appeared in the analyzers
list in the search tab as well as in the analyzers list in the plugins tab.
I am using Luke v 0.5 (2004-05-25)
Kannan
-----Original Message-
From: Rob Jose [mailto:[EMAIL PROTECTED]
Sent: Wednesday, July 21, 2004 11:37 AM
To: Lucene Users List
Sub
Sorry for the slightly off topic post, but I have a need to use luke with my
Analyzer. Has anyone done this? I have added a jar file to my classpath,
but that didn't help.
Thanks in advance
Rob
-
To unsubscribe, e-mail: [EMAIL
Is it possible to do a join on two fields when searching a Lucene Index.
For example, I have an index of documents that have a "StudentName" and a
"StudentId" field and another document that has "ClassId", "ClassName" and
"StudentId". I want to do a search on "ClassId" or "ClassName" and get a
lis
14 matches
Mail list logo