The down and dirty answer is it's like defragmenting your harddrive,
you're basically compacting and sorting out index references. What you
need to know is that it makes searching so much faster after you've
updating the index.
Nader Henein
Miguel Angel wrote:
What`s mean Optimized index in Luc
Erik Hatcher wrote:
The Hits object already does some most-recently-used caching.
Is there any docs on this or should I look in source?
I plan on terabytes search, and have used DAO caching (ibatis) to make
my db fast.
I have no idea how fast Lucene will be untill I am done and loaded and
have qu
What are you trying to accomplish with your proposed cache that you are
missing with Lucene already? Are you finding access to documents slow?
The Hits object already does some most-recently-used caching.
Erik
On Nov 19, 2004, at 7:58 PM, Vic wrote:
Did somone write a cache of hits yet?
What`s mean Optimized index in Lucene¿?
--
Miguel Angel Angeles R.
Asesoria en Conectividad y Servidores
Telf. 97451277
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
Hi, i have 1000 doc (Word, PDF and HTML) , those documents indexed
in 5 min. Is this correct?? or i have problem with my Analyzer, i
used StandartAnalyzer
--
Miguel Angel Angeles R.
Asesoria en Conectividad y Servidores
Telf. 97451277
---
> why reindex?
Well, since I had different experiences with different analyzers I've tried, I
thougt that this
problem must origin from either the indexing or a lucene bug.
> As stated at the end of my mail, I'd expect that to skip the
> first term in the enum.
Yes, this must be a problem for m