can anyone guide me how to compress data files and in the same time search
in them while they are compressed ... any ideas ... is there any open source
tool for that
_
Don't just search. Find. Check out the new MSN Search!
Sanyi writes:
If there's a bug, it should be tracked down, not worked around...
Sure, but I'm working with 20million records and it takes about 25 hours to
re-index, so I'm
looking for ways that doesn't require reindexing.
why reindex?
My code was:
WildcardTermEnum wcenum =
Hey All;
Is it possible for there to be a situation where the locking file is in place
after the reader has been closed?
I have extra logging in place and have followed the code execution. The reader
finishes deleting old content and closes (I know this for sure). This is the
only reader
Is it possible that while my searcher process is reading the directory
that the index writer process performs a merge? If that is so, then
the I think that the merge could remove segment files before they are
read by the
reader. When the reader tries to read one of the now missing segment
files
It is possible, but it's not likely, as other users are not reporting
this.
Otis
--- Luke Shannon [EMAIL PROTECTED] wrote:
Hey All;
Is it possible for there to be a situation where the locking file is
in place after the reader has been closed?
I have extra logging in place and have
If you have more than one lucene application running on the same machine,
they all share the same temp file? Atleast I had this problem when I run my
application in 2 diff instances of weblogic on the same machine.
Praveen
- Original Message -
From: Otis Gospodnetic [EMAIL PROTECTED]
Did somone write a cache of hits yet? Like they have for DAO.
For example I say Dao.search(XYZ);
It 1st checks the memory cache to see if this was just asked, on a cache
miss it runs a search and puts in in cache.
If not, I will, it would take me a few weeks.
.V