Hi
The IndexMergeTool (see url below) creates a new index, the "mergedIndex".
Do the other indexes, "index1", "index2", etc, need to be closed
before performing the merge?
This is the same as asking if the indexes passed to
IndexWriter.addIndexes need to be closed before they are added to the
new
>
> "pkimber" <[EMAIL PROTECTED]> wrote:
>
> > We are still getting various issues on our Lucene indexes running on
> > an NFS share. It has taken me some time to find some useful
> > information to report to the mailing list.
>
> Bummer!
>
> Can you zip up your test application that shows the iss
Hi Kai
No, I have no problem returning hits.
When I do have problems like this, I usually find I have something
more to learn about Lucene indexing. Try looking at the data and
query in Luke. I usually find this is the best way to understand what
is going on.
Here is the link to Luke:
http://w
Hi Kai
We keep a synchronized map of LuceneIndexAccessor instances, one instance per
Directory. The map is keyed on the directory path. We then re-use
the accessor rather than creating a new one each time.
Patrick
On 06/08/07, Kai Hu <[EMAIL PROTECTED]> wrote:
> Thanks , Patrick,
>
> It is use
Hi Kai
We use the Lucene Index Accessor contribution:
http://www.nabble.com/Fwd%3A-Contribution%3A-LuceneIndexAccessor-t17416.html#a47049
Patrick
On 06/08/07, Kai Hu <[EMAIL PROTECTED]> wrote:
> Hi,
>
> How do you solve the problems when add,update,delete documents
> in muti-threads,us
Hi Andy
I think:
Field.Text("name", "value");
has been replaced with:
new Field("name", "value", Field.Store.YES, Field.Index.TOKENIZED);
Patrick
On 25/07/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
Please reference How do I get code written for Lucene 1.4.x to work with
Lucene 2.x?
http
AIL PROTECTED]> wrote:
"Patrick Kimber" <[EMAIL PROTECTED]> wrote:
> Yes, there are many lines in the logs saying:
> hit FileNotFoundException when loading commit "segment_X"; skipping
> this commit point
> ...so it looks like the new code is working
writer is created?
I will add a check to my test to see if all documents are added. This
should tell us if any documents are being silently lost.
Thanks
Patrick
On 03/07/07, Michael McCandless <[EMAIL PROTECTED]> wrote:
"Patrick Kimber" <[EMAIL PROTECTED]> wrote:
> I
d
> machine will not see the current segments_N file written by the first
> machine and will incorrectly remove the newly created files.
>
> I think that "take2" JAR should at least resolve this
> FileNotFoundException but I think likely you are about to hit this new
> i
ncorrectly remove the newly created files.
I think that "take2" JAR should at least resolve this
FileNotFoundException but I think likely you are about to hit this new
issue.
Mike
"Patrick Kimber" <[EMAIL PROTECTED]> wrote:
> Hi Michael
>
> I am really pleased we
Hi Michael
I am really pleased we have a potential fix. I will look out for the patch.
Thanks for your help.
Patrick
On 03/07/07, Michael McCandless <[EMAIL PROTECTED]> wrote:
"Patrick Kimber" <[EMAIL PROTECTED]> wrote:
> I am using the NativeFSLockFactory. I w
es, it should close all writers and reader to release all
the locking.
The alternate solution to this problem is you can create seperate indexes
for each server, this will help because only one thread will be updating
the indexes so there wont be any problem.
Cheers,
Neeraj
"Patrick Kimb
appreciated.
Patrick
On 30/06/07, Michael McCandless <[EMAIL PROTECTED]> wrote:
Patrick Kimber wrote:
> I have been checking the application log. Just before the time when
> the lock file errors occur I found this log entry:
> [11:28:59] [ERROR] Inde
might try just using one of the nodes as the writer. In Michaels
comments, he always seems to mention the pattern of one writer many
readers on nfs. In this case you could use no LockFactory and perhaps
gain a little speed there.
- Mark
Patrick Kimber wrote:
> Hi Mark
>
> Yes, thank you.
t around the issue
somehow, but just to throw it out there...
- Mark
On 6/29/07, Patrick Kimber < [EMAIL PROTECTED]> wrote:
>
>
>
> I am using the Lucene Index Accessor contribution to co-ordinate the
> readers and writers:
>
>
http://www.nabble.com/Fwd
me from different nodes in the cluster?
> Can you make sure that when "the" writer gets the lock time-out there is
> indeed no other active writer?
>
> Doron
>
> "Patrick Kimber" <[EMAIL PROTECTED]> wrote on 29/06/2007
> 02:01:08:
>
> > Hi,
>
w2.open . time-out... but w3 closed the index so the
lock-file was supposed to be removed, why wasn't it?
Can write attempt come from different nodes in the cluster?
Can you make sure that when "the" writer gets the lock time-out there is
indeed no other active writer?
D
Hi,
We are sharing a Lucene index in a Linux cluster over an NFS share. We have
multiple servers reading and writing to the index.
I am getting regular lock exceptions e.g.
Lock obtain timed out:
NativeFSLock@/mnt/nfstest/repository/lucene/lock/lucene-2d3d31fa7f19eabb73d692df44087d81-n-write.lo
Hi
You could try SOLR
http://lucene.apache.org/solr/
This is obviously Java but you can access it using .NET...
Hope this helps
Patrick
On 09/02/07, Kainth, Sachin <[EMAIL PROTECTED]> wrote:
Hello all,
Does anyone know if there is a .NET version of Lucene Web Service?
Cheers
This email a
Hi Teresa
You need to convert the pdf file into text format before adding the
text to the Lucene index.
You may like to look at http://www.pdfbox.org/ for a library to
convert pdf files to text format.
Patrick
On 27/06/06, mcarcelen <[EMAIL PROTECTED]> wrote:
Hi,
I´m new with Lucene and I´m t
Hi Adam
Thanks for your help.
Patrick
On 23/05/06, Adam Constabaris <[EMAIL PROTECTED]> wrote:
Patrick Kimber wrote:
> Hi Adam
>
> We are getting the same error. Did you manage to work out what was
> causing the problem?
>
> Thanks
> Patrick
I can't say anyth
Hi Adam
We are getting the same error. Did you manage to work out what was
causing the problem?
Thanks
Patrick
On 21/04/06, Adam Constabaris <[EMAIL PROTECTED]> wrote:
This is a puzzler, I'm not sure if I'm doing something wrong or whether
I have a poisoned document, a corrupted index (failin
Hi Nick
Have you tried the Lucene Index Accessor contribution?
We have a similar update/search pattern and it works very well.
http://www.nabble.com/Fwd%3A-Contribution%3A-LuceneIndexAccessor-t17416.html#a47049
Patrick
On 28/03/06, Nick Atkins <[EMAIL PROTECTED]> wrote:
> I'm using Lucene runn
Hi Thomas
I have been getting similar errors and am trying to investigate the cause.
My current thinking is that it is caused by my virus checker opening
the files. The error only occurs on Windows. When I run the same
test on Linux I do not get the error.
Not much help I know... but at least yo
Hi Nikhil
We are using the index accessor contribution. For more information see:
http://www.nabble.com/Fwd%3A-Contribution%3A-LuceneIndexAccessor-t17416.html#a47049
This should help you to co-ordinate the IndexSearcher and IndexWriter.
Patrick
On 13/03/06, Nikhil Goel <[EMAIL PROTECTED]> wrote:
Hi Haritha
Hope the following helps:
Build Lucene Core from SVN
Download the lucene Subversion repository from:
http://svn.apache.org/repos/asf/lucene/java/trunk
Note: The CVS repository is still accessible but is out of date.
I downloaded to:
C:\src\lucene-svn\
To build (using ANT):
cd C:\sr
Hi
You should download the snowball contribution which is in the
SubVersion repository:
http://svn.apache.org/repos/asf/lucene/java/trunk/contrib/snowball
This can be built using ANT.
Patrick
On 06/03/06, Haritha_Parvatham <[EMAIL PROTECTED]> wrote:
> Hi,
> Can anyone giude me to intergrate s
I am getting intermittent errors with Lucene. Here are two examples:
java.io.IOException: Cannot rename E:\lucene\segments.new to E:\lucene\segments
java.io.IOException: Cannot rename E:\lucene\_8ya.tmp to E:\lucene\_8ya.del
This issue has an open BugZilla entry:
http://issues.apache.org/bugzilla
m) for extracting text
> from word documents, please let me know
>
> gui
>
> On Thu, 2005-11-24 at 16:58 +, Patrick Kimber wrote:
> > Thanks for the very quick response.
> >
> > On 24/11/05, Guilherme Barile <[EMAIL PROTECTED]> wrote:
> > > I hav
Hi Colin
Did you get some help?
Are you using Windows? If so, you can install TortoiseSVN which is a
shell extension:
http://tortoisesvn.tigris.org/
If you are using Windows or Linux you can use SmartSVN
http://www.smartcvs.com/smartsvn/
The url for Lucene on SVN is:
http://svn.apache.org/repos
Thanks for the very quick response.
On 24/11/05, Guilherme Barile <[EMAIL PROTECTED]> wrote:
> I have it here, uploaded it to rapidshare
> http://rapidshare.de/files/8097202/textmining.zip.html
>
> c ya
>
>
> On Thu, 2005-11-24 at 16:46 +, Patrick Kimber wrote
Hi
I am trying to download the source code for
tm-extractors-0.4.jar
from
http://www.textmining.org/
Looks like the site has been hacked.
Does anyone know the location of the CVS or SVN repository?
Thanks for your help...
Pat
-
T
t;[EMAIL PROTECTED]> wrote:
> On Dienstag 15 November 2005 11:24, Patrick Kimber wrote:
>
> > I have checked out the latest version of Lucene from CVS and have
> > found a change in the results compared to version 1.4.3.
>
> Lucene isn't in CVS anymore, it's
Hi
I have checked out the latest version of Lucene from CVS and have
found a change in the results compared to version 1.4.3.
The issue is with the deprecated API in the BooleanQuery class. The
deprecated function:
"public void add(Query query, boolean required, boolean prohibited)"
is returning d
We have found the book to be excellent.
http://lucenebook.com/
On 07/10/05, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> Could you tell me where i can find detailed documentation on lucene?
>
> thanks
>
>
-
To unsubscribe, e-m
Hi Giovanni
We are using the Neko HTML parser. Some simple example code can be
found in the "Lucene in Action" book.
For more information:
http://www.manning.com/books/hatcher2
http://www.apache.org/~andyc/neko/doc/html/
Patrick
On 29/07/05, Giovanni Novelli <[EMAIL PROTECTED]> wrote:
> Hello,
36 matches
Mail list logo