I think that the apache mail server is eating up the attachment. Try to make it a .diff file or attach the patch to a jira issue. Thanks !
Andi.. On Jul 6, 2012, at 18:54, Roman Chyla <roman.ch...@gmail.com> wrote: > Attaching the patch (there is no chance I could do it in one go, but > if parts are committed in the trunk, then we can do more...I have also > introduced base class for unittests, so that may be st to wave) > > So far, found one serious problem, crashes VM -- see. eg > test/test_BinaryDocument.py - when getting the document using: > reader.document(0) > > > What works fine now: > > test/ > test_Analyzers > test_Binary > test_RegexQuery > > samples/LuceneInAction/ > index.py > BasicSearchingTest.py > > > > On Thu, Jul 5, 2012 at 8:22 PM, Roman Chyla <roman.ch...@gmail.com> wrote: >> The patch probably probably didn't make it to the list, I'll file a ticket >> later >> >> It is definitely lot of work with the python code, I have gone through >> 1.5 test cases now, and it is just 'unpleasant', so many API changes >> out there - but I'll try to convert more >> >> roman >> >> On Thu, Jul 5, 2012 at 7:48 PM, Andi Vajda <va...@apache.org> wrote: >>> >>> On Jul 6, 2012, at 0:27, Roman Chyla <roman.ch...@gmail.com> wrote: >>> >>>> Lucene is 4.0 in alpha release and we would like to start working with >>>> pylucene4.0 already. I checked out the pylucene trunk and made the >>>> necessary changes so that it compiles. Would it be possible to >>>> incorporate (some of) these changes? >>> >>> Absolutely, please send a patch to the list or file a bug and attach it >>> there. >>> >>> The issue with a PyLucene 4.0 release is not so much getting it to compile >>> and run but rewriting all the tests and samples (originally ported from >>> Java) since the Lucene api changed in many ways. That's a large amount of >>> work and some of the new analyzer/tokenizer framework stuff needs some new >>> jcc support for generating classes on the fly. I've got that written to >>> some extent already but porting the samples and tests again is daunting. >>> >>> Andi.. >>> >>>> >>>> Thanks, >>>> >>>> Roman