To whom it may engage...
This is an automated request, but not an unsolicited one. For
more information please visit http://gump.apache.org/nagged.html,
and/or contact the folk at [EMAIL PROTECTED]
Project lucene-java has an issue affecting its community integration.
This issue affects
To whom it may engage...
This is an automated request, but not an unsolicited one. For
more information please visit http://gump.apache.org/nagged.html,
and/or contact the folk at [EMAIL PROTECTED]
Project lucene-java has an issue affecting its community integration.
This issue affects
[
https://issues.apache.org/jira/browse/LUCENE-957?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Doron Cohen updated LUCENE-957:
---
Attachment: lucene-957.patch
Previous patch apparently did not fix the bug - a casting problem in
RA
[
https://issues.apache.org/jira/browse/LUCENE-868?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-868:
---
Attachment: LUCENE-868-v2.patch
New patch that passes all tests (and compiles against the mem
[
https://issues.apache.org/jira/browse/LUCENE-868?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-868:
---
Attachment: (was: LUCENE-868-v1.patch)
> Making Term Vectors more accessible
> --
[
https://issues.apache.org/jira/browse/LUCENE-868?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12512842
]
Grant Ingersoll commented on LUCENE-868:
I also switched TermVectorMapper to be an abstract class per Yonik's
See http://lucene.zones.apache.org:8080/hudson/job/Lucene-Nightly/152/changes
--
[...truncated 822 lines...]
A contrib/gdata-server/webroot/WEB-INF/classes/gdata-account.xsd
A contrib/gdata-server/CHANGES.txt
A contrib/gdata-server/li
Do we have a best practice for going from, say a SpanQuery doc/
position information and retrieving the actual range of positions of
content from the Document? Is it just to reanalyze the Document
using the appropriate Analyzer and start recording once you hit the
positions you are interest
SpanQueryFilter addition
Key: LUCENE-960
URL: https://issues.apache.org/jira/browse/LUCENE-960
Project: Lucene - Java
Issue Type: Improvement
Components: Search
Reporter: Grant Ingersoll
[
https://issues.apache.org/jira/browse/LUCENE-960?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-960:
---
Attachment: SpanQueryFilter.java
Patch and tests for SpanQueryFilter
> SpanQueryFilter addit
[
https://issues.apache.org/jira/browse/LUCENE-960?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-960:
---
Attachment: (was: SpanQueryFilter.java)
> SpanQueryFilter addition
>
[
https://issues.apache.org/jira/browse/LUCENE-960?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-960:
---
Attachment: SpanQueryFilter.patch
Try again w/ an actual patch
> SpanQueryFilter addition
>
[
https://issues.apache.org/jira/browse/LUCENE-960?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Grant Ingersoll updated LUCENE-960:
---
Priority: Minor (was: Trivial)
Lucene Fields: [New, Patch Available] (was: [Patch A
: Do we have a best practice for going from, say a SpanQuery doc/
: position information and retrieving the actual range of positions of
: content from the Document? Is it just to reanalyze the Document
: using the appropriate Analyzer and start recording once you hit the
: positions you are inte
Slowly catching up...
Grant Ingersoll wrote:
> I think legally we are fine, since we aren't actually shipping it. I
> just mean that people may not want to wait however long it takes to
> download it. Of course, I don't know a work around other than to
> have some smaller set.
Measured the tim
[
https://issues.apache.org/jira/browse/LUCENE-530?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12512866
]
Mohammad Norouzi commented on LUCENE-530:
-
Hi
I am using this nice class but because of my requirements I had
16 matches
Mail list logo