Re: These two tests are failing a LOT over the last few days....

2019-06-06 Thread Jan Høydahl
Yea, I think so. I’ll @Awaitsfix them for now...

Jan Høydahl

> 7. jun. 2019 kl. 01:54 skrev Kevin Risden :
> 
> Is it due to the HTTP2 client changes for basic auth? 
> 
> https://issues.apache.org/jira/browse/SOLR-13510
> 
> Kevin Risden
> 
> 
>> On Thu, Jun 6, 2019 at 7:50 PM Erick Erickson  
>> wrote:
>> [repro]   5/5 failed: org.apache.solr.security.BasicAuthIntegrationTest
>> [repro]   5/5 failed: org.apache.solr.security.JWTAuthPluginIntegrationTest
>> 
>> Anyone recognize what recent commits might have triggered these?
>> 
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
>> For additional commands, e-mail: dev-h...@lucene.apache.org
>> 


[jira] [Updated] (LUCENE-8791) Add CollectorRescorer

2019-06-06 Thread Elbek Kamoliddinov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8791?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Elbek Kamoliddinov updated LUCENE-8791:
---
Attachment: LUCENE-8791.patch

> Add CollectorRescorer
> -
>
> Key: LUCENE-8791
> URL: https://issues.apache.org/jira/browse/LUCENE-8791
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Elbek Kamoliddinov
>Priority: Major
> Attachments: LUCENE-8791.patch, LUCENE-8791.patch, LUCENE-8791.patch, 
> LUCENE-8791.patch
>
>
> This is another implementation of query rescorer api (LUCENE-5489). It adds 
> rescoring functionality based on provided CollectorManager. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8791) Add CollectorRescorer

2019-06-06 Thread Elbek Kamoliddinov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8791?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858314#comment-16858314
 ] 

Elbek Kamoliddinov commented on LUCENE-8791:


After MIA, I am back at this. I was thinking along the line Mike, but even more 
brutal. So the change was just to have one sole ctor which takes single 
{{CollectorManager}} param. There is setter for setting {{ExecutorService}} 
object. This way single running single threaded will get first class treatment? 
I have attached an updated patch. 

[~atris]
{quote}
Interesting. How do you early terminate across multiple slices running 
concurrently?
{quote}
We distribute total number of results we are looking from matching across 
segments evenly plus some static number for overhead (I think) and terminate 
when each segment collects enough docs.   

> Add CollectorRescorer
> -
>
> Key: LUCENE-8791
> URL: https://issues.apache.org/jira/browse/LUCENE-8791
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Elbek Kamoliddinov
>Priority: Major
> Attachments: LUCENE-8791.patch, LUCENE-8791.patch, LUCENE-8791.patch
>
>
> This is another implementation of query rescorer api (LUCENE-5489). It adds 
> rescoring functionality based on provided CollectorManager. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.1-Windows (64bit/jdk-12) - Build # 133 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Windows/133/
Java: 64bit/jdk-12 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

12 tests failed.
FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth

Error Message:
Expected metric minimums for prefix SECURITY./authentication.: 
{failMissingCredentials=2, authenticated=19, passThrough=9, 
failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=17, passThrough=11, 
totalTime=17165100, failWrongCredentials=1, requestTimes=1271, requests=31, 
errors=0}

Stack Trace:
java.lang.AssertionError: Expected metric minimums for prefix 
SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=17, passThrough=11, 
totalTime=17165100, failWrongCredentials=1, requestTimes=1271, requests=31, 
errors=0}
at 
__randomizedtesting.SeedInfo.seed([61D2F8C846CE77E2:DDBC8EDAE29DF498]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
at 
org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 

[GitHub] [lucene-solr] iverase opened a new pull request #703: LUCENE-8838: Remove support for Steiner points

2019-06-06 Thread GitBox
iverase opened a new pull request #703: LUCENE-8838: Remove support for Steiner 
points
URL: https://github.com/apache/lucene-solr/pull/703
 
 
   Remove unused Steiner points support and fail if a hole is reduced to a 
point when all points are coplanar.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8838) Tessellator: Remove support for Steiner points

2019-06-06 Thread Ignacio Vera (JIRA)
Ignacio Vera created LUCENE-8838:


 Summary: Tessellator: Remove support for Steiner points
 Key: LUCENE-8838
 URL: https://issues.apache.org/jira/browse/LUCENE-8838
 Project: Lucene - Core
  Issue Type: Bug
Reporter: Ignacio Vera


Tessellator has support from Steiner points which come from the original 
porting of the MapBox's earcut algorithm to Java. We are not using such points 
and therefore it would be better to remove it.

In addition, it actually introduces a bug when a polygon hole is a line with al 
coplanar points.  In some cases it can be reduced to a point and then treated 
as a Steiner points. This looks to be wrong and on those cases we should throw 
an error.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.1-Linux (64bit/jdk-11.0.2) - Build # 394 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Linux/394/
Java: 64bit/jdk-11.0.2 -XX:+UseCompressedOops -XX:+UseParallelGC

15 tests failed.
FAILED:  org.apache.solr.cloud.NestedShardedAtomicUpdateTest.test

Error Message:
Error from server at http://127.0.0.1:41089/xfw/n/collection1: non ok status: 
500, message:Server Error

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server at http://127.0.0.1:41089/xfw/n/collection1: non ok status: 500, 
message:Server Error
at 
__randomizedtesting.SeedInfo.seed([A3ADC128F10D592E:2BF9FEF25FF134D6]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:579)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:207)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:224)
at 
org.apache.solr.BaseDistributedSearchTestCase.add(BaseDistributedSearchTestCase.java:576)
at 
org.apache.solr.cloud.NestedShardedAtomicUpdateTest.indexDocAndRandomlyCommit(NestedShardedAtomicUpdateTest.java:221)
at 
org.apache.solr.cloud.NestedShardedAtomicUpdateTest.sendWrongRouteParam(NestedShardedAtomicUpdateTest.java:191)
at 
org.apache.solr.cloud.NestedShardedAtomicUpdateTest.test(NestedShardedAtomicUpdateTest.java:55)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 

[jira] [Updated] (LUCENE-8811) Add maximum clause count check to IndexSearcher rather than BooleanQuery

2019-06-06 Thread Atri Sharma (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Atri Sharma updated LUCENE-8811:

Attachment: LUCENE-8811.patch

> Add maximum clause count check to IndexSearcher rather than BooleanQuery
> 
>
> Key: LUCENE-8811
> URL: https://issues.apache.org/jira/browse/LUCENE-8811
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8811.patch, LUCENE-8811.patch, LUCENE-8811.patch
>
>
> Currently we only check whether boolean queries have too many clauses. 
> However there are other ways that queries may have too many clauses, for 
> instance if you have boolean queries that have themselves inner boolean 
> queries.
> Could we use the new Query visitor API to move this check from BooleanQuery 
> to IndexSearcher in order to make this check more consistent across queries? 
> See for instance LUCENE-8810 where a rewrite rule caused the maximum clause 
> count to be hit even though the total number of leaf queries remained the 
> same.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8811) Add maximum clause count check to IndexSearcher rather than BooleanQuery

2019-06-06 Thread Atri Sharma (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8811?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858272#comment-16858272
 ] 

Atri Sharma commented on LUCENE-8811:
-

The last patch had an unintentional type: attached is a fixed patch:

 

[^LUCENE-8811.patch]

> Add maximum clause count check to IndexSearcher rather than BooleanQuery
> 
>
> Key: LUCENE-8811
> URL: https://issues.apache.org/jira/browse/LUCENE-8811
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8811.patch, LUCENE-8811.patch, LUCENE-8811.patch
>
>
> Currently we only check whether boolean queries have too many clauses. 
> However there are other ways that queries may have too many clauses, for 
> instance if you have boolean queries that have themselves inner boolean 
> queries.
> Could we use the new Query visitor API to move this check from BooleanQuery 
> to IndexSearcher in order to make this check more consistent across queries? 
> See for instance LUCENE-8810 where a rewrite rule caused the maximum clause 
> count to be hit even though the total number of leaf queries remained the 
> same.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13329) Placing exact number of replicas on a set of solr nodes, instead of each solr node.

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858259#comment-16858259
 ] 

ASF subversion and git services commented on SOLR-13329:


Commit b92ae784c7f0aa0208c7463a44a77622d719122e in lucene-solr's branch 
refs/heads/branch_8x from Noble Paul
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=b92ae78 ]

SOLR-13329: ref guide


> Placing exact number of replicas on a set of solr nodes, instead of each solr 
> node.
> ---
>
> Key: SOLR-13329
> URL: https://issues.apache.org/jira/browse/SOLR-13329
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: AutoScaling
>Affects Versions: master (9.0)
>Reporter: Amrit Sarkar
>Priority: Major
>
> Let's say we have a requirement where we would like to place:
> {code}
> exact X replica on a set of solr nodes comprises of solr-node-1, solr-node-2, 
> ... solr-node-N.
> {code}
> e.g. exact 1 replica on either of the respective 3 solr nodes, solr-node-1, 
> solr-node-2, solr-node-3, and rest of the replicas can be placed on 
> corresponding solr nodes.
> Right now we don't have a straightforward manner of doing the same. 
> Autoscaling cluster policy also doesn't support such behavior, but instead 
> takes an array of solr node names and treat them as separate rules as per 
> https://lucene.apache.org/solr/guide/7_7/solrcloud-autoscaling-policy-preferences.html#sysprop-attribute.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13329) Placing exact number of replicas on a set of solr nodes, instead of each solr node.

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858258#comment-16858258
 ] 

ASF subversion and git services commented on SOLR-13329:


Commit 10242afb1b34561b47ffcafbdfddfdae51018291 in lucene-solr's branch 
refs/heads/branch_8x from Noble Paul
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=10242af ]

SOLR-13329: ref guide


> Placing exact number of replicas on a set of solr nodes, instead of each solr 
> node.
> ---
>
> Key: SOLR-13329
> URL: https://issues.apache.org/jira/browse/SOLR-13329
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: AutoScaling
>Affects Versions: master (9.0)
>Reporter: Amrit Sarkar
>Priority: Major
>
> Let's say we have a requirement where we would like to place:
> {code}
> exact X replica on a set of solr nodes comprises of solr-node-1, solr-node-2, 
> ... solr-node-N.
> {code}
> e.g. exact 1 replica on either of the respective 3 solr nodes, solr-node-1, 
> solr-node-2, solr-node-3, and rest of the replicas can be placed on 
> corresponding solr nodes.
> Right now we don't have a straightforward manner of doing the same. 
> Autoscaling cluster policy also doesn't support such behavior, but instead 
> takes an array of solr node names and treat them as separate rules as per 
> https://lucene.apache.org/solr/guide/7_7/solrcloud-autoscaling-policy-preferences.html#sysprop-attribute.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858257#comment-16858257
 ] 

ASF subversion and git services commented on SOLR-13105:


Commit 827c346260a96d49d8233b67e06d7227dfc0cfe0 in lucene-solr's branch 
refs/heads/SOLR-13105-visual from Joel Bernstein
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=827c346 ]

SOLR-13105: More gallery images


> A visual guide to Solr Math Expressions and Streaming Expressions
> -
>
> Key: SOLR-13105
> URL: https://issues.apache.org/jira/browse/SOLR-13105
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot 
> 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, 
> Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 
> AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png
>
>
> Visualization is now a fundamental element of Solr Streaming Expressions and 
> Math Expressions. This ticket will create a visual guide to Solr Math 
> Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* 
> visualization examples.
> It will also cover using the JDBC expression to *analyze* and *visualize* 
> results from any JDBC compliant data source.
> Intro from the guide:
> {code:java}
> Streaming Expressions exposes the capabilities of Solr Cloud as composable 
> functions. These functions provide a system for searching, transforming, 
> analyzing and visualizing data stored in Solr Cloud collections.
> At a high level there are four main capabilities that will be explored in the 
> documentation:
> * Searching, sampling and aggregating results from Solr.
> * Transforming result sets after they are retrieved from Solr.
> * Analyzing and modeling result sets using probability and statistics and 
> machine learning libraries.
> * Visualizing result sets, aggregations and statistical models of the data.
> {code}
>  
> A few sample visualizations are attached to the ticket.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858247#comment-16858247
 ] 

ASF subversion and git services commented on SOLR-13105:


Commit e3eef4ae342e1e65d455eef53a821c502426 in lucene-solr's branch 
refs/heads/SOLR-13105-visual from Joel Bernstein
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=e3eef4a ]

SOLR-13105: More gallery images


> A visual guide to Solr Math Expressions and Streaming Expressions
> -
>
> Key: SOLR-13105
> URL: https://issues.apache.org/jira/browse/SOLR-13105
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot 
> 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, 
> Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 
> AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png
>
>
> Visualization is now a fundamental element of Solr Streaming Expressions and 
> Math Expressions. This ticket will create a visual guide to Solr Math 
> Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* 
> visualization examples.
> It will also cover using the JDBC expression to *analyze* and *visualize* 
> results from any JDBC compliant data source.
> Intro from the guide:
> {code:java}
> Streaming Expressions exposes the capabilities of Solr Cloud as composable 
> functions. These functions provide a system for searching, transforming, 
> analyzing and visualizing data stored in Solr Cloud collections.
> At a high level there are four main capabilities that will be explored in the 
> documentation:
> * Searching, sampling and aggregating results from Solr.
> * Transforming result sets after they are retrieved from Solr.
> * Analyzing and modeling result sets using probability and statistics and 
> machine learning libraries.
> * Visualizing result sets, aggregations and statistical models of the data.
> {code}
>  
> A few sample visualizations are attached to the ticket.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858241#comment-16858241
 ] 

ASF subversion and git services commented on SOLR-13105:


Commit 8b96a83ff89dbb54e1dbd906c4f7661d5aeac370 in lucene-solr's branch 
refs/heads/SOLR-13105-visual from Joel Bernstein
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=8b96a83 ]

SOLR-13105: More gallery images


> A visual guide to Solr Math Expressions and Streaming Expressions
> -
>
> Key: SOLR-13105
> URL: https://issues.apache.org/jira/browse/SOLR-13105
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot 
> 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, 
> Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 
> AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png
>
>
> Visualization is now a fundamental element of Solr Streaming Expressions and 
> Math Expressions. This ticket will create a visual guide to Solr Math 
> Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* 
> visualization examples.
> It will also cover using the JDBC expression to *analyze* and *visualize* 
> results from any JDBC compliant data source.
> Intro from the guide:
> {code:java}
> Streaming Expressions exposes the capabilities of Solr Cloud as composable 
> functions. These functions provide a system for searching, transforming, 
> analyzing and visualizing data stored in Solr Cloud collections.
> At a high level there are four main capabilities that will be explored in the 
> documentation:
> * Searching, sampling and aggregating results from Solr.
> * Transforming result sets after they are retrieved from Solr.
> * Analyzing and modeling result sets using probability and statistics and 
> machine learning libraries.
> * Visualizing result sets, aggregations and statistical models of the data.
> {code}
>  
> A few sample visualizations are attached to the ticket.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: These two tests are failing a LOT over the last few days....

2019-06-06 Thread Kevin Risden
Is it due to the HTTP2 client changes for basic auth?

https://issues.apache.org/jira/browse/SOLR-13510

Kevin Risden


On Thu, Jun 6, 2019 at 7:50 PM Erick Erickson 
wrote:

> [repro]   5/5 failed: org.apache.solr.security.BasicAuthIntegrationTest
> [repro]   5/5 failed: org.apache.solr.security.JWTAuthPluginIntegrationTest
>
> Anyone recognize what recent commits might have triggered these?
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


These two tests are failing a LOT over the last few days....

2019-06-06 Thread Erick Erickson
[repro]   5/5 failed: org.apache.solr.security.BasicAuthIntegrationTest
[repro]   5/5 failed: org.apache.solr.security.JWTAuthPluginIntegrationTest

Anyone recognize what recent commits might have triggered these?

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 378 - Still Failing

2019-06-06 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/378/

All tests passed

Build Log:
[...truncated 64157 lines...]
-ecj-javadoc-lint-tests:
[mkdir] Created dir: /tmp/ecj1005466702
 [ecj-lint] Compiling 48 source files to /tmp/ecj1005466702
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
/x1/jenkins/.ivy2/cache/org.restlet.jee/org.restlet/jars/org.restlet-2.3.0.jar
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
/x1/jenkins/.ivy2/cache/org.restlet.jee/org.restlet.ext.servlet/jars/org.restlet.ext.servlet-2.3.0.jar
 [ecj-lint] --
 [ecj-lint] 1. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 23)
 [ecj-lint] import javax.naming.NamingException;
 [ecj-lint]
 [ecj-lint] The type javax.naming.NamingException is not accessible
 [ecj-lint] --
 [ecj-lint] 2. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 28)
 [ecj-lint] public class MockInitialContextFactory implements 
InitialContextFactory {
 [ecj-lint]  ^
 [ecj-lint] The type MockInitialContextFactory must implement the inherited 
abstract method InitialContextFactory.getInitialContext(Hashtable)
 [ecj-lint] --
 [ecj-lint] 3. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 30)
 [ecj-lint] private final javax.naming.Context context;
 [ecj-lint]   
 [ecj-lint] The type javax.naming.Context is not accessible
 [ecj-lint] --
 [ecj-lint] 4. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 33)
 [ecj-lint] context = mock(javax.naming.Context.class);
 [ecj-lint] ^^^
 [ecj-lint] context cannot be resolved to a variable
 [ecj-lint] --
 [ecj-lint] 5. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 33)
 [ecj-lint] context = mock(javax.naming.Context.class);
 [ecj-lint]
 [ecj-lint] The type javax.naming.Context is not accessible
 [ecj-lint] --
 [ecj-lint] 6. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 36)
 [ecj-lint] when(context.lookup(anyString())).thenAnswer(invocation -> 
objects.get(invocation.getArgument(0)));
 [ecj-lint]  ^^^
 [ecj-lint] context cannot be resolved
 [ecj-lint] --
 [ecj-lint] 7. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 38)
 [ecj-lint] } catch (NamingException e) {
 [ecj-lint]  ^^^
 [ecj-lint] NamingException cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 8. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 45)
 [ecj-lint] public javax.naming.Context getInitialContext(Hashtable env) {
 [ecj-lint]
 [ecj-lint] The type javax.naming.Context is not accessible
 [ecj-lint] --
 [ecj-lint] 9. ERROR in 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java
 (at line 46)
 [ecj-lint] return context;
 [ecj-lint]^^^
 [ecj-lint] context cannot be resolved to a variable
 [ecj-lint] --
 [ecj-lint] 9 problems (9 errors)

BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/build.xml:643:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/build.xml:101:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/build.xml:651:
 The following error occurred while executing this line:
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-master/solr/common-build.xml:479:
 The following error occurred while executing this line:

[JENKINS] Lucene-Solr-8.1-Linux (64bit/jdk1.8.0_201) - Build # 393 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Linux/393/
Java: 64bit/jdk1.8.0_201 -XX:+UseCompressedOops -XX:+UseG1GC

10 tests failed.
FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth

Error Message:
Expected metric minimums for prefix SECURITY./authentication.: 
{failMissingCredentials=2, authenticated=19, passThrough=9, 
failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, 
totalTime=16855063, failWrongCredentials=1, requestTimes=1270, requests=29, 
errors=0}

Stack Trace:
java.lang.AssertionError: Expected metric minimums for prefix 
SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, 
totalTime=16855063, failWrongCredentials=1, requestTimes=1270, requests=29, 
errors=0}
at 
__randomizedtesting.SeedInfo.seed([30DBD8BA19A866B7:8CB5AEA8BDFBE5CD]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
at 
org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 

[jira] [Commented] (LUCENE-8837) smokeTestRelease.py option --download-only

2019-06-06 Thread JIRA


[ 
https://issues.apache.org/jira/browse/LUCENE-8837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858144#comment-16858144
 ] 

Jan Høydahl commented on LUCENE-8837:
-

The Pull Request also does some extracting of reusable functions into 
scriputil.py. Some of these edits are a preparation for the new 
releaseWizard.py tool that I am writing, splitting edits up in smaller chunks.

> smokeTestRelease.py option --download-only
> --
>
> Key: LUCENE-8837
> URL: https://issues.apache.org/jira/browse/LUCENE-8837
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> Add an option {{--download-only}} to the smoke tester.
> It will do a smoke test "light", only verifying that the artifacts are 
> downloadable and that the SHA sum and PGP signature verification passes. The 
> intended use is for a RM to run with this option after uploading artifacts to 
> the staging repo, since a full smoke-test is already performed locally.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy commented on a change in pull request #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy commented on a change in pull request #702: LUCENE-8837 
smokeTestRelease.py option --download-only
URL: https://github.com/apache/lucene-solr/pull/702#discussion_r291399688
 
 

 ##
 File path: dev-tools/scripts/smokeTestRelease.py
 ##
 @@ -14,35 +16,31 @@
 # limitations under the License.
 
 import argparse
-import os
 
 Review comment:
   Mostly auto rearranged imports. I could leave them as they were but this is 
more organized :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy commented on a change in pull request #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy commented on a change in pull request #702: LUCENE-8837 
smokeTestRelease.py option --download-only
URL: https://github.com/apache/lucene-solr/pull/702#discussion_r291398726
 
 

 ##
 File path: dev-tools/scripts/scriptutil.py
 ##
 @@ -66,14 +70,19 @@ def on_or_after(self, other):
(self.bugfix > other.bugfix or self.bugfix == other.bugfix and
self.prerelease >= other.prerelease)))
 
+  def gt(self, other):
+return (self.major > other.major or
+   (self.major == other.major and self.minor > other.minor) or
+   (self.major == other.major and self.minor == other.minor and 
self.bugfix > other.bugfix))
+
   def is_back_compat_with(self, other):
 if not self.on_or_after(other):
   raise Exception('Back compat check disallowed for newer version: %s < 
%s' % (self, other))
 return other.major + 1 >= self.major
 
-def run(cmd):
+def run(cmd, cwd=None):
 
 Review comment:
   Support for cwd in the run command is useful so that the cmd can be relative 
to some folder


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy commented on a change in pull request #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy commented on a change in pull request #702: LUCENE-8837 
smokeTestRelease.py option --download-only
URL: https://github.com/apache/lucene-solr/pull/702#discussion_r291399448
 
 

 ##
 File path: dev-tools/scripts/scriptutil.py
 ##
 @@ -99,6 +108,18 @@ def update_file(filename, line_re, edit):
 f.write(''.join(buffer))
   return True
 
+
+def check_ant():
 
 Review comment:
   Moved from buildAndPushRelease. Now also returns the actual ANT version. 
Will be used in releaseWizard script for warning about certain Ant versions.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy commented on a change in pull request #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy commented on a change in pull request #702: LUCENE-8837 
smokeTestRelease.py option --download-only
URL: https://github.com/apache/lucene-solr/pull/702#discussion_r291398578
 
 

 ##
 File path: dev-tools/scripts/scriptutil.py
 ##
 @@ -66,14 +70,19 @@ def on_or_after(self, other):
(self.bugfix > other.bugfix or self.bugfix == other.bugfix and
self.prerelease >= other.prerelease)))
 
+  def gt(self, other):
 
 Review comment:
   This is used to check if a given version is greater than current. Will be 
used in releaseWizard script to be committed later


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy commented on issue #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy commented on issue #702: LUCENE-8837 smokeTestRelease.py option 
--download-only
URL: https://github.com/apache/lucene-solr/pull/702#issuecomment-499694001
 
 
   I have tested these changes during the 7.7.2 release


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy opened a new pull request #702: LUCENE-8837 smokeTestRelease.py option --download-only

2019-06-06 Thread GitBox
janhoy opened a new pull request #702: LUCENE-8837 smokeTestRelease.py option 
--download-only
URL: https://github.com/apache/lucene-solr/pull/702
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8837) smokeTestRelease.py option --download-only

2019-06-06 Thread JIRA
Jan Høydahl created LUCENE-8837:
---

 Summary: smokeTestRelease.py option --download-only
 Key: LUCENE-8837
 URL: https://issues.apache.org/jira/browse/LUCENE-8837
 Project: Lucene - Core
  Issue Type: Improvement
  Components: general/tools
Reporter: Jan Høydahl
Assignee: Jan Høydahl


Add an option {{--download-only}} to the smoke tester.

It will do a smoke test "light", only verifying that the artifacts are 
downloadable and that the SHA sum and PGP signature verification passes. The 
intended use is for a RM to run with this option after uploading artifacts to 
the staging repo, since a full smoke-test is already performed locally.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8833) Allow subclasses of MMapDirecory to preload individual IndexInputs

2019-06-06 Thread Robert Muir (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858126#comment-16858126
 ] 

Robert Muir commented on LUCENE-8833:
-

Another idea is to expose the option completely differently to make it easier 
for the search use-case, maybe such as {{IndexInput.warm()}}. MMapDirectory 
could call {{load()}} on relevant bytebuffers, NIOFSDirectory could do 
whatever, ByteBuffersDirectory could do nothing. Someone could use this in 
their IndexReaderWarmer to efficiently warm their index and reduce user latency.

> Allow subclasses of MMapDirecory to preload individual IndexInputs
> --
>
> Key: LUCENE-8833
> URL: https://issues.apache.org/jira/browse/LUCENE-8833
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Simon Willnauer
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> I think it's useful for subclasses to select the preload flag on a per index 
> input basis rather than all or nothing. Here is a patch that has an 
> overloaded protected openInput method. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8833) Allow subclasses of MMapDirecory to preload individual IndexInputs

2019-06-06 Thread Robert Muir (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858120#comment-16858120
 ] 

Robert Muir commented on LUCENE-8833:
-

I'm just curious about more details. For the merge use-case, it makes sense to 
hint the operating system to do some read-ahead, since the bits will be 
accessed sequentially. But this flag does a lot more than that, it will touch 
every page too. It's too bad java has no other way to hit up madvise. :)

And isn't it the case the indexinput will already be open by IndexWriter? So 
I've always been confused about the IOContext ctor for that reason. It almost 
seems like {{clone(IOContext)}} would be more useful, we could practically do 
something there.

Anyway, just some concerns about exposing too much of this flag. Even if you 
can choose it based on arbitrary complex-logic, it would still be an 
all-or-nothing "hammer" because of how java limits us as far as telling the OS 
our intentions. The current IOContext is not really utilized much, because I 
think the problem is hard.

> Allow subclasses of MMapDirecory to preload individual IndexInputs
> --
>
> Key: LUCENE-8833
> URL: https://issues.apache.org/jira/browse/LUCENE-8833
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Simon Willnauer
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> I think it's useful for subclasses to select the preload flag on a per index 
> input basis rather than all or nothing. Here is a patch that has an 
> overloaded protected openInput method. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: VOTE: Apache Solr Reference Guide for Solr 8.0

2019-06-06 Thread David Smiley
Yeah, I totally appreciate your effort on the ref guide too Cassandra!

~ David Smiley
Apache Lucene/Solr Search Developer
http://www.linkedin.com/in/davidwsmiley


On Thu, Jun 6, 2019 at 4:00 PM Jan Høydahl  wrote:

> Cool, Cassandra. We really value your effort on the Ref Guide!
>
> --
> Jan Høydahl, search solution architect
> Cominvent AS - www.cominvent.com
>
> 6. jun. 2019 kl. 21:45 skrev Cassandra Targett :
>
> It took me a few days to get back to this - the week got away from me and
> I wanted to give Jan’s suggestions a decent review.
>
> Thanks, Jan, for those changes. It looks like it’s really one sentence
> that’s misleadingly wrong. I hate releasing with known errors like that,
> but if you don’t think it’s that big of a problem, I’ll go ahead with the
> release and fold your changes into 8.1. Maybe this is a good use for that
> otherwise unused Errata page in the PDF.
>
> Thanks everyone who voted, the VOTE has passed and I’ll get started
> releasing it.
>
> Cassandra
> On Jun 3, 2019, 6:16 PM -0500, Jan Høydahl , wrote:
>
> Feel free as RM to release without these changes and fold into 8.1.
> But here is a patch with re-phrasing of the two paragraphs I mentioned, as
> well as backport SOLR-12809, in case of a potential re-spin.
> https://gist.github.com/ed99d0945de112e05e1d1ff2ce6a
>
> --
> Jan Høydahl, search solution architect
> Cominvent AS - www.cominvent.com
>
> 3. jun. 2019 kl. 14:54 skrev Cassandra Targett :
>
> Jan, would you please suggest exact wording to correct the error you
> pointed out (quoted below). Since I clearly don’t understand the change,
> and this is months delayed already, that is likely be the fastest option to
> get a respin started this week. Otherwise, it will need to wait for me to
> wade back into the topic to see if I can understand what the correct
> wording should be.
>
> Thank you,
> Cassandra
> On May 31, 2019, 3:37 PM -0500, Jan Høydahl ,
> wrote:
>
> Viewed on a smartphone :)
>
> Comments:
>
> Major changes section:
>
> The Basic Authentication plugin now has an option forwardCredentials to
> prevent Basic Auth headers from being sent for inter-node requests.
>
>
> This is wrong. Basic Auth never sent basicauth headers inter-node, and
> adding the “forwardcredentials” option lets user-originated requests be
> forwarded with original basicauth credentials instead of PKI.
>
>
>
>
>
>
>


[jira] [Commented] (LUCENE-8781) Explore FST direct array arc encoding

2019-06-06 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858101#comment-16858101
 ] 

David Smiley commented on LUCENE-8781:
--

RE CHANGES.txt: Essentially your supposed to add an entry to CHANGES.txt in the 
appropriate section under the lowest version that will ship.  So that's 8.2 for 
this (ignoring 7.x since it won't see a release).  This same change goes in 
both branches's CHANGES.txt into the same spot like the rest of your code.  Be 
aware there can sometimes be annoying merge issues in CHANGES.txt due to weird 
discrepancies, so be careful you ultimately commit the changes you intend.
{quote}With this change you read "old" indexes and write "new" indexes. It is 
true that once you upgrade and write a "new" index, you can no longer read it 
with "old" code. So e.g. an index written with 8.2.0 could not be read by 
8.1.0, but vice-versa is fine. I think that is backwards-compatible but not 
forwards-compatible.
{quote}
Okay.  [~mikemccand] do you think this ought to be shared in CHANGES.txt under 
upgrading somehow?  Just looking for another opinion and shared eyes on this.
{quote}I did not enable it everywhere since I did not test it everywhere.
{quote}
Nonetheless you were not skimpy in your testing at all.  Since your tests 
revealed it seems to help on average, I think that's enough to do it by default 
_(even in the minimal builder constructor)_  Otherwise it becomes some magic 
voodoo to get better performance that only we here know about and may 
ultimately forget :).  Since you have intuition about certain cases (Kurumoji) 
are unlikely to be helpful, you could expressly disable it for them (with a 
comment).

> Explore FST direct array arc encoding 
> --
>
> Key: LUCENE-8781
> URL: https://issues.apache.org/jira/browse/LUCENE-8781
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Mike Sokolov
>Assignee: Dawid Weiss
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: FST-2-4.png, FST-6-9.png, FST-size.png
>
>  Time Spent: 3h 20m
>  Remaining Estimate: 0h
>
> This issue is for exploring an alternate FST encoding of Arcs as full-sized 
> arrays so Arcs are addressed directly by label, avoiding binary search that 
> we use today for arrays of Arcs. PR: 
> https://github.com/apache/lucene-solr/pull/657
> h3. Testing
> ant test passes. I added some unit tests that were helpful in uncovering bugs 
> while
> implementing which are more difficult to chase down when uncovered by the 
> randomized testing we already do. They don't really test anything new; 
> they're just more focused.
> I'm not sure why, but ant precommit failed for me with:
> {noformat}
>  ...lucene-solr/solr/common-build.xml:536: Check for forbidden API calls 
> failed while scanning class 
> 'org.apache.solr.metrics.reporters.SolrGangliaReporterTest' 
> (SolrGangliaReporterTest.java): java.lang.ClassNotFoundException: 
> info.ganglia.gmetric4j.gmetric.GMetric (while looking up details about 
> referenced class 'info.ganglia.gmetric4j.gmetric.GMetric')
> {noformat}
> I also got Test2BFST running (it was originally timing out due to excessive 
> calls to ramBytesUsage(), which seems to have gotten slow), and it passed; 
> that change isn't include here.
> h4. Micro-benchmark
> I timed lookups in FST via FSTEnum.seekExact in a unit test under various 
> conditions. 
> h5. English words
> A test of looking up existing words in a dictionary of ~17 English words 
> shows improvements; the numbers listed are % change in FST size, time to look 
> up (FSTEnum.seekExact) words that are in the dict, and time to look up random 
> strings that are not in the dict. The comparison is against the current 
> codebase with the optimization disabled. A separate comparison of showed no 
> significant change of the baseline (no opto applied) vs the current master 
> FST impl with no code changes applied.
> ||  load=2||   load=4 ||  load=16 ||
> | +4, -6, -7  | +18, -11, -8 | +22, -11.5, -7 |
> The "load factor" used for those measurements controls when direct array arc 
> encoding is used;
> namely when the number of outgoing arcs was > load * (max label - min label).
> h5. sequential and random terms
> The same test, with terms being a sequence of integers as strings shows a 
> larger improvement, around 20% (load=4). This is presumably the best case for 
> this delta, where every Arc is encoded as a direct lookup.
> When random lowercase ASCII strings are used, a smaller improvement of around 
> 4% is seen.
> h4. luceneutil
> Testing w/luceneutil (wikimediumall) we see improvements mostly in the 
> PKLookup case. Other results seem noisy, with perhaps a small improvment in 
> some of the queries.
> {noformat}
> TaskQPS base  

[jira] [Commented] (SOLR-13509) NullPointerException in JSON Facet if omitHeaders=true

2019-06-06 Thread Mikhail Khludnev (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858102#comment-16858102
 ] 

Mikhail Khludnev commented on SOLR-13509:
-

I think your points are all correct. I suppose we don't care anything beyond of 
{{SearchHandler}}. 

> NullPointerException in JSON Facet if omitHeaders=true
> --
>
> Key: SOLR-13509
> URL: https://issues.apache.org/jira/browse/SOLR-13509
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Facet Module
>Affects Versions: 8.1.1
> Environment: Solr 8.1.1 downloaded tar gz, started in cloud mode:
> {code:java}
> bin/solr start -e cloud -noprompt
> bin/solr create -c techproducts -s 2 -rf 2 -d 
> server/solr/configsets/sample_techproducts_configs/conf -n 
> sample_techproducts_configs
> bin/post -c techproducts example/exampledocs/*.xml{code}
>Reporter: Markus Kalkbrenner
>Assignee: Mikhail Khludnev
>Priority: Major
> Attachments: SOLR-13509.patch, SOLR-13509.patch
>
>
> The error exists in Solr 8.1.1 and didn't occur in Solr 8.0 and 7.x.
> Running this simple JSON Facet against the techproducts example (in cloud 
> mode) succeeds as expected:
> {code:java}
> curl http://localhost:8983/solr/techproducts/select -d '
> q=*:*&
> omitHeader=false&
> json.facet={
>   "max_price" : "max(price)"
> }{code}
> But as soon you omit Headers it results in a NullPointerException (which 
> didn't happen in earlier Solr versions):
> {code:java}
> curl http://localhost:8983/solr/techproducts/select -d '
> q=*:*&
> omitHeader=true&
> json.facet={
>   "max_price" : "max(price)"
> }'
> {code}
> Exception:
> {noformat}
> 2019-06-03 12:40:11.446 ERROR (qtp67730604-361) [c:techproducts s:shard2 
> r:core_node7 x:techproducts_shard2_replica_n4] o.a.s.h.RequestHandlerBase 
> java.lang.NullPointerException
>     at 
> org.apache.solr.search.facet.FacetModule.handleResponses(FacetModule.java:284)
>     at 
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:423)
>     at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
>     at org.apache.solr.core.SolrCore.execute(SolrCore.java:2566)
>     at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:756)
>     at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:542)
>     at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:397)
>     at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:343)
>     at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
>     at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
>     at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>     at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
>     at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1588)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
>     at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
>     at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
>     at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1557)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
>     at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
>     at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
>     at 
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
>     at 
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
>     at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>     at 
> org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
>     at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>     at org.eclipse.jetty.server.Server.handle(Server.java:502)
>     at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
>     at 
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
>     at 
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
>     at 

[JENKINS-MAVEN] Lucene-Solr-Maven-8.x #119: POMs out of sync

2019-06-06 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Maven-8.x/119/

No tests ran.

Build Log:
[...truncated 33024 lines...]
  [mvn] [INFO] -
  [mvn] [INFO] -
  [mvn] [ERROR] COMPILATION ERROR : 
  [mvn] [INFO] -

[...truncated 876 lines...]
BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-8.x/build.xml:680: The 
following error occurred while executing this line:
: Java returned: 1

Total time: 17 minutes 44 seconds
Build step 'Invoke Ant' marked build as failure
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Updated] (SOLR-13523) Atomic Update results in NullPointerException

2019-06-06 Thread Kieran Devlin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13523?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kieran Devlin updated SOLR-13523:
-
Component/s: (was: Admin UI)

> Atomic Update results in NullPointerException
> -
>
> Key: SOLR-13523
> URL: https://issues.apache.org/jira/browse/SOLR-13523
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: JSON Request API, update
>Affects Versions: 8.0
> Environment: * Operating system: Win10 v1803 build 17143.766
>  * Java version:
> java 11.0.1 2018-10-16 LTS
> Java(TM) SE Runtime Environment 18.9 (build 11.0.1+13-LTS)
> Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11.0.1+13-LTS, mixed mode)
>  * solr-spec: 8.1.1
>  * solr-impl: 8.1.1 fcbe46c28cef11bc058779afba09521de1b19bef - ab - 
> 2019-05-22 15:20:01
>  * lucene-spec: 8.1.1
>  * lucene-impl: 8.1.1 fcbe46c28cef11bc058779afba09521de1b19bef - ab - 
> 2019-05-22 15:15:24
>Reporter: Kieran Devlin
>Priority: Major
> Attachments: XUBrk.png, Xn1RW.png
>
>
> Partially update a document via an atomic update, when I do so, the web sever 
> responds with a 500 status with the stack trace:
> {code:java}
> { "responseHeader":{ "status":500, "QTime":1}, "error":{ 
> "trace":"java.lang.NullPointerException\r\n\tat 
> org.apache.solr.update.processor.AtomicUpdateDocumentMerger.getFieldFromHierarchy(AtomicUpdateDocumentMerger.java:301)\r\n\tat
>  
> org.apache.solr.update.processor.AtomicUpdateDocumentMerger.mergeChildDoc(AtomicUpdateDocumentMerger.java:398)\r\n\tat
>  
> org.apache.solr.update.processor.DistributedUpdateProcessor.getUpdatedDocument(DistributedUpdateProcessor.java:697)\r\n\tat
>  
> org.apache.solr.update.processor.DistributedUpdateProcessor.doVersionAdd(DistributedUpdateProcessor.java:372)\r\n\tat
>  
> org.apache.solr.update.processor.DistributedUpdateProcessor.lambda$versionAdd$0(DistributedUpdateProcessor.java:337)\r\n\tat
>  
> org.apache.solr.update.VersionBucket.runWithLock(VersionBucket.java:50)\r\n\tat
>  
> org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:337)\r\n\tat
>  
> org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:223)\r\n\tat
>  
> org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:475)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:75)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
>  
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
>  
> org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:92)\r\n\tat
>  
> org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.handleAdds(JsonLoader.java:507)\r\n\tat
>  
> org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.processUpdate(JsonLoader.java:145)\r\n\tat
>  
> org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.load(JsonLoader.java:121)\r\n\tat
>  

[jira] [Commented] (LUCENE-8781) Explore FST direct array arc encoding

2019-06-06 Thread Mike Sokolov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858079#comment-16858079
 ] 

Mike Sokolov commented on LUCENE-8781:
--

I think I -- did not understand how to edit CHANGES.txt correctly. I can 
address that, sure.

With this change you read "old" indexes and write "new" indexes. It is true 
that once you upgrade and write a "new" index, you can no longer read it with 
"old" code. So e.g. an index written with 8.2.0 could not be read by 8.1.0, but 
vice-versa is fine. I think that is backwards-compatible but not 
forwards-compatible.

I did not enable it everywhere since I did not test it everywhere. There were 
some cases where I saw substantial size increases, but no performance 
improvement; eg see AnalyzingSuggester above. But as you say at the 4x setting 
those did not grow much, so perhaps it would be safe to enable unconditionally. 
I'd like to see test eg Kurumoji and Nori to see if it helps there. I'm a 
little concerned about those since the packing of those chars is likely to be 
much sparser than ASCII or even UTF8 Latin chars? I don't know maybe those 
character sets are in dense-enough blocks that it will help.

> Explore FST direct array arc encoding 
> --
>
> Key: LUCENE-8781
> URL: https://issues.apache.org/jira/browse/LUCENE-8781
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Mike Sokolov
>Assignee: Dawid Weiss
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: FST-2-4.png, FST-6-9.png, FST-size.png
>
>  Time Spent: 3h 20m
>  Remaining Estimate: 0h
>
> This issue is for exploring an alternate FST encoding of Arcs as full-sized 
> arrays so Arcs are addressed directly by label, avoiding binary search that 
> we use today for arrays of Arcs. PR: 
> https://github.com/apache/lucene-solr/pull/657
> h3. Testing
> ant test passes. I added some unit tests that were helpful in uncovering bugs 
> while
> implementing which are more difficult to chase down when uncovered by the 
> randomized testing we already do. They don't really test anything new; 
> they're just more focused.
> I'm not sure why, but ant precommit failed for me with:
> {noformat}
>  ...lucene-solr/solr/common-build.xml:536: Check for forbidden API calls 
> failed while scanning class 
> 'org.apache.solr.metrics.reporters.SolrGangliaReporterTest' 
> (SolrGangliaReporterTest.java): java.lang.ClassNotFoundException: 
> info.ganglia.gmetric4j.gmetric.GMetric (while looking up details about 
> referenced class 'info.ganglia.gmetric4j.gmetric.GMetric')
> {noformat}
> I also got Test2BFST running (it was originally timing out due to excessive 
> calls to ramBytesUsage(), which seems to have gotten slow), and it passed; 
> that change isn't include here.
> h4. Micro-benchmark
> I timed lookups in FST via FSTEnum.seekExact in a unit test under various 
> conditions. 
> h5. English words
> A test of looking up existing words in a dictionary of ~17 English words 
> shows improvements; the numbers listed are % change in FST size, time to look 
> up (FSTEnum.seekExact) words that are in the dict, and time to look up random 
> strings that are not in the dict. The comparison is against the current 
> codebase with the optimization disabled. A separate comparison of showed no 
> significant change of the baseline (no opto applied) vs the current master 
> FST impl with no code changes applied.
> ||  load=2||   load=4 ||  load=16 ||
> | +4, -6, -7  | +18, -11, -8 | +22, -11.5, -7 |
> The "load factor" used for those measurements controls when direct array arc 
> encoding is used;
> namely when the number of outgoing arcs was > load * (max label - min label).
> h5. sequential and random terms
> The same test, with terms being a sequence of integers as strings shows a 
> larger improvement, around 20% (load=4). This is presumably the best case for 
> this delta, where every Arc is encoded as a direct lookup.
> When random lowercase ASCII strings are used, a smaller improvement of around 
> 4% is seen.
> h4. luceneutil
> Testing w/luceneutil (wikimediumall) we see improvements mostly in the 
> PKLookup case. Other results seem noisy, with perhaps a small improvment in 
> some of the queries.
> {noformat}
> TaskQPS base  StdDevQPS opto  StdDev  
>   Pct diff
>   OrHighHigh6.93  (3.0%)6.89  (3.1%)   
> -0.5% (  -6% -5%)
>OrHighMed   45.15  (3.9%)   44.92  (3.5%)   
> -0.5% (  -7% -7%)
> Wildcard8.72  (4.7%)8.69  (4.6%)   
> -0.4% (  -9% -9%)
>   AndHighLow  274.11  (2.6%)  273.58  (3.1%)   
> -0.2% (  -5% -5%)
>OrHighLow  241.41 

[jira] [Commented] (SOLR-13496) NullPointerException in JSONWriter.writeSolrDocument

2019-06-06 Thread Lucene/Solr QA (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858074#comment-16858074
 ] 

Lucene/Solr QA commented on SOLR-13496:
---

| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 1 new or modified test 
files. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  2m  
2s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
56s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
56s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Release audit (RAT) {color} | 
{color:green}  1m 56s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Check forbidden APIs {color} | 
{color:green}  1m 56s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Validate source patterns {color} | 
{color:green}  1m 56s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 52m  
0s{color} | {color:green} core in the patch passed. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 58m 54s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Issue | SOLR-13496 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12971045/SOLR-13496.patch |
| Optional Tests |  compile  javac  unit  ratsources  checkforbiddenapis  
validatesourcepatterns  |
| uname | Linux lucene1-us-west 4.4.0-137-generic #163~14.04.1-Ubuntu SMP Mon 
Sep 24 17:14:57 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | ant |
| Personality | 
/home/jenkins/jenkins-slave/workspace/PreCommit-SOLR-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh
 |
| git revision | master / df1775f |
| ant | version: Apache Ant(TM) version 1.9.3 compiled on July 24 2018 |
| Default Java | LTS |
|  Test Results | 
https://builds.apache.org/job/PreCommit-SOLR-Build/409/testReport/ |
| modules | C: solr/core U: solr/core |
| Console output | 
https://builds.apache.org/job/PreCommit-SOLR-Build/409/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> NullPointerException in JSONWriter.writeSolrDocument
> 
>
> Key: SOLR-13496
> URL: https://issues.apache.org/jira/browse/SOLR-13496
> Project: Solr
>  Issue Type: Bug
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-13496.patch, SOLR-13496.patch, SOLR-13496.patch
>
>
> For non-grouped searches 
> [QueryComponent.regularFinishStage|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java#L647-L655]
>  already considers the possibility of null {{SolrDocument}} values due to an 
> index change.
> For grouped searches 
> [GroupedEndResultTransformer.transform|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/search/grouping/endresulttransformer/GroupedEndResultTransformer.java#L94-L114]
>  potentially adds a null element to a {{SolrDocumentList}}.
> The 
> [TextResponseWriter.writeSolrDocumentList|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/TextResponseWriter.java#L170]
>  method passes any null {{SolrDocument}} through to the 
> [JSONWriter.writeSolrDocument|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/JSONWriter.java#L87]
>  method leading to a {{NullPointerException}} at line 87.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13481) Re-try of solr request will not happen with different live servers, if one request throws Exception

2019-06-06 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13481?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858071#comment-16858071
 ] 

David Smiley commented on SOLR-13481:
-

I don't recommend pasting in lots of code into the description of a Jira issue. 
 Instead I recommend using a durable link to the code to a particular line, 
like at GitHub.  Can you fix it accordingly please?

Also, comment how this problem surfaced for you or perhaps how to reproduce 
it... or is the more theoretical?

(p.s. I'm not volunteering to fix the issue but wanted to help you report the 
issue better)

> Re-try of solr request will not happen with different live servers, if one 
> request throws Exception
> ---
>
> Key: SOLR-13481
> URL: https://issues.apache.org/jira/browse/SOLR-13481
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: clients - java
>Affects Versions: 7.6
>Reporter: Rajeswari Natarajan
>Priority: Major
>
> LBHttpSolrClient.java needs to be fixed , as if the doRequest (called by 
> request method below) method throws exception, the for loop will get 
> terminated and the request will fail
>  
>  public Rsp request(Req req) throws SolrServerException, IOException {
>   Rsp rsp = new Rsp();
>   Exception ex = null;
>   boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
>  ADMIN_PATHS.contains(req.request.getPath());
>   List skipped = null;
>   
>   final Integer numServersToTry = req.getNumServersToTry();
>   int numServersTried = 0;
>   
>   boolean timeAllowedExceeded = false;
>   long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
>   long timeOutTime = System.nanoTime() + timeAllowedNano;
>   for (String serverStr : req.getServers()) {
>     if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
>  timeOutTime))
> {  break;    }
>  
>     serverStr = normalize(serverStr);
>     // if the server is currently a zombie, just skip to the next one
>     ServerWrapper wrapper = zombieServers.get(serverStr);
>     if (wrapper != null) {
>   // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr);
>   final int numDeadServersToTry = req.getNumDeadServersToTry();
>   if (numDeadServersToTry > 0) {
>     if (skipped == null)
> {  skipped = new ArrayList<>(numDeadServersToTry);  
> skipped.add(wrapper);    }
>    else if (skipped.size() < numDeadServersToTry)
> {  skipped.add(wrapper);    }
>  }
>   continue;
>     }
>     try {
>   MDC.put("LBHttpSolrClient.url", serverStr);
>   
>   if (numServersToTry != null && numServersTried >
>  numServersToTry.intValue())
> {    break;  }
>   
>   HttpSolrClient client = makeSolrClient(serverStr);
>   
>   ++numServersTried;
>   ex = doRequest(client, req, rsp, isNonRetryable, false, null);
>   if (ex == null) \{    return rsp; // SUCCESS  }
>     } finally \{  MDC.remove("LBHttpSolrClient.url");    }
>   }
>   
>   // try the servers we previously skipped
>   if (skipped != null) {
>     for (ServerWrapper wrapper : skipped) \{  if 
> (timeAllowedExceeded = isTimeExceeded(timeAllowedNano, timeOutTime)) \{   
>  break;  }
>  
>   
>   if (numServersToTry != null && numServersTried >
>  numServersToTry.intValue())
>  \{    break;  }
>  
>   try {
>     MDC.put("LBHttpSolrClient.url", wrapper.client.getBaseURL());
>     ++numServersTried;
>     ex = doRequest(wrapper.client, req, rsp, isNonRetryable, true,
>  wrapper.getKey());
>     if (ex == null)
> {  return rsp; // SUCCESS    }
>  } finally
> {    MDC.remove("LBHttpSolrClient.url");  }
>    }
>   }
>   
>   
>   final String solrServerExceptionMessage;
>   if (timeAllowedExceeded)
> {    solrServerExceptionMessage = "Time allowed to handle this request 
> exceeded";  }
> else {
>     if (numServersToTry != null && numServersTried >
>  numServersToTry.intValue())
> {  solrServerExceptionMessage = "No live SolrServers available to 
> handle this request:"  + " numServersTried="+numServersTried  
>     + " numServersToTry="+numServersToTry.intValue();    }
> else
> {  solrServerExceptionMessage = "No live SolrServers available to 
> handle this request";    }
>  }
>   if (ex == null)
> {    throw new SolrServerException(solrServerExceptionMessage);  }
> else
> {  

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-11.0.2) - Build # 5184 - Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5184/
Java: 64bit/jdk-11.0.2 -XX:+UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  
org.apache.solr.cloud.TestSolrCloudWithDelegationTokens.testDelegationTokenRenew

Error Message:
expected:<200> but was:<403>

Stack Trace:
java.lang.AssertionError: expected:<200> but was:<403>
at 
__randomizedtesting.SeedInfo.seed([206944C93AB3B619:17F2B0D7027F6BBD]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at 
org.apache.solr.cloud.TestSolrCloudWithDelegationTokens.renewDelegationToken(TestSolrCloudWithDelegationTokens.java:131)
at 
org.apache.solr.cloud.TestSolrCloudWithDelegationTokens.verifyDelegationTokenRenew(TestSolrCloudWithDelegationTokens.java:316)
at 
org.apache.solr.cloud.TestSolrCloudWithDelegationTokens.testDelegationTokenRenew(TestSolrCloudWithDelegationTokens.java:334)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 

[jira] [Commented] (LUCENE-8781) Explore FST direct array arc encoding

2019-06-06 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858045#comment-16858045
 ] 

David Smiley commented on LUCENE-8781:
--

[~sokolov] I'm confused about some things:
* You've committed this to master & branch_8x.  But only the master branch's 
CHANGES.txt says it has this improvement, and it says so as a 9.0 only thing.  
Shouldn't this be added to the 8.2.0 section on both branches?  
* is this a backwards incompatible change to a written index (any place an FST 
might be serialized)?
* why would an FST user _not_ want this setting?  I noticed you disabled this 
for a few uses of FSTs and I don't know the rhyme/reason.  Even with the 
default 4x load factor, the FST is barely any larger.

> Explore FST direct array arc encoding 
> --
>
> Key: LUCENE-8781
> URL: https://issues.apache.org/jira/browse/LUCENE-8781
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Mike Sokolov
>Assignee: Dawid Weiss
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: FST-2-4.png, FST-6-9.png, FST-size.png
>
>  Time Spent: 3h 20m
>  Remaining Estimate: 0h
>
> This issue is for exploring an alternate FST encoding of Arcs as full-sized 
> arrays so Arcs are addressed directly by label, avoiding binary search that 
> we use today for arrays of Arcs. PR: 
> https://github.com/apache/lucene-solr/pull/657
> h3. Testing
> ant test passes. I added some unit tests that were helpful in uncovering bugs 
> while
> implementing which are more difficult to chase down when uncovered by the 
> randomized testing we already do. They don't really test anything new; 
> they're just more focused.
> I'm not sure why, but ant precommit failed for me with:
> {noformat}
>  ...lucene-solr/solr/common-build.xml:536: Check for forbidden API calls 
> failed while scanning class 
> 'org.apache.solr.metrics.reporters.SolrGangliaReporterTest' 
> (SolrGangliaReporterTest.java): java.lang.ClassNotFoundException: 
> info.ganglia.gmetric4j.gmetric.GMetric (while looking up details about 
> referenced class 'info.ganglia.gmetric4j.gmetric.GMetric')
> {noformat}
> I also got Test2BFST running (it was originally timing out due to excessive 
> calls to ramBytesUsage(), which seems to have gotten slow), and it passed; 
> that change isn't include here.
> h4. Micro-benchmark
> I timed lookups in FST via FSTEnum.seekExact in a unit test under various 
> conditions. 
> h5. English words
> A test of looking up existing words in a dictionary of ~17 English words 
> shows improvements; the numbers listed are % change in FST size, time to look 
> up (FSTEnum.seekExact) words that are in the dict, and time to look up random 
> strings that are not in the dict. The comparison is against the current 
> codebase with the optimization disabled. A separate comparison of showed no 
> significant change of the baseline (no opto applied) vs the current master 
> FST impl with no code changes applied.
> ||  load=2||   load=4 ||  load=16 ||
> | +4, -6, -7  | +18, -11, -8 | +22, -11.5, -7 |
> The "load factor" used for those measurements controls when direct array arc 
> encoding is used;
> namely when the number of outgoing arcs was > load * (max label - min label).
> h5. sequential and random terms
> The same test, with terms being a sequence of integers as strings shows a 
> larger improvement, around 20% (load=4). This is presumably the best case for 
> this delta, where every Arc is encoded as a direct lookup.
> When random lowercase ASCII strings are used, a smaller improvement of around 
> 4% is seen.
> h4. luceneutil
> Testing w/luceneutil (wikimediumall) we see improvements mostly in the 
> PKLookup case. Other results seem noisy, with perhaps a small improvment in 
> some of the queries.
> {noformat}
> TaskQPS base  StdDevQPS opto  StdDev  
>   Pct diff
>   OrHighHigh6.93  (3.0%)6.89  (3.1%)   
> -0.5% (  -6% -5%)
>OrHighMed   45.15  (3.9%)   44.92  (3.5%)   
> -0.5% (  -7% -7%)
> Wildcard8.72  (4.7%)8.69  (4.6%)   
> -0.4% (  -9% -9%)
>   AndHighLow  274.11  (2.6%)  273.58  (3.1%)   
> -0.2% (  -5% -5%)
>OrHighLow  241.41  (1.9%)  241.11  (3.5%)   
> -0.1% (  -5% -5%)
>   AndHighMed   52.23  (4.1%)   52.41  (5.3%)
> 0.3% (  -8% -   10%)
>  MedTerm 1026.24  (3.1%) 1030.52  (4.3%)
> 0.4% (  -6% -8%)
> HighTerm .10  (3.4%) 1116.70  (4.0%)
> 0.5% (  -6% -8%)
>HighTermDayOfYearSort   14.59  (8.2%)   14.73  (9.3%)

Re: VOTE: Apache Solr Reference Guide for Solr 8.0

2019-06-06 Thread Jan Høydahl
Cool, Cassandra. We really value your effort on the Ref Guide!

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

> 6. jun. 2019 kl. 21:45 skrev Cassandra Targett :
> 
> It took me a few days to get back to this - the week got away from me and I 
> wanted to give Jan’s suggestions a decent review. 
> 
> Thanks, Jan, for those changes. It looks like it’s really one sentence that’s 
> misleadingly wrong. I hate releasing with known errors like that, but if you 
> don’t think it’s that big of a problem, I’ll go ahead with the release and 
> fold your changes into 8.1. Maybe this is a good use for that otherwise 
> unused Errata page in the PDF.
> 
> Thanks everyone who voted, the VOTE has passed and I’ll get started releasing 
> it.
> 
> Cassandra
> On Jun 3, 2019, 6:16 PM -0500, Jan Høydahl , wrote:
>> Feel free as RM to release without these changes and fold into 8.1.
>> But here is a patch with re-phrasing of the two paragraphs I mentioned, as 
>> well as backport SOLR-12809, in case of a potential re-spin.
>> https://gist.github.com/ed99d0945de112e05e1d1ff2ce6a 
>>  
>> 
>> --
>> Jan Høydahl, search solution architect
>> Cominvent AS - www.cominvent.com 
>> 
>>> 3. jun. 2019 kl. 14:54 skrev Cassandra Targett >> >:
>>> 
>>> Jan, would you please suggest exact wording to correct the error you 
>>> pointed out (quoted below). Since I clearly don’t understand the change, 
>>> and this is months delayed already, that is likely be the fastest option to 
>>> get a respin started this week. Otherwise, it will need to wait for me to 
>>> wade back into the topic to see if I can understand what the correct 
>>> wording should be.
>>> 
>>> Thank you,
>>> Cassandra 
>>> On May 31, 2019, 3:37 PM -0500, Jan Høydahl >> >, wrote:
 Viewed on a smartphone :)
 
 Comments:
 
 Major changes section:
 
> The Basic Authentication plugin now has an option forwardCredentials to 
> prevent Basic Auth headers from being sent for inter-node requests.
 
 
 This is wrong. Basic Auth never sent basicauth headers inter-node, and 
 adding the “forwardcredentials” option lets user-originated requests be 
 forwarded with original basicauth credentials instead of PKI.
>  
>>> 
>> 



[GitHub] [lucene-solr] janhoy commented on issue #635: SOLR-13371 improve security chapters in refguide

2019-06-06 Thread GitBox
janhoy commented on issue #635: SOLR-13371 improve security chapters in refguide
URL: https://github.com/apache/lucene-solr/pull/635#issuecomment-499643600
 
 
   I'm prepared to merge this in time for the 8.1 guide. Any last comments 
before I merge?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1862 - Still Unstable

2019-06-06 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1862/

1 tests failed.
FAILED:  org.apache.solr.cloud.OverseerTest.testShardLeaderChange

Error Message:
Captured an uncaught exception in thread: Thread[id=38366, 
name=OverseerCollectionConfigSetProcessor-74821756039987202-127.0.0.1:8_solr-n_00,
 state=RUNNABLE, group=Overseer collection creation process.]

Stack Trace:
com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught 
exception in thread: Thread[id=38366, 
name=OverseerCollectionConfigSetProcessor-74821756039987202-127.0.0.1:8_solr-n_00,
 state=RUNNABLE, group=Overseer collection creation process.]
at 
__randomizedtesting.SeedInfo.seed([915A90A2F3580CA:D7462EFD35AD753B]:0)
Caused by: org.apache.solr.common.AlreadyClosedException
at __randomizedtesting.SeedInfo.seed([915A90A2F3580CA]:0)
at 
org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:69)
at 
org.apache.solr.common.cloud.SolrZkClient.getData(SolrZkClient.java:337)
at 
org.apache.solr.cloud.OverseerTaskProcessor.amILeader(OverseerTaskProcessor.java:425)
at 
org.apache.solr.cloud.OverseerTaskProcessor.run(OverseerTaskProcessor.java:156)
at java.base/java.lang.Thread.run(Thread.java:834)




Build Log:
[...truncated 14475 lines...]
   [junit4] Suite: org.apache.solr.cloud.OverseerTest
   [junit4]   2> Creating dataDir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/checkout/solr/build/solr-core/test/J2/temp/solr.cloud.OverseerTest_915A90A2F3580CA-001/init-core-data-001
   [junit4]   2> 4332495 WARN  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
startTrackingSearchers: numOpens=148 numCloses=148
   [junit4]   2> 4332495 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 4332532 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
Randomized ssl (false) and clientAuth (false) via: 
@org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, 
clientAuth=0.0/0.0)
   [junit4]   2> 4332580 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
SecureRandom sanity checks: test.solr.allowed.securerandom=null & 
java.security.egd=file:/dev/./urandom
   [junit4]   2> 4332581 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.c.ZkTestServer 
STARTING ZK TEST SERVER
   [junit4]   2> 4332637 INFO  (ZkTestServer Run Thread) [ ] 
o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 4332637 INFO  (ZkTestServer Run Thread) [ ] 
o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 4332752 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.c.ZkTestServer 
start zk server on port:8
   [junit4]   2> 4332752 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.c.ZkTestServer 
parse host and port list: 127.0.0.1:8
   [junit4]   2> 4332752 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.c.ZkTestServer 
connecting to 127.0.0.1 8
   [junit4]   2> 4332780 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 4332800 INFO  (zkConnectionManagerCallback-8841-thread-1) [
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 4332800 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 4332802 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 4332803 INFO  (zkConnectionManagerCallback-8843-thread-1) [
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 4332803 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 4332811 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
initCore
   [junit4]   2> 4332811 INFO  
(SUITE-OverseerTest-seed#[915A90A2F3580CA]-worker) [ ] o.a.s.SolrTestCaseJ4 
initCore end
   [junit4]   2> 4332847 INFO  
(TEST-OverseerTest.testShardLeaderChange-seed#[915A90A2F3580CA]) [ ] 
o.a.s.SolrTestCaseJ4 ###Starting testShardLeaderChange
   [junit4]   2> 4333261 INFO  (Thread-16766) [ ] 
o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 4333280 INFO  (zkConnectionManagerCallback-8847-thread-1) [
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 4333280 INFO  (Thread-16766) [ ] 
o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 4333283 WARN  

[jira] [Resolved] (LUCENE-8821) Please delete old releases from mirroring system

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl resolved LUCENE-8821.
-
Resolution: Fixed

> Please delete old releases from mirroring system
> 
>
> Key: LUCENE-8821
> URL: https://issues.apache.org/jira/browse/LUCENE-8821
> Project: Lucene - Core
>  Issue Type: Task
>  Components: general/website
>Reporter: Sebb
>Assignee: Jan Høydahl
>Priority: Major
>
> To reduce the load on the ASF mirrors, projects are required to delete old 
> releases [1]
> Please can you remove all non-current releases?
> i.e. all but 8.1.1
> It's unfair to expect the 3rd party mirrors to carry old releases.
> However you can still link to the archives for historic releases.
> Please also update your release procedures (if relevant)
> Thanks!
> [1] [http://www.apache.org/dev/release.html#when-to-archive]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (LUCENE-8802) buildAndPushRelease --logfile arg

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl resolved LUCENE-8802.
-
Resolution: Fixed

> buildAndPushRelease --logfile arg
> -
>
> Key: LUCENE-8802
> URL: https://issues.apache.org/jira/browse/LUCENE-8802
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: master (9.0), 8.2, 8.1.2
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Add possibility for custom log file destination
> Also fixes a missing import causing wrong error msg at timeout exception
> {code:java}
> from subprocess import TimeoutExpired
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8802) buildAndPushRelease --logfile arg

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8802?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858037#comment-16858037
 ] 

ASF subversion and git services commented on LUCENE-8802:
-

Commit 02d9534c4a7de270746f2998bc3880696f09e8a0 in lucene-solr's branch 
refs/heads/branch_8_1 from Jan Høydahl
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=02d9534 ]

LUCENE-8802: buildAndPushRelease --logfile arg (#679)

(cherry picked from commit df1775ffd3517c23ace582384c0554f4f521f6e0)


> buildAndPushRelease --logfile arg
> -
>
> Key: LUCENE-8802
> URL: https://issues.apache.org/jira/browse/LUCENE-8802
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Add possibility for custom log file destination
> Also fixes a missing import causing wrong error msg at timeout exception
> {code:java}
> from subprocess import TimeoutExpired
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8802) buildAndPushRelease --logfile arg

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl updated LUCENE-8802:

Fix Version/s: 8.1.2
   8.2
   master (9.0)

> buildAndPushRelease --logfile arg
> -
>
> Key: LUCENE-8802
> URL: https://issues.apache.org/jira/browse/LUCENE-8802
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: master (9.0), 8.2, 8.1.2
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Add possibility for custom log file destination
> Also fixes a missing import causing wrong error msg at timeout exception
> {code:java}
> from subprocess import TimeoutExpired
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8802) buildAndPushRelease --logfile arg

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8802?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858036#comment-16858036
 ] 

ASF subversion and git services commented on LUCENE-8802:
-

Commit 5ee558cc5d052b585a5b67cf9a526a4aec854669 in lucene-solr's branch 
refs/heads/branch_8x from Jan Høydahl
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=5ee558c ]

LUCENE-8802: buildAndPushRelease --logfile arg (#679)

(cherry picked from commit df1775ffd3517c23ace582384c0554f4f521f6e0)


> buildAndPushRelease --logfile arg
> -
>
> Key: LUCENE-8802
> URL: https://issues.apache.org/jira/browse/LUCENE-8802
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Add possibility for custom log file destination
> Also fixes a missing import causing wrong error msg at timeout exception
> {code:java}
> from subprocess import TimeoutExpired
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] janhoy merged pull request #679: LUCENE-8802: buildAndPushRelease --logfile arg

2019-06-06 Thread GitBox
janhoy merged pull request #679: LUCENE-8802: buildAndPushRelease --logfile arg
URL: https://github.com/apache/lucene-solr/pull/679
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8802) buildAndPushRelease --logfile arg

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8802?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858034#comment-16858034
 ] 

ASF subversion and git services commented on LUCENE-8802:
-

Commit df1775ffd3517c23ace582384c0554f4f521f6e0 in lucene-solr's branch 
refs/heads/master from Jan Høydahl
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=df1775f ]

LUCENE-8802: buildAndPushRelease --logfile arg (#679)



> buildAndPushRelease --logfile arg
> -
>
> Key: LUCENE-8802
> URL: https://issues.apache.org/jira/browse/LUCENE-8802
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Add possibility for custom log file destination
> Also fixes a missing import causing wrong error msg at timeout exception
> {code:java}
> from subprocess import TimeoutExpired
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (LUCENE-8818) smokeTestRelease.py encoding bug

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl resolved LUCENE-8818.
-
Resolution: Fixed

> smokeTestRelease.py encoding bug
> 
>
> Key: LUCENE-8818
> URL: https://issues.apache.org/jira/browse/LUCENE-8818
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Minor
> Fix For: master (9.0), 8.2
>
> Attachments: LUCENE-8818.patch
>
>
> Smoke tester may crash while parsing log file created from gpg stdout, but 
> only in certain conditions. Error trace is
> {code:java}
> Test Lucene...
>  test basics...
>  check changes HTML...
>  download lucene-7.7.2-src.tgz...
>    43.5 MB in 123.23 sec (0.4 MB/sec)
>    verify sha512 digest
>    verify sig
> Traceback (most recent call last):
>  File "dev-tools/scripts/smokeTestRelease.py", line 1518, in 
>    main()
>  File "dev-tools/scripts/smokeTestRelease.py", line 1448, in main
>    smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, c.is_signed, 
> c.local_keys, ' '.join(c.test_args))
>  File "dev-tools/scripts/smokeTestRelease.py", line 1497, in smokeTest
>    checkSigs('lucene', lucenePath, version, tmpDir, isSigned, keysFile)
>  File "dev-tools/scripts/smokeTestRelease.py", line 379, in checkSigs
>    for line in f.readlines():
>  File "/Users/ab/anaconda3/lib/python3.7/codecs.py", line 322, in decode
>    (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 180: 
> invalid start byte{code}
> The failing line is 
> https://github.com/apache/lucene-solr/blob/faaee86efb01fa6e431fcb129cfb956c7d62d514/dev-tools/scripts/smokeTestRelease.py#L378
>  
> Found by [~ab]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8818) smokeTestRelease.py encoding bug

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8818?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858032#comment-16858032
 ] 

ASF subversion and git services commented on LUCENE-8818:
-

Commit 8a488478f093e99a7168f30b56392ddef080b276 in lucene-solr's branch 
refs/heads/branch_8x from Jan Høydahl
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=8a48847 ]

LUCENE-8818: Fix smokeTestRelease.py encoding bug

(cherry picked from commit 8d6fd7298fe480a627db333eafeadc7b6a8fdcad)


> smokeTestRelease.py encoding bug
> 
>
> Key: LUCENE-8818
> URL: https://issues.apache.org/jira/browse/LUCENE-8818
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Minor
> Fix For: master (9.0), 8.2
>
> Attachments: LUCENE-8818.patch
>
>
> Smoke tester may crash while parsing log file created from gpg stdout, but 
> only in certain conditions. Error trace is
> {code:java}
> Test Lucene...
>  test basics...
>  check changes HTML...
>  download lucene-7.7.2-src.tgz...
>    43.5 MB in 123.23 sec (0.4 MB/sec)
>    verify sha512 digest
>    verify sig
> Traceback (most recent call last):
>  File "dev-tools/scripts/smokeTestRelease.py", line 1518, in 
>    main()
>  File "dev-tools/scripts/smokeTestRelease.py", line 1448, in main
>    smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, c.is_signed, 
> c.local_keys, ' '.join(c.test_args))
>  File "dev-tools/scripts/smokeTestRelease.py", line 1497, in smokeTest
>    checkSigs('lucene', lucenePath, version, tmpDir, isSigned, keysFile)
>  File "dev-tools/scripts/smokeTestRelease.py", line 379, in checkSigs
>    for line in f.readlines():
>  File "/Users/ab/anaconda3/lib/python3.7/codecs.py", line 322, in decode
>    (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 180: 
> invalid start byte{code}
> The failing line is 
> https://github.com/apache/lucene-solr/blob/faaee86efb01fa6e431fcb129cfb956c7d62d514/dev-tools/scripts/smokeTestRelease.py#L378
>  
> Found by [~ab]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8818) smokeTestRelease.py encoding bug

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl updated LUCENE-8818:

Fix Version/s: 8.2
   master (9.0)

> smokeTestRelease.py encoding bug
> 
>
> Key: LUCENE-8818
> URL: https://issues.apache.org/jira/browse/LUCENE-8818
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Minor
> Fix For: master (9.0), 8.2
>
> Attachments: LUCENE-8818.patch
>
>
> Smoke tester may crash while parsing log file created from gpg stdout, but 
> only in certain conditions. Error trace is
> {code:java}
> Test Lucene...
>  test basics...
>  check changes HTML...
>  download lucene-7.7.2-src.tgz...
>    43.5 MB in 123.23 sec (0.4 MB/sec)
>    verify sha512 digest
>    verify sig
> Traceback (most recent call last):
>  File "dev-tools/scripts/smokeTestRelease.py", line 1518, in 
>    main()
>  File "dev-tools/scripts/smokeTestRelease.py", line 1448, in main
>    smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, c.is_signed, 
> c.local_keys, ' '.join(c.test_args))
>  File "dev-tools/scripts/smokeTestRelease.py", line 1497, in smokeTest
>    checkSigs('lucene', lucenePath, version, tmpDir, isSigned, keysFile)
>  File "dev-tools/scripts/smokeTestRelease.py", line 379, in checkSigs
>    for line in f.readlines():
>  File "/Users/ab/anaconda3/lib/python3.7/codecs.py", line 322, in decode
>    (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 180: 
> invalid start byte{code}
> The failing line is 
> https://github.com/apache/lucene-solr/blob/faaee86efb01fa6e431fcb129cfb956c7d62d514/dev-tools/scripts/smokeTestRelease.py#L378
>  
> Found by [~ab]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: VOTE: Apache Solr Reference Guide for Solr 8.0

2019-06-06 Thread Cassandra Targett
It took me a few days to get back to this - the week got away from me and I 
wanted to give Jan’s suggestions a decent review.

Thanks, Jan, for those changes. It looks like it’s really one sentence that’s 
misleadingly wrong. I hate releasing with known errors like that, but if you 
don’t think it’s that big of a problem, I’ll go ahead with the release and fold 
your changes into 8.1. Maybe this is a good use for that otherwise unused 
Errata page in the PDF.

Thanks everyone who voted, the VOTE has passed and I’ll get started releasing 
it.

Cassandra
On Jun 3, 2019, 6:16 PM -0500, Jan Høydahl , wrote:
> Feel free as RM to release without these changes and fold into 8.1.
> But here is a patch with re-phrasing of the two paragraphs I mentioned, as 
> well as backport SOLR-12809, in case of a potential re-spin.
> https://gist.github.com/ed99d0945de112e05e1d1ff2ce6a
>
> --
> Jan Høydahl, search solution architect
> Cominvent AS - www.cominvent.com
>
> > 3. jun. 2019 kl. 14:54 skrev Cassandra Targett :
> >
> > Jan, would you please suggest exact wording to correct the error you 
> > pointed out (quoted below). Since I clearly don’t understand the change, 
> > and this is months delayed already, that is likely be the fastest option to 
> > get a respin started this week. Otherwise, it will need to wait for me to 
> > wade back into the topic to see if I can understand what the correct 
> > wording should be.
> >
> > Thank you,
> > Cassandra
> > On May 31, 2019, 3:37 PM -0500, Jan Høydahl , wrote:
> > > Viewed on a smartphone :)
> > >
> > > Comments:
> > >
> > > Major changes section:
> > >
> > > > The Basic Authentication plugin now has an option forwardCredentials to 
> > > > prevent Basic Auth headers from being sent for inter-node requests.
> > >
> > > This is wrong. Basic Auth never sent basicauth headers inter-node, and 
> > > adding the “forwardcredentials” option lets user-originated requests be 
> > > forwarded with original basicauth credentials instead of PKI.
> > > >
> >
>


[jira] [Commented] (LUCENE-8818) smokeTestRelease.py encoding bug

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8818?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16858029#comment-16858029
 ] 

ASF subversion and git services commented on LUCENE-8818:
-

Commit 8d6fd7298fe480a627db333eafeadc7b6a8fdcad in lucene-solr's branch 
refs/heads/master from Jan Høydahl
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=8d6fd72 ]

LUCENE-8818: Fix smokeTestRelease.py encoding bug


> smokeTestRelease.py encoding bug
> 
>
> Key: LUCENE-8818
> URL: https://issues.apache.org/jira/browse/LUCENE-8818
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Minor
> Attachments: LUCENE-8818.patch
>
>
> Smoke tester may crash while parsing log file created from gpg stdout, but 
> only in certain conditions. Error trace is
> {code:java}
> Test Lucene...
>  test basics...
>  check changes HTML...
>  download lucene-7.7.2-src.tgz...
>    43.5 MB in 123.23 sec (0.4 MB/sec)
>    verify sha512 digest
>    verify sig
> Traceback (most recent call last):
>  File "dev-tools/scripts/smokeTestRelease.py", line 1518, in 
>    main()
>  File "dev-tools/scripts/smokeTestRelease.py", line 1448, in main
>    smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, c.is_signed, 
> c.local_keys, ' '.join(c.test_args))
>  File "dev-tools/scripts/smokeTestRelease.py", line 1497, in smokeTest
>    checkSigs('lucene', lucenePath, version, tmpDir, isSigned, keysFile)
>  File "dev-tools/scripts/smokeTestRelease.py", line 379, in checkSigs
>    for line in f.readlines():
>  File "/Users/ab/anaconda3/lib/python3.7/codecs.py", line 322, in decode
>    (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 180: 
> invalid start byte{code}
> The failing line is 
> https://github.com/apache/lucene-solr/blob/faaee86efb01fa6e431fcb129cfb956c7d62d514/dev-tools/scripts/smokeTestRelease.py#L378
>  
> Found by [~ab]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8818) smokeTestRelease.py encoding bug

2019-06-06 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/LUCENE-8818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl updated LUCENE-8818:

Component/s: general/tools

> smokeTestRelease.py encoding bug
> 
>
> Key: LUCENE-8818
> URL: https://issues.apache.org/jira/browse/LUCENE-8818
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/tools
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Minor
> Attachments: LUCENE-8818.patch
>
>
> Smoke tester may crash while parsing log file created from gpg stdout, but 
> only in certain conditions. Error trace is
> {code:java}
> Test Lucene...
>  test basics...
>  check changes HTML...
>  download lucene-7.7.2-src.tgz...
>    43.5 MB in 123.23 sec (0.4 MB/sec)
>    verify sha512 digest
>    verify sig
> Traceback (most recent call last):
>  File "dev-tools/scripts/smokeTestRelease.py", line 1518, in 
>    main()
>  File "dev-tools/scripts/smokeTestRelease.py", line 1448, in main
>    smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, c.is_signed, 
> c.local_keys, ' '.join(c.test_args))
>  File "dev-tools/scripts/smokeTestRelease.py", line 1497, in smokeTest
>    checkSigs('lucene', lucenePath, version, tmpDir, isSigned, keysFile)
>  File "dev-tools/scripts/smokeTestRelease.py", line 379, in checkSigs
>    for line in f.readlines():
>  File "/Users/ab/anaconda3/lib/python3.7/codecs.py", line 322, in decode
>    (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 180: 
> invalid start byte{code}
> The failing line is 
> https://github.com/apache/lucene-solr/blob/faaee86efb01fa6e431fcb129cfb956c7d62d514/dev-tools/scripts/smokeTestRelease.py#L378
>  
> Found by [~ab]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] rmuir commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread GitBox
rmuir commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to 
continue scanning from the last position when possible
URL: https://github.com/apache/lucene-solr/pull/701#issuecomment-499635591
 
 
   maybe the better approach is top-down: for example starting at high level 
sorting/faceting algorithms, do they really go through any trouble to resolve 
ordinals in sequential order or do they just resolve the top-N in some 
arbitrary order by document count. 
   
   IMO if this kind of code goes through the trouble to e.g. sort the top-N by 
ordinal, retrieve all the term metadata, then sort it back, then optimizations 
here make sense. But based on that code we can also determine if e.g. its only 
necessary to optimize say `seekCeil` versus also `seekExact`. It may simplify 
the low level code.
   
   If the high level code isn't actually doing work to resolve ordinals in 
ordinal-order, then it won't help to optimize seek. Merging code etc is already 
using `next()` which is clearly optimized for that case.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8811) Add maximum clause count check to IndexSearcher rather than BooleanQuery

2019-06-06 Thread Atri Sharma (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Atri Sharma updated LUCENE-8811:

Attachment: LUCENE-8811.patch

> Add maximum clause count check to IndexSearcher rather than BooleanQuery
> 
>
> Key: LUCENE-8811
> URL: https://issues.apache.org/jira/browse/LUCENE-8811
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8811.patch, LUCENE-8811.patch
>
>
> Currently we only check whether boolean queries have too many clauses. 
> However there are other ways that queries may have too many clauses, for 
> instance if you have boolean queries that have themselves inner boolean 
> queries.
> Could we use the new Query visitor API to move this check from BooleanQuery 
> to IndexSearcher in order to make this check more consistent across queries? 
> See for instance LUCENE-8810 where a rewrite rule caused the maximum clause 
> count to be hit even though the total number of leaf queries remained the 
> same.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8811) Add maximum clause count check to IndexSearcher rather than BooleanQuery

2019-06-06 Thread Atri Sharma (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8811?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857957#comment-16857957
 ] 

Atri Sharma commented on LUCENE-8811:
-

[~jpountz] [~romseygeek] Attached patch implements a QueryVisitor which counts 
leaf visits and terms which are consumed and adds per IndexSearcher maximum 
clause count limits.

 

Please let me know your thoughts and comments.

 

[^LUCENE-8811.patch]

> Add maximum clause count check to IndexSearcher rather than BooleanQuery
> 
>
> Key: LUCENE-8811
> URL: https://issues.apache.org/jira/browse/LUCENE-8811
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8811.patch, LUCENE-8811.patch
>
>
> Currently we only check whether boolean queries have too many clauses. 
> However there are other ways that queries may have too many clauses, for 
> instance if you have boolean queries that have themselves inner boolean 
> queries.
> Could we use the new Query visitor API to move this check from BooleanQuery 
> to IndexSearcher in order to make this check more consistent across queries? 
> See for instance LUCENE-8810 where a rewrite rule caused the maximum clause 
> count to be hit even though the total number of leaf queries remained the 
> same.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-MAVEN] Lucene-Solr-Maven-master #2578: POMs out of sync

2019-06-06 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Maven-master/2578/

No tests ran.

Build Log:
[...truncated 34571 lines...]
BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-master/build.xml:680: The 
following error occurred while executing this line:
: Java returned: 1

Total time: 20 minutes 49 seconds
Build step 'Invoke Ant' marked build as failure
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8812) add KoreanNumberFilter to Nori(Korean) Analyzer

2019-06-06 Thread Jim Ferenczi (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8812?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857911#comment-16857911
 ] 

Jim Ferenczi commented on LUCENE-8812:
--

Sorry I didn't see your reply. I agree with you that it is ambiguous to put it 
in analysis-common so +1 to add it in the nori module for now and revisit 
if/when we create a separate module for the mecab tokenizer. 

> add KoreanNumberFilter to Nori(Korean) Analyzer
> ---
>
> Key: LUCENE-8812
> URL: https://issues.apache.org/jira/browse/LUCENE-8812
> Project: Lucene - Core
>  Issue Type: New Feature
>Reporter: Namgyu Kim
>Priority: Major
> Attachments: LUCENE-8812.patch
>
>
> This is a follow-up issue to LUCENE-8784.
> The KoreanNumberFilter is a TokenFilter that normalizes Korean numbers to 
> regular Arabic decimal numbers in half-width characters.
> Logic is similar to JapaneseNumberFilter.
> It should be able to cover the following test cases.
> 1) Korean Word to Number
> 십만이천오백 => 102500
> 2) 1 character conversion
> 일영영영 => 1000
> 3) Decimal Point Calculation
> 3.2천 => 3200
> 4) Comma between three digits
> 4,647.0010 => 4647.001



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8832) Support for field removal and renaming

2019-06-06 Thread Andrzej Bialecki (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857873#comment-16857873
 ] 

Andrzej Bialecki  commented on LUCENE-8832:
---

I think it's basically the same effort, because in both cases it's the same 
re-mapping structure in Codec.

> Support for field removal and renaming
> --
>
> Key: LUCENE-8832
> URL: https://issues.apache.org/jira/browse/LUCENE-8832
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
>
> Currently it's not possible to rename existing Lucene fields or delete them 
> without creating a new index from scratch (FieldInfos are basically 
> append-only).
> This issue proposes to investigate an approach that applies these changes at 
> a Codec level so that the unwanted data is skipped over (in case of field 
> delete) or accessed under a different name (in case of field rename). Since 
> the same Codec API is used for segment merging the deletion / removal 
> filtering could be applied only to the currently existing segments because 
> the resulting merged segments would not contain this data anymore.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.1-Linux (64bit/jdk-11.0.2) - Build # 392 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Linux/392/
Java: 64bit/jdk-11.0.2 -XX:-UseCompressedOops -XX:+UseG1GC

10 tests failed.
FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth

Error Message:
Expected metric minimums for prefix SECURITY./authentication.: 
{failMissingCredentials=2, authenticated=19, passThrough=9, 
failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, totalTime=9373167, 
failWrongCredentials=1, requestTimes=1827, requests=29, errors=0}

Stack Trace:
java.lang.AssertionError: Expected metric minimums for prefix 
SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, totalTime=9373167, 
failWrongCredentials=1, requestTimes=1827, requests=29, errors=0}
at 
__randomizedtesting.SeedInfo.seed([2F850C7B5F20FB56:93EB7A69FB73782C]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
at 
org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 

[jira] [Commented] (SOLR-13512) Raw index data analysis tool

2019-06-06 Thread Erick Erickson (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857864#comment-16857864
 ] 

Erick Erickson commented on SOLR-13512:
---

I took a scan through the patch and didn't notice any ref guide entries, I'm 
thinking just a very brief,  just a couple of sentence statement of how to 
access this functionality and the kinds of things it returns. Maybe the output 
for a single field?

> Raw index data analysis tool
> 
>
> Key: SOLR-13512
> URL: https://issues.apache.org/jira/browse/SOLR-13512
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: SOLR-13512.patch, SOLR-13512.patch, SOLR-13512.patch, 
> rawSizeDetails.json, rawSizeSummary.json
>
>
> A common question from Solr users is how to determine how a given schema 
> field and all its related index data contributes to the total index size.
> It's possible to estimate this information by doing a single full pass 
> through all index data, aggregating estimated sizes of terms, postings, doc 
> values and stored fields. The totals represent of course the worst case 
> scenario when there's no index compression at all, but still they should be 
> useful for answering the questions above.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-13512) Raw index data analysis tool

2019-06-06 Thread Andrzej Bialecki (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrzej Bialecki  updated SOLR-13512:
-
Fix Version/s: 8.2
   master (9.0)

> Raw index data analysis tool
> 
>
> Key: SOLR-13512
> URL: https://issues.apache.org/jira/browse/SOLR-13512
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: SOLR-13512.patch, SOLR-13512.patch, SOLR-13512.patch, 
> rawSizeDetails.json, rawSizeSummary.json
>
>
> A common question from Solr users is how to determine how a given schema 
> field and all its related index data contributes to the total index size.
> It's possible to estimate this information by doing a single full pass 
> through all index data, aggregating estimated sizes of terms, postings, doc 
> values and stored fields. The totals represent of course the worst case 
> scenario when there's no index compression at all, but still they should be 
> useful for answering the questions above.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13512) Raw index data analysis tool

2019-06-06 Thread Andrzej Bialecki (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857854#comment-16857854
 ] 

Andrzej Bialecki  commented on SOLR-13512:
--

Another update:
 * added "typesBySize" to provide guidance on what type of data consumes what 
part of the total size.
 * added sampling of data in case of large indexes. This makes a HUGE 
difference in the speed of calculation, and still the results are good enough 
to provide useful guidance.
 * bug fixes, additional unit testing, some internal API refactoring / renaming.

I'd appreciate a review but if there are no objections I'd like to commit this 
shortly.

> Raw index data analysis tool
> 
>
> Key: SOLR-13512
> URL: https://issues.apache.org/jira/browse/SOLR-13512
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
> Attachments: SOLR-13512.patch, SOLR-13512.patch, SOLR-13512.patch, 
> rawSizeDetails.json, rawSizeSummary.json
>
>
> A common question from Solr users is how to determine how a given schema 
> field and all its related index data contributes to the total index size.
> It's possible to estimate this information by doing a single full pass 
> through all index data, aggregating estimated sizes of terms, postings, doc 
> values and stored fields. The totals represent of course the worst case 
> scenario when there's no index compression at all, but still they should be 
> useful for answering the questions above.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-13512) Raw index data analysis tool

2019-06-06 Thread Andrzej Bialecki (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrzej Bialecki  updated SOLR-13512:
-
Attachment: SOLR-13512.patch

> Raw index data analysis tool
> 
>
> Key: SOLR-13512
> URL: https://issues.apache.org/jira/browse/SOLR-13512
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
> Attachments: SOLR-13512.patch, SOLR-13512.patch, SOLR-13512.patch, 
> rawSizeDetails.json, rawSizeSummary.json
>
>
> A common question from Solr users is how to determine how a given schema 
> field and all its related index data contributes to the total index size.
> It's possible to estimate this information by doing a single full pass 
> through all index data, aggregating estimated sizes of terms, postings, doc 
> values and stored fields. The totals represent of course the worst case 
> scenario when there's no index compression at all, but still they should be 
> useful for answering the questions above.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: Welcome Namgyu Kim as Lucene/Solr committer

2019-06-06 Thread Nhat Nguyen
Congrats and welcome Namgyu Kim!

On Thu, Jun 6, 2019 at 12:03 PM Tomoko Uchida 
wrote:

> Hi,
>
> I just contacted Namgyu Kim to ask he noticed this thread. He has
> known the Lucene dev-list, but seems failed to subscribe it so far for
> some reason.
> I would like to restart this thread to invite him here. 
>
> Again, welcome Namgyu!
>
> Thanks,
> Tomoko
>
> 2019年6月6日(木) 15:21 Noble Paul :
> >
> > Welcome Namgyu
> >
> > On Thu, Jun 6, 2019 at 12:32 PM Christian Moen  wrote:
> > >
> > > Congrats, Namgyu!
> > >
> > > On Thu, Jun 6, 2019 at 4:36 AM Christine Poerschke (BLOOMBERG/ LONDON)
>  wrote:
> > >>
> > >> Welcome!
> > >>
> > >> Christine
> > >>
> > >> From: dev@lucene.apache.org At: 06/03/19 18:52:38
> > >> To: dev@lucene.apache.org
> > >> Subject: Welcome Namgyu Kim as Lucene/Solr committer
> > >>
> > >> Hi all,
> > >>
> > >> Please join me in welcoming Namgyu Kim as Lucene/ Solr committer!
> > >>
> > >> Kim has been helping address technical debt and fixing bugs in the
> > >> last year, including a cleanup to our DutchAnalyzer[0] and
> > >> improvements to the StoredFieldsVisitor API[1]. More recently he also
> > >> started improving our korean analyzer[2].
> > >>
> > >> [0] https://issues.apache.org/jira/browse/LUCENE-8582
> > >> [1] https://issues.apache.org/jira/browse/LUCENE-8805
> > >> [2] https://issues.apache.org/jira/browse/LUCENE-8784
> > >>
> > >> Congratulations and welcome! It is a tradition to introduce yourself
> > >> with a brief bio.
> > >>
> > >> --
> > >> Adrien
> > >>
> > >> -
> > >> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> > >> For additional commands, e-mail: dev-h...@lucene.apache.org
> > >>
> > >>
> >
> >
> > --
> > -
> > Noble Paul
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> > For additional commands, e-mail: dev-h...@lucene.apache.org
> >
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


Re: Welcome Namgyu Kim as Lucene/Solr committer

2019-06-06 Thread Tomoko Uchida
Hi,

I just contacted Namgyu Kim to ask he noticed this thread. He has
known the Lucene dev-list, but seems failed to subscribe it so far for
some reason.
I would like to restart this thread to invite him here. 

Again, welcome Namgyu!

Thanks,
Tomoko

2019年6月6日(木) 15:21 Noble Paul :
>
> Welcome Namgyu
>
> On Thu, Jun 6, 2019 at 12:32 PM Christian Moen  wrote:
> >
> > Congrats, Namgyu!
> >
> > On Thu, Jun 6, 2019 at 4:36 AM Christine Poerschke (BLOOMBERG/ LONDON) 
> >  wrote:
> >>
> >> Welcome!
> >>
> >> Christine
> >>
> >> From: dev@lucene.apache.org At: 06/03/19 18:52:38
> >> To: dev@lucene.apache.org
> >> Subject: Welcome Namgyu Kim as Lucene/Solr committer
> >>
> >> Hi all,
> >>
> >> Please join me in welcoming Namgyu Kim as Lucene/ Solr committer!
> >>
> >> Kim has been helping address technical debt and fixing bugs in the
> >> last year, including a cleanup to our DutchAnalyzer[0] and
> >> improvements to the StoredFieldsVisitor API[1]. More recently he also
> >> started improving our korean analyzer[2].
> >>
> >> [0] https://issues.apache.org/jira/browse/LUCENE-8582
> >> [1] https://issues.apache.org/jira/browse/LUCENE-8805
> >> [2] https://issues.apache.org/jira/browse/LUCENE-8784
> >>
> >> Congratulations and welcome! It is a tradition to introduce yourself
> >> with a brief bio.
> >>
> >> --
> >> Adrien
> >>
> >> -
> >> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> >> For additional commands, e-mail: dev-h...@lucene.apache.org
> >>
> >>
>
>
> --
> -
> Noble Paul
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.1-Windows (32bit/jdk1.8.0_201) - Build # 132 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Windows/132/
Java: 32bit/jdk1.8.0_201 -client -XX:+UseSerialGC

11 tests failed.
FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth

Error Message:
Expected metric minimums for prefix SECURITY./authentication.: 
{failMissingCredentials=2, authenticated=19, passThrough=9, 
failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, 
totalTime=187021700, failWrongCredentials=1, requestTimes=559, requests=29, 
errors=0}

Stack Trace:
java.lang.AssertionError: Expected metric minimums for prefix 
SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=10, 
totalTime=187021700, failWrongCredentials=1, requestTimes=559, requests=29, 
errors=0}
at 
__randomizedtesting.SeedInfo.seed([C320FEF85EE5353:B05C79FD21BDD029]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
at 
org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 

[GitHub] [lucene-solr] bruno-roustant commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread GitBox
bruno-roustant commented on issue #701: LUCENE-8836 Optimize DocValues 
TermsDict to continue scanning from the last position when possible
URL: https://github.com/apache/lucene-solr/pull/701#issuecomment-499546598
 
 
   "can the optimization be implemented in a less-invasive manner?"
   Well, the change optimizes both seekExact(ord) and seekCeil(term), and it 
handles calls to next() to keep track of the last accessed term. This change 
also reduces the binary search range in TermsDict.seekTermsIndex(), by 
comparing to the last accessed term.
I already tried to be minimal (although I added comments) so I don't think 
I can do with less.
   
   
   "How best might the performance of DocValues be evaluated?"
   The approach I took was to run some Lucene tests while counting the total 
number of seeks and terms read in the IndexInput, with and without the 
optimization.
   TestLucene70DocValuesFormat - the optimization saves 24% seeks and 15% term 
reads.
   TestDocValuesQueries - the optimization adds 0.7% seeks and 0.003% term 
reads.
   TestDocValuesRewriteMethod.testRegexps - the optimization saves 71% seeks 
and 82% term reads.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 3335 - Unstable

2019-06-06 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/3335/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-8.x/113/consoleText

[repro] Revision: 43a7ec87a2c4dbaaa521ec219d2a267ee823d9b8

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=TestRandomChains 
-Dtests.method=testRandomChainsWithLargeStrings -Dtests.seed=E00132298AD0431C 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ar-QA -Dtests.timezone=America/Cayman -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=ShardSplitTest 
-Dtests.method=testSplitWithChaosMonkey -Dtests.seed=51487FE98157030E 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=da-DK -Dtests.timezone=America/Grand_Turk -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=ShardSplitTest 
-Dtests.seed=51487FE98157030E -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=da-DK -Dtests.timezone=America/Grand_Turk -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
3364753661ffb91bf04058c6184368656e0d5ab7
[repro] git fetch
[repro] git checkout 43a7ec87a2c4dbaaa521ec219d2a267ee823d9b8

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   ShardSplitTest
[repro]lucene/analysis/common
[repro]   TestRandomChains
[repro] ant compile-test

[...truncated 3578 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.ShardSplitTest" -Dtests.showOutput=onerror 
-Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=51487FE98157030E -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=da-DK -Dtests.timezone=America/Grand_Turk -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 133 lines...]
[repro] ant compile-test

[...truncated 102 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.TestRandomChains" -Dtests.showOutput=onerror 
-Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=E00132298AD0431C -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ar-QA -Dtests.timezone=America/Cayman -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[...truncated 167 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.api.collections.ShardSplitTest
[repro]   3/5 failed: org.apache.lucene.analysis.core.TestRandomChains
[repro] git checkout 3364753661ffb91bf04058c6184368656e0d5ab7

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8832) Support for field removal and renaming

2019-06-06 Thread Erick Erickson (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857801#comment-16857801
 ] 

Erick Erickson commented on LUCENE-8832:


WDYT about splitting this into two tasks? Would it be simpler/faster to deal 
with the deletions, then the renames? Or is it the case that doing them both at 
the same time is very little extra effort?

Up to you of course, just wondering

> Support for field removal and renaming
> --
>
> Key: LUCENE-8832
> URL: https://issues.apache.org/jira/browse/LUCENE-8832
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Andrzej Bialecki 
>Assignee: Andrzej Bialecki 
>Priority: Major
>
> Currently it's not possible to rename existing Lucene fields or delete them 
> without creating a new index from scratch (FieldInfos are basically 
> append-only).
> This issue proposes to investigate an approach that applies these changes at 
> a Codec level so that the unwanted data is skipped over (in case of field 
> delete) or accessed under a different name (in case of field rename). Since 
> the same Codec API is used for segment merging the deletion / removal 
> filtering could be applied only to the currently existing segments because 
> the resulting merged segments would not contain this data anymore.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11) - Build # 24192 - Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24192/
Java: 64bit/jdk-11 -XX:+UseCompressedOops -XX:+UseSerialGC

2 tests failed.
FAILED:  org.apache.solr.cloud.ReindexCollectionTest.testBasicReindexing

Error Message:
Solr11035BandAid failed, counts differ after updates: expected:<103> but 
was:<200>

Stack Trace:
java.lang.AssertionError: Solr11035BandAid failed, counts differ after updates: 
expected:<103> but was:<200>
at 
__randomizedtesting.SeedInfo.seed([3234464DCF5CCE5C:A19689A2C7B51A64]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at 
org.apache.solr.SolrTestCaseJ4.Solr11035BandAid(SolrTestCaseJ4.java:3087)
at 
org.apache.solr.cloud.ReindexCollectionTest.indexDocs(ReindexCollectionTest.java:387)
at 
org.apache.solr.cloud.ReindexCollectionTest.testBasicReindexing(ReindexCollectionTest.java:124)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 

[GitHub] [lucene-solr] dsmiley commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread GitBox
dsmiley commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to 
continue scanning from the last position when possible
URL: https://github.com/apache/lucene-solr/pull/701#issuecomment-499537743
 
 
   How best might the performance of DocValues be evaluated?  luceneutil has a 
great query perf benchmark but the only thing related to DocValues I noticed 
was sorting


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-13525) There is no way to define a numeric value in the Tuple function

2019-06-06 Thread Oleksandr Chornyi (JIRA)
Oleksandr Chornyi created SOLR-13525:


 Summary: There is no way to define a numeric value in the Tuple 
function
 Key: SOLR-13525
 URL: https://issues.apache.org/jira/browse/SOLR-13525
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: streaming expressions
Affects Versions: 7.7.1
Reporter: Oleksandr Chornyi


h3. Background

An easy way to experiment with Streaming Expressions is to define a tuple or 
list of tuples as a stream source and apply decorators\evaluators to this 
source. However, at the moment there is no easy way to define numeric values in 
a tuple because everything is being treated as a string.
h3. Steps to Reproduce

Evaluate the following streaming expression:
{code:java}
tuple(int=13, float=42.42, string=foo, quoted_string="bar")
{code}
*Actual Result*:
{code}
"docs": [
  {
"int": "13",
"float": "42.42",
"string": "foo",
"quoted_string": "bar"
  },
  ...
]
{code}
*Expected Result*:
{code}
"docs": [
  {
"int": 13,
"float": 42.42,
"string": "foo",
"quoted_string": "bar"
  },
  ...
]
{code}
h3. Possible Workarounds

It's possible to get the desired result by applying {{val()}} function to each 
numeric value, but it's not convenient:
{code:java}
tuple(int=val(13), float=val(42.42))
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: java.lang.IllegalArgumentException: Could not load codec 'Lucene62'. Did you forget to add lucene-backward-codecs.jar?

2019-06-06 Thread David Allouche
Thank you Andi, you're awesome!

> On 5 Jun 2019, at 00:33, Andi Vajda  wrote:
> 
> 
> In rev 1860637 I refreshed the list of supported lucene module to be built
> with PyLucene. The lucene-backward-codecs module was indeed missing.
> 
> Please, try it out with pylucene 7.7.1 (refresing its Makefile from trunk)
> and let me know if it fixes your problem.
> 
> Thanks !
> 
> Andi..
> 
> On Tue, 4 Jun 2019, David Allouche wrote:
> 
>> Hello,
>> 
>> I use pylucene, and I am upgrading from 6.5.0 to 7.7.1.
>> 
>> Opening my old index using the new pylucene, I get
>> 
>>> Traceback (most recent call last):
>>>  ...
>>>  File "/home/user/jobaffinity/lib/luceneindex.py", line 58, in 
>>> create_lucene_index_maybe
>>>writer = IndexWriter(directory, config)
>>> lucene.JavaError: , >
>>>Java stacktrace:
>>> java.lang.IllegalArgumentException: Could not load codec 'Lucene62'.  Did 
>>> you forget to add lucene-backward-codecs.jar?
>>>at org.apache.lucene.index.SegmentInfos.readCodec(SegmentInfos.java:428)
>>>at org.apache.lucene.index.SegmentInfos.readCommit(SegmentInfos.java:360)
>>>at org.apache.lucene.index.SegmentInfos.readCommit(SegmentInfos.java:291)
>>>at org.apache.lucene.index.IndexWriter.(IndexWriter.java:845)
>>> Caused by: java.lang.IllegalArgumentException: An SPI class of type 
>>> org.apache.lucene.codecs.Codec with name 'Lucene62' does not exist.  You 
>>> need to add the corresponding JAR file supporting this SPI to your 
>>> classpath.  The current classpath supports the following names: [Lucene70]
>>>at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:116)
>>>at org.apache.lucene.codecs.Codec.forName(Codec.java:116)
>>>at org.apache.lucene.index.SegmentInfos.readCodec(SegmentInfos.java:424)
>>>... 3 more
>> 
>> I am really not familiar with the Java side of things. With some help from 
>> the web, I found out where to download the jars.
>> 
>> I presume I need 
>> http://central.maven.org/maven2/org/apache/lucene/lucene-backward-codecs/7.7.1/lucene-backward-codecs-7.7.1.jar
>> 
>> But then, I am quite clueless about what to do. I tried dumping the jar 
>> right next to lucene-core-7.7.1.jar, in my 
>> $(VENV)/lib/python2.7/site-packages/lucene, but that does not seem to help.
>> 
>> I have a large-ish index with about 22M entries, used in a public-facing 
>> service, so I very much like to avoid rebuilding the index every time I 
>> upgrade pylucene.
>> 
>> I could reverse-engineer this script:
>> https://github.com/cominvent/solr-tools/blob/master/upgradeindex/upgradeindex.sh
>> 
>> But that would require me to put the service down while running the 
>> migration.
>> 
>> How can I package the lucene-backwards-codecs in pylucene?
>> 
>> 
>> 



[jira] [Created] (SOLR-13524) Or Stream Evaluator produces incorrect results with more than 2 arguments

2019-06-06 Thread Oleksandr Chornyi (JIRA)
Oleksandr Chornyi created SOLR-13524:


 Summary: Or Stream Evaluator produces incorrect results with more 
than 2 arguments
 Key: SOLR-13524
 URL: https://issues.apache.org/jira/browse/SOLR-13524
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: streaming expressions
Affects Versions: 7.7.1
Reporter: Oleksandr Chornyi


h3. Background

[Documentation for the 
OR|https://lucene.apache.org/solr/guide/7_7/stream-evaluator-reference.html#or] 
evaluator says that it "will return the logical OR of at least 2 boolean 
parameters." The evaluator indeed accepts more than two parameters:
{code:java}
or(true, true, true){code}
returns {{true}}. However, it stops evaluating other parameters when the first 
two parameters evaluate to false.
h3. Steps to Reproduce

Evaluate the following expression:
{code:java}
or(false, false, true)
{code}
*Expected Result*: {{"return-value": true}}
 *Actual Result*: {{"return-value": false}}
  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [JENKINS] Lucene-Solr-8.1-Linux (64bit/jdk-11.0.2) - Build # 391 - Still Unstable!

2019-06-06 Thread Jan Høydahl
I think this test needs to be BadApple’d now?

Jan Høydahl

> 6. jun. 2019 kl. 13:38 skrev Policeman Jenkins Server :
> 
> Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Linux/391/
> Java: 64bit/jdk-11.0.2 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC
> 
> 9 tests failed.
> FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth
> 
> Error Message:
> Expected metric minimums for prefix SECURITY./authentication.: 
> {failMissingCredentials=2, authenticated=19, passThrough=9, 
> failWrongCredentials=1, requests=31, errors=0}, but got: 
> {failMissingCredentials=2, authenticated=16, passThrough=11, 
> totalTime=9122740, failWrongCredentials=1, requestTimes=1258, requests=30, 
> errors=0}
> 
> Stack Trace:
> java.lang.AssertionError: Expected metric minimums for prefix 
> SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
> passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
> {failMissingCredentials=2, authenticated=16, passThrough=11, 
> totalTime=9122740, failWrongCredentials=1, requestTimes=1258, requests=30, 
> errors=0}
>at __randomizedtesting.SeedInfo.seed([B3A7C371A669098D:FC9B563023A8AF7]:0)
>at org.junit.Assert.fail(Assert.java:88)
>at org.junit.Assert.assertTrue(Assert.java:41)
>at 
> org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
>at 
> org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
>at 
> org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
>at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method)
>at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
>at 
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
>at 
> org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
>at 
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
>at 
> org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
>at 
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
>at 
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
>at 
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
>at 
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
>at 
> com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
>at 
> com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
>at 
> com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
>at 
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
>at 
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
>at 
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
>at 
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
>at 
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
>at 
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
>at 
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
>at 
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
>at 
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
>at 
> 

[GitHub] [lucene-solr] rmuir commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread GitBox
rmuir commented on issue #701: LUCENE-8836 Optimize DocValues TermsDict to 
continue scanning from the last position when possible
URL: https://github.com/apache/lucene-solr/pull/701#issuecomment-499501677
 
 
   can the optimization be implemented in a less-invasive manner?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-13523) Atomic Update results in NullPointerException

2019-06-06 Thread Kieran Devlin (JIRA)
Kieran Devlin created SOLR-13523:


 Summary: Atomic Update results in NullPointerException
 Key: SOLR-13523
 URL: https://issues.apache.org/jira/browse/SOLR-13523
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: Admin UI, JSON Request API, update
Affects Versions: 8.0
 Environment: * Operating system: Win10 v1803 build 17143.766
 * Java version:
java 11.0.1 2018-10-16 LTS
Java(TM) SE Runtime Environment 18.9 (build 11.0.1+13-LTS)
Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11.0.1+13-LTS, mixed mode)
 * solr-spec: 8.1.1
 * solr-impl: 8.1.1 fcbe46c28cef11bc058779afba09521de1b19bef - ab - 2019-05-22 
15:20:01
 * lucene-spec: 8.1.1
 * lucene-impl: 8.1.1 fcbe46c28cef11bc058779afba09521de1b19bef - ab - 
2019-05-22 15:15:24
Reporter: Kieran Devlin
 Attachments: XUBrk.png, Xn1RW.png

Partially update a document via an atomic update, when I do so, the web sever 
responds with a 500 status with the stack trace:
{code:java}
{ "responseHeader":{ "status":500, "QTime":1}, "error":{ 
"trace":"java.lang.NullPointerException\r\n\tat 
org.apache.solr.update.processor.AtomicUpdateDocumentMerger.getFieldFromHierarchy(AtomicUpdateDocumentMerger.java:301)\r\n\tat
 
org.apache.solr.update.processor.AtomicUpdateDocumentMerger.mergeChildDoc(AtomicUpdateDocumentMerger.java:398)\r\n\tat
 
org.apache.solr.update.processor.DistributedUpdateProcessor.getUpdatedDocument(DistributedUpdateProcessor.java:697)\r\n\tat
 
org.apache.solr.update.processor.DistributedUpdateProcessor.doVersionAdd(DistributedUpdateProcessor.java:372)\r\n\tat
 
org.apache.solr.update.processor.DistributedUpdateProcessor.lambda$versionAdd$0(DistributedUpdateProcessor.java:337)\r\n\tat
 
org.apache.solr.update.VersionBucket.runWithLock(VersionBucket.java:50)\r\n\tat 
org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:337)\r\n\tat
 
org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:223)\r\n\tat
 
org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:475)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:75)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)\r\n\tat
 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)\r\n\tat
 
org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:92)\r\n\tat
 
org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.handleAdds(JsonLoader.java:507)\r\n\tat
 
org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.processUpdate(JsonLoader.java:145)\r\n\tat
 
org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.load(JsonLoader.java:121)\r\n\tat
 org.apache.solr.handler.loader.JsonLoader.load(JsonLoader.java:84)\r\n\tat 
org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97)\r\n\tat
 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68)\r\n\tat
 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)\r\n\tat
 

[jira] [Created] (LUCENE-8836) Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread Bruno Roustant (JIRA)
Bruno Roustant created LUCENE-8836:
--

 Summary: Optimize DocValues TermsDict to continue scanning from 
the last position when possible
 Key: LUCENE-8836
 URL: https://issues.apache.org/jira/browse/LUCENE-8836
 Project: Lucene - Core
  Issue Type: Improvement
Reporter: Bruno Roustant


Lucene80DocValuesProducer.TermsDict is used to lookup for either a term or a 
term ordinal.

Currently it does not have the optimization the FSTEnum has: to be able to 
continue a sequential scan from where the last lookup was in the IndexInput. 
For sparse lookups (when searching only a few terms or ordinal) it is not an 
issue. But for multiple lookups in a row this optimization could save 
re-scanning all the terms from the block start (since they are delat encoded).

This patch proposes the optimization.

To estimate the gain, we ran 3 Lucene tests while counting the seeks and the 
term reads in the IndexInput, with and without the optimization:

TestLucene70DocValuesFormat - the optimization saves 24% seeks and 15% term 
reads.
TestDocValuesQueries - the optimization adds 0.7% seeks and 0.003% term reads.
TestDocValuesRewriteMethod.testRegexps - the optimization saves 71% seeks and 
82% term reads.

In some cases, when scanning many terms in lexicographical order, the 
optimization saves a lot. In some case, when only looking for some sparse 
terms, the optimization does not bring improvement, but does not penalize 
neither. It seems to be worth to always have it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] bruno-roustant opened a new pull request #701: LUCENE-8836 Optimize DocValues TermsDict to continue scanning from the last position when possible

2019-06-06 Thread GitBox
bruno-roustant opened a new pull request #701: LUCENE-8836 Optimize DocValues 
TermsDict to continue scanning from the last position when possible
URL: https://github.com/apache/lucene-solr/pull/701
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857618#comment-16857618
 ] 

ASF subversion and git services commented on SOLR-13105:


Commit d946bc6bbbf79058da5919fa1391777b9d53e002 in lucene-solr's branch 
refs/heads/SOLR-13105-visual from Joel Bernstein
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=d946bc6 ]

SOLR-13105: Start gallery page


> A visual guide to Solr Math Expressions and Streaming Expressions
> -
>
> Key: SOLR-13105
> URL: https://issues.apache.org/jira/browse/SOLR-13105
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot 
> 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, 
> Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 
> AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png
>
>
> Visualization is now a fundamental element of Solr Streaming Expressions and 
> Math Expressions. This ticket will create a visual guide to Solr Math 
> Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* 
> visualization examples.
> It will also cover using the JDBC expression to *analyze* and *visualize* 
> results from any JDBC compliant data source.
> Intro from the guide:
> {code:java}
> Streaming Expressions exposes the capabilities of Solr Cloud as composable 
> functions. These functions provide a system for searching, transforming, 
> analyzing and visualizing data stored in Solr Cloud collections.
> At a high level there are four main capabilities that will be explored in the 
> documentation:
> * Searching, sampling and aggregating results from Solr.
> * Transforming result sets after they are retrieved from Solr.
> * Analyzing and modeling result sets using probability and statistics and 
> machine learning libraries.
> * Visualizing result sets, aggregations and statistical models of the data.
> {code}
>  
> A few sample visualizations are attached to the ticket.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Windows (64bit/jdk-11.0.2) - Build # 7979 - Failure!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/7979/
Java: 64bit/jdk-11.0.2 -XX:-UseCompressedOops -XX:+UseG1GC

All tests passed

Build Log:
[...truncated 63651 lines...]
-ecj-javadoc-lint-src:
[mkdir] Created dir: C:\Users\jenkins\AppData\Local\Temp\ecj415358667
 [ecj-lint] Compiling 1278 source files to 
C:\Users\jenkins\AppData\Local\Temp\ecj415358667
 [ecj-lint] Processing annotations
 [ecj-lint] Annotations processed
 [ecj-lint] Processing annotations
 [ecj-lint] No elements to process
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\lib\org.restlet-2.3.0.jar
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\lib\org.restlet.ext.servlet-2.3.0.jar
 [ecj-lint] --
 [ecj-lint] 1. WARNING in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\client\solrj\embedded\EmbeddedSolrServer.java
 (at line 219)
 [ecj-lint] return (NamedList) new 
JavaBinCodec(resolver).unmarshal(in);
 [ecj-lint]^^
 [ecj-lint] Resource leak: '' is never closed
 [ecj-lint] --
 [ecj-lint] --
 [ecj-lint] 2. WARNING in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\cloud\autoscaling\sim\SimCloudManager.java
 (at line 788)
 [ecj-lint] throw new UnsupportedOperationException("must add at least 1 
node first");
 [ecj-lint] 
^^
 [ecj-lint] Resource leak: 'queryRequest' is not closed at this location
 [ecj-lint] --
 [ecj-lint] 3. WARNING in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\cloud\autoscaling\sim\SimCloudManager.java
 (at line 794)
 [ecj-lint] throw new UnsupportedOperationException("must add at least 1 
node first");
 [ecj-lint] 
^^
 [ecj-lint] Resource leak: 'queryRequest' is not closed at this location
 [ecj-lint] --
 [ecj-lint] --
 [ecj-lint] 4. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 19)
 [ecj-lint] import javax.naming.Context;
 [ecj-lint]
 [ecj-lint] The type javax.naming.Context is not accessible
 [ecj-lint] --
 [ecj-lint] 5. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 20)
 [ecj-lint] import javax.naming.InitialContext;
 [ecj-lint]^^^
 [ecj-lint] The type javax.naming.InitialContext is not accessible
 [ecj-lint] --
 [ecj-lint] 6. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 21)
 [ecj-lint] import javax.naming.NamingException;
 [ecj-lint]
 [ecj-lint] The type javax.naming.NamingException is not accessible
 [ecj-lint] --
 [ecj-lint] 7. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 22)
 [ecj-lint] import javax.naming.NoInitialContextException;
 [ecj-lint]^^
 [ecj-lint] The type javax.naming.NoInitialContextException is not accessible
 [ecj-lint] --
 [ecj-lint] 8. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 776)
 [ecj-lint] Context c = new InitialContext();
 [ecj-lint] ^^^
 [ecj-lint] Context cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 9. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 776)
 [ecj-lint] Context c = new InitialContext();
 [ecj-lint] ^^
 [ecj-lint] InitialContext cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 10. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 779)
 [ecj-lint] } catch (NoInitialContextException e) {
 [ecj-lint]  ^
 [ecj-lint] NoInitialContextException cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 11. ERROR in 
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\java\org\apache\solr\core\SolrResourceLoader.java
 (at line 781)
 [ecj-lint] } catch (NamingException e) {
 [ecj-lint]  ^^^
 [ecj-lint] NamingException cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] --
 [ecj-lint] 12. WARNING in 

[jira] [Comment Edited] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-06-06 Thread Mark Miller (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857576#comment-16857576
 ] 

Mark Miller edited comment on SOLR-13452 at 6/6/19 11:52 AM:
-

{quote}One think I noticed is that you use File constructor a lot
{quote}
Yeah, I'm still fumbling with groovy and gradle as a Java guy. I'll do a pass 
and try to convert file stuff.

FYI, the new unusedDeps task will give a printout like the following (example 
on solr-core):
{noformat}
> gw unusedDeps

Our classpath dependency count 101
Our directly used dependency count 76

List of possibly unused jars - they may be used at runtime however 
(Class.forName or something), this is not definitive.
We take our classpath dependenies, substract our direct dependencies and then 
subtract dependencies used by our direct dependencies

asm-analysis-6.2.jar
asm-tree-6.2.jar
commons-beanutils-1.9.3.jar
kerb-crypto-1.0.1.jar
kerby-config-1.0.1.jar
kerby-pkix-1.0.1.jar
kerby-util-1.0.1.jar
lucene-spatial-8.1.0.jar
org.restlet.ext.servlet-2.3.0.jar

{noformat}


was (Author: markrmil...@gmail.com):
bq. One think I noticed is that you use File constructor a lot

Yeah, I'm still fumbling with groovy and gradle as a Java guy. I'll do a pass 
and try to convert file stuff.

FYI, the new unusedDeps task will give a printout like the following:

{noformat}
> gw unusedDeps

Our classpath dependency count 101
Our directly used dependency count 76

List of possibly unused jars - they may be used at runtime however 
(Class.forName or something), this is not definitive.
We take our classpath dependenies, substract our direct dependencies and then 
subtract dependencies used by our direct dependencies

asm-analysis-6.2.jar
asm-tree-6.2.jar
commons-beanutils-1.9.3.jar
kerb-crypto-1.0.1.jar
kerby-config-1.0.1.jar
kerby-pkix-1.0.1.jar
kerby-util-1.0.1.jar
lucene-spatial-8.1.0.jar
org.restlet.ext.servlet-2.3.0.jar

{noformat}



> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: master (9.0)
>
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
>  
>  https://github.com/apache/lucene-solr/tree/jira/SOLR-13452_gradle_3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-06-06 Thread Mark Miller (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857576#comment-16857576
 ] 

Mark Miller commented on SOLR-13452:


bq. One think I noticed is that you use File constructor a lot

Yeah, I'm still fumbling with groovy and gradle as a Java guy. I'll do a pass 
and try to convert file stuff.

FYI, the new unusedDeps task will give a printout like the following:

{noformat}
> gw unusedDeps

Our classpath dependency count 101
Our directly used dependency count 76

List of possibly unused jars - they may be used at runtime however 
(Class.forName or something), this is not definitive.
We take our classpath dependenies, substract our direct dependencies and then 
subtract dependencies used by our direct dependencies

asm-analysis-6.2.jar
asm-tree-6.2.jar
commons-beanutils-1.9.3.jar
kerb-crypto-1.0.1.jar
kerby-config-1.0.1.jar
kerby-pkix-1.0.1.jar
kerby-util-1.0.1.jar
lucene-spatial-8.1.0.jar
org.restlet.ext.servlet-2.3.0.jar

{noformat}



> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: master (9.0)
>
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
>  
>  https://github.com/apache/lucene-solr/tree/jira/SOLR-13452_gradle_3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.1-Linux (64bit/jdk-11.0.2) - Build # 391 - Still Unstable!

2019-06-06 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.1-Linux/391/
Java: 64bit/jdk-11.0.2 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

9 tests failed.
FAILED:  org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth

Error Message:
Expected metric minimums for prefix SECURITY./authentication.: 
{failMissingCredentials=2, authenticated=19, passThrough=9, 
failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=11, totalTime=9122740, 
failWrongCredentials=1, requestTimes=1258, requests=30, errors=0}

Stack Trace:
java.lang.AssertionError: Expected metric minimums for prefix 
SECURITY./authentication.: {failMissingCredentials=2, authenticated=19, 
passThrough=9, failWrongCredentials=1, requests=31, errors=0}, but got: 
{failMissingCredentials=2, authenticated=16, passThrough=11, totalTime=9122740, 
failWrongCredentials=1, requestTimes=1258, requests=30, errors=0}
at 
__randomizedtesting.SeedInfo.seed([B3A7C371A669098D:FC9B563023A8AF7]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129)
at 
org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83)
at 
org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:306)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 

[jira] [Resolved] (SOLR-13434) OpenTracing support for Solr

2019-06-06 Thread Cao Manh Dat (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cao Manh Dat resolved SOLR-13434.
-
Resolution: Fixed

> OpenTracing support for Solr
> 
>
> Key: SOLR-13434
> URL: https://issues.apache.org/jira/browse/SOLR-13434
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Shalin Shekhar Mangar
>Assignee: Cao Manh Dat
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: SOLR-13434.patch
>
>  Time Spent: 7h 40m
>  Remaining Estimate: 0h
>
> [OpenTracing|https://opentracing.io/] is a vendor neutral API and 
> infrastructure for distributed tracing. Many OSS tracers just as Jaeger, 
> OpenZipkin, Apache SkyWalking as well as commercial tools support OpenTracing 
> APIs. Ideally, we can implement it once and have integrations for popular 
> tracers like we have with metrics and prometheus.
> I'm aware of SOLR-9641 but HTrace has since retired from incubator for lack 
> of activity so this is a fresh attempt at solving this problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13434) OpenTracing support for Solr

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857538#comment-16857538
 ] 

ASF subversion and git services commented on SOLR-13434:


Commit 3364753661ffb91bf04058c6184368656e0d5ab7 in lucene-solr's branch 
refs/heads/master from Cao Manh Dat
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=3364753 ]

SOLR-13434: Using back Java 9 type reference


> OpenTracing support for Solr
> 
>
> Key: SOLR-13434
> URL: https://issues.apache.org/jira/browse/SOLR-13434
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Shalin Shekhar Mangar
>Assignee: Cao Manh Dat
>Priority: Major
> Fix For: master (9.0), 8.2
>
> Attachments: SOLR-13434.patch
>
>  Time Spent: 7h 40m
>  Remaining Estimate: 0h
>
> [OpenTracing|https://opentracing.io/] is a vendor neutral API and 
> infrastructure for distributed tracing. Many OSS tracers just as Jaeger, 
> OpenZipkin, Apache SkyWalking as well as commercial tools support OpenTracing 
> APIs. Ideally, we can implement it once and have integrations for popular 
> tracers like we have with metrics and prometheus.
> I'm aware of SOLR-9641 but HTrace has since retired from incubator for lack 
> of activity so this is a fresh attempt at solving this problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8833) Allow subclasses of MMapDirecory to preload individual IndexInputs

2019-06-06 Thread Simon Willnauer (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857525#comment-16857525
 ] 

Simon Willnauer commented on LUCENE-8833:
-

> what would the iocontext provide to base the preload decision on? just 
> curious.

sure, the one I had in mind as an example is merge. I am not sure if it makes a 
big difference I was just thinking if there are other signals than the file 
extension. 
I opened LUCENE-8835 to fix the file listing issue FileSwitchDirectory has.

> Allow subclasses of MMapDirecory to preload individual IndexInputs
> --
>
> Key: LUCENE-8833
> URL: https://issues.apache.org/jira/browse/LUCENE-8833
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Simon Willnauer
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> I think it's useful for subclasses to select the preload flag on a per index 
> input basis rather than all or nothing. Here is a patch that has an 
> overloaded protected openInput method. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] thomaswoeckinger opened a new pull request #681: Fix SOLR-13347

2019-06-06 Thread GitBox
thomaswoeckinger opened a new pull request #681: Fix SOLR-13347
URL: https://github.com/apache/lucene-solr/pull/681
 
 
   Test are included in https://github.com/apache/lucene-solr/pull/665


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] thomaswoeckinger closed pull request #681: Fix SOLR-13347

2019-06-06 Thread GitBox
thomaswoeckinger closed pull request #681: Fix SOLR-13347
URL: https://github.com/apache/lucene-solr/pull/681
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] thomaswoeckinger commented on issue #681: Fix SOLR-13347

2019-06-06 Thread GitBox
thomaswoeckinger commented on issue #681: Fix SOLR-13347
URL: https://github.com/apache/lucene-solr/pull/681#issuecomment-499445719
 
 
   Updated commit message to be more specific


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] [lucene-solr] s1monw opened a new pull request #700: LUCENE-8835: Respect file extension when listing files form FileSwitchDirectory

2019-06-06 Thread GitBox
s1monw opened a new pull request #700: LUCENE-8835: Respect file extension when 
listing files form FileSwitchDirectory
URL: https://github.com/apache/lucene-solr/pull/700
 
 
   FileSwitchDirectory splits file actions between 2 directories based
   on file extensions. The extensions are respected on write operations
   like delete or create but ignored when we list the content of the
   directories. Until now we only deduplicated the contents on
   Directory#listAll which can cause inconsistencies and hard to debug
   errors due to double deletions in IndexWriter is a file is pending
   delete in one of the directories but still shows up in the directory
   listing form the other directory. This case can happen if both
   directories point to the same underlying FS directory which is a
   common usecase to split between mmap and noifs.
   
   This change filters out files from directories depending on their
   file extension to make sure files that are deleted in one directory
   are not returned form another if they point to the same fs directory.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8835) Respect file extension when listing files form FileSwitchDirectory

2019-06-06 Thread Simon Willnauer (JIRA)
Simon Willnauer created LUCENE-8835:
---

 Summary: Respect file extension when listing files form 
FileSwitchDirectory
 Key: LUCENE-8835
 URL: https://issues.apache.org/jira/browse/LUCENE-8835
 Project: Lucene - Core
  Issue Type: Bug
Reporter: Simon Willnauer


FileSwitchDirectory splits file actions between 2 directories based on file 
extensions. The extensions are respected on write operations like delete or 
create but ignored when we list the content of the directories. Until now we 
only deduplicated the contents on Directory#listAll which can cause 
inconsistencies and hard to debug errors due to double deletions in IndexWriter 
is a file is pending delete in one of the directories but still shows up in the 
directory listing form the other directory. This case can happen if both 
directories point to the same underlying FS directory which is a common usecase 
to split between mmap and noifs. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857516#comment-16857516
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit 5d258df02bbc3ae6071f659d66bb435c98510f6a in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle_3 from Mark Robert Miller
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=5d258df ]

SOLR-13452: Tweak main build.gradle a bit.


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: master (9.0)
>
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
>  
>  https://github.com/apache/lucene-solr/tree/jira/SOLR-13452_gradle_3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-06-06 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857517#comment-16857517
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit c4ccba87ba33d50a00e039ad96afeb576b30d20c in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle_3 from Mark Robert Miller
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=c4ccba8 ]

SOLR-13452: Improve unused dep checker to not count on creating the dist tgz 
and zip first and check if jars are used by other dep jars even if not by the 
module itself.


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: master (9.0)
>
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
>  
>  https://github.com/apache/lucene-solr/tree/jira/SOLR-13452_gradle_3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-13496) NullPointerException in JSONWriter.writeSolrDocument

2019-06-06 Thread Christine Poerschke (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Christine Poerschke updated SOLR-13496:
---
Attachment: SOLR-13496.patch

> NullPointerException in JSONWriter.writeSolrDocument
> 
>
> Key: SOLR-13496
> URL: https://issues.apache.org/jira/browse/SOLR-13496
> Project: Solr
>  Issue Type: Bug
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-13496.patch, SOLR-13496.patch, SOLR-13496.patch
>
>
> For non-grouped searches 
> [QueryComponent.regularFinishStage|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java#L647-L655]
>  already considers the possibility of null {{SolrDocument}} values due to an 
> index change.
> For grouped searches 
> [GroupedEndResultTransformer.transform|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/search/grouping/endresulttransformer/GroupedEndResultTransformer.java#L94-L114]
>  potentially adds a null element to a {{SolrDocumentList}}.
> The 
> [TextResponseWriter.writeSolrDocumentList|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/TextResponseWriter.java#L170]
>  method passes any null {{SolrDocument}} through to the 
> [JSONWriter.writeSolrDocument|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/JSONWriter.java#L87]
>  method leading to a {{NullPointerException}} at line 87.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13496) NullPointerException in JSONWriter.writeSolrDocument

2019-06-06 Thread Christine Poerschke (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857506#comment-16857506
 ] 

Christine Poerschke commented on SOLR-13496:


bq. ... I think that's unrelated ... SOLR-13518 just opened to improve the 
debuggability of that test.

Actually the bot was right and the SOLR-13518 change helps to make it easier to 
see why, it was basically saying that 
{{ChaoticDistributedGroupingTest.TargettedChaosComponent.getDescription}} 
returning {{null}} is not okay.

Will attach revised patch for a QA re-run (but as mentioned above am still not 
intending to commit the {{ChaoticDistributedGroupingTest}} test itself).

> NullPointerException in JSONWriter.writeSolrDocument
> 
>
> Key: SOLR-13496
> URL: https://issues.apache.org/jira/browse/SOLR-13496
> Project: Solr
>  Issue Type: Bug
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-13496.patch, SOLR-13496.patch
>
>
> For non-grouped searches 
> [QueryComponent.regularFinishStage|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java#L647-L655]
>  already considers the possibility of null {{SolrDocument}} values due to an 
> index change.
> For grouped searches 
> [GroupedEndResultTransformer.transform|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/search/grouping/endresulttransformer/GroupedEndResultTransformer.java#L94-L114]
>  potentially adds a null element to a {{SolrDocumentList}}.
> The 
> [TextResponseWriter.writeSolrDocumentList|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/TextResponseWriter.java#L170]
>  method passes any null {{SolrDocument}} through to the 
> [JSONWriter.writeSolrDocument|https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.1.1/solr/core/src/java/org/apache/solr/response/JSONWriter.java#L87]
>  method leading to a {{NullPointerException}} at line 87.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8829) TopDocs#Merge is Tightly Coupled To Number Of Collectors Involved

2019-06-06 Thread Atri Sharma (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8829?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16857501#comment-16857501
 ] 

Atri Sharma commented on LUCENE-8829:
-

Attached is an updated patch with additional tests. I was able to reproduce the 
issue in a standalone test where multiple indexsearchers with different slices 
run the same query on same set of documents.

 

[^LUCENE-8829.patch]

> TopDocs#Merge is Tightly Coupled To Number Of Collectors Involved
> -
>
> Key: LUCENE-8829
> URL: https://issues.apache.org/jira/browse/LUCENE-8829
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Atri Sharma
>Priority: Major
> Attachments: LUCENE-8829.patch, LUCENE-8829.patch
>
>
> While investigating LUCENE-8819, I understood that TopDocs#merge's order of 
> results are indirectly dependent on the number of collectors involved in the 
> merge. This is troubling because 1) The number of collectors involved in a 
> merge are cost based and directly dependent on the number of slices created 
> for the parallel searcher case. 2) TopN hits code path will invoke merge with 
> a single Collector, so essentially, doing the same TopN query with single 
> threaded and parallel threaded searcher will invoke different order of 
> results, which is a bad invariant that breaks.
>  
> The reason why this happens is because of the subtle way TopDocs#merge sets 
> shardIndex in the ScoreDoc population during populating the priority queue 
> used for merging. ShardIndex is essentially set to the ordinal of the 
> collector which generates the hit. This means that the shardIndex is 
> dependent on the number of collectors, even for the same set of hits.
>  
> In case of no sort order specified, shardIndex is used for tie breaking when 
> scores are equal. This translates to different orders for same hits with 
> different shardIndices.
>  
> I propose that we remove shardIndex from the default tie breaking mechanism 
> and replace it with docID. DocID order is the de facto that is expected 
> during collection, so it might make sense to use the same factor during tie 
> breaking when scores are the same.
>  
> CC: [~ivera]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8829) TopDocs#Merge is Tightly Coupled To Number Of Collectors Involved

2019-06-06 Thread Atri Sharma (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Atri Sharma updated LUCENE-8829:

Attachment: LUCENE-8829.patch

> TopDocs#Merge is Tightly Coupled To Number Of Collectors Involved
> -
>
> Key: LUCENE-8829
> URL: https://issues.apache.org/jira/browse/LUCENE-8829
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Atri Sharma
>Priority: Major
> Attachments: LUCENE-8829.patch, LUCENE-8829.patch
>
>
> While investigating LUCENE-8819, I understood that TopDocs#merge's order of 
> results are indirectly dependent on the number of collectors involved in the 
> merge. This is troubling because 1) The number of collectors involved in a 
> merge are cost based and directly dependent on the number of slices created 
> for the parallel searcher case. 2) TopN hits code path will invoke merge with 
> a single Collector, so essentially, doing the same TopN query with single 
> threaded and parallel threaded searcher will invoke different order of 
> results, which is a bad invariant that breaks.
>  
> The reason why this happens is because of the subtle way TopDocs#merge sets 
> shardIndex in the ScoreDoc population during populating the priority queue 
> used for merging. ShardIndex is essentially set to the ordinal of the 
> collector which generates the hit. This means that the shardIndex is 
> dependent on the number of collectors, even for the same set of hits.
>  
> In case of no sort order specified, shardIndex is used for tie breaking when 
> scores are equal. This translates to different orders for same hits with 
> different shardIndices.
>  
> I propose that we remove shardIndex from the default tie breaking mechanism 
> and replace it with docID. DocID order is the de facto that is expected 
> during collection, so it might make sense to use the same factor during tie 
> breaking when scores are the same.
>  
> CC: [~ivera]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



  1   2   >