[JENKINS] Lucene-Solr-7.x-Windows (64bit/jdk-11) - Build # 894 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Windows/894/
Java: 64bit/jdk-11 -XX:+UseCompressedOops -XX:+UseG1GC

23 tests failed.
FAILED:  
org.apache.solr.ltr.store.rest.TestModelManagerPersistence.testFilePersistence

Error Message:
Software caused connection abort: recv failed

Stack Trace:
javax.net.ssl.SSLProtocolException: Software caused connection abort: recv 
failed
at 
__randomizedtesting.SeedInfo.seed([3AECF8F3BDC9B452:184D9C8E18A5D217]:0)
at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:126)
at 
java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:321)
at 
java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:264)
at 
java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:259)
at 
java.base/sun.security.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1314)
at 
java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:839)
at 
org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at 
org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at 
org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:282)
at 
org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at 
org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at 
org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at 
org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at 
org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:165)
at 
org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at 
org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at 
org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at 
org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at 
org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111)
at 
org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at 
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at 
org.apache.solr.util.RestTestHarness.getResponse(RestTestHarness.java:215)
at org.apache.solr.util.RestTestHarness.query(RestTestHarness.java:107)
at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:226)
at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:192)
at 
org.apache.solr.ltr.store.rest.TestModelManagerPersistence.testFilePersistence(TestModelManagerPersistence.java:168)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSing

[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread moshebla
Github user moshebla commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234882891
  
--- Diff: solr/core/src/test/org/apache/solr/search/TestReload.java ---
@@ -36,13 +36,13 @@ public void testGetRealtimeReload() throws Exception {
 
 assertU(commit("softCommit","true"));   // should cause a RTG searcher 
to be opened
 
-assertJQ(req("qt","/get","id","1")
+assertJQ(req("qt","/get","id","1", "fl", "id,_version_")
 ,"=={'doc':{'id':'1','_version_':" + version + "}}"
 );
 
 h.reload();
 
-assertJQ(req("qt","/get","id","1")
+assertJQ(req("qt","/get","id","1", "fl", "id,_version_")
--- End diff --

_root_ is also added to the doc so I added fl param to filter it,
since, IMO, testing _root_ seemed out of scope for this test.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread moshebla
Github user moshebla commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234882621
  
--- Diff: 
solr/core/src/test/org/apache/solr/search/TestPseudoReturnFields.java ---
@@ -126,7 +126,7 @@ public void testAllRealFields() throws Exception {
   ,"//result/doc/str[@name='ssto']"
   ,"//result/doc/str[@name='subject']"
   
-  ,"//result/doc[count(*)=4]"
+  ,"//result/doc[count(*)=5]"
--- End diff --

The numbers were changed since now _root_ is also added to the doc,
increasing the field count by 1.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-Tests-7.x - Build # 1033 - Unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.x/1033/

3 tests failed.
FAILED:  
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica

Error Message:
Expected new active leader null Live Nodes: [127.0.0.1:33746_solr, 
127.0.0.1:38612_solr, 127.0.0.1:38865_solr] Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/13)={
   "pullReplicas":"0",   "replicationFactor":"2",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node2":{   "core":"raceDeleteReplica_false_shard1_replica_n1",
   "base_url":"http://127.0.0.1:45558/solr";,   
"node_name":"127.0.0.1:45558_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   "leader":"true"},  
   "core_node6":{   
"core":"raceDeleteReplica_false_shard1_replica_n5",   
"base_url":"http://127.0.0.1:45558/solr";,   
"node_name":"127.0.0.1:45558_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   
"router":{"name":"compositeId"},   "maxShardsPerNode":"1",   
"autoAddReplicas":"false",   "nrtReplicas":"2",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Expected new active leader
null
Live Nodes: [127.0.0.1:33746_solr, 127.0.0.1:38612_solr, 127.0.0.1:38865_solr]
Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/13)={
  "pullReplicas":"0",
  "replicationFactor":"2",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node2":{
  "core":"raceDeleteReplica_false_shard1_replica_n1",
  "base_url":"http://127.0.0.1:45558/solr";,
  "node_name":"127.0.0.1:45558_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "leader":"true"},
"core_node6":{
  "core":"raceDeleteReplica_false_shard1_replica_n5",
  "base_url":"http://127.0.0.1:45558/solr";,
  "node_name":"127.0.0.1:45558_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"2",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([BE834F3AB20EE7F4:D4952EEADAFCAD3E]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:280)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:334)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:230)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.ja

[JENKINS] Lucene-Solr-BadApples-7.x-Linux (64bit/jdk-11) - Build # 122 - Failure!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-BadApples-7.x-Linux/122/
Java: 64bit/jdk-11 -XX:+UseCompressedOops -XX:+UseG1GC

11 tests failed.
FAILED:  org.apache.solr.cloud.DeleteNodeTest.test

Error Message:
Could not find collection : deletenodetest_coll

Stack Trace:
org.apache.solr.common.SolrException: Could not find collection : 
deletenodetest_coll
at 
__randomizedtesting.SeedInfo.seed([C4FDC88A39C11D34:4CA9F750973D70CC]:0)
at 
org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:118)
at org.apache.solr.cloud.DeleteNodeTest.test(DeleteNodeTest.java:78)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)


FAILED:  
org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.testSslAndNoClientAuth

Error Message:
Could not find collection:second_collection

Stack Trace:
java.lang.AssertionError: Could not find collection:second_collection
at 
__randomizedtesting.SeedInfo.seed([C4FDC88A39C11D34:38471CBEC1E1ACFE]:0)
at org.juni

[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 382 - Unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/382/

6 tests failed.
FAILED:  
org.apache.lucene.codecs.lucene70.TestLucene70DocValuesFormat.testBooleanNumericsVsStoredFields

Error Message:
Test abandoned because suite timeout was reached.

Stack Trace:
java.lang.Exception: Test abandoned because suite timeout was reached.
at __randomizedtesting.SeedInfo.seed([91E778728438B4E4]:0)


FAILED:  
junit.framework.TestSuite.org.apache.lucene.codecs.lucene70.TestLucene70DocValuesFormat

Error Message:
Suite timeout exceeded (>= 720 msec).

Stack Trace:
java.lang.Exception: Suite timeout exceeded (>= 720 msec).
at __randomizedtesting.SeedInfo.seed([91E778728438B4E4]:0)


FAILED:  org.apache.solr.cloud.RecoveryAfterSoftCommitTest.test

Error Message:
KeeperErrorCode = Session expired for /clusterstate.json

Stack Trace:
org.apache.zookeeper.KeeperException$SessionExpiredException: KeeperErrorCode = 
Session expired for /clusterstate.json
at 
__randomizedtesting.SeedInfo.seed([5F79B21E6387A373:D72D8DC4CD7BCE8B]:0)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:130)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:54)
at org.apache.zookeeper.ZooKeeper.getData(ZooKeeper.java:1215)
at 
org.apache.solr.common.cloud.SolrZkClient.lambda$getData$5(SolrZkClient.java:341)
at 
org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:60)
at 
org.apache.solr.common.cloud.SolrZkClient.getData(SolrZkClient.java:341)
at 
org.apache.solr.common.cloud.ZkStateReader.refreshLegacyClusterState(ZkStateReader.java:572)
at 
org.apache.solr.common.cloud.ZkStateReader.forceUpdateCollection(ZkStateReader.java:376)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.updateMappingsFromZk(AbstractFullDistribZkTestBase.java:681)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.updateMappingsFromZk(AbstractFullDistribZkTestBase.java:676)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createJettys(AbstractFullDistribZkTestBase.java:471)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createServers(AbstractFullDistribZkTestBase.java:341)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1008)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.jav

[JENKINS] Lucene-Solr-7.x-Solaris (64bit/jdk1.8.0) - Build # 917 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Solaris/917/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

5 tests failed.
FAILED:  org.apache.solr.cloud.LIRRollingUpdatesTest.testNewLeaderOldReplica

Error Message:
IOException occured when talking to server at: 
http://127.0.0.1:33753/solr/testNewLeaderOldReplica

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: IOException occured when 
talking to server at: http://127.0.0.1:33753/solr/testNewLeaderOldReplica
at 
__randomizedtesting.SeedInfo.seed([25611D33796C6D53:6B7ABA34D8667A9C]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:657)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1260)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.realTimeGetDocId(LIRRollingUpdatesTest.java:409)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.assertDocExists(LIRRollingUpdatesTest.java:401)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.assertDocsExistInAllReplicas(LIRRollingUpdatesTest.java:387)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.testNewLeaderOldReplica(LIRRollingUpdatesTest.java:218)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 

[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread moshebla
Github user moshebla commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234872041
  
--- Diff: solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java 
---
@@ -194,20 +197,33 @@ public String getHashableId() {
   return null; // caller should call getLuceneDocument() instead
 }
 
-String rootId = getHashableId();
-
-boolean isVersion = version != 0;
+final String rootId = getHashableId();
+final boolean hasVersion = version != 0;
+final SolrInputField versionSif = hasVersion? 
solrDoc.get(CommonParams.VERSION_FIELD): null;
--- End diff --

OK I changed this and ran SolrExampleTests which seem to pass.
I'm currently running the solr-core test suite.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12632) Completely remove Trie fields

2018-11-19 Thread Varun Thacker (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12632?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692678#comment-16692678
 ] 

Varun Thacker commented on SOLR-12632:
--

I did some basic benchmarking on my laptop against branch 7_6. Index has 25M 
documents in a two shard collection.

Both top_level_pi/one_level_pi have a 1M cardinality.
{code:java}
SolrInputDocument document = new SolrInputDocument();
document.addField("id", x*batchSize+j);
document.addField("top_level_pi", TestUtil.nextInt(r,0, 1000*1000));
document.addField("one_level_pi", TestUtil.nextInt(r,0, 1000*1000));

document.addField("top_level_ti", TestUtil.nextInt(r,0, 1000*1000));
document.addField("one_level_ti", TestUtil.nextInt(r,0, 1000*1000));

{code}
 
{code:java}


{code}
There were two types of queries that I ran against : one with counts and one 
with count and a sum aggregation

The totalRows is 11

*Simple Count*
{code:java}
1.
time : 38-41s
filterCache inserts : 54303
facet(gettingstarted,q="id:123*",buckets="top_level_pi,one_level_pi", 
bucketSorts="count(*) desc", bucketSizeLimit=-1, count(*))

2.
time : 6-9s
filterCache inserts : 54280
facet(gettingstarted,q="id:123*",buckets="top_level_ti,one_level_ti", 
bucketSorts="count(*) desc", bucketSizeLimit=-1, count(*))


Underlying Facet Query that the stream expression forms
{
    "top_level_ti": {
        "type": "terms",
        "field": "top_level_ti",
        "limit": 2147483647,
        "sort": {
            "count": "desc"
        },
        "facet": {
            "one_level_ti": {
                "type": "terms",
                "field": "one_level_ti",
                "limit": 2147483647,
                "sort": {
                    "count": "desc"
                },
                "facet": {}
            }
        }
    }
{code}
 

*Count + Sum Aggregation*
{code:java}
3.
time : 110s
filterCache inserts : 110044
facet(gettingstarted,q="id:123*",buckets="top_level_pi,one_level_pi", 
bucketSorts="count(*) desc", bucketSizeLimit=-1, count(*), sum(top_level_pi))

4.
time : 11-13s
filterCache inserts ; 110021
facet(gettingstarted,q="id:123*",buckets="top_level_ti,one_level_ti", 
bucketSorts="count(*) desc", bucketSizeLimit=-1, count(*), sum(top_level_ti))


Underlying Facet Query that the stream expression forms
{
    "top_level_pi": {
        "type": "terms",
        "field": "top_level_pi",
        "limit": 2147483647,
        "sort": {
            "count": "desc"
        },
        "facet": {
            "one_level_pi": {
                "type": "terms",
                "field": "one_level_pi",
                "limit": 2147483647,
                "sort": {
                    "count": "desc"
                },
                "facet": {
                    "facet_0": "sum(top_level_pi)"
                }
            }
        }
    }
}{code}
 

Few observations
 # For the same query trie is a lot faster than point fields
 # When you add a sum aggregation to the same facet query the time doubles
 # The filter cache inserts are exteremly high. For nested facets with high 
cardinality this will simply nuke the filer cache without much reuse
 # For the same queries when I ran without the filter cache, only query 3 was 
faster by 30-50% the rest were roughly the same time.

For reference , I ran the points query with java flight recorder and here's the 
main stack trace from method profiling
{code:java}
org.apache.lucene.util.bkd.BKDReader$PackedIndexTree.pushLeft(BKDReader.java:410)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:786)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:797)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:787)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:787)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:797)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:787)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:787)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:787)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:797)
org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:533)
org.apache.lucene.search.PointRangeQuery$1$4.get(PointRangeQuery.java:299)
org.apache.lucene.search.PointRangeQuery$1.scorer(PointRangeQuery.java:323)
org.apache.lucene.search.Weight.bulkScorer(Weight.java:177)
org.apache.lucene.search.IndexOrDocValuesQuery$1.bulkScorer(IndexOrDocValuesQuery.java:138)
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:667)
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:471)
org.apache.solr.search.DocSetUtil.createDocSetGeneric(DocSetUtil.java:151)
org.apache.solr.search.DocSetUtil.createDocSet(DocSetUtil.java:140)
org.apache.solr.search.SolrIndexSearcher.getDocSetNC(SolrIndexSearcher.java:1178)
org.apache.solr.search.SolrIndexSearcher.getDocSet(SolrIndexSearcher.java:1213)
org.apache.solr.

[GitHub] lucene-solr issue #501: SOLR-5211

2018-11-19 Thread dsmiley
Github user dsmiley commented on the issue:

https://github.com/apache/lucene-solr/pull/501
  
Nice.
Do we even need the test config / schema at all?  If they don't add 
anything, we shouldn't have them.  Also I see there is still some "Block" 
terminology added here we can avoid to prefer "Nested".



---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread dsmiley
Github user dsmiley commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234869970
  
--- Diff: solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java 
---
@@ -194,20 +197,33 @@ public String getHashableId() {
   return null; // caller should call getLuceneDocument() instead
 }
 
-String rootId = getHashableId();
-
-boolean isVersion = version != 0;
+final String rootId = getHashableId();
+final boolean hasVersion = version != 0;
+final SolrInputField versionSif = hasVersion? 
solrDoc.get(CommonParams.VERSION_FIELD): null;
--- End diff --

What I meant was that *if* the solrDoc has a \_version\_ then we can copy 
it to all nested children.  We need not examine this.version at all.  At least 
this is my understanding of reading the code; it'd be good to ensure tests pass.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-master-Linux (64bit/jdk-11) - Build # 123 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-BadApples-master-Linux/123/
Java: 64bit/jdk-11 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

73 tests failed.
FAILED:  org.apache.solr.cloud.SSLMigrationTest.test

Error Message:
Replica didn't have the proper urlScheme in the ClusterState

Stack Trace:
java.lang.AssertionError: Replica didn't have the proper urlScheme in the 
ClusterState
at 
__randomizedtesting.SeedInfo.seed([84F73154BCAA9F7B:CA30E8E1256F283]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.cloud.SSLMigrationTest.assertReplicaInformation(SSLMigrationTest.java:104)
at 
org.apache.solr.cloud.SSLMigrationTest.testMigrateSSL(SSLMigrationTest.java:97)
at org.apache.solr.cloud.SSLMigrationTest.test(SSLMigrationTest.java:61)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1010)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate

[jira] [Assigned] (SOLR-12632) Completely remove Trie fields

2018-11-19 Thread Varun Thacker (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12632?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Varun Thacker reassigned SOLR-12632:


Assignee: (was: Varun Thacker)

> Completely remove Trie fields
> -
>
> Key: SOLR-12632
> URL: https://issues.apache.org/jira/browse/SOLR-12632
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Steve Rowe
>Priority: Blocker
>  Labels: numeric-tries-to-points
> Fix For: master (8.0)
>
>
> Trie fields were deprecated in Solr 7.0.  We should remove them completely 
> before we release Solr 8.0.
> Unresolved points-related issues: 
> [https://jira.apache.org/jira/issues/?jql=project=SOLR+AND+labels=numeric-tries-to-points+AND+resolution=unresolved]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Assigned] (SOLR-12632) Completely remove Trie fields

2018-11-19 Thread Varun Thacker (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12632?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Varun Thacker reassigned SOLR-12632:


Assignee: Varun Thacker

> Completely remove Trie fields
> -
>
> Key: SOLR-12632
> URL: https://issues.apache.org/jira/browse/SOLR-12632
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Steve Rowe
>Assignee: Varun Thacker
>Priority: Blocker
>  Labels: numeric-tries-to-points
> Fix For: master (8.0)
>
>
> Trie fields were deprecated in Solr 7.0.  We should remove them completely 
> before we release Solr 8.0.
> Unresolved points-related issues: 
> [https://jira.apache.org/jira/issues/?jql=project=SOLR+AND+labels=numeric-tries-to-points+AND+resolution=unresolved]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12992) Avoid creating Strings from BytesRef in ExportWriter

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692611#comment-16692611
 ] 

ASF subversion and git services commented on SOLR-12992:


Commit 25bca6f16513fda0bdd2ab670633bac26dbf5d6e in lucene-solr's branch 
refs/heads/master from [~noble.paul]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=25bca6f ]

SOLR-12992: When using binary format, ExportWriter to directly copy BytesRef 
instead of creating new String


> Avoid creating Strings from BytesRef in ExportWriter 
> -
>
> Key: SOLR-12992
> URL: https://issues.apache.org/jira/browse/SOLR-12992
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Noble Paul
>Assignee: Noble Paul
>Priority: Major
> Attachments: SOLR-12992.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12992) Avoid creating Strings from BytesRef in ExportWriter

2018-11-19 Thread Noble Paul (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Noble Paul updated SOLR-12992:
--
Parent Issue: SOLR-12885  (was: SOLR-12924)

> Avoid creating Strings from BytesRef in ExportWriter 
> -
>
> Key: SOLR-12992
> URL: https://issues.apache.org/jira/browse/SOLR-12992
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Noble Paul
>Assignee: Noble Paul
>Priority: Major
> Fix For: 7.7
>
> Attachments: SOLR-12992.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (SOLR-12992) Avoid creating Strings from BytesRef in ExportWriter

2018-11-19 Thread Noble Paul (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Noble Paul resolved SOLR-12992.
---
   Resolution: Fixed
Fix Version/s: 7.7

> Avoid creating Strings from BytesRef in ExportWriter 
> -
>
> Key: SOLR-12992
> URL: https://issues.apache.org/jira/browse/SOLR-12992
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Noble Paul
>Assignee: Noble Paul
>Priority: Major
> Fix For: 7.7
>
> Attachments: SOLR-12992.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12992) Avoid creating Strings from BytesRef in ExportWriter

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692613#comment-16692613
 ] 

ASF subversion and git services commented on SOLR-12992:


Commit a4e95f39be9d7f107b9027eb950775dc59406932 in lucene-solr's branch 
refs/heads/branch_7x from [~noble.paul]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=a4e95f3 ]

SOLR-12992: When using binary format, ExportWriter to directly copy BytesRef 
instead of creating new String


> Avoid creating Strings from BytesRef in ExportWriter 
> -
>
> Key: SOLR-12992
> URL: https://issues.apache.org/jira/browse/SOLR-12992
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Noble Paul
>Assignee: Noble Paul
>Priority: Major
> Attachments: SOLR-12992.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12999) Index replication could delete segments first

2018-11-19 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692606#comment-16692606
 ] 

David Smiley commented on SOLR-12999:
-

As Michael hints at indirectly, the motivating use-case is especially SolrCloud 
when full-replication is needed to get back in sync when a replica starts up 
after it had been down.  This is what I have in mind, any way.  In this 
circumstance, I am hoping and assuming a SolrIndexSearcher would not even be 
open since the replica isn't searchable in this case (right?).

> Index replication could delete segments first
> -
>
> Key: SOLR-12999
> URL: https://issues.apache.org/jira/browse/SOLR-12999
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: replication (java)
>Reporter: David Smiley
>Priority: Major
>
> Index replication could optionally delete files that it knows will not be 
> needed _first_.  This would reduce disk capacity requirements of Solr, and it 
> would reduce some disk fragmentation when space get tight.
> Solr (IndexFetcher) already grabs the remote file list, and it could see 
> which files it has locally, then delete the others.  Today it asks Lucene to 
> {{deleteUnusedFiles}} at the end.  This new mode would only be useful if 
> there is no SolrIndexSearcher open, since it would prevent the removal of 
> files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Solaris (64bit/jdk1.8.0) - Build # 2166 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/2166/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseParallelGC

3 tests failed.
FAILED:  
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica

Error Message:
Expected new active leader null Live Nodes: [127.0.0.1:35070_solr, 
127.0.0.1:52461_solr, 127.0.0.1:56479_solr] Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/13)={
   "pullReplicas":"0",   "replicationFactor":"2",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node4":{   "core":"raceDeleteReplica_false_shard1_replica_n2",
   "base_url":"http://127.0.0.1:62213/solr";,   
"node_name":"127.0.0.1:62213_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   "leader":"true"},  
   "core_node6":{   
"core":"raceDeleteReplica_false_shard1_replica_n5",   
"base_url":"http://127.0.0.1:62213/solr";,   
"node_name":"127.0.0.1:62213_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   
"router":{"name":"compositeId"},   "maxShardsPerNode":"1",   
"autoAddReplicas":"false",   "nrtReplicas":"2",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Expected new active leader
null
Live Nodes: [127.0.0.1:35070_solr, 127.0.0.1:52461_solr, 127.0.0.1:56479_solr]
Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/13)={
  "pullReplicas":"0",
  "replicationFactor":"2",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node4":{
  "core":"raceDeleteReplica_false_shard1_replica_n2",
  "base_url":"http://127.0.0.1:62213/solr";,
  "node_name":"127.0.0.1:62213_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "leader":"true"},
"core_node6":{
  "core":"raceDeleteReplica_false_shard1_replica_n5",
  "base_url":"http://127.0.0.1:62213/solr";,
  "node_name":"127.0.0.1:62213_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"2",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([68A7209D997AB4D9:2B1414DF188FE13]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:280)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:328)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:224)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsea

[jira] [Commented] (SOLR-11501) Depending on the parser, QParser should not parse local-params

2018-11-19 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692602#comment-16692602
 ] 

David Smiley commented on SOLR-11501:
-

I should have stated it can help with keeping Solr secure.  This was on my mind 
when there was the hack involving activating the XMLQParserPlugin.  When the 
developers building the search experience choose a particular query parser to 
parse _user queries_ (be it edismax or whatever), I think it is an unpleasant 
surprise to learn that devious users can switch out the query parser to 
something else.  What I've done in apps before this point is to deliberately 
prepend a space in front of the user's query to avoid such unwanted hacking.

It's true that this can be a breaking change but I thought it was acceptable 
given the recent security incident and that I feel this is a bug, not a feature.

Is there a particular scenario that caused you to stumble over this (be it 
tinkering or more real-world) and could you elaborate?

> Depending on the parser, QParser should not parse local-params
> --
>
> Key: SOLR-11501
> URL: https://issues.apache.org/jira/browse/SOLR-11501
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Reporter: David Smiley
>Assignee: David Smiley
>Priority: Major
> Fix For: 7.2
>
> Attachments: SOLR_11501_limit_local_params_parsing.patch, 
> SOLR_11501_limit_local_params_parsing.patch
>
>
> Solr should not parse local-params (and thus be able to switch the query 
> parser) in certain circumstances.  _Perhaps_ it is when the QParser.getParser 
> is passed "lucene" for the {{defaultParser}}?  This particular approach is 
> just a straw-man; I suspect certain valid embedded queries could no longer 
> work if this is done incorrectly.  Whatever the solution, I don't think we 
> should assume 'q' is special, as it's valid and useful to build up queries 
> containing user input in other ways, e.g. {{q= +field:value +\{!dismax 
> v=$qq\}&qq=user input}}  and we want to protect the user input there 
> similarly from unwelcome query parsing switching.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11501) Depending on the parser, QParser should not parse local-params

2018-11-19 Thread Will Currie (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692601#comment-16692601
 ] 

Will Currie commented on SOLR-11501:


IIRC the origin was this CVE 
[http://mail-archives.apache.org/mod_mbox/lucene-dev/201710.mbox/%3CCAJEmKoC%2BeQdP-E6BKBVDaR_43fRs1A-hOLO3JYuemmUcr1R%2BTA%40mail.gmail.com%3E]

A "little bobby tables" of solr's query language [https://xkcd.com/327/]

The thinking being there's probably more bugs now or in the future where that 
one came from

> unwelcome query parsing switching

> Depending on the parser, QParser should not parse local-params
> --
>
> Key: SOLR-11501
> URL: https://issues.apache.org/jira/browse/SOLR-11501
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Reporter: David Smiley
>Assignee: David Smiley
>Priority: Major
> Fix For: 7.2
>
> Attachments: SOLR_11501_limit_local_params_parsing.patch, 
> SOLR_11501_limit_local_params_parsing.patch
>
>
> Solr should not parse local-params (and thus be able to switch the query 
> parser) in certain circumstances.  _Perhaps_ it is when the QParser.getParser 
> is passed "lucene" for the {{defaultParser}}?  This particular approach is 
> just a straw-man; I suspect certain valid embedded queries could no longer 
> work if this is done incorrectly.  Whatever the solution, I don't think we 
> should assume 'q' is special, as it's valid and useful to build up queries 
> containing user input in other ways, e.g. {{q= +field:value +\{!dismax 
> v=$qq\}&qq=user input}}  and we want to protect the user input there 
> similarly from unwelcome query parsing switching.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk-10.0.1) - Build # 3121 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/3121/
Java: 64bit/jdk-10.0.1 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

6 tests failed.
FAILED:  org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting

Error Message:
Error from server at 
https://127.0.0.1:44469/solr/collection1_shard2_replica_n3: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n3/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n3/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:44469/solr/collection1_shard2_replica_n3: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n3/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n3/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([FA80E1B3B62772BC:3837DDDBB56782C4]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:269)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsea

[JENKINS] Lucene-Solr-7.x-MacOSX (64bit/jdk1.8.0) - Build # 942 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-MacOSX/942/
Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

4 tests failed.
FAILED:  
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica

Error Message:
Expected new active leader null Live Nodes: [127.0.0.1:49688_solr, 
127.0.0.1:49689_solr, 127.0.0.1:49690_solr] Last available state: 
DocCollection(raceDeleteReplica_true//collections/raceDeleteReplica_true/state.json/14)={
   "pullReplicas":"0",   "replicationFactor":"2",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node4":{   "core":"raceDeleteReplica_true_shard1_replica_n2", 
  "base_url":"http://127.0.0.1:49691/solr";,   
"node_name":"127.0.0.1:49691_solr",   "state":"down",   
"type":"NRT",   "leader":"true"}, "core_node6":{   
"core":"raceDeleteReplica_true_shard1_replica_n5",   
"base_url":"http://127.0.0.1:49691/solr";,   
"node_name":"127.0.0.1:49691_solr",   "state":"down",   
"type":"NRT"}, "core_node3":{   
"core":"raceDeleteReplica_true_shard1_replica_n1",   
"base_url":"http://127.0.0.1:49689/solr";,   
"node_name":"127.0.0.1:49689_solr",   "state":"down",   
"type":"NRT",   "router":{"name":"compositeId"},   "maxShardsPerNode":"1",  
 "autoAddReplicas":"false",   "nrtReplicas":"2",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Expected new active leader
null
Live Nodes: [127.0.0.1:49688_solr, 127.0.0.1:49689_solr, 127.0.0.1:49690_solr]
Last available state: 
DocCollection(raceDeleteReplica_true//collections/raceDeleteReplica_true/state.json/14)={
  "pullReplicas":"0",
  "replicationFactor":"2",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node4":{
  "core":"raceDeleteReplica_true_shard1_replica_n2",
  "base_url":"http://127.0.0.1:49691/solr";,
  "node_name":"127.0.0.1:49691_solr",
  "state":"down",
  "type":"NRT",
  "leader":"true"},
"core_node6":{
  "core":"raceDeleteReplica_true_shard1_replica_n5",
  "base_url":"http://127.0.0.1:49691/solr";,
  "node_name":"127.0.0.1:49691_solr",
  "state":"down",
  "type":"NRT"},
"core_node3":{
  "core":"raceDeleteReplica_true_shard1_replica_n1",
  "base_url":"http://127.0.0.1:49689/solr";,
  "node_name":"127.0.0.1:49689_solr",
  "state":"down",
  "type":"NRT",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"2",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([FD8F42273B98DC71:979923F7536A96BB]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:280)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:334)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:229)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carro

[JENKINS] Lucene-Solr-NightlyTests-7.6 - Build # 7 - Still unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.6/7/

1 tests failed.
FAILED:  
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testParallelUpdateQTime

Error Message:
Error from server at http://127.0.0.1:46331/solr/collection1_shard2_replica_n2: 
Expected mime type application/octet-stream but got text/html.   
 
Error 404 Can not find: 
/solr/collection1_shard2_replica_n2/update  HTTP ERROR 
404 Problem accessing /solr/collection1_shard2_replica_n2/update. 
Reason: Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at http://127.0.0.1:46331/solr/collection1_shard2_replica_n2: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n2/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n2/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([1907D5A04E56AB99:F7DFAE3100297E0A]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testParallelUpdateQTime(CloudSolrClientTest.java:146)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:5

[jira] [Comment Edited] (SOLR-11501) Depending on the parser, QParser should not parse local-params

2018-11-19 Thread Joel Bernstein (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692513#comment-16692513
 ] 

Joel Bernstein edited comment on SOLR-11501 at 11/20/18 1:17 AM:
-

I'd like to revisit the reasoning behind this ticket and consider reverting. In 
the description of the ticket it says:

"Solr should not parse local-params (and thus be able to switch the query 
parser) in certain circumstances. "

1) What is the reasoning behind this statement?

2) Why are we breaking backcompat like this in minor release?


was (Author: joel.bernstein):
I'd like to revisit the reasoning behind this ticket and consider reverting. In 
the description of the ticket it says:

"Solr should not parse local-params (and thus be able to switch the query 
parser) in certain circumstances. "

What is the reasoning behind this statement?

> Depending on the parser, QParser should not parse local-params
> --
>
> Key: SOLR-11501
> URL: https://issues.apache.org/jira/browse/SOLR-11501
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Reporter: David Smiley
>Assignee: David Smiley
>Priority: Major
> Fix For: 7.2
>
> Attachments: SOLR_11501_limit_local_params_parsing.patch, 
> SOLR_11501_limit_local_params_parsing.patch
>
>
> Solr should not parse local-params (and thus be able to switch the query 
> parser) in certain circumstances.  _Perhaps_ it is when the QParser.getParser 
> is passed "lucene" for the {{defaultParser}}?  This particular approach is 
> just a straw-man; I suspect certain valid embedded queries could no longer 
> work if this is done incorrectly.  Whatever the solution, I don't think we 
> should assume 'q' is special, as it's valid and useful to build up queries 
> containing user input in other ways, e.g. {{q= +field:value +\{!dismax 
> v=$qq\}&qq=user input}}  and we want to protect the user input there 
> similarly from unwelcome query parsing switching.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11501) Depending on the parser, QParser should not parse local-params

2018-11-19 Thread Joel Bernstein (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692513#comment-16692513
 ] 

Joel Bernstein commented on SOLR-11501:
---

I'd like to revisit the reasoning behind this ticket and consider reverting. In 
the description of the ticket it says:

"Solr should not parse local-params (and thus be able to switch the query 
parser) in certain circumstances. "

What is the reasoning behind this statement?

> Depending on the parser, QParser should not parse local-params
> --
>
> Key: SOLR-11501
> URL: https://issues.apache.org/jira/browse/SOLR-11501
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Reporter: David Smiley
>Assignee: David Smiley
>Priority: Major
> Fix For: 7.2
>
> Attachments: SOLR_11501_limit_local_params_parsing.patch, 
> SOLR_11501_limit_local_params_parsing.patch
>
>
> Solr should not parse local-params (and thus be able to switch the query 
> parser) in certain circumstances.  _Perhaps_ it is when the QParser.getParser 
> is passed "lucene" for the {{defaultParser}}?  This particular approach is 
> just a straw-man; I suspect certain valid embedded queries could no longer 
> work if this is done incorrectly.  Whatever the solution, I don't think we 
> should assume 'q' is special, as it's valid and useful to build up queries 
> containing user input in other ways, e.g. {{q= +field:value +\{!dismax 
> v=$qq\}&qq=user input}}  and we want to protect the user input there 
> similarly from unwelcome query parsing switching.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-11501) Depending on the parser, QParser should not parse local-params

2018-11-19 Thread Joel Bernstein (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692513#comment-16692513
 ] 

Joel Bernstein edited comment on SOLR-11501 at 11/20/18 1:17 AM:
-

I'd like to revisit the reasoning behind this ticket and consider reverting. In 
the description of the ticket it says:

"Solr should not parse local-params (and thus be able to switch the query 
parser) in certain circumstances. "

1) What is the reasoning behind this statement?

2) Why are we breaking backcompat like this in a minor release?


was (Author: joel.bernstein):
I'd like to revisit the reasoning behind this ticket and consider reverting. In 
the description of the ticket it says:

"Solr should not parse local-params (and thus be able to switch the query 
parser) in certain circumstances. "

1) What is the reasoning behind this statement?

2) Why are we breaking backcompat like this in minor release?

> Depending on the parser, QParser should not parse local-params
> --
>
> Key: SOLR-11501
> URL: https://issues.apache.org/jira/browse/SOLR-11501
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Reporter: David Smiley
>Assignee: David Smiley
>Priority: Major
> Fix For: 7.2
>
> Attachments: SOLR_11501_limit_local_params_parsing.patch, 
> SOLR_11501_limit_local_params_parsing.patch
>
>
> Solr should not parse local-params (and thus be able to switch the query 
> parser) in certain circumstances.  _Perhaps_ it is when the QParser.getParser 
> is passed "lucene" for the {{defaultParser}}?  This particular approach is 
> just a straw-man; I suspect certain valid embedded queries could no longer 
> work if this is done incorrectly.  Whatever the solution, I don't think we 
> should assume 'q' is special, as it's valid and useful to build up queries 
> containing user input in other ways, e.g. {{q= +field:value +\{!dismax 
> v=$qq\}&qq=user input}}  and we want to protect the user input there 
> similarly from unwelcome query parsing switching.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8566) Deprecate methods in CustomAnalyzer.Builder which take factory classes

2018-11-19 Thread Tomoko Uchida (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8566?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692515#comment-16692515
 ] 

Tomoko Uchida commented on LUCENE-8566:
---

Are there any feedbacks? :)

I will try to explain some background. (I'm not a fluent English speaker. 
Excuse me if I make mistakes.)
I have recently learned about the concept of Service Provider Interface and 
Java9's Module System utilizes this to decouple the interfaces and 
implementations (though it seems Lucene does not use the JDK's ServiceLoader 
but its own special loader class.) I did not know well about service providers 
until quite recently, but I like the essential concept and feel like that it's 
a good idea to limit the way to look up the factories, those are often provided 
from third party jars, only via names.

I think Lucene users and also Solr should use service names to instantiate the 
factories in the future, but first Lucene itself should use only the names and 
force to its users to use them. So I am considering this is a small step to 
unify the use of the factories.

Just for information, my studies and thoughts was staretd from a comment in 
this issue: https://issues.apache.org/jira/browse/LUCENE-8453

bq. We should also document the SPI name of each factory.


But I do not want to mess up the issue board (may be because of my lack of 
understanding for the details or priorities,) I will close this in a few weeks 
if there are no comments here.

 

> Deprecate methods in CustomAnalyzer.Builder which take factory classes
> --
>
> Key: LUCENE-8566
> URL: https://issues.apache.org/jira/browse/LUCENE-8566
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/analysis
>Reporter: Tomoko Uchida
>Priority: Minor
>
> CustomAnalyzer.Builder has methods which take implementation classes as 
> follows.
>  - withTokenizer(Class factory, String... params)
>  - withTokenizer(Class factory, 
> Map params)
>  - addTokenFilter(Class factory, String... 
> params)
>  - addTokenFilter(Class factory, 
> Map params)
>  - addCharFilter(Class factory, String... params)
>  - addCharFilter(Class factory, 
> Map params)
> Since the builder also has methods which take service names, it seems like 
> that above methods are unnecessary and a little bit misleading. Giving 
> symbolic names is preferable to implementation factory classes, but for now, 
> users can write code depending on implementation classes.
> What do you think about deprecating those methods (adding {{@Deprecated}} 
> annotations) and deleting them in the future releases? Those are called by 
> only test cases so deleting them should have no impact on current lucene/solr 
> codebase.
> If this proposal gains your consent, I will create a patch. (Let me know if I 
> missed some point. I'll close it.)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.6-Linux (64bit/jdk-9.0.4) - Build # 4 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/4/
Java: 64bit/jdk-9.0.4 -XX:-UseCompressedOops -XX:+UseG1GC

13 tests failed.
FAILED:  
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica

Error Message:
Expected new active leader null Live Nodes: [127.0.0.1:34859_solr, 
127.0.0.1:36007_solr, 127.0.0.1:38693_solr] Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/11)={
   "pullReplicas":"0",   "replicationFactor":"2",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node2":{   "core":"raceDeleteReplica_false_shard1_replica_n1",
   "base_url":"http://127.0.0.1:42811/solr";,   
"node_name":"127.0.0.1:42811_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   "leader":"true"},  
   "core_node6":{   
"core":"raceDeleteReplica_false_shard1_replica_n5",   
"base_url":"http://127.0.0.1:42811/solr";,   
"node_name":"127.0.0.1:42811_solr",   "state":"down",   
"type":"NRT",   "force_set_state":"false",   
"router":{"name":"compositeId"},   "maxShardsPerNode":"1",   
"autoAddReplicas":"false",   "nrtReplicas":"2",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Expected new active leader
null
Live Nodes: [127.0.0.1:34859_solr, 127.0.0.1:36007_solr, 127.0.0.1:38693_solr]
Last available state: 
DocCollection(raceDeleteReplica_false//collections/raceDeleteReplica_false/state.json/11)={
  "pullReplicas":"0",
  "replicationFactor":"2",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node2":{
  "core":"raceDeleteReplica_false_shard1_replica_n1",
  "base_url":"http://127.0.0.1:42811/solr";,
  "node_name":"127.0.0.1:42811_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "leader":"true"},
"core_node6":{
  "core":"raceDeleteReplica_false_shard1_replica_n5",
  "base_url":"http://127.0.0.1:42811/solr";,
  "node_name":"127.0.0.1:42811_solr",
  "state":"down",
  "type":"NRT",
  "force_set_state":"false",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"2",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([8D179D416F2264E8:E701FC9107D02E22]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:280)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:334)
at 
org.apache.solr.cloud.DeleteReplicaTest.raceConditionOnDeleteAndRegisterReplica(DeleteReplicaTest.java:230)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.eval

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-9) - Build # 4936 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/4936/
Java: 64bit/jdk-9 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

1 tests failed.
FAILED:  
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned

Error Message:
Error from server at 
https://127.0.0.1:58974/solr/collection1_shard2_replica_n2: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n2/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n2/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:58974/solr/collection1_shard2_replica_n2: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n2/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n2/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([2AFB140BAE4DB508:D23DED3E3D542DC0]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned(CloudSolrClientTest.java:725)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
 

[jira] [Assigned] (SOLR-6595) Improve error response in case distributed collection cmd fails

2018-11-19 Thread Jason Gerlowski (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-6595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jason Gerlowski reassigned SOLR-6595:
-

Assignee: Jason Gerlowski

> Improve error response in case distributed collection cmd fails
> ---
>
> Key: SOLR-6595
> URL: https://issues.apache.org/jira/browse/SOLR-6595
> Project: Solr
>  Issue Type: Bug
>  Components: SolrCloud
>Affects Versions: 4.10
> Environment: SolrCloud with Client SSL
>Reporter: Sindre Fiskaa
>Assignee: Jason Gerlowski
>Priority: Minor
>
> Followed the description 
> https://cwiki.apache.org/confluence/display/solr/Enabling+SSL and generated a 
> self signed key pair. Configured a few solr-nodes and used the collection api 
> to crate a new collection. -I get error message when specify the nodes with 
> the createNodeSet param. When I don't use createNodeSet param the collection 
> gets created without error on random nodes. Could this be a bug related to 
> the createNodeSet param?- *Update: It failed due to what turned out to be 
> invalid client certificate on the overseer, and returned the following 
> response:*
> {code:xml}
> 
>   0 name="QTime">185
>   
> org.apache.solr.client.solrj.SolrServerException:IOException occured 
> when talking to server at: https://vt-searchln04:443/solr
>   
> 
> {code}
> *Update: Three problems:*
> # Status=0 when the cmd did not succeed (only ZK was updated, but cores not 
> created due to failing to connect to shard nodes to talk to core admin API).
> # The error printed does not tell which action failed. Would be helpful to 
> either get the msg from the original exception or at least some message 
> saying "Failed to create core, see log on Overseer 
> # State of collection is not clean since it exists as far as ZK is concerned 
> but cores not created. Thus retrying the CREATECOLLECTION cmd would fail. 
> Should Overseer detect error in distributed cmds and rollback changes already 
> made in ZK?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-6595) Improve error response in case distributed collection cmd fails

2018-11-19 Thread Jason Gerlowski (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-6595?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692329#comment-16692329
 ] 

Jason Gerlowski commented on SOLR-6595:
---

Wanted to check in on this and see which of the original concerns are still 
issues:

bq. Status=0 when the cmd did not succeed
Still a problem, though it will soon be fixed for CREATE, the reporter's 
original example here.

bq. The error printed does not tell which action failed
Still a problem, but a hard one: it's tough to guess which bits in the 
exception chain are the helpful bits.  The top and root of the chain are the 
most likely entries to be interesting, but not always.  Any truncation of the 
exception chain is going to reduce the chance we're conveying the important 
part.

bq. State of collection is not clean since it exists as far as ZK is concerned 
but cores not created
This _should_ have already been fixed in SOLR-8983.

So I'd argue that fixing the {{status}} property should be our main goal.  To 
that end, I've attached a patch fixing this problem for CREATE on SOLR-5970.  I 
don't like the narrowness of that fix though will spend some time seeing if 
there's a way it can be generalized at a different level of our collection API 
processing.  Going to assign this to myself.

> Improve error response in case distributed collection cmd fails
> ---
>
> Key: SOLR-6595
> URL: https://issues.apache.org/jira/browse/SOLR-6595
> Project: Solr
>  Issue Type: Bug
>  Components: SolrCloud
>Affects Versions: 4.10
> Environment: SolrCloud with Client SSL
>Reporter: Sindre Fiskaa
>Priority: Minor
>
> Followed the description 
> https://cwiki.apache.org/confluence/display/solr/Enabling+SSL and generated a 
> self signed key pair. Configured a few solr-nodes and used the collection api 
> to crate a new collection. -I get error message when specify the nodes with 
> the createNodeSet param. When I don't use createNodeSet param the collection 
> gets created without error on random nodes. Could this be a bug related to 
> the createNodeSet param?- *Update: It failed due to what turned out to be 
> invalid client certificate on the overseer, and returned the following 
> response:*
> {code:xml}
> 
>   0 name="QTime">185
>   
> org.apache.solr.client.solrj.SolrServerException:IOException occured 
> when talking to server at: https://vt-searchln04:443/solr
>   
> 
> {code}
> *Update: Three problems:*
> # Status=0 when the cmd did not succeed (only ZK was updated, but cores not 
> created due to failing to connect to shard nodes to talk to core admin API).
> # The error printed does not tell which action failed. Would be helpful to 
> either get the msg from the original exception or at least some message 
> saying "Failed to create core, see log on Overseer 
> # State of collection is not clean since it exists as far as ZK is concerned 
> but cores not created. Thus retrying the CREATECOLLECTION cmd would fail. 
> Should Overseer detect error in distributed cmds and rollback changes already 
> made in ZK?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.6-Windows (64bit/jdk-10.0.1) - Build # 2 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Windows/2/
Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

3 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.TriggerIntegrationTest.testTriggerThrottling

Error Message:
Both triggers should have fired by now

Stack Trace:
java.lang.AssertionError: Both triggers should have fired by now
at 
__randomizedtesting.SeedInfo.seed([6C3DF63E732F6A64:971F5E1BA18589F6]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.autoscaling.TriggerIntegrationTest.testTriggerThrottling(TriggerIntegrationTest.java:270)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:844)


FAILED:  
org.apache.solr.cloud.autoscaling.TriggerIntegrationTest.testTriggerThrottling

Error Message:
Both triggers should have fired by now

Stack Trace:
java.lang.AssertionError: Both triggers should have fired by now
at 
__randomizedtesting.SeedInfo.seed([6C3DF63E732F6A64:971F5E1BA18589F6]:0)
   

[jira] [Commented] (SOLR-6714) Collection RELOAD returns 200 even when some shards fail to reload -- other APIs with similar problems?

2018-11-19 Thread Jason Gerlowski (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-6714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692283#comment-16692283
 ] 

Jason Gerlowski commented on SOLR-6714:
---

Fixing RELOAD's status seems like a specific part of SOLR-6595, which aims to 
make "status" more meaningful across our Collection Admin APIs as a whole.

> Collection RELOAD returns 200 even when some shards fail to reload -- other 
> APIs with similar problems?
> ---
>
> Key: SOLR-6714
> URL: https://issues.apache.org/jira/browse/SOLR-6714
> Project: Solr
>  Issue Type: Bug
>  Components: SolrCloud
>Reporter: Hoss Man
>Priority: Major
>
> Using 4.10.2, if you startup a simple 2 node cloud with...
> {noformat}
> ./bin/solr start -e cloud -noprompt
> {noformat}
> And then try to force a situation where a replica is hozed like this...
> {noformat}
> rm -rf node1/solr/gettingstarted_shard1_replica1/*
> chmod a-rw node1/solr/gettingstarted_shard1_replica1
> {noformat}
> The result of a Collection RELOAD command is still a success...
> {noformat}
> curl -sS -D - 
> 'http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted'
> HTTP/1.1 200 OK
> Content-Type: application/xml; charset=UTF-8
> Transfer-Encoding: chunked
> 
> 
> 0 name="QTime">1866 name="127.0.1.1:8983_solr">org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException:Error
>  handling 'reload' action name="127.0.1.1:8983_solr"> name="status">01631 name="127.0.1.1:7574_solr"> name="status">01710 name="127.0.1.1:7574_solr"> name="status">01795
> 
> {noformat}
> The HTTP stats code of collection level APIs should not be 200 if any of the 
> underlying requests that it depends on result in 4xx or 5xx errors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12746) Ref Guide HTML output should adhere to more standard HTML5

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692261#comment-16692261
 ] 

ASF subversion and git services commented on SOLR-12746:


Commit bbe9511894340df14b938565cf3575540ca2f21b in lucene-solr's branch 
refs/heads/branch_7_6 from [~ctargett]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=bbe9511 ]

SOLR-12746: Fix formatting for callout list numbers and preamble sections


> Ref Guide HTML output should adhere to more standard HTML5
> --
>
> Key: SOLR-12746
> URL: https://issues.apache.org/jira/browse/SOLR-12746
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Cassandra Targett
>Assignee: Cassandra Targett
>Priority: Major
> Fix For: 7.6, master (8.0)
>
>
> The default HTML produced by Jekyll/Asciidoctor adds a lot of extra {{}} 
> tags to the content which break up our content into very small chunks. This 
> is acceptable to a casual website reader as far as it goes, but any Reader 
> view in a browser or another type of content extraction system that uses a 
> similar "readability" scoring algorithm is going to either miss a lot of 
> content or fail to display the page entirely.
> To see what I mean, take a page like 
> https://lucene.apache.org/solr/guide/7_4/language-analysis.html and enable 
> Reader View in your browser (I used Firefox; Steve Rowe told me offline 
> Safari would not even offer the option on the page for him). You will notice 
> a lot of missing content. It's almost like someone selected sentences at 
> random.
> Asciidoctor has a long-standing issue to provide a better more 
> semantic-oriented HTML5 output, but it has not been resolved yet: 
> https://github.com/asciidoctor/asciidoctor/issues/242
> Asciidoctor does provide a way to override the default output templates by 
> providing your own in Slim, HAML, ERB or any other template language 
> supported by Tilt (none of which I know yet). There are some samples 
> available via the Asciidoctor project which we can borrow, but it's otherwise 
> unknown as of yet what parts of the output are causing the worst of the 
> problems. This issue is to explore how to fix it to improve this part of the 
> HTML reading experience.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12746) Ref Guide HTML output should adhere to more standard HTML5

2018-11-19 Thread Cassandra Targett (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692271#comment-16692271
 ] 

Cassandra Targett commented on SOLR-12746:
--

The change to remove dependence on the {{asciidoctor-html5s}} gem broke the 
ability for numbers in callout lists to be formatted properly. The most recent 
commit before this comment fixes those. 

I also noticed I had missed the formatting of "preamble" or "lead" sections on 
each page - it's supposed to be slightly larger. I added the new class/element 
structure to the CSS definitions and that fixed them.

> Ref Guide HTML output should adhere to more standard HTML5
> --
>
> Key: SOLR-12746
> URL: https://issues.apache.org/jira/browse/SOLR-12746
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Cassandra Targett
>Assignee: Cassandra Targett
>Priority: Major
> Fix For: 7.6, master (8.0)
>
>
> The default HTML produced by Jekyll/Asciidoctor adds a lot of extra {{}} 
> tags to the content which break up our content into very small chunks. This 
> is acceptable to a casual website reader as far as it goes, but any Reader 
> view in a browser or another type of content extraction system that uses a 
> similar "readability" scoring algorithm is going to either miss a lot of 
> content or fail to display the page entirely.
> To see what I mean, take a page like 
> https://lucene.apache.org/solr/guide/7_4/language-analysis.html and enable 
> Reader View in your browser (I used Firefox; Steve Rowe told me offline 
> Safari would not even offer the option on the page for him). You will notice 
> a lot of missing content. It's almost like someone selected sentences at 
> random.
> Asciidoctor has a long-standing issue to provide a better more 
> semantic-oriented HTML5 output, but it has not been resolved yet: 
> https://github.com/asciidoctor/asciidoctor/issues/242
> Asciidoctor does provide a way to override the default output templates by 
> providing your own in Slim, HAML, ERB or any other template language 
> supported by Tilt (none of which I know yet). There are some samples 
> available via the Asciidoctor project which we can borrow, but it's otherwise 
> unknown as of yet what parts of the output are causing the worst of the 
> problems. This issue is to explore how to fix it to improve this part of the 
> HTML reading experience.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12746) Ref Guide HTML output should adhere to more standard HTML5

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692259#comment-16692259
 ] 

ASF subversion and git services commented on SOLR-12746:


Commit d37554e88ecfab2c274bbfd185856b8f737ea350 in lucene-solr's branch 
refs/heads/branch_7x from [~ctargett]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=d37554e ]

SOLR-12746: Fix formatting for callout list numbers and preamble sections


> Ref Guide HTML output should adhere to more standard HTML5
> --
>
> Key: SOLR-12746
> URL: https://issues.apache.org/jira/browse/SOLR-12746
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Cassandra Targett
>Assignee: Cassandra Targett
>Priority: Major
> Fix For: 7.6, master (8.0)
>
>
> The default HTML produced by Jekyll/Asciidoctor adds a lot of extra {{}} 
> tags to the content which break up our content into very small chunks. This 
> is acceptable to a casual website reader as far as it goes, but any Reader 
> view in a browser or another type of content extraction system that uses a 
> similar "readability" scoring algorithm is going to either miss a lot of 
> content or fail to display the page entirely.
> To see what I mean, take a page like 
> https://lucene.apache.org/solr/guide/7_4/language-analysis.html and enable 
> Reader View in your browser (I used Firefox; Steve Rowe told me offline 
> Safari would not even offer the option on the page for him). You will notice 
> a lot of missing content. It's almost like someone selected sentences at 
> random.
> Asciidoctor has a long-standing issue to provide a better more 
> semantic-oriented HTML5 output, but it has not been resolved yet: 
> https://github.com/asciidoctor/asciidoctor/issues/242
> Asciidoctor does provide a way to override the default output templates by 
> providing your own in Slim, HAML, ERB or any other template language 
> supported by Tilt (none of which I know yet). There are some samples 
> available via the Asciidoctor project which we can borrow, but it's otherwise 
> unknown as of yet what parts of the output are causing the worst of the 
> problems. This issue is to explore how to fix it to improve this part of the 
> HTML reading experience.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12746) Ref Guide HTML output should adhere to more standard HTML5

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692256#comment-16692256
 ] 

ASF subversion and git services commented on SOLR-12746:


Commit 4efaecac34159ee1b718c548530d20d80c1fb4bf in lucene-solr's branch 
refs/heads/master from [~ctargett]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=4efaeca ]

SOLR-12746: Fix formatting for callout list numbers and preamble sections


> Ref Guide HTML output should adhere to more standard HTML5
> --
>
> Key: SOLR-12746
> URL: https://issues.apache.org/jira/browse/SOLR-12746
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Cassandra Targett
>Assignee: Cassandra Targett
>Priority: Major
> Fix For: 7.6, master (8.0)
>
>
> The default HTML produced by Jekyll/Asciidoctor adds a lot of extra {{}} 
> tags to the content which break up our content into very small chunks. This 
> is acceptable to a casual website reader as far as it goes, but any Reader 
> view in a browser or another type of content extraction system that uses a 
> similar "readability" scoring algorithm is going to either miss a lot of 
> content or fail to display the page entirely.
> To see what I mean, take a page like 
> https://lucene.apache.org/solr/guide/7_4/language-analysis.html and enable 
> Reader View in your browser (I used Firefox; Steve Rowe told me offline 
> Safari would not even offer the option on the page for him). You will notice 
> a lot of missing content. It's almost like someone selected sentences at 
> random.
> Asciidoctor has a long-standing issue to provide a better more 
> semantic-oriented HTML5 output, but it has not been resolved yet: 
> https://github.com/asciidoctor/asciidoctor/issues/242
> Asciidoctor does provide a way to override the default output templates by 
> providing your own in Slim, HAML, ERB or any other template language 
> supported by Tilt (none of which I know yet). There are some samples 
> available via the Asciidoctor project which we can borrow, but it's otherwise 
> unknown as of yet what parts of the output are causing the worst of the 
> problems. This issue is to explore how to fix it to improve this part of the 
> HTML reading experience.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1701 - Still unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1701/

1 tests failed.
FAILED:  org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting

Error Message:
Error from server at 
https://127.0.0.1:42197/solr/collection1_shard2_replica_n3: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n3/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n3/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:42197/solr/collection1_shard2_replica_n3: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n3/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n3/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([81050F4FF2CF67A5:43B23327F18F97DD]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:238)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java

[jira] [Commented] (SOLR-12994) search engine for my site just as page title lookup in solr reference.

2018-11-19 Thread Cassandra Targett (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12994?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692224#comment-16692224
 ] 

Cassandra Targett commented on SOLR-12994:
--

This doesn't need to be a private issue, those are reserved for security 
vulnerabilities while the Lucene/Solr PMC figures out how to address those 
types of issues in Lucene and Solr.

However, the question does not report a bug with Solr itself, so the question 
is better suited for the solr-user mailing list. You can find information about 
subscribing here: 
https://lucene.apache.org/solr/community.html#mailing-lists-irc.

As a short answer, however, that is not a search box as you are thinking of it. 
That is a page title lookup that finds matching keywords in a JSON file that's 
built when the Ref Guide's static HTML pages are built. It requires Jekyll and 
Liquid templates to duplicate it, but the source code is available in Solr's 
source: 
https://git1-us-west.apache.org/repos/asf?p=lucene-solr.git;a=tree;f=solr/solr-ref-guide;hb=refs/heads/master.

However, I think you will be better served by asking other Solr users how they 
have integrated search into their sites by emailing your question to the 
solr-user mailing list.

> search engine for my site just as page title lookup in solr reference.
> --
>
> Key: SOLR-12994
> URL: https://issues.apache.org/jira/browse/SOLR-12994
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: JSON Request API, search, SolrCloud
>Affects Versions: 7.5
> Environment: I am working on mac.
>Reporter: Rohan
>Priority: Major
>  Labels: features
> Attachments: Screenshot 2018-11-16 at 5.56.12 PM.png
>
>
> I need to create a search engine for my site which has many different 
> insights with descriptions in it. It should be something like the search box 
> provided by solr in the reference website. I need some help to guide me 
> through the different steps. I don't understand how it is woking because i 
> have uploaded some json docs to my core and performed a query, but i am not 
> getting any results. 
>  
> This is the link i was talking about-- 
> https://lucene.apache.org/solr/guide/7_2/about-this-guide.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (SOLR-12994) search engine for my site just as page title lookup in solr reference.

2018-11-19 Thread Cassandra Targett (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cassandra Targett resolved SOLR-12994.
--
Resolution: Invalid

> search engine for my site just as page title lookup in solr reference.
> --
>
> Key: SOLR-12994
> URL: https://issues.apache.org/jira/browse/SOLR-12994
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: JSON Request API, search, SolrCloud
>Affects Versions: 7.5
> Environment: I am working on mac.
>Reporter: Rohan
>Priority: Major
>  Labels: features
> Attachments: Screenshot 2018-11-16 at 5.56.12 PM.png
>
>
> I need to create a search engine for my site which has many different 
> insights with descriptions in it. It should be something like the search box 
> provided by solr in the reference website. I need some help to guide me 
> through the different steps. I don't understand how it is woking because i 
> have uploaded some json docs to my core and performed a query, but i am not 
> getting any results. 
>  
> This is the link i was talking about-- 
> https://lucene.apache.org/solr/guide/7_2/about-this-guide.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12994) search engine for my site just as page title lookup in solr reference.

2018-11-19 Thread Cassandra Targett (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cassandra Targett updated SOLR-12994:
-
 Security: Public  (was: Private (Security Issue))
Fix Version/s: (was: 7.5)

> search engine for my site just as page title lookup in solr reference.
> --
>
> Key: SOLR-12994
> URL: https://issues.apache.org/jira/browse/SOLR-12994
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: JSON Request API, search, SolrCloud
>Affects Versions: 7.5
> Environment: I am working on mac.
>Reporter: Rohan
>Priority: Major
>  Labels: features
> Attachments: Screenshot 2018-11-16 at 5.56.12 PM.png
>
>
> I need to create a search engine for my site which has many different 
> insights with descriptions in it. It should be something like the search box 
> provided by solr in the reference website. I need some help to guide me 
> through the different steps. I don't understand how it is woking because i 
> have uploaded some json docs to my core and performed a query, but i am not 
> getting any results. 
>  
> This is the link i was talking about-- 
> https://lucene.apache.org/solr/guide/7_2/about-this-guide.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Windows (64bit/jdk-10.0.1) - Build # 7628 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/7628/
Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

4 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.TriggerSetPropertiesIntegrationTest.testSetProperties

Error Message:


Stack Trace:
java.lang.AssertionError
at 
__randomizedtesting.SeedInfo.seed([CA537A75A1F8FBEB:A137AD3C12D56E6F]:0)
at org.junit.Assert.fail(Assert.java:92)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertTrue(Assert.java:54)
at 
org.apache.solr.cloud.autoscaling.TriggerSetPropertiesIntegrationTest.testSetProperties(TriggerSetPropertiesIntegrationTest.java:111)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:844)


FAILED:  
org.apache.solr.cloud.autoscaling.TriggerSetPropertiesIntegrationTest.testSetProperties

Error Message:


Stack Trace:
java.lang.AssertionError
at 
__randomizedtesting.SeedInfo.seed([CA537A75A1F8FBEB:A137AD3C12D56E6F]:0)
   

[JENKINS] Lucene-Solr-SmokeRelease-7.x - Build # 378 - Still Failing

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/378/

No tests ran.

Build Log:
[...truncated 23447 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2440 links (1991 relative) to 3217 anchors in 247 files
 [echo] Validated Links & Anchors via: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/package/changes

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/package/solr-7.7.0.tgz
 into 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:c

[jira] [Resolved] (SOLR-12981) Better support JSON faceting responses in SolrJ

2018-11-19 Thread Jason Gerlowski (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12981?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jason Gerlowski resolved SOLR-12981.

   Resolution: Fixed
Fix Version/s: 7.7
   master (8.0)

> Better support JSON faceting responses in SolrJ
> ---
>
> Key: SOLR-12981
> URL: https://issues.apache.org/jira/browse/SOLR-12981
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: clients - java, SolrJ
>Affects Versions: 7.5, master (8.0)
>Reporter: Jason Gerlowski
>Assignee: Jason Gerlowski
>Priority: Major
> Fix For: master (8.0), 7.7
>
> Attachments: SOLR-12981.patch, SOLR-12981.patch
>
>
> SOLR-12947 created JsonQueryRequest to make using the JSON request API easier 
> in SolrJ.  SOLR-12965 is adding faceting support to this request object.  
> This subtask of SOLR-12965 involves providing a way to parse the JSON 
> faceting responses into easy-to-use SolrJ objects.  
> Currently the only option for users is to manipulate the underlying NamedList 
> directly.  We should create a "JsonFacetingResponse" in the model of 
> ClusteringResponse, SuggesterResponse, TermsResponse, etc. and add an 
> accessor to {{QueryResponse}} for getting at the faceting results.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (SOLR-12965) Add JSON faceting support to SolrJ

2018-11-19 Thread Jason Gerlowski (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12965?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jason Gerlowski resolved SOLR-12965.

   Resolution: Fixed
Fix Version/s: 7.7
   master (8.0)

> Add JSON faceting support to SolrJ
> --
>
> Key: SOLR-12965
> URL: https://issues.apache.org/jira/browse/SOLR-12965
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: clients - java, SolrJ
>Affects Versions: 7.5, master (8.0)
>Reporter: Jason Gerlowski
>Assignee: Jason Gerlowski
>Priority: Major
> Fix For: master (8.0), 7.7
>
> Attachments: SOLR-12965.patch, SOLR-12965.patch, SOLR-12965.patch, 
> SOLR-12965.patch
>
>
> SOLR-12947 created {{JsonQueryRequest}}, a SolrJ class that makes it easier 
> for users to make JSON-api requests in their Java/SolrJ code.  Currently this 
> class is missing any sort of faceting capabilities (I'd held off on adding 
> this as a part of SOLR-12947 just to keep the issues smaller).
> This JIRA covers adding that missing faceting capability.
> There's a few ways we could handle it, but my first attempt at adding 
> faceting support will probably have users specify a Map for 
> each facet that they wish to add, similar to how complex queries were 
> supported in SOLR-12947.  This approach has some pros and cons:
> The benefit is how general the approach is- our interface stays resilient to 
> any future changes to the syntax of the JSON API, and users can build facets 
> that I'd never thought to explicitly test.  The downside is that this doesn't 
> offer much abstraction for users who are unfamiliar with our JSON syntax- 
> they still have to know the JSON "schema" to build a map representing their 
> facet.  But in practice we can probably mitigate this downside by providing 
> "facet builders" or some other helper classes to provide this abstraction in 
> the common case.
> Hope to have a skeleton patch up soon.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-7.6-Linux (64bit/jdk-12-ea+12) - Build # 3 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/3/
Java: 64bit/jdk-12-ea+12 -XX:-UseCompressedOops -XX:+UseSerialGC

52 tests failed.
FAILED:  org.apache.solr.client.solrj.io.sql.JdbcTest.testMapReduceAggregation

Error Message:
java.sql.SQLException: java.io.IOException: --> 
https://127.0.0.1:33041/solr/collection1_collection_shard2_replica_n2/:Failed 
to execute sqlQuery 'select a_s, sum(a_f) from collection1 group by a_s order 
by sum(a_f) desc' against JDBC connection 'jdbc:calcitesolr:'. Error while 
executing SQL "select a_s, sum(a_f) from collection1 group by a_s order by 
sum(a_f) desc": java.io.IOException: java.util.concurrent.ExecutionException: 
java.io.IOException: --> 
https://127.0.0.1:33041/solr/collection1_collection_shard2_replica_n2/:java.util.concurrent.ExecutionException:
 java.io.IOException: params 
fl=a_s,a_f&q=*:*&wt=javabin&qt=/export&partitionKeys=a_s&sort=a_s+desc&distrib=false

Stack Trace:
java.sql.SQLException: java.sql.SQLException: java.io.IOException: --> 
https://127.0.0.1:33041/solr/collection1_collection_shard2_replica_n2/:Failed 
to execute sqlQuery 'select a_s, sum(a_f) from collection1 group by a_s order 
by sum(a_f) desc' against JDBC connection 'jdbc:calcitesolr:'.
Error while executing SQL "select a_s, sum(a_f) from collection1 group by a_s 
order by sum(a_f) desc": java.io.IOException: 
java.util.concurrent.ExecutionException: java.io.IOException: --> 
https://127.0.0.1:33041/solr/collection1_collection_shard2_replica_n2/:java.util.concurrent.ExecutionException:
 java.io.IOException: params 
fl=a_s,a_f&q=*:*&wt=javabin&qt=/export&partitionKeys=a_s&sort=a_s+desc&distrib=false
at 
__randomizedtesting.SeedInfo.seed([D89BEBAFD54A081C:4B37B94DC73400C]:0)
at 
org.apache.solr.client.solrj.io.sql.StatementImpl.executeQueryImpl(StatementImpl.java:74)
at 
org.apache.solr.client.solrj.io.sql.StatementImpl.executeQuery(StatementImpl.java:111)
at 
org.apache.solr.client.solrj.io.sql.JdbcTest.testMapReduceAggregation(JdbcTest.java:246)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadow

[jira] [Commented] (SOLR-12999) Index replication could delete segments first

2018-11-19 Thread Michael Braun (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692096#comment-16692096
 ] 

Michael Braun commented on SOLR-12999:
--

On my old project we forked IndexFetcher to do something similar - when full 
replication was called for (so we'd be deleting every single file once it was 
done), we pointed the index to a newly-created dummy directory and made sure 
the Searcher was off the existing one before a cleanup was triggered on the 
original index directory, which would cause the underlying now-unneeded files 
to be deleted. 

[~jason.j.b...@gmail.com] can probably add more to this.

> Index replication could delete segments first
> -
>
> Key: SOLR-12999
> URL: https://issues.apache.org/jira/browse/SOLR-12999
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: replication (java)
>Reporter: David Smiley
>Priority: Major
>
> Index replication could optionally delete files that it knows will not be 
> needed _first_.  This would reduce disk capacity requirements of Solr, and it 
> would reduce some disk fragmentation when space get tight.
> Solr (IndexFetcher) already grabs the remote file list, and it could see 
> which files it has locally, then delete the others.  Today it asks Lucene to 
> {{deleteUnusedFiles}} at the end.  This new mode would only be useful if 
> there is no SolrIndexSearcher open, since it would prevent the removal of 
> files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8562) Speed up merging segments of points with data dimensions

2018-11-19 Thread Adrien Grand (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692087#comment-16692087
 ] 

Adrien Grand commented on LUCENE-8562:
--

+1 to not sort data dimensions. Do we really need the call to {{PathSlice 
dataSlice = switchToHeap(slices[0], toCloseHeroically)}}, it should already 
have been called earlier? For simplicity maybe we should only compute the 
common prefix length on data dimensions and not even try to use data dimensions 
as a "sortedDim"?



> Speed up merging segments of points with data dimensions
> 
>
> Key: LUCENE-8562
> URL: https://issues.apache.org/jira/browse/LUCENE-8562
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/index
>Affects Versions: master (8.0), 7.7
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8562.patch
>
>
> Currently when merging segments of points with data dimensions, all 
> dimensions are sorted and carried over down the tree even though only 
> indexing dimensions are needed to build the BKD tree. This is needed so leaf 
> node data can be compressed by common prefix.
> But when using _MutablePointValues_, this ordering is done at the leaf level 
> so we can se a similar approach from data dimensions and delay the sorting at 
> leaf level. This seems to speed up indexing time as well as reduce the 
> storage needed for building the index.
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12998) Race condition between requery recovery and the core load completing

2018-11-19 Thread Varun Thacker (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12998?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Varun Thacker updated SOLR-12998:
-
Description: 
I started Solr ( 7_6 branch ) using {{bin/solr start -e cloud -noprompt}} and 
indexed 25M documents which amounts to roughly 1GB per replica in raw disk 
space.

 

Whenever I restart the cluster, node2 ( 7574 ) get's these errors in the logs 
which seems to indicate that request recovery is trying to find a core which 
isn't completely loaded yet.

 
{code:java}
2018-11-18 21:06:09.845 INFO  
(coreLoadExecutor-9-thread-2-processing-n:192.168.0.7:7574_solr) 
[c:gettingstarted s:shard2 r:core_node8 x:gettingstarted_shard2_replica_n6] 
o.a.s.c.SolrCore [[gettingstarted_shard2_replica_n6] ] Opening new SolrCore at 
[/Users/varunthacker/apache-work/lucene-solr/solr/example/cloud/node2/solr/gettingstarted_shard2_replica_n6],
 
dataDir=[/Users/varunthacker/apache-work/lucene-solr/solr/example/cloud/node2/solr/gettingstarted_shard2_replica_n6/data/]
...
2018-11-18 21:06:10.501 INFO  (qtp690521419-148) [   
x:gettingstarted_shard2_replica_n6] o.a.s.h.a.CoreAdminOperation It has been 
requested that we recover: core=gettingstarted_shard2_replica_n6
...
2018-11-18 21:06:10.503 ERROR (qtp690521419-141) [   
x:gettingstarted_shard1_replica_n1] o.a.s.h.RequestHandlerBase 
org.apache.solr.common.SolrException: Unable to locate core 
gettingstarted_shard1_replica_n1
    at 
org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$5(CoreAdminOperation.java:167)
    at 
org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:360)
    at 
org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:395)
    at 
org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:180)
...
2018-11-18 21:06:10.503 INFO  (qtp690521419-148) [   
x:gettingstarted_shard2_replica_n6] o.a.s.s.HttpSolrCall [admin] webapp=null 
path=/admin/cores 
params={core=gettingstarted_shard2_replica_n6&action=REQUESTRECOVERY&wt=javabin&version=2}
 status=400 QTime=3
... still the core is loading
2018-11-18 21:06:10.876 INFO  
(coreLoadExecutor-9-thread-2-processing-n:192.168.0.7:7574_solr) 
[c:gettingstarted s:shard2 r:core_node8 x:gettingstarted_shard2_replica_n6] 
o.a.s.s.SolrIndexSearcher Opening 
[Searcher@2d17168a[gettingstarted_shard2_replica_n6] main]
... now done
2018-11-18 21:06:10.953 INFO  
(searcherExecutor-10-thread-1-processing-n:192.168.0.7:7574_solr 
x:gettingstarted_shard2_replica_n6 c:gettingstarted s:shard2 r:core_node8) 
[c:gettingstarted s:shard2 r:core_node8 x:gettingstarted_shard2_replica_n6] 
o.a.s.c.SolrCore [gettingstarted_shard2_replica_n6] Registered new searcher 
Searcher@2d17168a[gettingstarted_shard2_replica_n6] 
main{ExitableDirectoryReader(UninvertingDirectoryReader(_49(7.6.0):C2416916/4913:delGen=2
 _a4(7.6.0):C2474322 _71(7.6.0):C2723220 _4k(7.6.0):c111317 _d7(7.6.0):C2501101 
_bt(7.6.0):c91922 _di(7.6.0):c339286 _eb(7.6.0):c163244 _ds(7.6.0):c92780 
_e1(7.6.0):c337329 _em(7.6.0):c360003 _ew(7.6.0):c263533 _fg(7.6.0):c226118 
_f6(7.6.0):c266132 _es(7.6.0):C2361 _f8(7.6.0):C2251 _ff(7.6.0):C1821 
_fh(7.6.0):C20079 _fi(7.6.0):C35733 _fj(7.6.0):C37203 _fk(7.6.0):C34780 
_fl(7.6.0):C262 _fm(7.6.0):C2148 _fn(7.6.0):C2503))}{code}

  was:
I started Solr ( 7_6 branch ) using {{bin/solr start -e cloud -noprompt}} and 
indexed 25M documents which amounts to roughly 1GB per replica in raw disk 
space.

 

Whenever I restart the cluster node2 ( 7574 ) get's these errors in the logs 
which seem to indicate that request recovery is trying to find a core which 
isn't completely loaded yet.

 
{code:java}
2018-11-18 21:06:09.845 INFO  
(coreLoadExecutor-9-thread-2-processing-n:192.168.0.7:7574_solr) 
[c:gettingstarted s:shard2 r:core_node8 x:gettingstarted_shard2_replica_n6] 
o.a.s.c.SolrCore [[gettingstarted_shard2_replica_n6] ] Opening new SolrCore at 
[/Users/varunthacker/apache-work/lucene-solr/solr/example/cloud/node2/solr/gettingstarted_shard2_replica_n6],
 
dataDir=[/Users/varunthacker/apache-work/lucene-solr/solr/example/cloud/node2/solr/gettingstarted_shard2_replica_n6/data/]
...
2018-11-18 21:06:10.501 INFO  (qtp690521419-148) [   
x:gettingstarted_shard2_replica_n6] o.a.s.h.a.CoreAdminOperation It has been 
requested that we recover: core=gettingstarted_shard2_replica_n6
...
2018-11-18 21:06:10.503 ERROR (qtp690521419-141) [   
x:gettingstarted_shard1_replica_n1] o.a.s.h.RequestHandlerBase 
org.apache.solr.common.SolrException: Unable to locate core 
gettingstarted_shard1_replica_n1
    at 
org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$5(CoreAdminOperation.java:167)
    at 
org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:360)
    at 
org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:395)
    at 
org.apache.solr.handler.a

[JENKINS-EA] Lucene-Solr-master-Linux (64bit/jdk-12-ea+12) - Build # 23237 - Still unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23237/
Java: 64bit/jdk-12-ea+12 -XX:-UseCompressedOops -XX:+UseParallelGC

48 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.TestCloudRecovery

Error Message:
Could not find collection:collection1

Stack Trace:
java.lang.AssertionError: Could not find collection:collection1
at __randomizedtesting.SeedInfo.seed([8A4D8272A6C4C0BC]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertNotNull(Assert.java:526)
at 
org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:155)
at 
org.apache.solr.cloud.TestCloudRecovery.setupCluster(TestCloudRecovery.java:70)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:875)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:835)


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.TestCloudRecovery

Error Message:
Could not find collection:collection1

Stack Trace:
java.lang.AssertionError: Could not find collection:collection1
at __randomizedtesting.SeedInfo.seed([8A4D8272A6C4C0BC]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertNotNull(Assert.java:526)
at 
org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:155)
at 
org.apache.solr.cloud.TestCloudRecovery.setupCluster(TestCloudRecovery.java:70)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:875)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evalua

[jira] [Commented] (SOLR-12999) Index replication could delete segments first

2018-11-19 Thread Shawn Heisey (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16692005#comment-16692005
 ] 

Shawn Heisey commented on SOLR-12999:
-

As noted by Erick these statements should be true:

On *nix, Lucene/Solr can successfully delete any file, but until the current 
searcher closes them, the space won't be released.  On Windows, trying to 
delete files that a searcher has open will fail.

I like the idea, but for these reasons, I don't think it will actually work.


> Index replication could delete segments first
> -
>
> Key: SOLR-12999
> URL: https://issues.apache.org/jira/browse/SOLR-12999
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: replication (java)
>Reporter: David Smiley
>Priority: Major
>
> Index replication could optionally delete files that it knows will not be 
> needed _first_.  This would reduce disk capacity requirements of Solr, and it 
> would reduce some disk fragmentation when space get tight.
> Solr (IndexFetcher) already grabs the remote file list, and it could see 
> which files it has locally, then delete the others.  Today it asks Lucene to 
> {{deleteUnusedFiles}} at the end.  This new mode would only be useful if 
> there is no SolrIndexSearcher open, since it would prevent the removal of 
> files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #460: LUCENE-8497: refactor multi-term analysis han...

2018-11-19 Thread mayya-sharipova
Github user mayya-sharipova closed the pull request at:

https://github.com/apache/lucene-solr/pull/460


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Solaris (64bit/jdk1.8.0) - Build # 2165 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/2165/
Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseG1GC

4 tests failed.
FAILED:  
org.apache.lucene.index.TestIndexWriterWithThreads.testIOExceptionDuringWriteSegmentWithThreadsOnlyOnce

Error Message:
MockDirectoryWrapper: cannot close: there are still 14 open files: {_7.cfs=1, 
_c.cfs=1, _2.cfs=1, _1.cfs=1, _6_Lucene70_0.dvm=1, _0.cfs=1, _5.cfs=1, 
_9.cfs=1, _6_Lucene70_0.dvd=1, _a.cfs=1, _b.cfs=1, _3.cfs=1, _4.cfs=1, _8.cfs=1}

Stack Trace:
java.lang.RuntimeException: MockDirectoryWrapper: cannot close: there are still 
14 open files: {_7.cfs=1, _c.cfs=1, _2.cfs=1, _1.cfs=1, _6_Lucene70_0.dvm=1, 
_0.cfs=1, _5.cfs=1, _9.cfs=1, _6_Lucene70_0.dvd=1, _a.cfs=1, _b.cfs=1, 
_3.cfs=1, _4.cfs=1, _8.cfs=1}
at 
__randomizedtesting.SeedInfo.seed([1614CC7A047AE2D6:5A7F4DBCC2DAC133]:0)
at 
org.apache.lucene.store.MockDirectoryWrapper.close(MockDirectoryWrapper.java:838)
at 
org.apache.lucene.index.TestIndexWriterWithThreads._testMultipleThreadsFailure(TestIndexWriterWithThreads.java:341)
at 
org.apache.lucene.index.TestIndexWriterWithThreads.testIOExceptionDuringWriteSegmentWithThreadsOnlyOnce(TestIndexWriterWithThreads.java:507)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: unclo

[jira] [Commented] (SOLR-12999) Index replication could delete segments first

2018-11-19 Thread Erick Erickson (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691897#comment-16691897
 ] 

Erick Erickson commented on SOLR-12999:
---

I'm a little lost here. What's the definition of "knows will not be needed"? If 
it's a list of segments locally that are not present on the leader and 
therefore won't be necessary this seems fraught.

> if a new searcher is opened for any reason, wouldn't it fail? Restarts, rogue 
> commits, whatever.

> Wouldn't the files still  be there on *nix systems if a searcher had them 
> open?

> On Windows systems, if a searcher has the files open would the delete fail in 
> the first place?

I know you know Lucene details far better than me, so I suspect I'm totally 
missing the point.

> Index replication could delete segments first
> -
>
> Key: SOLR-12999
> URL: https://issues.apache.org/jira/browse/SOLR-12999
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: replication (java)
>Reporter: David Smiley
>Priority: Major
>
> Index replication could optionally delete files that it knows will not be 
> needed _first_.  This would reduce disk capacity requirements of Solr, and it 
> would reduce some disk fragmentation when space get tight.
> Solr (IndexFetcher) already grabs the remote file list, and it could see 
> which files it has locally, then delete the others.  Today it asks Lucene to 
> {{deleteUnusedFiles}} at the end.  This new mode would only be useful if 
> there is no SolrIndexSearcher open, since it would prevent the removal of 
> files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8509) NGramTokenizer, TrimFilter and WordDelimiterGraphFilter in combination can produce backwards offsets

2018-11-19 Thread Alan Woodward (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691884#comment-16691884
 ] 

Alan Woodward commented on LUCENE-8509:
---

Here's an updated patch that allows you to conditionally disable WDGF's 
offset-mangling, defaulting to "no mangling".  This should also allow us to 
plug it back into TestRandomChains, while disallowing the constructor that 
permits turning the mangling back on again.

> NGramTokenizer, TrimFilter and WordDelimiterGraphFilter in combination can 
> produce backwards offsets
> 
>
> Key: LUCENE-8509
> URL: https://issues.apache.org/jira/browse/LUCENE-8509
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8509.patch, LUCENE-8509.patch
>
>
> Discovered by an elasticsearch user and described here: 
> https://github.com/elastic/elasticsearch/issues/33710
> The ngram tokenizer produces tokens "a b" and " bb" (note the space at the 
> beginning of the second token).  The WDGF takes the first token and splits it 
> into two, adjusting the offsets of the second token, so we get "a"[0,1] and 
> "b"[2,3].  The trim filter removes the leading space from the second token, 
> leaving offsets unchanged, so WDGF sees "bb"[1,4]; because the leading space 
> has already been stripped, WDGF sees no need to adjust offsets, and emits the 
> token as-is, resulting in the start offsets of the tokenstream being [0, 2, 
> 1], and the IndexWriter rejecting it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8509) NGramTokenizer, TrimFilter and WordDelimiterGraphFilter in combination can produce backwards offsets

2018-11-19 Thread Alan Woodward (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Woodward updated LUCENE-8509:
--
Attachment: LUCENE-8509.patch

> NGramTokenizer, TrimFilter and WordDelimiterGraphFilter in combination can 
> produce backwards offsets
> 
>
> Key: LUCENE-8509
> URL: https://issues.apache.org/jira/browse/LUCENE-8509
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8509.patch, LUCENE-8509.patch
>
>
> Discovered by an elasticsearch user and described here: 
> https://github.com/elastic/elasticsearch/issues/33710
> The ngram tokenizer produces tokens "a b" and " bb" (note the space at the 
> beginning of the second token).  The WDGF takes the first token and splits it 
> into two, adjusting the offsets of the second token, so we get "a"[0,1] and 
> "b"[2,3].  The trim filter removes the leading space from the second token, 
> leaving offsets unchanged, so WDGF sees "bb"[1,4]; because the leading space 
> has already been stripped, WDGF sees no need to adjust offsets, and emits the 
> token as-is, resulting in the start offsets of the tokenstream being [0, 2, 
> 1], and the IndexWriter rejecting it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 1970 - Unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1970/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/377/consoleText

[repro] Revision: b27bcc859580edff5ce07613e18308edadb88f4c

[repro] Ant options: -DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9
[repro] Repro line:  ant test  -Dtestcase=TestSimPolicyCloud 
-Dtests.method=testCreateCollectionSplitShard -Dtests.seed=B15EAD003BA4C31B 
-Dtests.multiplier=2 -Dtests.locale=ar-OM -Dtests.timezone=Asia/Kashgar 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=IndexSizeTriggerTest 
-Dtests.method=testSplitIntegration -Dtests.seed=B15EAD003BA4C31B 
-Dtests.multiplier=2 -Dtests.locale=he-IL -Dtests.timezone=EST5EDT 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestDelegationWithHadoopAuth 
-Dtests.method=testDelegationTokenRenew -Dtests.seed=B15EAD003BA4C31B 
-Dtests.multiplier=2 -Dtests.locale=es-GT -Dtests.timezone=Asia/Famagusta 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
eb8010405de190f37014a2f05790d108dd9d0268
[repro] git fetch
[repro] git checkout b27bcc859580edff5ce07613e18308edadb88f4c

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestDelegationWithHadoopAuth
[repro]   TestSimPolicyCloud
[repro]   IndexSizeTriggerTest
[repro] ant compile-test

[...truncated 3582 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=15 
-Dtests.class="*.TestDelegationWithHadoopAuth|*.TestSimPolicyCloud|*.IndexSizeTriggerTest"
 -Dtests.showOutput=onerror 
-DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9 
-Dtests.seed=B15EAD003BA4C31B -Dtests.multiplier=2 -Dtests.locale=es-GT 
-Dtests.timezone=Asia/Famagusta -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 1324 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest
[repro]   0/5 failed: 
org.apache.solr.security.hadoop.TestDelegationWithHadoopAuth
[repro]   1/5 failed: org.apache.solr.cloud.autoscaling.sim.TestSimPolicyCloud
[repro] git checkout eb8010405de190f37014a2f05790d108dd9d0268

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 214 - Still Unstable

2018-11-19 Thread Apache Jenkins Server
Error processing tokens: Error while parsing action 
'Text/ZeroOrMore/FirstOf/Token/DelimitedToken/DelimitedToken_Action3' at input 
position (line 76, pos 4):
)"}
   ^

java.lang.OutOfMemoryError: Java heap space

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Resolved] (SOLR-12972) deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 follow-on)

2018-11-19 Thread Christine Poerschke (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12972?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Christine Poerschke resolved SOLR-12972.

   Resolution: Fixed
Fix Version/s: 7.7
   master (8.0)

> deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 
> follow-on)
> -
>
> Key: SOLR-12972
> URL: https://issues.apache.org/jira/browse/SOLR-12972
> Project: Solr
>  Issue Type: Task
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Fix For: master (8.0), 7.7
>
> Attachments: SOLR-12972-step1.patch, SOLR-12972-step2.patch
>
>
> * step 1 (branch_7x and master): deprecate the unused (but public) member
> * step 2 (master only): remove the member



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12972) deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 follow-on)

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691825#comment-16691825
 ] 

ASF subversion and git services commented on SOLR-12972:


Commit eb8010405de190f37014a2f05790d108dd9d0268 in lucene-solr's branch 
refs/heads/master from [~cpoerschke]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=eb80104 ]

SOLR-12972: remove deprecated, unused SolrIndexConfig.luceneVersion


> deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 
> follow-on)
> -
>
> Key: SOLR-12972
> URL: https://issues.apache.org/jira/browse/SOLR-12972
> Project: Solr
>  Issue Type: Task
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-12972-step1.patch, SOLR-12972-step2.patch
>
>
> * step 1 (branch_7x and master): deprecate the unused (but public) member
> * step 2 (master only): remove the member



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-SmokeRelease-master - Build # 1187 - Failure

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1187/

No tests ran.

Build Log:
[...truncated 23435 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2440 links (1991 relative) to 3219 anchors in 248 files
 [echo] Validated Links & Anchors via: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-8.0.0.tgz
 into 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

i

[jira] [Commented] (SOLR-12972) deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 follow-on)

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691793#comment-16691793
 ] 

ASF subversion and git services commented on SOLR-12972:


Commit 58ec5ef72716e8af632c742b5ccc715bfae23e02 in lucene-solr's branch 
refs/heads/branch_7x from [~cpoerschke]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=58ec5ef ]

SOLR-12972: deprecate unused SolrIndexConfig.luceneVersion


> deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 
> follow-on)
> -
>
> Key: SOLR-12972
> URL: https://issues.apache.org/jira/browse/SOLR-12972
> Project: Solr
>  Issue Type: Task
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-12972-step1.patch, SOLR-12972-step2.patch
>
>
> * step 1 (branch_7x and master): deprecate the unused (but public) member
> * step 2 (master only): remove the member



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr issue #501: SOLR-5211

2018-11-19 Thread dsmiley
Github user dsmiley commented on the issue:

https://github.com/apache/lucene-solr/pull/501
  
Also please rename "block" terminology to "nested" where possible.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [GitHub] lucene-solr issue #500: LUCENE-8517: do not wrap FixedShingleFilter with con...

2018-11-19 Thread Michael Sokolov
Oh! got it - We run our tests and other release machinery etc against
a single JDK, and it is currently Java 8. I will precommit with Java 8
then. Presumably at some future date JDK11 becomes the system of
record? Historically how long have we waited after a new Java release
before shifting over?
On Mon, Nov 19, 2018 at 9:10 AM uschindler  wrote:
>
> Github user uschindler commented on the issue:
>
> https://github.com/apache/lucene-solr/pull/500
>
> Hi,
>
> Currently a Lucene release should be done with Java 8. Therefore all 
> checks like precommit have to be done with Java 8, otherwise we can’t be sure 
> that it really works with Java 8.
>
> Lucene can for sure be built with Java 9 or later, but some 
> developer-central tasks cannot be executed for various reasons (that are not 
> necessary for building from source). The problem here is with the ECJ tool 
> that is used for precommit checks. It’s the Eclipse compiler and this one 
> needs a version that support the module system. Unfortunately there were some 
> problems with updating it as it broke some parts of the checks (I tried it a 
> while back, mabye it’s better now). Other tools that won’t work are the 
> Javadocs linter, because the format of Javadocs differs in Java 9+ and 
> maintaining 2 different HTML parsers is too hard (because the Javadocs in the 
> release JAR/ZIP/TGZ are Java 8 anyways).
>
> As a contributor/committer of Lucene you have to use Java 8 at least for 
> the quality checks so we can be sure that all is fine with your code! As an 
> end user, you can build lucene with any version later – this is also tested 
> with Jenkins.
>
> Uwe
>
>
> ---
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread dsmiley
Github user dsmiley commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234646277
  
--- Diff: 
solr/core/src/java/org/apache/solr/update/DirectUpdateHandler2.java ---
@@ -975,6 +976,14 @@ private void updateDocOrDocValues(AddUpdateCommand 
cmd, IndexWriter writer) thro
 }
   }
 
+  boolean forciblyAddBlockId() {
--- End diff --

Come to think of it, who cares if root is populated or not for "old" 
schemas?  Lets just do it always; no conditional test.  If someone doesn't want 
root to be populated, that same person probably doesn't have child docs and 
should just as well omit the root field from their schema.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8374) Reduce reads for sparse DocValues

2018-11-19 Thread Adrien Grand (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8374?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691790#comment-16691790
 ] 

Adrien Grand commented on LUCENE-8374:
--

bq. I do not understand your resistance against providing an out-of-the-box 
performance improvement to Lucene-based search engines. 

I have nothing against making performance better, quite the opposite! But as 
often there are trade-offs and this patch spends significant CPU and I/O when 
opening the reader in order to build a cache while the same information could 
be computed at index time. This might be fine for static readers, but it 
doesn't sound like a net win otherwise.

bq. I do not see index-upgrades as trivial exercises and I have previously 
encountered quite a lot of resistance against trying patched versions of 
Lucene/Solr.

Sure, I agree patched versions have lots of drawbacks. I was suggesting this 
workaround in the case upgrading was a no-go given that this is more acceptable 
to me than adding such caching to the codec.

bq. If splitting helps acceptance, I'll try and commit that way.

Thanks. I'm mostly interested in splitting from the perspective of tracking 
regressions. For instance it will be easier to track which change a 
slowdown/speedup relates to in Mike's nightly benchmarks.

bq. I'm also game to make a split patch, if someone commits to do a review.

I'll take care of reviews.

bq. Note: I am aware that Lucene/Solr 7.6 is in pre-release and that larger 
patches should not be committed to master until 7.6 has been released.

I don't think this is an issue. Nick already cut branch_7_6, so the 7.6 release 
shouldn't impact work that we are doing on the master and 7.x branches?

> Reduce reads for sparse DocValues
> -
>
> Key: LUCENE-8374
> URL: https://issues.apache.org/jira/browse/LUCENE-8374
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Affects Versions: 7.5, master (8.0)
>Reporter: Toke Eskildsen
>Priority: Major
>  Labels: performance
> Fix For: 7.6
>
> Attachments: LUCENE-8374.patch, LUCENE-8374.patch, LUCENE-8374.patch, 
> LUCENE-8374.patch, LUCENE-8374.patch, LUCENE-8374.patch, LUCENE-8374.patch, 
> LUCENE-8374_branch_7_3.patch, LUCENE-8374_branch_7_3.patch.20181005, 
> LUCENE-8374_branch_7_4.patch, LUCENE-8374_branch_7_5.patch, 
> entire_index_logs.txt, image-2018-10-24-07-30-06-663.png, 
> image-2018-10-24-07-30-56-962.png, single_vehicle_logs.txt, 
> start-2018-10-24-1_snapshot___Users_tim_Snapshots__-_YourKit_Java_Profiler_2017_02-b75_-_64-bit.png,
>  
> start-2018-10-24_snapshot___Users_tim_Snapshots__-_YourKit_Java_Profiler_2017_02-b75_-_64-bit.png
>
>
> The {{Lucene70DocValuesProducer}} has the internal classes 
> {{SparseNumericDocValues}} and {{BaseSortedSetDocValues}} (sparse code path), 
> which again uses {{IndexedDISI}} to handle the docID -> value-ordinal lookup. 
> The value-ordinal is the index of the docID assuming an abstract tightly 
> packed monotonically increasing list of docIDs: If the docIDs with 
> corresponding values are {{[0, 4, 1432]}}, their value-ordinals will be {{[0, 
> 1, 2]}}.
> h2. Outer blocks
> The lookup structure of {{IndexedDISI}} consists of blocks of 2^16 values 
> (65536), where each block can be either {{ALL}}, {{DENSE}} (2^12 to 2^16 
> values) or {{SPARSE}} (< 2^12 values ~= 6%). Consequently blocks vary quite a 
> lot in size and ordinal resolving strategy.
> When a sparse Numeric DocValue is needed, the code first locates the block 
> containing the wanted docID flag. It does so by iterating blocks one-by-one 
> until it reaches the needed one, where each iteration requires a lookup in 
> the underlying {{IndexSlice}}. For a common memory mapped index, this 
> translates to either a cached request or a read operation. If a segment has 
> 6M documents, worst-case is 91 lookups. In our web archive, our segments has 
> ~300M values: A worst-case of 4577 lookups!
> One obvious solution is to use a lookup-table for blocks: A long[]-array with 
> an entry for each block. For 6M documents, that is < 1KB and would allow for 
> direct jumping (a single lookup) in all instances. Unfortunately this 
> lookup-table cannot be generated upfront when the writing of values is purely 
> streaming. It can be appended to the end of the stream before it is closed, 
> but without knowing the position of the lookup-table the reader cannot seek 
> to it.
> One strategy for creating such a lookup-table would be to generate it during 
> reads and cache it for next lookup. This does not fit directly into how 
> {{IndexedDISI}} currently works (it is created anew for each invocation), but 
> could probably be added with a little work. An advantage to this is that this 
> does not change the underlying format and t

[jira] [Updated] (SOLR-13000) log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib

2018-11-19 Thread Christine Poerschke (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Christine Poerschke updated SOLR-13000:
---
Attachment: SOLR-13000.patch

> log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib
> --
>
> Key: SOLR-13000
> URL: https://issues.apache.org/jira/browse/SOLR-13000
> Project: Solr
>  Issue Type: Bug
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-13000.patch
>
>
> My colleague [~ajhalani] noticed that declaring a {{solr-solrj}} dependency 
> alone when configuring {{-Dlog4j.configurationFile}} for a project is not 
> sufficient.
> I looked into further and it seems that the {{log4j-slf4j-impl}} jar is 
> (inadvertently?) missing in {{solrj/lib}} in the release?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13000) log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib

2018-11-19 Thread Christine Poerschke (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691771#comment-16691771
 ] 

Christine Poerschke commented on SOLR-13000:


Attached proposed patch to change a {{conf="test"}} to a {{conf="compile"}} in 
{{solr/solrj/ivy.xml}} file.

> log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib
> --
>
> Key: SOLR-13000
> URL: https://issues.apache.org/jira/browse/SOLR-13000
> Project: Solr
>  Issue Type: Bug
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-13000.patch
>
>
> My colleague [~ajhalani] noticed that declaring a {{solr-solrj}} dependency 
> alone when configuring {{-Dlog4j.configurationFile}} for a project is not 
> sufficient.
> I looked into further and it seems that the {{log4j-slf4j-impl}} jar is 
> (inadvertently?) missing in {{solrj/lib}} in the release?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-13000) log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib

2018-11-19 Thread Christine Poerschke (JIRA)
Christine Poerschke created SOLR-13000:
--

 Summary: log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib
 Key: SOLR-13000
 URL: https://issues.apache.org/jira/browse/SOLR-13000
 Project: Solr
  Issue Type: Bug
Reporter: Christine Poerschke
Assignee: Christine Poerschke


My colleague [~ajhalani] noticed that declaring a {{solr-solrj}} dependency 
alone when configuring {{-Dlog4j.configurationFile}} for a project is not 
sufficient.

I looked into further and it seems that the {{log4j-slf4j-impl}} jar is 
(inadvertently?) missing in {{solrj/lib}} in the release?




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread moshebla
Github user moshebla commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234639274
  
--- Diff: 
solr/core/src/java/org/apache/solr/update/DirectUpdateHandler2.java ---
@@ -975,6 +976,14 @@ private void updateDocOrDocValues(AddUpdateCommand 
cmd, IndexWriter writer) thro
 }
   }
 
+  boolean forciblyAddBlockId() {
--- End diff --

The change described above causes BlockUpdateTest#testLegacyBlockProcessing 
to fail.
Should I remove it, I noticed you mentioned this being a burden going 
forward, or should I create a new solrconfig and test case to test this works 
in older solr versions?
Wouldn't this be a problem if _root_ is not defined in the collection's 
schema?


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-12999) Index replication could delete segments first

2018-11-19 Thread David Smiley (JIRA)
David Smiley created SOLR-12999:
---

 Summary: Index replication could delete segments first
 Key: SOLR-12999
 URL: https://issues.apache.org/jira/browse/SOLR-12999
 Project: Solr
  Issue Type: Improvement
  Security Level: Public (Default Security Level. Issues are Public)
  Components: replication (java)
Reporter: David Smiley


Index replication could optionally delete files that it knows will not be 
needed _first_.  This would reduce disk capacity requirements of Solr, and it 
would reduce some disk fragmentation when space get tight.

Solr (IndexFetcher) already grabs the remote file list, and it could see which 
files it has locally, then delete the others.  Today it asks Lucene to 
{{deleteUnusedFiles}} at the end.  This new mode would only be useful if there 
is no SolrIndexSearcher open, since it would prevent the removal of files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12972) deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 follow-on)

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691762#comment-16691762
 ] 

ASF subversion and git services commented on SOLR-12972:


Commit 7abb25eff55aabe2c969148367d4a1b1c53b64d7 in lucene-solr's branch 
refs/heads/master from [~cpoerschke]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=7abb25e ]

SOLR-12972: deprecate unused SolrIndexConfig.luceneVersion


> deprecate-and-remove unused SolrIndexConfig.luceneVersion (LUCENE-5871 
> follow-on)
> -
>
> Key: SOLR-12972
> URL: https://issues.apache.org/jira/browse/SOLR-12972
> Project: Solr
>  Issue Type: Task
>Reporter: Christine Poerschke
>Assignee: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-12972-step1.patch, SOLR-12972-step2.patch
>
>
> * step 1 (branch_7x and master): deprecate the unused (but public) member
> * step 2 (master only): remove the member



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #501: SOLR-5211

2018-11-19 Thread moshebla
Github user moshebla commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/501#discussion_r234636103
  
--- Diff: 
solr/core/src/java/org/apache/solr/update/DirectUpdateHandler2.java ---
@@ -975,6 +976,14 @@ private void updateDocOrDocValues(AddUpdateCommand 
cmd, IndexWriter writer) thro
 }
   }
 
+  boolean forciblyAddBlockId() {
--- End diff --

I made this happen only if luceneMatchVersion >= 0.
Hopefully I understood you correctly.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr issue #500: LUCENE-8517: do not wrap FixedShingleFilter with con...

2018-11-19 Thread uschindler
Github user uschindler commented on the issue:

https://github.com/apache/lucene-solr/pull/500
  
Hi,

Currently a Lucene release should be done with Java 8. Therefore all checks 
like precommit have to be done with Java 8, otherwise we can’t be sure that 
it really works with Java 8.

Lucene can for sure be built with Java 9 or later, but some 
developer-central tasks cannot be executed for various reasons (that are not 
necessary for building from source). The problem here is with the ECJ tool that 
is used for precommit checks. It’s the Eclipse compiler and this one needs a 
version that support the module system. Unfortunately there were some problems 
with updating it as it broke some parts of the checks (I tried it a while back, 
mabye it’s better now). Other tools that won’t work are the Javadocs 
linter, because the format of Javadocs differs in Java 9+ and maintaining 2 
different HTML parsers is too hard (because the Javadocs in the release 
JAR/ZIP/TGZ are Java 8 anyways).

As a contributor/committer of Lucene you have to use Java 8 at least for 
the quality checks so we can be sure that all is fine with your code! As an end 
user, you can build lucene with any version later – this is also tested with 
Jenkins.

Uwe


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8567) TextIndexManager checksum failed on s390x

2018-11-19 Thread Stefan Franz (JIRA)
Stefan Franz created LUCENE-8567:


 Summary: TextIndexManager checksum failed on s390x
 Key: LUCENE-8567
 URL: https://issues.apache.org/jira/browse/LUCENE-8567
 Project: Lucene - Core
  Issue Type: Bug
  Components: core/codecs
Affects Versions: 7.5
 Environment: SLES 12.3 on s390x with IBM J9 VM
Reporter: Stefan Franz


Hello,

we are using Apache Lucene within Jetbrain's Youtrack and have runtime problems 
with IBM's J9 VM version 8, but the same bug occurs when using openjdk on the 
same platform. Maybe there is a problem with the big-endian byte order of the 
z-platform? 

  
{code:java}
[2018-11-14 18:08:11,470] 18:08:11,462 ERROR [@extIndexManager] 
[TextIndexManager ] [] Failed to index 111 documents
[2018-11-14 18:08:11,471] org.apache.lucene.index.CorruptIndexException: 
checksum failed (hardware problem?) : expected=fa0a2c04 actual=a7c0aac0 
(resource=BufferedChecksumIndexInput(ExodusIndexInput[_0_Lucene50_0.doc]))
[2018-11-14 18:08:11,471] at 
org.apache.lucene.codecs.CodecUtil.checkFooter(CodecUtil.java:419) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,472] at 
org.apache.lucene.codecs.lucene50.Lucene50CompoundFormat.write(Lucene50CompoundFormat.java:99)
 ~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
jimczi - 2018-09-18 13:01:13]
[2018-11-14 18:08:11,472] at 
org.apache.lucene.index.IndexWriter.createCompoundFile(IndexWriter.java:4997) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,473] at 
org.apache.lucene.index.DocumentsWriterPerThread.sealFlushedSegment(DocumentsWriterPerThread.java:576)
 ~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
jimczi - 2018-09-18 13:01:13]
[2018-11-14 18:08:11,473] at 
org.apache.lucene.index.DocumentsWriterPerThread.flush(DocumentsWriterPerThread.java:515)
 ~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
jimczi - 2018-09-18 13:01:13]
[2018-11-14 18:08:11,473] at 
org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWriter.java:554) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,474] at 
org.apache.lucene.index.DocumentsWriter.flushAllThreads(DocumentsWriter.java:719)
 ~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
jimczi - 2018-09-18 13:01:13]
[2018-11-14 18:08:11,474] at 
org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3602) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,474] at 
org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:3577) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,475] at 
org.apache.lucene.index.IndexWriter.shutdown(IndexWriter.java:1035) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,475] at 
org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1078) 
~[lucene-core-7.5.0.jar:7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi 
- 2018-09-18 13:01:13]
[2018-11-14 18:08:11,475] at 
jetbrains.youtrack.textindex.TextIndexManagerImpl$LuceneIndexingContext.close(TextIndexManagerImpl.java:843)
 ~[youtrack-text-search-2018.3.jar:?]
[2018-11-14 18:08:11,476] at 
jetbrains.youtrack.textindex.TextIndexManagerBase$BatchIndexingJob.execute(TextIndexManagerBase.java:995)
 [youtrack-text-search-2018.3.jar:?]
[2018-11-14 18:08:11,476] at 
jetbrains.exodus.core.execution.DecoratorJob.executeDecorated(DecoratorJob.java:72)
 [xodus-utils-1.2.3501.jar:1.2.3501]
[2018-11-14 18:08:11,476] at 
jetbrains.youtrack.textindex.settings.TransactionalTextIndexJob.access$000(TransactionalTextIndexJob.java:10)
 [youtrack-text-search-2018.3.jar:?]
[2018-11-14 18:08:11,477] at 
jetbrains.youtrack.textindex.settings.TransactionalTextIndexJob$1.invoke(TransactionalTextIndexJob.java:25)
 [youtrack-text-search-2018.3.jar:?]
[2018-11-14 18:08:11,477] at 
jetbrains.teamsys.dnq.runtime.txn._Txn.runReadonly(_Txn.java:351) 
[jetbrains.teamsys.dnq.runtime-3438.jar:?]
[2018-11-14 18:08:11,477] at 
jetbrains.youtrack.textindex.settings.TransactionalTextIndexJob.execute(TransactionalTextIndexJob.java:23)
 [youtrack-text-search-2018.3.jar:?]
[2018-11-14 18:08:11,477] at 
jetbrains.exodus.core.execution.Job.run(Job.java:99) 
[xodus-utils-1.2.3501.jar:1.2.3501]
[2018-11-14 18:08:11,478] at 
jetbrains.exodus.core.execution.ThreadJobProcessor.executeJob(ThreadJobProcessor.java:133)
 [xodus-utils-1.2.3501.jar:1.2.3501]
[2018-11-14 18:08:11,478] at 
jetbrains.exodus.core.execution.JobProcessorQueueAdapter.doExecuteJob(JobProcessorQueueAdapter.java:243)
 [xodus

[jira] [Commented] (LUCENE-8216) Better cross-field scoring

2018-11-19 Thread Adrien Grand (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691717#comment-16691717
 ] 

Adrien Grand commented on LUCENE-8216:
--

Woohoo. +1 to start with a dedicated query and then look into folding this into 
similarities instead, adding built-in support for this in query parsers, etc. 
This is targeting sandbox, so I have no problem merging it as-is and iterating 
from there. Some thoughts that I had while reviewing the patch:
 - applyWeight casts to an int, should it keep a float instead? The similarity 
API already allows to pass term frequencies as a float. That would avoid the 
issue that you could otherwise end up with a term frequency that is equal to 0, 
which is illegal.
 - Creating a FilterLeafReader feels a bit heavy given that everything that 
LeafSimScorer does with it is pulling norms. Forking LeafSimScorer might make 
things a bit easier (and later maybe refactoring LeafSimScorer)?
 - It looks like removing `field` from CollectionStatistics would help support 
such a change without having to use fake field names.
 - Maybe advanceExact on merged norms should return `value != 0` rather than 
true all the time? I know it shouldn't be an issue in practice since we only 
get the norm on fields that have a value when scoring, but I would like it 
better if it behaved correctly. Also nextDoc() looks wrong too as I don't think 
it would skip over documents that don't have a value or return NO_MORE_DOCS 
when maxDoc is reached?

bq. Norms are also summed per document but since they represent the number of 
unique words we could also take the max.

Hmm I don't think this is correct. If a field value consists of twice the same 
term then the length of the field will be 2. Maybe you got confused because the 
length discards synonyms so that if two terms occur at the same position then 
this only adds one to the length by default. Summing up lengths sounds like a 
sensible approach to me.

> Better cross-field scoring
> --
>
> Key: LUCENE-8216
> URL: https://issues.apache.org/jira/browse/LUCENE-8216
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Major
> Fix For: master (8.0)
>
> Attachments: LUCENE-8216.patch
>
>
> I'd like Lucene to have better support for scoring across multiple fields. 
> Today we have BlendedTermQuery which tries to help there but it probably 
> tries to do too much on some aspects (handling cross-field term queries AND 
> synonyms) and too little on other ones (it tries to merge index-level 
> statistics, but not per-document statistics like tf and norm).
> Maybe we could implement something like BM25F so that queries across multiple 
> fields would retain the benefits of BM25 like the fact that the impact of the 
> term frequency saturates quickly, which is not the case with BlendedTermQuery 
> if you have occurrences across many fields.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr issue #500: LUCENE-8517: do not wrap FixedShingleFilter with con...

2018-11-19 Thread msokolov
Github user msokolov commented on the issue:

https://github.com/apache/lucene-solr/pull/500
  
Ah, I see. I wonder if it would be a huge effort to support building with
JDK11 at least? IMO one could skip 9 and 10 since even openjdk is not
"supporting" them with bug fixes, right? Anyway I'd be happy to help out if
that would be welcome

On Mon, Nov 19, 2018 at 8:27 AM Uwe Schindler 
wrote:

> It's only supported with Java 8, because that's our intended release
> version.
>
> —
> You are receiving this because you authored the thread.
> Reply to this email directly, view it on GitHub
> ,
> or mute the thread
> 

> .
>



---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr issue #500: LUCENE-8517: do not wrap FixedShingleFilter with con...

2018-11-19 Thread uschindler
Github user uschindler commented on the issue:

https://github.com/apache/lucene-solr/pull/500
  
It's only supported with Java 8, because that's our intended release 
version.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr issue #500: LUCENE-8517: do not wrap FixedShingleFilter with con...

2018-11-19 Thread msokolov
Github user msokolov commented on the issue:

https://github.com/apache/lucene-solr/pull/500
  
By the way, `ant precommit` failed for me with this message:

```
-ecj-javadoc-lint-unsupported:

BUILD FAILED

/home/ANT.AMAZON.COM/sokolovm/workspace/lucene-solr/lucene/common-build.xml:2075:
 Linting documentation with ECJ is not supported on this Java version (unknown).
```

I'm building with JDK10 -- is that supposed to be supported / is this a 
known issue?


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.6-Linux (64bit/jdk-9.0.4) - Build # 2 - Still Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/2/
Java: 64bit/jdk-9.0.4 -XX:-UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testParallelUpdateQTime

Error Message:
Error from server at 
https://127.0.0.1:35851/solr/collection1_shard2_replica_n2: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n2/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n2/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:35851/solr/collection1_shard2_replica_n2: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n2/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n2/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([7351038014736146:9D8978115A0CB4D5]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testParallelUpdateQTime(CloudSolrClientTest.java:146)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)

[jira] [Commented] (LUCENE-8517) TestRandomChains.testRandomChainsWithLargeStrings failure

2018-11-19 Thread Mike Sokolov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691682#comment-16691682
 ] 

Mike Sokolov commented on LUCENE-8517:
--

Hmm sadly it does still repro for me even with LUCENE-8564 applied. I haven't 
had a chance to look closely this morning. In the meantime I'll post an update 
to the PR addressing your comments, and for now I'll leave in the exclusion for 
{{FIxedShingleFilter}}, although I hope we'll get to the bottom of it.

> TestRandomChains.testRandomChainsWithLargeStrings failure
> -
>
> Key: LUCENE-8517
> URL: https://issues.apache.org/jira/browse/LUCENE-8517
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: modules/analysis
>Reporter: Steve Rowe
>Priority: Major
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> From 
> [https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2828/consoleText], 
> reproduces for me on Java8:
> {noformat}
> Checking out Revision 216f10026b86627750e133fe24ce6a750c470695 
> (refs/remotes/origin/branch_7x)
> [...]
> [java-info] java version "10.0.1"
> [java-info] OpenJDK Runtime Environment (10.0.1+10, Oracle Corporation)
> [java-info] OpenJDK 64-Bit Server VM (10.0.1+10, Oracle Corporation)
> [java-info] Test args: [-XX:-UseCompressedOops -XX:+UseConcMarkSweepGC]
> [...]
>[junit4] Suite: org.apache.lucene.analysis.core.TestRandomChains
>[junit4]   2> Exception from random analyzer: 
>[junit4]   2> charfilters=
>[junit4]   2>   
> org.apache.lucene.analysis.charfilter.MappingCharFilter(org.apache.lucene.analysis.charfilter.NormalizeCharMap@3ef95503,
>  java.io.StringReader@70dde633)
>[junit4]   2>   
> org.apache.lucene.analysis.fa.PersianCharFilter(org.apache.lucene.analysis.charfilter.MappingCharFilter@12423b20)
>[junit4]   2> tokenizer=
>[junit4]   2>   org.apache.lucene.analysis.th.ThaiTokenizer()
>[junit4]   2> filters=
>[junit4]   2>   
> org.apache.lucene.analysis.compound.HyphenationCompoundWordTokenFilter(ValidatingTokenFilter@7914bba7
>  
> term=,bytes=[],startOffset=0,endOffset=0,positionIncrement=1,positionLength=1,type=word,termFrequency=1,
>  org.apache.lucene.analysis.compound.hyphenation.HyphenationTree@abd7bca)
>[junit4]   2>   
> Conditional:org.apache.lucene.analysis.MockGraphTokenFilter(java.util.Random@56348091,
>  OneTimeWrapper@aa1c073 
> term=,bytes=[],startOffset=0,endOffset=0,positionIncrement=1,positionLength=1,type=word,termFrequency=1)
>[junit4]   2>   
> Conditional:org.apache.lucene.analysis.shingle.FixedShingleFilter(OneTimeWrapper@4cf58fce
>  
> term=,bytes=[],startOffset=0,endOffset=0,positionIncrement=1,positionLength=1,type=word,termFrequency=1,
>  4, , )
>[junit4]   2>   
> org.apache.lucene.analysis.pt.PortugueseLightStemFilter(ValidatingTokenFilter@3a915324
>  
> term=,bytes=[],startOffset=0,endOffset=0,positionIncrement=1,positionLength=1,type=word,termFrequency=1,keyword=false)
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRandomChains 
> -Dtests.method=testRandomChainsWithLargeStrings -Dtests.seed=92344C536D4E00F4 
> -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=en-ZW 
> -Dtests.timezone=Atlantic/Faroe -Dtests.asserts=true 
> -Dtests.file.encoding=US-ASCII
>[junit4] ERROR   0.46s J2 | 
> TestRandomChains.testRandomChainsWithLargeStrings <<<
>[junit4]> Throwable #1: java.lang.IllegalStateException: stage 3: 
> inconsistent startOffset at pos=0: 0 vs 5; token=effort
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([92344C536D4E00F4:F86FF34234002007]:0)
>[junit4]>  at 
> org.apache.lucene.analysis.ValidatingTokenFilter.incrementToken(ValidatingTokenFilter.java:109)
>[junit4]>  at 
> org.apache.lucene.analysis.pt.PortugueseLightStemFilter.incrementToken(PortugueseLightStemFilter.java:48)
>[junit4]>  at 
> org.apache.lucene.analysis.ValidatingTokenFilter.incrementToken(ValidatingTokenFilter.java:68)
>[junit4]>  at 
> org.apache.lucene.analysis.BaseTokenStreamTestCase.checkResetException(BaseTokenStreamTestCase.java:441)
>[junit4]>  at 
> org.apache.lucene.analysis.BaseTokenStreamTestCase.checkRandomData(BaseTokenStreamTestCase.java:546)
>[junit4]>  at 
> org.apache.lucene.analysis.core.TestRandomChains.testRandomChainsWithLargeStrings(TestRandomChains.java:897)
>[junit4]>  at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>[junit4]>  at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>[junit4]>  at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>[junit4]>  at 
> java.base/java.lang.ref

[GitHub] lucene-solr pull request #500: LUCENE-8517: do not wrap FixedShingleFilter w...

2018-11-19 Thread msokolov
Github user msokolov commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/500#discussion_r234606790
  
--- Diff: 
lucene/test-framework/src/java/org/apache/lucene/analysis/ValidatingTokenFilter.java
 ---
@@ -50,6 +53,9 @@
   private final OffsetAttribute offsetAtt = 
getAttribute(OffsetAttribute.class);
   private final CharTermAttribute termAtt = 
getAttribute(CharTermAttribute.class);
 
+  // record all the Tokens seen so they can be dumped on failure
+  private final List tokens = new ArrayList<>();
--- End diff --

Oh I hadn't realized RandomChains could create such large strings that 
would be a concern!  Yes, I don't think it's helpful to have even 200 tokens 
printed out. In general your interest here will be in the final few tokens, so 
I think we can limit to 20 or so.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #500: LUCENE-8517: do not wrap FixedShingleFilter w...

2018-11-19 Thread msokolov
Github user msokolov commented on a diff in the pull request:

https://github.com/apache/lucene-solr/pull/500#discussion_r234606810
  
--- Diff: 
lucene/test-framework/src/java/org/apache/lucene/analysis/TokenStreamUtil.java 
---
@@ -0,0 +1,41 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.lucene.analysis;
+
+import java.io.PrintStream;
+
+public final class TokenStreamUtil {
+
+  private TokenStreamUtil() {
+  }
+
+  /**
+   * Prints details about consumed tokens stored in any 
ValidatingTokenFilters in the input chain
+   * @param in
+   * @param out
+   */
+  public static void dumpValidatingTokenFilters(TokenStream in, 
PrintStream out) {
--- End diff --

Yes - I started from a broader project including the ability to inject 
arbitrary filters into any token stream. I was thinking of `ToDotTokenFilter`, 
but simplified it along the way since I didn't need that for debugging here. 
I'll move into VTF and create this later as needed.


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [JENKINS] Lucene-Solr-master-Linux (64bit/jdk1.8.0_172) - Build # 23236 - Failure!

2018-11-19 Thread Alan Woodward
I pushed a fix for this, sorry for the noise.

It looks as though Yetus doesn’t build the reference guide, which might be 
something to look into?

> On 19 Nov 2018, at 11:56, Policeman Jenkins Server  
> wrote:
> 
> Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23236/
> Java: 64bit/jdk1.8.0_172 -XX:+UseCompressedOops -XX:+UseParallelGC
> 
> All tests passed
> 
> Build Log:
> [...truncated 64420 lines...]
> [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
> invalid part, must have at least one section (e.g., chapter, appendix, etc.)
> [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
> part, must have at least one section (e.g., chapter, appendix, etc.)
> [java] Relative link points at dest file that doesn't exist: 
> ../../../../lucene/build/docs//analyzers-common/org/apache/lucene/analysis/util/MultiTermAwareComponent.html
> [java]  ... source: 
> file:/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-ref-guide/bare-bones-html/analyzers.html
> [java] Processed 2441 links (1992 relative) to 3219 anchors in 248 files
> [java] Total of 1 problems found
> 
> BUILD FAILED
> /home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:633: The following 
> error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:101: The following 
> error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:660: The 
> following error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:192: The 
> following error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/solr-ref-guide/build.xml:326:
>  Java returned: 255
> 
> Total time: 85 minutes 7 seconds
> Build step 'Invoke Ant' marked build as failure
> Archiving artifacts
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> [WARNINGS] Skipping publisher since build result is FAILURE
> Recording test results
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Email was triggered for: Failure - Any
> Sending email for trigger: Failure - Any
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8497) Rethink multi-term analysis handling

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8497?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691624#comment-16691624
 ] 

ASF subversion and git services commented on LUCENE-8497:
-

Commit c2bd3aed22b439168fb2bfadcdcee4fed09e4ff7 in lucene-solr's branch 
refs/heads/master from [~romseygeek]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=c2bd3ae ]

LUCENE-8497: Fix reference to MultiTermAwareComponenent in Solr reference guide


> Rethink multi-term analysis handling
> 
>
> Key: LUCENE-8497
> URL: https://issues.apache.org/jira/browse/LUCENE-8497
> Project: Lucene - Core
>  Issue Type: New Feature
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Fix For: master (8.0)
>
> Attachments: LUCENE-8497.patch, LUCENE-8497.patch, LUCENE-8497.patch, 
> LUCENE-8497.patch
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> The current framework for handling term normalisation works via instanceof 
> checks for MultiTermAwareComponent and casts.  MultiTermAwareComponent itself 
> deals in AbstractAnalysisComponents, and so callers need to cast to the 
> correct component type before use, which is ripe for misuse.
> We should re-organise all this to be type-safe and usable without casts.  One 
> possibility is to add `normalize` methods to CharFilterFactory and 
> TokenFilterFactory that mirror their existing `create` methods.  The default 
> implementation would return the input unchanged, while filters that should 
> apply at normalization time can delegate to `create`.
> Related to this, we should deprecate and remove LowerCaseTokenizer, which 
> combines tokenization and normalization in a way that will break this API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8564) Make it easier to iterate over graphs in tokenstreams

2018-11-19 Thread Alan Woodward (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8564?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Woodward updated LUCENE-8564:
--
Attachment: LUCENE-8564.patch

> Make it easier to iterate over graphs in tokenstreams
> -
>
> Key: LUCENE-8564
> URL: https://issues.apache.org/jira/browse/LUCENE-8564
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8564.patch, LUCENE-8564.patch
>
>
> We have a number of TokenFilters that read ahead in the token stream (eg 
> synonyms, shingles) and ideally these would understand token graphs as well 
> as linear streams.  FixedShingleFilter already has some mechanisms to deal 
> with graphs; this issue is to extract this logic into a GraphTokenStream 
> class that can then be reused by other token filters



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [JENKINS] Lucene-Solr-master-Linux (64bit/jdk1.8.0_172) - Build # 23236 - Failure!

2018-11-19 Thread Alan Woodward
I’m looking now...

> On 19 Nov 2018, at 11:56, Policeman Jenkins Server  
> wrote:
> 
> Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23236/
> Java: 64bit/jdk1.8.0_172 -XX:+UseCompressedOops -XX:+UseParallelGC
> 
> All tests passed
> 
> Build Log:
> [...truncated 64420 lines...]
> [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
> invalid part, must have at least one section (e.g., chapter, appendix, etc.)
> [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
> part, must have at least one section (e.g., chapter, appendix, etc.)
>[java] Relative link points at dest file that doesn't exist: 
> ../../../../lucene/build/docs//analyzers-common/org/apache/lucene/analysis/util/MultiTermAwareComponent.html
>[java]  ... source: 
> file:/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-ref-guide/bare-bones-html/analyzers.html
>[java] Processed 2441 links (1992 relative) to 3219 anchors in 248 files
>[java] Total of 1 problems found
> 
> BUILD FAILED
> /home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:633: The following 
> error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:101: The following 
> error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:660: The 
> following error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:192: The 
> following error occurred while executing this line:
> /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/solr-ref-guide/build.xml:326:
>  Java returned: 255
> 
> Total time: 85 minutes 7 seconds
> Build step 'Invoke Ant' marked build as failure
> Archiving artifacts
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> [WARNINGS] Skipping publisher since build result is FAILURE
> Recording test results
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Email was triggered for: Failure - Any
> Sending email for trigger: Failure - Any
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> Setting 
> ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (64bit/jdk1.8.0_172) - Build # 23236 - Failure!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23236/
Java: 64bit/jdk1.8.0_172 -XX:+UseCompressedOops -XX:+UseParallelGC

All tests passed

Build Log:
[...truncated 64420 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Relative link points at dest file that doesn't exist: 
../../../../lucene/build/docs//analyzers-common/org/apache/lucene/analysis/util/MultiTermAwareComponent.html
 [java]  ... source: 
file:/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-ref-guide/bare-bones-html/analyzers.html
 [java] Processed 2441 links (1992 relative) to 3219 anchors in 248 files
 [java] Total of 1 problems found

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:633: The following 
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:101: The following 
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:660: The 
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:192: The 
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/solr-ref-guide/build.xml:326:
 Java returned: 255

Total time: 85 minutes 7 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8564) Make it easier to iterate over graphs in tokenstreams

2018-11-19 Thread Alan Woodward (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8564?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691605#comment-16691605
 ] 

Alan Woodward commented on LUCENE-8564:
---

I reworked this slightly; we now have GraphTokenFilter as an abstract extension 
of TokenFilter, rather than using a separate standalone object.  This makes 
reset()/end() easier to handle.

> Make it easier to iterate over graphs in tokenstreams
> -
>
> Key: LUCENE-8564
> URL: https://issues.apache.org/jira/browse/LUCENE-8564
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8564.patch, LUCENE-8564.patch
>
>
> We have a number of TokenFilters that read ahead in the token stream (eg 
> synonyms, shingles) and ideally these would understand token graphs as well 
> as linear streams.  FixedShingleFilter already has some mechanisms to deal 
> with graphs; this issue is to extract this logic into a GraphTokenStream 
> class that can then be reused by other token filters



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8539) Fix typos and style in TestStopFilter

2018-11-19 Thread Alessandro Benedetti (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8539?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691565#comment-16691565
 ] 

Alessandro Benedetti commented on LUCENE-8539:
--

I have reviewed the Pull Request and it is a big +1 from my side.
Can anyone from the community pick it up and give us his opinion?

> Fix typos and style in TestStopFilter
> -
>
> Key: LUCENE-8539
> URL: https://issues.apache.org/jira/browse/LUCENE-8539
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/other
>Reporter: Diego Ceccarelli
>Priority: Minor
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> This patch fixes some typos in TestStopFilter, it contains also some 
> refactoring of the tests to make them more clear. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-7.x - Build # 219 - Still Unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/219/

1 tests failed.
FAILED:  org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting

Error Message:
Error from server at 
https://127.0.0.1:40187/solr/collection1_shard2_replica_n2: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n2/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n2/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:40187/solr/collection1_shard2_replica_n2: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n2/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n2/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([F6DE2AD86FE5141F:346916B06CA5E467]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:269)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rule

[jira] [Updated] (SOLR-12775) Add a deprecated implementation of LowerCaseTokenizer

2018-11-19 Thread Alan Woodward (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12775?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Woodward updated SOLR-12775:
-
Attachment: SOLR-12775.patch

> Add a deprecated implementation of LowerCaseTokenizer
> -
>
> Key: SOLR-12775
> URL: https://issues.apache.org/jira/browse/SOLR-12775
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Alan Woodward
>Priority: Blocker
> Fix For: master (8.0)
>
> Attachments: SOLR-12775.patch, SOLR-12775.patch, SOLR-12775.patch
>
>
> LUCENE-8498 will remove LowerCaseTokenizer and LowerCaseTokenizerFactory from 
> lucene 8.  To make upgrading from Solr 7.x to Solr 8 easier for users who 
> have schemas that use LowerCaseTokenizerFactory, we should add a deprecated 
> copy of the code to the {{org.apache.solr.analysis}} package.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12775) Add a deprecated implementation of LowerCaseTokenizer

2018-11-19 Thread Alan Woodward (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691533#comment-16691533
 ] 

Alan Woodward commented on SOLR-12775:
--

Final patch for Yetus to grind on

> Add a deprecated implementation of LowerCaseTokenizer
> -
>
> Key: SOLR-12775
> URL: https://issues.apache.org/jira/browse/SOLR-12775
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Alan Woodward
>Priority: Blocker
> Fix For: master (8.0)
>
> Attachments: SOLR-12775.patch, SOLR-12775.patch, SOLR-12775.patch
>
>
> LUCENE-8498 will remove LowerCaseTokenizer and LowerCaseTokenizerFactory from 
> lucene 8.  To make upgrading from Solr 7.x to Solr 8 easier for users who 
> have schemas that use LowerCaseTokenizerFactory, we should add a deprecated 
> copy of the code to the {{org.apache.solr.analysis}} package.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Assigned] (LUCENE-8497) Rethink multi-term analysis handling

2018-11-19 Thread Alan Woodward (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alan Woodward reassigned LUCENE-8497:
-

Assignee: Alan Woodward

> Rethink multi-term analysis handling
> 
>
> Key: LUCENE-8497
> URL: https://issues.apache.org/jira/browse/LUCENE-8497
> Project: Lucene - Core
>  Issue Type: New Feature
>Reporter: Alan Woodward
>Assignee: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8497.patch, LUCENE-8497.patch, LUCENE-8497.patch, 
> LUCENE-8497.patch
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> The current framework for handling term normalisation works via instanceof 
> checks for MultiTermAwareComponent and casts.  MultiTermAwareComponent itself 
> deals in AbstractAnalysisComponents, and so callers need to cast to the 
> correct component type before use, which is ripe for misuse.
> We should re-organise all this to be type-safe and usable without casts.  One 
> possibility is to add `normalize` methods to CharFilterFactory and 
> TokenFilterFactory that mirror their existing `create` methods.  The default 
> implementation would return the input unchanged, while filters that should 
> apply at normalization time can delegate to `create`.
> Related to this, we should deprecate and remove LowerCaseTokenizer, which 
> combines tokenization and normalization in a way that will break this API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8497) Rethink multi-term analysis handling

2018-11-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8497?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691506#comment-16691506
 ] 

ASF subversion and git services commented on LUCENE-8497:
-

Commit 65486442c4a893a17cd70c9a865fa1af7c160aa3 in lucene-solr's branch 
refs/heads/master from [~romseygeek]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=6548644 ]

LUCENE-8497: Replace MultiTermAwareComponent with normalize() method


> Rethink multi-term analysis handling
> 
>
> Key: LUCENE-8497
> URL: https://issues.apache.org/jira/browse/LUCENE-8497
> Project: Lucene - Core
>  Issue Type: New Feature
>Reporter: Alan Woodward
>Priority: Major
> Attachments: LUCENE-8497.patch, LUCENE-8497.patch, LUCENE-8497.patch, 
> LUCENE-8497.patch
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> The current framework for handling term normalisation works via instanceof 
> checks for MultiTermAwareComponent and casts.  MultiTermAwareComponent itself 
> deals in AbstractAnalysisComponents, and so callers need to cast to the 
> correct component type before use, which is ripe for misuse.
> We should re-organise all this to be type-safe and usable without casts.  One 
> possibility is to add `normalize` methods to CharFilterFactory and 
> TokenFilterFactory that mirror their existing `create` methods.  The default 
> implementation would return the input unchanged, while filters that should 
> apply at normalization time can delegate to `create`.
> Related to this, we should deprecate and remove LowerCaseTokenizer, which 
> combines tokenization and normalization in a way that will break this API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.6-Linux (32bit/jdk1.8.0_172) - Build # 1 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/1/
Java: 32bit/jdk1.8.0_172 -server -XX:+UseConcMarkSweepGC

6 tests failed.
FAILED:  org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting

Error Message:
Error from server at 
https://127.0.0.1:37957/solr/collection1_shard2_replica_n3: Expected mime type 
application/octet-stream but got text/html.Error 404 
Can not find: /solr/collection1_shard2_replica_n3/update  
HTTP ERROR 404 Problem accessing 
/solr/collection1_shard2_replica_n3/update. Reason: Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605  
  

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at https://127.0.0.1:37957/solr/collection1_shard2_replica_n3: Expected 
mime type application/octet-stream but got text/html. 


Error 404 Can not find: 
/solr/collection1_shard2_replica_n3/update

HTTP ERROR 404
Problem accessing /solr/collection1_shard2_replica_n3/update. Reason:
Can not find: 
/solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty";>Powered by Jetty:// 9.4.11.v20180605




at 
__randomizedtesting.SeedInfo.seed([522728E7CA833A8:C7954EE67FE8C3D0]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:238)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfte

[JENKINS-EA] Lucene-Solr-7.6-Windows (64bit/jdk-12-ea+12) - Build # 1 - Unstable!

2018-11-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Windows/1/
Java: 64bit/jdk-12-ea+12 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

15 tests failed.
FAILED:  
org.apache.solr.cloud.LIRRollingUpdatesTest.testNewLeaderAndMixedReplicas

Error Message:
IOException occured when talking to server at: https://127.0.0.1:57912/solr

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: IOException occured when 
talking to server at: https://127.0.0.1:57912/solr
at 
__randomizedtesting.SeedInfo.seed([F8A80A4358E435B0:4DDFE4D8638066D0]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:657)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.doRequest(LBHttpSolrClient.java:483)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:413)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1107)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.testLeaderAndMixedReplicas(LIRRollingUpdatesTest.java:333)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.testNewLeaderAndMixedReplicas(LIRRollingUpdatesTest.java:340)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
  

[JENKINS] Lucene-Solr-Tests-master - Build # 2954 - Still Unstable

2018-11-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2954/

5 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.ComputePlanActionTest.testSelectedCollections

Error Message:
Trigger was not fired even after 5 seconds

Stack Trace:
java.lang.AssertionError: Trigger was not fired even after 5 seconds
at 
__randomizedtesting.SeedInfo.seed([D5F837BA9E56E18D:EF56D263A03238E3]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.cloud.autoscaling.ComputePlanActionTest.testSelectedCollections(ComputePlanActionTest.java:493)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  
org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest.testMergeIntegration

Error Message:
Error from server at https://127.0.0.1:42611/solr: create the collection time 
out:180s

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server

  1   2   >