Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/3781/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.cloud.PeerSyncReplicationTest.test

Error Message:
PeerSynced node did not become leader expected:<CloudJettyRunner 
[url=http://127.0.0.1:64262/lq_fx/collection1]> but was:<CloudJettyRunner 
[url=http://127.0.0.1:64258/lq_fx/collection1]>

Stack Trace:
java.lang.AssertionError: PeerSynced node did not become leader 
expected:<CloudJettyRunner [url=http://127.0.0.1:64262/lq_fx/collection1]> but 
was:<CloudJettyRunner [url=http://127.0.0.1:64258/lq_fx/collection1]>
        at 
__randomizedtesting.SeedInfo.seed([66390243AC03ACF0:EE6D3D9902FFC108]:0)
        at org.junit.Assert.fail(Assert.java:93)
        at org.junit.Assert.failNotEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:128)
        at 
org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:162)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
        at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
        at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
        at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
        at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
        at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
        at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
        at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
        at java.lang.Thread.run(Thread.java:745)




Build Log:
[...truncated 10693 lines...]
   [junit4] Suite: org.apache.solr.cloud.PeerSyncReplicationTest
   [junit4]   2> Creating dataDir: 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/init-core-data-001
   [junit4]   2> 112366 INFO  
(SUITE-PeerSyncReplicationTest-seed#[66390243AC03ACF0]-worker) [    ] 
o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: 
@org.apache.solr.util.RandomizeSSL(reason=, value=NaN, ssl=NaN, clientAuth=NaN) 
w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 112366 INFO  
(SUITE-PeerSyncReplicationTest-seed#[66390243AC03ACF0]-worker) [    ] 
o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /lq_fx/
   [junit4]   2> 112369 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 112369 INFO  (Thread-80) [    ] o.a.s.c.ZkTestServer client 
port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 112369 INFO  (Thread-80) [    ] o.a.s.c.ZkTestServer Starting 
server
   [junit4]   2> 112474 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkTestServer start zk server on port:64250
   [junit4]   2> 112508 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml
 to /configs/conf1/solrconfig.xml
   [junit4]   2> 112513 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/schema.xml
 to /configs/conf1/schema.xml
   [junit4]   2> 112517 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml
 to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 112520 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/stopwords.txt
 to /configs/conf1/stopwords.txt
   [junit4]   2> 112523 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/protwords.txt
 to /configs/conf1/protwords.txt
   [junit4]   2> 112526 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/currency.xml
 to /configs/conf1/currency.xml
   [junit4]   2> 112529 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml
 to /configs/conf1/enumsConfig.xml
   [junit4]   2> 112532 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json
 to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 112535 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt
 to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 112539 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt
 to /configs/conf1/old_synonyms.txt
   [junit4]   2> 112542 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractZkTestCase put 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/synonyms.txt
 to /configs/conf1/synonyms.txt
   [junit4]   2> 112837 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.SolrTestCaseJ4 Writing core.properties file to 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/cores/collection1
   [junit4]   2> 112840 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 112843 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@253c16dd{/lq_fx,null,AVAILABLE}
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Started 
ServerConnector@3a9ddca{HTTP/1.1,[http/1.1]}{127.0.0.1:64253}
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server Started @118696ms
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: 
{solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/tempDir-001/control/data,
 hostContext=/lq_fx, hostPort=64253, 
coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/cores}
   [junit4]   2> 112845 ERROR 
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be 
missing or incomplete.
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 
7.0.0
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 112845 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 112846 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 
2017-01-15T21:50:49.902Z
   [junit4]   2> 112852 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 112852 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.SolrXmlConfig Loading container configuration from 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/solr.xml
   [junit4]   2> 112866 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:64250/solr
   [junit4]   2> 112918 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.OverseerElectionContext I am going to be 
the leader 127.0.0.1:64253_lq_fx
   [junit4]   2> 112920 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.Overseer Overseer 
(id=97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) starting
   [junit4]   2> 112940 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:64253_lq_fx
   [junit4]   2> 112942 INFO  
(zkCallback-74-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (0) -> (1)
   [junit4]   2> 113026 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.CorePropertiesLocator Found 1 core 
definitions underneath 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/cores
   [junit4]   2> 113026 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.CorePropertiesLocator Cores are: 
[collection1]
   [junit4]   2> 113032 INFO  
(OverseerStateUpdate-97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.o.ReplicaMutator Assigning new node to 
shard shard=shard1
   [junit4]   2> 114051 WARN  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] o.a.s.c.Config 
Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> 
instead.
   [junit4]   2> 114051 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 114064 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 114138 WARN  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.s.IndexSchema [collection1] default search field in schema is text. 
WARNING: Deprecated, please use 'df' on request instead.
   [junit4]   2> 114140 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 114162 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from 
collection control_collection
   [junit4]   2> 114184 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] o.a.s.c.SolrCore 
[[collection1] ] Opening new SolrCore at 
[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/cores/collection1],
 
dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/control-001/cores/collection1/data/]
   [junit4]   2> 114184 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX 
Server: com.sun.jmx.mbeanserver.JmxMBeanServer@754dfb5e
   [junit4]   2> 114188 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=50, maxMergeAtOnceExplicit=18, maxMergedSegmentMB=67.0595703125, 
floorSegmentMB=0.7548828125, forceMergeDeletesPctAllowed=21.08628022637968, 
segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 114201 WARN  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = 
requestHandler,name = /dump,class = DumpRequestHandler,attributes = 
{initParams=a, name=/dump, class=DumpRequestHandler},args = 
{defaults={a=A,b=B}}}
   [junit4]   2> 114217 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.UpdateHandler Using UpdateLog implementation: 
org.apache.solr.update.UpdateLog
   [junit4]   2> 114217 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH 
numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 114218 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 114218 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 114218 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: 
minMergeSize=1677721, mergeFactor=24, maxMergeSize=2147483648, 
maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, 
maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, 
noCFSRatio=0.4692800368962414]
   [junit4]   2> 114220 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.s.SolrIndexSearcher Opening [Searcher@59f78cb7[collection1] main]
   [junit4]   2> 114222 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: 
/configs/conf1
   [junit4]   2> 114223 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using 
ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 114223 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 114224 INFO  
(searcherExecutor-234-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection   
x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher 
Searcher@59f78cb7[collection1] 
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 114225 INFO  
(coreLoadExecutor-233-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx c:control_collection   x:collection1] 
o.a.s.u.UpdateLog Could not find max version in index or recent updates, using 
new clock 1556628951564025856
   [junit4]   2> 114238 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas 
found to continue.
   [junit4]   2> 114238 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new 
leader - try and sync
   [junit4]   2> 114238 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to 
http://127.0.0.1:64253/lq_fx/collection1/
   [junit4]   2> 114238 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync 
replicas to me
   [junit4]   2> 114238 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.SyncStrategy 
http://127.0.0.1:64253/lq_fx/collection1/ has no replicas
   [junit4]   2> 114246 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new 
leader: http://127.0.0.1:64253/lq_fx/collection1/ shard1
   [junit4]   2> 114402 INFO  
(coreZkRegister-226-thread-1-processing-n:127.0.0.1:64253_lq_fx x:collection1 
c:control_collection) [n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 
r:core_node1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery 
necessary
   [junit4]   2> 114576 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 114578 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:64250/solr ready
   [junit4]   2> 114578 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection 
loss:false
   [junit4]   2> 114878 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.SolrTestCaseJ4 Writing core.properties file to 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/cores/collection1
   [junit4]   2> 114879 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001
   [junit4]   2> 114880 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 114882 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@337241d7{/lq_fx,null,AVAILABLE}
   [junit4]   2> 114882 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Started 
ServerConnector@1c8d94c6{HTTP/1.1,[http/1.1]}{127.0.0.1:64258}
   [junit4]   2> 114883 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server Started @120735ms
   [junit4]   2> 114883 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: 
{solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/tempDir-001/jetty1,
 solrconfig=solrconfig.xml, hostContext=/lq_fx, hostPort=64258, 
coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/cores}
   [junit4]   2> 114883 ERROR 
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be 
missing or incomplete.
   [junit4]   2> 114884 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 
7.0.0
   [junit4]   2> 114884 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 114884 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 114884 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 
2017-01-15T21:50:51.940Z
   [junit4]   2> 114889 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 114889 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.SolrXmlConfig Loading container configuration from 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/solr.xml
   [junit4]   2> 114912 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:64250/solr
   [junit4]   2> 114934 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (0) -> (1)
   [junit4]   2> 114945 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:64258_lq_fx
   [junit4]   2> 114949 INFO  
(zkCallback-74-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (1) -> (2)
   [junit4]   2> 114950 INFO  
(zkCallback-83-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (1) -> (2)
   [junit4]   2> 114950 INFO  (zkCallback-78-thread-1) [    ] 
o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 114998 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.CorePropertiesLocator Found 1 core 
definitions underneath 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/cores
   [junit4]   2> 114998 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.CorePropertiesLocator Cores are: 
[collection1]
   [junit4]   2> 115002 INFO  
(OverseerStateUpdate-97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.o.ReplicaMutator Assigning new node to 
shard shard=shard1
   [junit4]   2> 116028 WARN  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.Config 
Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> 
instead.
   [junit4]   2> 116028 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.SolrConfig 
Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 116041 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] Schema name=test
   [junit4]   2> 116117 WARN  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] default search field in schema is text. WARNING: Deprecated, 
please use 'df' on request instead.
   [junit4]   2> 116121 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 116144 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.CoreContainer 
Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 116144 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.SolrCore 
[[collection1] ] Opening new SolrCore at 
[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/cores/collection1],
 
dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-1-001/cores/collection1/data/]
   [junit4]   2> 116144 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap 
JMX monitoring is enabled. Adding Solr mbeans to JMX Server: 
com.sun.jmx.mbeanserver.JmxMBeanServer@754dfb5e
   [junit4]   2> 116148 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=50, maxMergeAtOnceExplicit=18, maxMergedSegmentMB=67.0595703125, 
floorSegmentMB=0.7548828125, forceMergeDeletesPctAllowed=21.08628022637968, 
segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 116174 WARN  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.c.RequestHandlers 
INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class 
= DumpRequestHandler,attributes = {initParams=a, name=/dump, 
class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 116193 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateHandler 
Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 116193 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog 
Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 
maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 116194 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Hard AutoCommit: disabled
   [junit4]   2> 116194 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Soft AutoCommit: disabled
   [junit4]   2> 116195 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: 
minMergeSize=1677721, mergeFactor=24, maxMergeSize=2147483648, 
maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, 
maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, 
noCFSRatio=0.4692800368962414]
   [junit4]   2> 116198 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.s.SolrIndexSearcher Opening [Searcher@1fc6449f[collection1] main]
   [junit4]   2> 116200 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: 
/configs/conf1
   [junit4]   2> 116201 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using 
ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 116201 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 116202 INFO  
(searcherExecutor-245-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] 
o.a.s.c.SolrCore [collection1] Registered new searcher 
Searcher@1fc6449f[collection1] 
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 116203 INFO  
(coreLoadExecutor-244-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog Could 
not find max version in index or recent updates, using new clock 
1556628953638109184
   [junit4]   2> 116215 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to 
continue.
   [junit4]   2> 116215 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try 
and sync
   [junit4]   2> 116215 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy Sync replicas to 
http://127.0.0.1:64258/lq_fx/collection1/
   [junit4]   2> 116215 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 116215 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.SyncStrategy http://127.0.0.1:64258/lq_fx/collection1/ 
has no replicas
   [junit4]   2> 116222 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: 
http://127.0.0.1:64258/lq_fx/collection1/ shard1
   [junit4]   2> 116379 INFO  
(coreZkRegister-239-thread-1-processing-n:127.0.0.1:64258_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64258_lq_fx c:collection1 s:shard1 r:core_node1 
x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 116849 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.SolrTestCaseJ4 Writing core.properties file to 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/cores/collection1
   [junit4]   2> 116851 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001
   [junit4]   2> 116852 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 116854 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@5689c452{/lq_fx,null,AVAILABLE}
   [junit4]   2> 116854 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Started 
ServerConnector@109335b1{HTTP/1.1,[http/1.1]}{127.0.0.1:64262}
   [junit4]   2> 116854 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server Started @122706ms
   [junit4]   2> 116854 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: 
{solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/tempDir-001/jetty2,
 solrconfig=solrconfig.xml, hostContext=/lq_fx, hostPort=64262, 
coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/cores}
   [junit4]   2> 116855 ERROR 
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be 
missing or incomplete.
   [junit4]   2> 116856 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 
7.0.0
   [junit4]   2> 116856 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 116856 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 116856 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 
2017-01-15T21:50:53.912Z
   [junit4]   2> 116862 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 116862 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.SolrXmlConfig Loading container configuration from 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/solr.xml
   [junit4]   2> 116875 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:64250/solr
   [junit4]   2> 116897 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (0) -> (2)
   [junit4]   2> 116907 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:64262_lq_fx
   [junit4]   2> 116910 INFO  (zkCallback-78-thread-1) [    ] 
o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 116911 INFO  
(zkCallback-89-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (2) -> (3)
   [junit4]   2> 116910 INFO  
(zkCallback-74-thread-3-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (2) -> (3)
   [junit4]   2> 116910 INFO  
(zkCallback-83-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (2) -> (3)
   [junit4]   2> 116953 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.CorePropertiesLocator Found 1 core 
definitions underneath 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/cores
   [junit4]   2> 116953 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.CorePropertiesLocator Cores are: 
[collection1]
   [junit4]   2> 116958 INFO  
(OverseerStateUpdate-97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.o.ReplicaMutator Assigning new node to 
shard shard=shard1
   [junit4]   2> 117982 WARN  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.Config 
Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> 
instead.
   [junit4]   2> 117983 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.SolrConfig 
Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 117995 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] Schema name=test
   [junit4]   2> 118060 WARN  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] default search field in schema is text. WARNING: Deprecated, 
please use 'df' on request instead.
   [junit4]   2> 118069 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 118091 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.CoreContainer 
Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 118092 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.SolrCore 
[[collection1] ] Opening new SolrCore at 
[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/cores/collection1],
 
dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-2-001/cores/collection1/data/]
   [junit4]   2> 118092 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap 
JMX monitoring is enabled. Adding Solr mbeans to JMX Server: 
com.sun.jmx.mbeanserver.JmxMBeanServer@754dfb5e
   [junit4]   2> 118095 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=50, maxMergeAtOnceExplicit=18, maxMergedSegmentMB=67.0595703125, 
floorSegmentMB=0.7548828125, forceMergeDeletesPctAllowed=21.08628022637968, 
segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 118108 WARN  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.c.RequestHandlers 
INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class 
= DumpRequestHandler,attributes = {initParams=a, name=/dump, 
class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 118581 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateHandler 
Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 118581 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog 
Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 
maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 118582 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Hard AutoCommit: disabled
   [junit4]   2> 118582 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Soft AutoCommit: disabled
   [junit4]   2> 118583 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: 
minMergeSize=1677721, mergeFactor=24, maxMergeSize=2147483648, 
maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, 
maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, 
noCFSRatio=0.4692800368962414]
   [junit4]   2> 118585 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.s.SolrIndexSearcher Opening [Searcher@5019506c[collection1] main]
   [junit4]   2> 118587 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: 
/configs/conf1
   [junit4]   2> 118588 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using 
ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 118588 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 118589 INFO  
(searcherExecutor-256-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] 
o.a.s.c.SolrCore [collection1] Registered new searcher 
Searcher@5019506c[collection1] 
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 118590 INFO  
(coreLoadExecutor-255-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog Could 
not find max version in index or recent updates, using new clock 
1556628956141060096
   [junit4]   2> 118597 INFO  
(coreZkRegister-250-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64262_lq_fx c:collection1 s:shard1 r:core_node2 
x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
   [junit4]   2> 118597 INFO  
(updateExecutor-86-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.DefaultSolrCoreState Running 
recovery
   [junit4]   2> 118598 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery 
process. recoveringAfterStartup=true
   [junit4]   2> 118598 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy ###### 
startupVersions=[[]]
   [junit4]   2> 118598 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering 
updates. core=[collection1]
   [junit4]   2> 118598 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Starting to buffer 
updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 118598 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Publishing state 
of core [collection1] as recovering, leader is 
[http://127.0.0.1:64258/lq_fx/collection1/] and I am 
[http://127.0.0.1:64262/lq_fx/collection1/]
   [junit4]   2> 118601 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Sending prep 
recovery command to [http://127.0.0.1:64258/lq_fx]; [WaitForState: 
action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:64262_lq_fx&coreNodeName=core_node2&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 118606 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node2, state: 
recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
   [junit4]   2> 118607 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 
(shard1 of collection1) have state: recovering
   [junit4]   2> 118607 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, 
shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? 
true, live=true, checkLive=true, currentState=recovering, localState=active, 
nodeName=127.0.0.1:64262_lq_fx, coreNodeName=core_node2, 
onlyIfActiveCheckResult=false, nodeProps: 
core_node2:{"core":"collection1","base_url":"http://127.0.0.1:64262/lq_fx","node_name":"127.0.0.1:64262_lq_fx","state":"recovering"}
   [junit4]   2> 118607 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node2, state: recovering, 
checkLive: true, onlyIfLeader: true for: 0 seconds.
   [junit4]   2> 118607 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores 
params={nodeName=127.0.0.1:64262_lq_fx&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node2&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2}
 status=0 QTime=1
   [junit4]   2> 119290 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.SolrTestCaseJ4 Writing core.properties file to 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/cores/collection1
   [junit4]   2> 119300 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001
   [junit4]   2> 119301 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 119305 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Started 
o.e.j.s.ServletContextHandler@4a8f0564{/lq_fx,null,AVAILABLE}
   [junit4]   2> 119307 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Started 
ServerConnector@6a602a5c{HTTP/1.1,[http/1.1]}{127.0.0.1:64267}
   [junit4]   2> 119307 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.Server Started @125158ms
   [junit4]   2> 119307 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.s.e.JettySolrRunner Jetty properties: 
{solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/tempDir-001/jetty3,
 solrconfig=solrconfig.xml, hostContext=/lq_fx, hostPort=64267, 
coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/cores}
   [junit4]   2> 119308 ERROR 
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be 
missing or incomplete.
   [junit4]   2> 119308 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 
7.0.0
   [junit4]   2> 119309 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 119309 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 119309 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 
2017-01-15T21:50:56.365Z
   [junit4]   2> 119317 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in 
ZooKeeper)
   [junit4]   2> 119317 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.SolrXmlConfig Loading container configuration from 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/solr.xml
   [junit4]   2> 119329 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:64250/solr
   [junit4]   2> 119353 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64267_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (0) -> (3)
   [junit4]   2> 119366 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64267_lq_fx    ] o.a.s.c.ZkController Register node as live in 
ZooKeeper:/live_nodes/127.0.0.1:64267_lq_fx
   [junit4]   2> 119370 INFO  
(zkCallback-83-thread-1-processing-n:127.0.0.1:64258_lq_fx) 
[n:127.0.0.1:64258_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (3) -> (4)
   [junit4]   2> 119371 INFO  
(zkCallback-89-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (3) -> (4)
   [junit4]   2> 119371 INFO  (zkCallback-78-thread-1) [    ] 
o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 119371 INFO  
(zkCallback-74-thread-2-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (3) -> (4)
   [junit4]   2> 119372 INFO  
(zkCallback-96-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (3) -> (4)
   [junit4]   2> 119436 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64267_lq_fx    ] o.a.s.c.CorePropertiesLocator Found 1 core 
definitions underneath 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/cores
   [junit4]   2> 119436 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) 
[n:127.0.0.1:64267_lq_fx    ] o.a.s.c.CorePropertiesLocator Cores are: 
[collection1]
   [junit4]   2> 119442 INFO  
(OverseerStateUpdate-97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.o.ReplicaMutator Assigning new node to 
shard shard=shard1
   [junit4]   2> 120459 WARN  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.Config 
Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> 
instead.
   [junit4]   2> 120459 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.SolrConfig 
Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 120472 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] Schema name=test
   [junit4]   2> 120558 WARN  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
[collection1] default search field in schema is text. WARNING: Deprecated, 
please use 'df' on request instead.
   [junit4]   2> 120560 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.s.IndexSchema 
Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 120583 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.CoreContainer 
Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 120583 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.SolrCore 
[[collection1] ] Opening new SolrCore at 
[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/cores/collection1],
 
dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001/shard-3-001/cores/collection1/data/]
   [junit4]   2> 120583 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap 
JMX monitoring is enabled. Adding Solr mbeans to JMX Server: 
com.sun.jmx.mbeanserver.JmxMBeanServer@754dfb5e
   [junit4]   2> 120586 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: 
maxMergeAtOnce=50, maxMergeAtOnceExplicit=18, maxMergedSegmentMB=67.0595703125, 
floorSegmentMB=0.7548828125, forceMergeDeletesPctAllowed=21.08628022637968, 
segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 120599 WARN  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.c.RequestHandlers 
INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class 
= DumpRequestHandler,attributes = {initParams=a, name=/dump, 
class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 120621 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateHandler 
Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 120621 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog 
Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 
maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 120624 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Hard AutoCommit: disabled
   [junit4]   2> 120624 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.u.CommitTracker 
Soft AutoCommit: disabled
   [junit4]   2> 120625 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class 
org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: 
minMergeSize=1677721, mergeFactor=24, maxMergeSize=2147483648, 
maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, 
maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, 
noCFSRatio=0.4692800368962414]
   [junit4]   2> 120636 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.s.SolrIndexSearcher Opening [Searcher@40f47f73[collection1] main]
   [junit4]   2> 120638 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: 
/configs/conf1
   [junit4]   2> 120639 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using 
ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 120639 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 120640 INFO  
(searcherExecutor-267-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] 
o.a.s.c.SolrCore [collection1] Registered new searcher 
Searcher@40f47f73[collection1] 
main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 120641 INFO  
(coreLoadExecutor-266-thread-1-processing-n:127.0.0.1:64267_lq_fx) 
[n:127.0.0.1:64267_lq_fx c:collection1   x:collection1] o.a.s.u.UpdateLog Could 
not find max version in index or recent updates, using new clock 
1556628958291689472
   [junit4]   2> 120647 INFO  
(coreZkRegister-261-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
c:collection1) [n:127.0.0.1:64267_lq_fx c:collection1 s:shard1 r:core_node3 
x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
   [junit4]   2> 120648 INFO  
(updateExecutor-93-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.DefaultSolrCoreState Running 
recovery
   [junit4]   2> 120648 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery 
process. recoveringAfterStartup=true
   [junit4]   2> 120648 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy ###### 
startupVersions=[[]]
   [junit4]   2> 120649 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering 
updates. core=[collection1]
   [junit4]   2> 120649 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Starting to buffer 
updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 120649 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Publishing state 
of core [collection1] as recovering, leader is 
[http://127.0.0.1:64258/lq_fx/collection1/] and I am 
[http://127.0.0.1:64267/lq_fx/collection1/]
   [junit4]   2> 120651 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Sending prep 
recovery command to [http://127.0.0.1:64258/lq_fx]; [WaitForState: 
action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:64267_lq_fx&coreNodeName=core_node3&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 120653 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node3, state: 
recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
   [junit4]   2> 120654 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 
(shard1 of collection1) have state: recovering
   [junit4]   2> 120654 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, 
shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? 
true, live=true, checkLive=true, currentState=down, localState=active, 
nodeName=127.0.0.1:64267_lq_fx, coreNodeName=core_node3, 
onlyIfActiveCheckResult=false, nodeProps: 
core_node3:{"core":"collection1","base_url":"http://127.0.0.1:64267/lq_fx","node_name":"127.0.0.1:64267_lq_fx","state":"down"}
   [junit4]   2> 120980 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.SolrTestCaseJ4 ###Starting test
   [junit4]   2> 120980 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 30 
for each attempt
   [junit4]   2> 120980 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection: 
collection1 failOnTimeout:true timeout (sec):30
   [junit4]   2> 121659 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, 
shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? 
true, live=true, checkLive=true, currentState=recovering, localState=active, 
nodeName=127.0.0.1:64267_lq_fx, coreNodeName=core_node3, 
onlyIfActiveCheckResult=false, nodeProps: 
core_node3:{"core":"collection1","base_url":"http://127.0.0.1:64267/lq_fx","node_name":"127.0.0.1:64267_lq_fx","state":"recovering"}
   [junit4]   2> 121659 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node3, state: recovering, 
checkLive: true, onlyIfLeader: true for: 1 seconds.
   [junit4]   2> 121659 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx    ] 
o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores 
params={nodeName=127.0.0.1:64267_lq_fx&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node3&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2}
 status=0 QTime=1005
   [junit4]   2> 125614 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Attempting to 
PeerSync from [http://127.0.0.1:64258/lq_fx/collection1/] - 
recoveringAfterStartup=[true]
   [junit4]   2> 125614 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync PeerSync: 
core=collection1 url=http://127.0.0.1:64262/lq_fx START 
replicas=[http://127.0.0.1:64258/lq_fx/collection1/] nUpdates=1000
   [junit4]   2> 125618 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerprint 
IndexFingerprint millis:1.0 result:{maxVersionSpecified=9223372036854775807, 
maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, 
maxDoc=0}
   [junit4]   2> 125618 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request 
[collection1]  webapp=/lq_fx path=/get 
params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2}
 status=0 QTime=1
   [junit4]   2> 125619 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint 
millis:0.0 result:{maxVersionSpecified=9223372036854775807, 
maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, 
maxDoc=0}
   [junit4]   2> 125619 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync We are already in sync. 
No need to do a PeerSync 
   [junit4]   2> 125619 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 125619 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 No 
uncommitted changes. Skipping IW.commit.
   [junit4]   2> 125620 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 
end_commit_flush
   [junit4]   2> 125620 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of 
recovery was successful.
   [junit4]   2> 125620 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates 
buffered during PeerSync.
   [junit4]   2> 125620 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
   [junit4]   2> 125620 INFO  
(recoveryExecutor-87-thread-1-processing-n:127.0.0.1:64262_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:64262_lq_fx c:collection1 
s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Registering as 
Active after recovery.
   [junit4]   2> 128670 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Attempting to 
PeerSync from [http://127.0.0.1:64258/lq_fx/collection1/] - 
recoveringAfterStartup=[true]
   [junit4]   2> 128671 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync PeerSync: 
core=collection1 url=http://127.0.0.1:64267/lq_fx START 
replicas=[http://127.0.0.1:64258/lq_fx/collection1/] nUpdates=1000
   [junit4]   2> 128673 INFO  (qtp1889428961-488) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerprint 
IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, 
maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, 
maxDoc=0}
   [junit4]   2> 128673 INFO  (qtp1889428961-488) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request 
[collection1]  webapp=/lq_fx path=/get 
params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2}
 status=0 QTime=0
   [junit4]   2> 128675 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint 
millis:1.0 result:{maxVersionSpecified=9223372036854775807, 
maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, 
maxDoc=0}
   [junit4]   2> 128675 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync We are already in sync. 
No need to do a PeerSync 
   [junit4]   2> 128675 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 128675 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 No 
uncommitted changes. Skipping IW.commit.
   [junit4]   2> 128676 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 
end_commit_flush
   [junit4]   2> 128676 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of 
recovery was successful.
   [junit4]   2> 128676 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates 
buffered during PeerSync.
   [junit4]   2> 128676 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
   [junit4]   2> 128676 INFO  
(recoveryExecutor-94-thread-1-processing-n:127.0.0.1:64267_lq_fx x:collection1 
s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:64267_lq_fx c:collection1 
s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Registering as 
Active after recovery.
   [junit4]   2> 129012 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection: collection1
   [junit4]   2> 129016 INFO  (qtp461909716-449) [n:127.0.0.1:64253_lq_fx 
c:control_collection s:shard1 r:core_node1 x:collection1] 
o.a.s.u.DirectUpdateHandler2 start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 129016 INFO  (qtp461909716-449) [n:127.0.0.1:64253_lq_fx 
c:control_collection s:shard1 r:core_node1 x:collection1] 
o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 129018 INFO  (qtp461909716-449) [n:127.0.0.1:64253_lq_fx 
c:control_collection s:shard1 r:core_node1 x:collection1] 
o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 129018 INFO  (qtp461909716-449) [n:127.0.0.1:64253_lq_fx 
c:control_collection s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=}
 0 2
   [junit4]   2> 129024 INFO  (qtp1889428961-484) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 
start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 129024 INFO  (qtp1889428961-484) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 
No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 129025 INFO  (qtp1889428961-484) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 
end_commit_flush
   [junit4]   2> 129026 INFO  (qtp1889428961-484) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=}
 0 2
   [junit4]   2> 129027 INFO  (qtp1241384-549) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 
start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 129028 INFO  (qtp911078381-514) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 
start 
commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 129028 INFO  (qtp1241384-549) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 
No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 129029 INFO  (qtp1241384-549) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 
end_commit_flush
   [junit4]   2> 129029 INFO  (qtp911078381-514) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 
No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 129029 INFO  (qtp1241384-549) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=}
 0 2
   [junit4]   2> 129030 INFO  (qtp911078381-514) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 
end_commit_flush
   [junit4]   2> 129030 INFO  (qtp911078381-514) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=}
 0 2
   [junit4]   2> 129032 INFO  (qtp1889428961-490) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=}
 0 12
   [junit4]   2> 129035 INFO  (qtp1889428961-491) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request 
[collection1]  webapp=/lq_fx path=/select 
params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2}
 hits=0 status=0 QTime=0
   [junit4]   2> 129037 INFO  (qtp911078381-515) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.S.Request 
[collection1]  webapp=/lq_fx path=/select 
params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2}
 hits=0 status=0 QTime=0
   [junit4]   2> 129038 INFO  (qtp1241384-550) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.S.Request 
[collection1]  webapp=/lq_fx path=/select 
params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2}
 hits=0 status=0 QTime=0
   [junit4]   2> 131052 INFO  (qtp461909716-450) [n:127.0.0.1:64253_lq_fx 
c:control_collection s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{deleteByQuery=*:* (-1556628969202122752)} 0 5
   [junit4]   2> 131058 INFO  (qtp1241384-551) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&_version_=-1556628969209462784&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{deleteByQuery=*:*
 (-1556628969209462784)} 0 2
   [junit4]   2> 131060 INFO  (qtp911078381-515) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&_version_=-1556628969209462784&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{deleteByQuery=*:*
 (-1556628969209462784)} 0 3
   [junit4]   2> 131060 INFO  (qtp1889428961-486) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{deleteByQuery=*:* (-1556628969209462784)} 0 7
   [junit4]   2> 131085 INFO  (qtp911078381-517) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[0
 (1556628969222045696)]} 0 12
   [junit4]   2> 131085 INFO  (qtp1241384-552) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[0
 (1556628969222045696)]} 0 12
   [junit4]   2> 131086 INFO  (qtp1889428961-487) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[0 (1556628969222045696)]} 0 21
   [junit4]   2> 131098 INFO  (qtp1241384-553) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[1
 (1556628969246162944)]} 0 4
   [junit4]   2> 131101 INFO  (qtp911078381-518) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[1
 (1556628969246162944)]} 0 3
   [junit4]   2> 131102 INFO  (qtp1889428961-488) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[1 (1556628969246162944)]} 0 14
   [junit4]   2> 131105 INFO  (qtp911078381-519) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[2
 (1556628969261891584)]} 0 0
   [junit4]   2> 131105 INFO  (qtp1241384-554) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[2
 (1556628969261891584)]} 0 0
   [junit4]   2> 131106 INFO  (qtp1889428961-489) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[2 (1556628969261891584)]} 0 3
   [junit4]   2> 131110 INFO  (qtp911078381-519) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[3
 (1556628969268183040)]} 0 0
   [junit4]   2> 131111 INFO  (qtp1241384-554) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[3
 (1556628969268183040)]} 0 0
   [junit4]   2> 131112 INFO  (qtp1889428961-484) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[3 (1556628969268183040)]} 0 2
   [junit4]   2> 131114 INFO  (qtp911078381-519) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[4
 (1556628969272377344)]} 0 0
   [junit4]   2> 131115 INFO  (qtp1241384-554) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[4
 (1556628969272377344)]} 0 0
   [junit4]   2> 131115 INFO  (qtp1889428961-490) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[4 (1556628969272377344)]} 0 2
   [junit4]   2> 131118 INFO  (qtp911078381-519) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[5
 (1556628969275523072)]} 0 0
   [junit4]   2> 131118 INFO  (qtp1241384-554) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[5
 (1556628969275523072)]} 0 0
   [junit4]   2> 131119 INFO  (qtp1889428961-491) [n:127.0.0.1:64258_lq_fx 
c:collection1 s:shard1 r:core_node1 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={wt=javabin&version=2}{add=[5 (1556628969275523072)]} 0 2
   [junit4]   2> 131122 INFO  (qtp911078381-519) [n:127.0.0.1:64262_lq_fx 
c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/lq_fx path=/update 
params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:64258/lq_fx/collection1/&wt=javabin&version=2}{add=[6
 (1556628969279717376)]} 0 0
   [junit4]   2> 131122 INFO  (qtp1241384-554) [n:127.0.0.1:64267_lq_fx 
c:collection1 s:shard1 r:core_node3 x:collection1] 
o.a.s.u.p.LogUpdateProcessorFactory [coll

[...truncated too long message...]

ner instance=1879300464
   [junit4]   2> 149504 INFO  (coreCloseExecutor-289-thread-1) 
[n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.c.SolrCore [collection1]  CLOSING SolrCore 
org.apache.solr.core.SolrCore@1dc5829e
   [junit4]   2> 149582 INFO  (coreCloseExecutor-289-thread-1) 
[n:127.0.0.1:64253_lq_fx c:control_collection s:shard1 r:core_node1 
x:collection1] o.a.s.m.SolrMetricManager Closing metric reporters for: 
solr.core.collection1
   [junit4]   2> 149583 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.Overseer Overseer 
(id=97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) closing
   [junit4]   2> 149583 INFO  
(OverseerStateUpdate-97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.Overseer Overseer Loop exiting : 
127.0.0.1:64253_lq_fx
   [junit4]   2> 149586 WARN  
(zkCallback-74-thread-1-processing-n:127.0.0.1:64253_lq_fx) 
[n:127.0.0.1:64253_lq_fx    ] o.a.s.c.c.ZkStateReader ZooKeeper watch 
triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for 
/live_nodes]
   [junit4]   2> 149587 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.m.SolrMetricManager Closing metric reporters for: solr.node
   [junit4]   2> 149587 INFO  
(zkCallback-103-thread-1-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.c.ZkStateReader Updated live nodes from 
ZooKeeper... (2) -> (1)
   [junit4]   2> 149588 INFO  
(zkCallback-103-thread-3-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.OverseerElectionContext I am going to be 
the leader 127.0.0.1:64262_lq_fx
   [junit4]   2> 149588 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Stopped 
ServerConnector@3a9ddca{HTTP/1.1,[http/1.1]}{127.0.0.1:0}
   [junit4]   2> 149589 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Stopped 
o.e.j.s.ServletContextHandler@253c16dd{/lq_fx,null,UNAVAILABLE}
   [junit4]   2> 149590 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ChaosMonkey monkey: stop shard! 64258
   [junit4]   2> 149590 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ChaosMonkey monkey: stop shard! 64262
   [junit4]   2> 149590 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.CoreContainer Shutting down CoreContainer instance=790397287
   [junit4]   2> 149591 INFO  
(zkCallback-103-thread-3-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.Overseer Overseer 
(id=97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) starting
   [junit4]   2> 149604 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.Overseer Overseer 
(id=97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) closing
   [junit4]   2> 149605 INFO  
(OverseerStateUpdate-97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.Overseer According to ZK I 
(id=97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) am no longer a leader.
   [junit4]   2> 149605 INFO  
(OverseerStateUpdate-97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.Overseer Overseer Loop exiting : 
127.0.0.1:64262_lq_fx
   [junit4]   2> 149606 INFO  
(OverseerCollectionConfigSetProcessor-97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004)
 [n:127.0.0.1:64262_lq_fx    ] o.a.s.c.OverseerTaskProcessor According to ZK I 
(id=97289309351247889-127.0.0.1:64262_lq_fx-n_0000000004) am no longer a leader.
   [junit4]   2> 149608 WARN  
(zkCallback-103-thread-3-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx    ] o.a.s.c.c.ZkStateReader ZooKeeper watch 
triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for 
/live_nodes]
   [junit4]   2> 151935 WARN  
(zkCallback-103-thread-2-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.c.SyncStrategy Closed, skipping sync up.
   [junit4]   2> 151936 INFO  
(zkCallback-103-thread-2-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.c.SolrCore [collection1]  CLOSING SolrCore 
org.apache.solr.core.SolrCore@763b4dda
   [junit4]   2> 151995 INFO  
(zkCallback-103-thread-2-processing-n:127.0.0.1:64262_lq_fx) 
[n:127.0.0.1:64262_lq_fx c:collection1 s:shard1 r:core_node2 x:collection1] 
o.a.s.m.SolrMetricManager Closing metric reporters for: solr.core.collection1
   [junit4]   2> 151995 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.m.SolrMetricManager Closing metric reporters for: solr.node
   [junit4]   2> 151997 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.AbstractConnector Stopped 
ServerConnector@439ca663{HTTP/1.1,[http/1.1]}{127.0.0.1:64262}
   [junit4]   2> 151997 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.e.j.s.h.ContextHandler Stopped 
o.e.j.s.ServletContextHandler@1de1f76d{/lq_fx,null,UNAVAILABLE}
   [junit4]   2> 151998 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ChaosMonkey monkey: stop shard! 64267
   [junit4]   2> 152000 INFO  
(TEST-PeerSyncReplicationTest.test-seed#[66390243AC03ACF0]) [    ] 
o.a.s.c.ZkTestServer connecting to 127.0.0.1:64250 64250
   [junit4]   2> 153123 INFO  (Thread-80) [    ] o.a.s.c.ZkTestServer 
connecting to 127.0.0.1:64250 64250
   [junit4]   2> 157291 WARN  (Thread-80) [    ] o.a.s.c.ZkTestServer Watch 
limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2>        6       /solr/aliases.json
   [junit4]   2>        6       /solr/clusterprops.json
   [junit4]   2>        5       /solr/security.json
   [junit4]   2>        5       /solr/configs/conf1
   [junit4]   2>        4       /solr/collections/collection1/state.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2>        6       /solr/clusterstate.json
   [junit4]   2>        2       
/solr/collections/collection1/leader_elect/shard1/election/97289309351247880-core_node1-n_0000000000
   [junit4]   2>        2       
/solr/overseer_elect/election/97289309351247876-127.0.0.1:64253_lq_fx-n_0000000000
   [junit4]   2>        2       
/solr/overseer_elect/election/97289309351247880-127.0.0.1:64258_lq_fx-n_0000000001
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2>        37      /solr/overseer/collection-queue-work
   [junit4]   2>        36      /solr/overseer/queue
   [junit4]   2>        6       /solr/collections
   [junit4]   2>        6       /solr/overseer/queue-work
   [junit4]   2>        5       /solr/live_nodes
   [junit4]   2> 
   [junit4]   2> NOTE: reproduce with: ant test  
-Dtestcase=PeerSyncReplicationTest -Dtests.method=test 
-Dtests.seed=66390243AC03ACF0 -Dtests.slow=true -Dtests.locale=id 
-Dtests.timezone=America/Juneau -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] FAILURE 44.9s J1 | PeerSyncReplicationTest.test <<<
   [junit4]    > Throwable #1: java.lang.AssertionError: PeerSynced node did 
not become leader expected:<CloudJettyRunner 
[url=http://127.0.0.1:64262/lq_fx/collection1]> but was:<CloudJettyRunner 
[url=http://127.0.0.1:64258/lq_fx/collection1]>
   [junit4]    >        at 
__randomizedtesting.SeedInfo.seed([66390243AC03ACF0:EE6D3D9902FFC108]:0)
   [junit4]    >        at 
org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:162)
   [junit4]    >        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
   [junit4]    >        at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
   [junit4]    >        at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> 157301 INFO  
(SUITE-PeerSyncReplicationTest-seed#[66390243AC03ACF0]-worker) [    ] 
o.a.s.SolrTestCaseJ4 ###deleteCore
   [junit4]   2> NOTE: leaving temporary files on disk at: 
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_66390243AC03ACF0-001
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
{other_tl1=PostingsFormat(name=Asserting), 
range_facet_l_dv=PostingsFormat(name=Direct), 
rnd_s=PostingsFormat(name=Asserting), 
multiDefault=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))),
 intDefault=PostingsFormat(name=Asserting), 
a_i1=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))),
 
range_facet_l=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))),
 _version_=PostingsFormat(name=Asserting), 
a_t=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))),
 id=PostingsFormat(name=Direct), 
range_facet_i_dv=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))),
 text=Lucene50(blocksize=128), 
timestamp=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128)))},
 docValues:{range_facet_l_dv=DocValuesFormat(name=Lucene70), 
range_facet_i_dv=DocValuesFormat(name=Memory), 
timestamp=DocValuesFormat(name=Memory)}, maxPointsInLeafNode=1706, 
maxMBSortInHeap=7.164267977330658, sim=RandomSimilarity(queryNorm=false): {}, 
locale=id, timezone=America/Juneau
   [junit4]   2> NOTE: Mac OS X 10.11.6 x86_64/Oracle Corporation 1.8.0_102 
(64-bit)/cpus=3,threads=1,free=126304736,total=287309824
   [junit4]   2> NOTE: All tests run in this JVM: 
[TestSuggestSpellingConverter, TestFieldSortValues, TestReload, 
SpellingQueryConverterTest, BasicFunctionalityTest, FullHLLTest, 
BigEndianAscendingWordDeserializerTest, AutoCommitTest, 
BasicDistributedZk2Test, PeerSyncReplicationTest]
   [junit4] Completed [18/678 (1!)] on J1 in 44.96s, 1 test, 1 failure <<< 
FAILURES!

[...truncated 64609 lines...]

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to