Hi Mike,

while working on LUCENE-6989, I ran all core tests with FSDirectory (in my case 
MMapDirectory) on Windows, and I got more failures. So there seems to be some 
fishy things. I can post you stack traces, seems to be fairly reproducible.

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: u...@thetaphi.de

> -----Original Message-----
> From: Michael McCandless [mailto:luc...@mikemccandless.com]
> Sent: Monday, February 08, 2016 2:50 PM
> To: Lucene/Solr dev <dev@lucene.apache.org>
> Subject: Re: [JENKINS] Lucene-Solr-trunk-Windows (32bit/jdk1.8.0_72) -
> Build # 5606 - Still Failing!
> 
> I'll look...
> 
> Mike McCandless
> 
> http://blog.mikemccandless.com
> 
> 
> On Mon, Feb 8, 2016 at 6:13 AM, Policeman Jenkins Server
> <jenk...@thetaphi.de> wrote:
> > Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Windows/5606/
> > Java: 32bit/jdk1.8.0_72 -client -XX:+UseConcMarkSweepGC
> >
> > 1 tests failed.
> > FAILED:  org.apache.lucene.store.TestSimpleFSLockFactory.testStressLocks
> >
> > Error Message:
> > IndexWriter hit unexpected exceptions
> >
> > Stack Trace:
> > java.lang.AssertionError: IndexWriter hit unexpected exceptions
> >         at
> __randomizedtesting.SeedInfo.seed([E11DDD1BFE48D599:BF2C93E6E2E41DF
> F]:0)
> >         at org.junit.Assert.fail(Assert.java:93)
> >         at org.junit.Assert.assertTrue(Assert.java:43)
> >         at
> org.apache.lucene.store.BaseLockFactoryTestCase.testStressLocks(BaseLock
> FactoryTestCase.java:180)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:62)
> >         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1764)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(Rando
> mizedRunner.java:871)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(Rando
> mizedRunner.java:907)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(Rand
> omizedRunner.java:921)
> >         at
> org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRule
> SetupTeardownChained.java:49)
> >         at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:45)
> >         at
> org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleTh
> readAndTestName.java:48)
> >         at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:64)
> >         at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:47)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:367)
> >         at
> com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask
> (ThreadLeakControl.java:809)
> >         at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadL
> eakControl.java:460)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(Ran
> domizedRunner.java:880)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:781)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(Rando
> mizedRunner.java:816)
> >         at
> com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(Rando
> mizedRunner.java:827)
> >         at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:45)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:41)
> >         at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
> >         at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:53)
> >         at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:47)
> >         at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:64)
> >         at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:54)
> >         at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> >         at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:367)
> >         at java.lang.Thread.run(Thread.java:745)
> >
> >
> >
> >
> > Build Log:
> > [...truncated 356 lines...]
> >    [junit4] Suite: org.apache.lucene.store.TestSimpleFSLockFactory
> >    [junit4]   1> Stress Test Index Writer: close hit unexpected exception:
> java.nio.file.NoSuchFileException:
> C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\build\core\test\J1\temp\lucene.store.TestSimpleFSLockFa
> ctory_E11DDD1BFE48D599-001\tempDir-002\segments_2
> >    [junit4]   1> java.nio.file.NoSuchFileException:
> C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\build\core\test\J1\temp\lucene.store.TestSimpleFSLockFa
> ctory_E11DDD1BFE48D599-001\tempDir-002\segments_2
> >    [junit4]   1>        at
> sun.nio.fs.WindowsException.translateToIOException(WindowsException.ja
> va:79)
> >    [junit4]   1>        at
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.jav
> a:97)
> >    [junit4]   1>        at
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.jav
> a:102)
> >    [junit4]   1>        at
> sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvi
> der.java:269)
> >    [junit4]   1>        at
> sun.nio.fs.AbstractFileSystemProvider.delete(AbstractFileSystemProvider.ja
> va:103)
> >    [junit4]   1>        at
> org.apache.lucene.mockfile.FilterFileSystemProvider.delete(FilterFileSystem
> Provider.java:137)
> >    [junit4]   1>        at
> org.apache.lucene.mockfile.FilterFileSystemProvider.delete(FilterFileSystem
> Provider.java:137)
> >    [junit4]   1>        at
> org.apache.lucene.mockfile.FilterFileSystemProvider.delete(FilterFileSystem
> Provider.java:137)
> >    [junit4]   1>        at
> org.apache.lucene.mockfile.FilterFileSystemProvider.delete(FilterFileSystem
> Provider.java:137)
> >    [junit4]   1>        at
> org.apache.lucene.mockfile.FilterFileSystemProvider.delete(FilterFileSystem
> Provider.java:137)
> >    [junit4]   1>        at java.nio.file.Files.delete(Files.java:1126)
> >    [junit4]   1>        at
> org.apache.lucene.store.FSDirectory.privateDeleteFile(FSDirectory.java:368)
> >    [junit4]   1>        at
> org.apache.lucene.store.FSDirectory.deleteFile(FSDirectory.java:330)
> >    [junit4]   1>        at
> org.apache.lucene.store.MockDirectoryWrapper.deleteFile(MockDirectoryW
> rapper.java:463)
> >    [junit4]   1>        at
> org.apache.lucene.store.LockValidatingDirectoryWrapper.deleteFile(LockVali
> datingDirectoryWrapper.java:38)
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexFileDeleter.deleteFiles(IndexFileDeleter.java:
> 708)
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexFileDeleter.refresh(IndexFileDeleter.java:450
> )
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexWriter.rollbackInternalNoCommit(IndexWrite
> r.java:2090)
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexWriter.rollbackInternal(IndexWriter.java:2032
> )
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexWriter.shutdown(IndexWriter.java:1074)
> >    [junit4]   1>        at
> org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1116)
> >    [junit4]   1>        at
> org.apache.lucene.store.BaseLockFactoryTestCase$WriterThread.run(BaseL
> ockFactoryTestCase.java:268)
> >    [junit4]   1>
> >    [junit4]   1> TEST: WriterThread iter=0
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.776Z; Thread-132]: init: current
> segments file is "segments_1";
> deletionPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy
> @be6c2f
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.794Z; Thread-132]: init: load
> commit "segments_1"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.796Z; Thread-132]: delete []
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.796Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1" [1 segments ; isCommit = false]
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.796Z; Thread-132]: delete []
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.796Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.798Z; Thread-132]: init:
> create=false
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.798Z; Thread-132]:
> >    [junit4]   1>
> dir=MockDirectoryWrapper(NIOFSDirectory@C:\Users\JenkinsSlave\worksp
> ace\Lucene-Solr-trunk-
> Windows\lucene\build\core\test\J1\temp\lucene.store.TestSimpleFSLockFa
> ctory_E11DDD1BFE48D599-001\tempDir-002
> lockFactory=org.apache.lucene.store.SimpleFSLockFactory@114494a)
> >    [junit4]   1> index=_0(6.0.0):c1
> >    [junit4]   1> version=6.0.0
> >    [junit4]   1> analyzer=org.apache.lucene.analysis.MockAnalyzer
> >    [junit4]   1> ramBufferSizeMB=16.0
> >    [junit4]   1> maxBufferedDocs=-1
> >    [junit4]   1> maxBufferedDeleteTerms=-1
> >    [junit4]   1> mergedSegmentWarmer=null
> >    [junit4]   1>
> delPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy
> >    [junit4]   1> commit=null
> >    [junit4]   1> openMode=APPEND
> >    [junit4]   1>
> similarity=org.apache.lucene.search.similarities.BM25Similarity
> >    [junit4]   1> mergeScheduler=ConcurrentMergeScheduler:
> maxThreadCount=-1, maxMergeCount=-1, ioThrottle=true
> >    [junit4]   1> codec=CheapBastard
> >    [junit4]   1> infoStream=org.apache.lucene.util.PrintStreamInfoStream
> >    [junit4]   1> mergePolicy=[TieredMergePolicy: maxMergeAtOnce=10,
> maxMergeAtOnceExplicit=30, maxMergedSegmentMB=5120.0,
> floorSegmentMB=2.0, forceMergeDeletesPctAllowed=10.0,
> segmentsPerTier=10.0, maxCFSSegmentSizeMB=8.796093022207999E12,
> noCFSRatio=0.1
> >    [junit4]   1>
> indexerThreadPool=org.apache.lucene.index.DocumentsWriterPerThreadPo
> ol@16d3afc
> >    [junit4]   1> readerPooling=false
> >    [junit4]   1> perThreadHardLimitMB=1945
> >    [junit4]   1> useCompoundFile=true
> >    [junit4]   1> commitOnClose=true
> >    [junit4]   1> writer=org.apache.lucene.index.IndexWriter@136044b
> >    [junit4]   1>
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.798Z; Thread-132]:
> MMapDirectory.UNMAP_SUPPORTED=true
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.799Z; Thread-132]: now flush at
> close
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.799Z; Thread-132]:   start flush:
> applyAllDeletes=true
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.799Z; Thread-132]:   index before
> flush _0(6.0.0):c1
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.799Z; Thread-132]: startFullFlush
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.799Z; Thread-132]: anyChanges?
> numDocsInRam=1 deletes=false hasTickets:false pendingChangesInFullFlush:
> false
> >    [junit4]   1> DWFC 0 [2016-02-08T11:08:36.799Z; Thread-132]:
> addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0,
> segment=_1, aborted=false, numDocsInRAM=1, deleteQueue=DWDQ: [
> generation: 0 ]]
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.799Z; Thread-132]: flush
> postings as segment _1 numDocs=1
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.800Z; Thread-132]: 1 msec to 
> > write
> norms
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.800Z; Thread-132]: 0 msec to 
> > write
> docValues
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.801Z; Thread-132]: 0 msec to 
> > write
> points
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.802Z; Thread-132]: 1 msec to 
> > finish
> stored fields
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.808Z; Thread-132]: 8 msec to 
> > write
> postings and finish vectors
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.809Z; Thread-132]: 1 msec to 
> > write
> fieldInfos
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.809Z; Thread-132]: new
> segment has 0 deleted docs
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.809Z; Thread-132]: new
> segment has no vectors; norms; no docValues; prox; freqs
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.809Z; Thread-132]:
> flushedFiles=[_1.nvd, _1.doc, _1.tim, _1.tip, _1.fdx, _1.nvm, _1.fnm, _1.pos,
> _1.fdt]
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.809Z; Thread-132]: flushed
> codec=CheapBastard
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.813Z; Thread-132]: flushed:
> segment=_1 ramUsed=0.071 MB newFlushedSize=0.001 MB
> docs/MB=1,312.36
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.813Z; Thread-132]: create
> compound file
> >    [junit4]   1> DWPT 0 [2016-02-08T11:08:36.821Z; Thread-132]: flush time
> 27.6357 msec
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.821Z; Thread-132]:
> publishFlushedSegment seg-private updates=null
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.821Z; Thread-132]:
> publishFlushedSegment
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.821Z; Thread-132]: publish sets
> newSegment delGen=1 seg=_1(6.0.0):c1
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.821Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1" [2 segments ; isCommit = false]
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.821Z; Thread-132]: delete []
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.821Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.821Z; Thread-132]: Thread-132
> finishFullFlush success=true
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.nvd"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.doc"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.tim"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.tip"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.fdx"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.nvm"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.fnm"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.pos"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: will delete 
> > new
> file "_1.fdt"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.822Z; Thread-132]: delete 
> > [_1.nvd,
> _1.doc, _1.tim, _1.tip, _1.fdx, _1.nvm, _1.fnm, _1.pos, _1.fdt]
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.824Z; Thread-132]: apply all 
> > deletes
> during flush
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.824Z; Thread-132]: now apply all
> deletes for all segments maxDoc=2
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.824Z; Thread-132]: applyDeletes:
> open segment readers took 0 msec
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.824Z; Thread-132]: applyDeletes:
> no segments; skipping
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.824Z; Thread-132]: prune
> sis=segments_1: _0(6.0.0):c1 _1(6.0.0):c1 minGen=0 packetCount=0
> >    [junit4]   1> TMP 0 [2016-02-08T11:08:36.824Z; Thread-132]: findMerges: 2
> segments
> >    [junit4]   1> TMP 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> seg=_0(6.0.0):c1 size=0.001 MB [floored]
> >    [junit4]   1> TMP 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> seg=_1(6.0.0):c1 size=0.001 MB [floored]
> >    [junit4]   1> TMP 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> allowedSegmentCount=1 vs count=2 (eligible count=2) tooBigCount=0
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> initDynamicDefaults spins=true maxThreadCount=1 maxMergeCount=6
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]: now merge
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:   index:
> _0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:   no more
> merges pending; now return
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> updateMergeThreads ioThrottle=true targetMBPerSec=10240.0 MB/sec
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]: now merge
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:   index:
> _0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> MS 0 [2016-02-08T11:08:36.825Z; Thread-132]:   no more
> merges pending; now return
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: waitForMerges
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: waitForMerges
> done
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: commit: start
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: commit: enter
> lock
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: commit: now
> prepare
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]:
> prepareCommit: flush
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]:   index before
> flush _0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.825Z; Thread-132]: startFullFlush
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: apply all 
> > deletes
> during flush
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.825Z; Thread-132]: now apply all
> deletes for all segments maxDoc=2
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.825Z; Thread-132]: applyDeletes:
> open segment readers took 0 msec
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.825Z; Thread-132]: applyDeletes:
> no segments; skipping
> >    [junit4]   1> BD 0 [2016-02-08T11:08:36.826Z; Thread-132]: prune
> sis=segments_1: _0(6.0.0):c1 _1(6.0.0):c1 minGen=0 packetCount=0
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.826Z; Thread-132]: Thread-132
> finishFullFlush success=true
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.826Z; Thread-132]: startCommit():
> start
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.826Z; Thread-132]: startCommit
> index=_0(6.0.0):c1 _1(6.0.0):c1 changeCount=3
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.828Z; Thread-132]: startCommit:
> wrote pending segments file "pending_segments_2"
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.828Z; Thread-132]: done all 
> > syncs:
> [_1.cfs, _0.cfe, _0.si, _1.cfe, _1.si, _0.cfs]
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.828Z; Thread-132]: commit:
> pendingCommit != null
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.829Z; Thread-132]: commit: done
> writing segments file "segments_2"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.829Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1" [2 segments ; isCommit = true]
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.829Z; Thread-132]:
> deleteCommits: now decRef commit "segments_1"
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.829Z; Thread-132]: delete
> [segments_1]
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: 1 msec to
> checkpoint
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: delete []
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: commit: took
> 6.0 msec
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: commit: done
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: rollback
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: all running
> merges have aborted
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: rollback: done
> finish merges
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.830Z; Thread-132]: abort
> >    [junit4]   1> DW 0 [2016-02-08T11:08:36.830Z; Thread-132]: done abort
> success=true
> >    [junit4]   1> IW 0 [2016-02-08T11:08:36.830Z; Thread-132]: rollback:
> infos=_0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1" [2 segments ; isCommit = false]
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: delete []
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: delete []
> >    [junit4]   1> IFD 0 [2016-02-08T11:08:36.830Z; Thread-132]: delete []
> >    [junit4]   1>
> >    [junit4]   1> TEST: WriterThread iter=1
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.835Z; Thread-132]: init: current
> segments file is "segments_2";
> deletionPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy
> @176f0b9
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.835Z; Thread-132]: init: load
> commit "segments_2"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.836Z; Thread-132]: delete []
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.836Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1" [2 segments ; isCommit = false]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.836Z; Thread-132]: delete []
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.836Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.837Z; Thread-132]: init:
> create=false
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.837Z; Thread-132]:
> >    [junit4]   1>
> dir=MockDirectoryWrapper(NIOFSDirectory@C:\Users\JenkinsSlave\worksp
> ace\Lucene-Solr-trunk-
> Windows\lucene\build\core\test\J1\temp\lucene.store.TestSimpleFSLockFa
> ctory_E11DDD1BFE48D599-001\tempDir-002
> lockFactory=org.apache.lucene.store.SimpleFSLockFactory@114494a)
> >    [junit4]   1> index=_0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> version=6.0.0
> >    [junit4]   1> analyzer=org.apache.lucene.analysis.MockAnalyzer
> >    [junit4]   1> ramBufferSizeMB=16.0
> >    [junit4]   1> maxBufferedDocs=-1
> >    [junit4]   1> maxBufferedDeleteTerms=-1
> >    [junit4]   1> mergedSegmentWarmer=null
> >    [junit4]   1>
> delPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy
> >    [junit4]   1> commit=null
> >    [junit4]   1> openMode=APPEND
> >    [junit4]   1>
> similarity=org.apache.lucene.search.similarities.BM25Similarity
> >    [junit4]   1> mergeScheduler=ConcurrentMergeScheduler:
> maxThreadCount=-1, maxMergeCount=-1, ioThrottle=true
> >    [junit4]   1> codec=CheapBastard
> >    [junit4]   1> infoStream=org.apache.lucene.util.PrintStreamInfoStream
> >    [junit4]   1> mergePolicy=[TieredMergePolicy: maxMergeAtOnce=10,
> maxMergeAtOnceExplicit=30, maxMergedSegmentMB=5120.0,
> floorSegmentMB=2.0, forceMergeDeletesPctAllowed=10.0,
> segmentsPerTier=10.0, maxCFSSegmentSizeMB=8.796093022207999E12,
> noCFSRatio=0.1
> >    [junit4]   1>
> indexerThreadPool=org.apache.lucene.index.DocumentsWriterPerThreadPo
> ol@3067b3
> >    [junit4]   1> readerPooling=false
> >    [junit4]   1> perThreadHardLimitMB=1945
> >    [junit4]   1> useCompoundFile=true
> >    [junit4]   1> commitOnClose=true
> >    [junit4]   1> writer=org.apache.lucene.index.IndexWriter@136fc06
> >    [junit4]   1>
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.837Z; Thread-132]:
> MMapDirectory.UNMAP_SUPPORTED=true
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.840Z; Thread-132]: now flush at
> close
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.840Z; Thread-132]:   start flush:
> applyAllDeletes=true
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.840Z; Thread-132]:   index before
> flush _0(6.0.0):c1 _1(6.0.0):c1
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.840Z; Thread-132]: startFullFlush
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.840Z; Thread-132]: anyChanges?
> numDocsInRam=1 deletes=false hasTickets:false pendingChangesInFullFlush:
> false
> >    [junit4]   1> DWFC 1 [2016-02-08T11:08:36.840Z; Thread-132]:
> addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0,
> segment=_2, aborted=false, numDocsInRAM=1, deleteQueue=DWDQ: [
> generation: 0 ]]
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.840Z; Thread-132]: flush
> postings as segment _2 numDocs=1
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.843Z; Thread-132]: 2 msec to 
> > write
> norms
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.843Z; Thread-132]: 0 msec to 
> > write
> docValues
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.843Z; Thread-132]: 0 msec to 
> > write
> points
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.843Z; Thread-132]: 1 msec to 
> > finish
> stored fields
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.848Z; Thread-132]: 5 msec to 
> > write
> postings and finish vectors
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.849Z; Thread-132]: 1 msec to 
> > write
> fieldInfos
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.849Z; Thread-132]: new
> segment has 0 deleted docs
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.849Z; Thread-132]: new
> segment has no vectors; norms; no docValues; prox; freqs
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.849Z; Thread-132]:
> flushedFiles=[_2.nvd, _2.tip, _2.fdt, _2.nvm, _2.fnm, _2.pos, _2.fdx, _2.doc,
> _2.tim]
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.849Z; Thread-132]: flushed
> codec=CheapBastard
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.850Z; Thread-132]: flushed:
> segment=_2 ramUsed=0.071 MB newFlushedSize=0.001 MB
> docs/MB=1,312.36
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.850Z; Thread-132]: create
> compound file
> >    [junit4]   1> DWPT 1 [2016-02-08T11:08:36.854Z; Thread-132]: flush time
> 17.1492 msec
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.854Z; Thread-132]:
> publishFlushedSegment seg-private updates=null
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.854Z; Thread-132]:
> publishFlushedSegment
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.854Z; Thread-132]: publish sets
> newSegment delGen=1 seg=_2(6.0.0):c1
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1" [3 segments ; isCommit = false]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: delete []
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.854Z; Thread-132]: Thread-132
> finishFullFlush success=true
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.nvd"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.tip"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.fdt"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.nvm"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.fnm"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.pos"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.fdx"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.doc"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: will delete 
> > new
> file "_2.tim"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.854Z; Thread-132]: delete 
> > [_2.nvd,
> _2.tip, _2.fdt, _2.nvm, _2.fnm, _2.pos, _2.fdx, _2.doc, _2.tim]
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.859Z; Thread-132]: apply all 
> > deletes
> during flush
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.859Z; Thread-132]: now apply all
> deletes for all segments maxDoc=3
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.859Z; Thread-132]: applyDeletes:
> open segment readers took 0 msec
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.859Z; Thread-132]: applyDeletes:
> no segments; skipping
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.859Z; Thread-132]: prune
> sis=segments_2: _0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1 minGen=0
> packetCount=0
> >    [junit4]   1> TMP 1 [2016-02-08T11:08:36.859Z; Thread-132]: findMerges: 3
> segments
> >    [junit4]   1> TMP 1 [2016-02-08T11:08:36.860Z; Thread-132]:
> seg=_0(6.0.0):c1 size=0.001 MB [floored]
> >    [junit4]   1> TMP 1 [2016-02-08T11:08:36.860Z; Thread-132]:
> seg=_1(6.0.0):c1 size=0.001 MB [floored]
> >    [junit4]   1> TMP 1 [2016-02-08T11:08:36.860Z; Thread-132]:
> seg=_2(6.0.0):c1 size=0.001 MB [floored]
> >    [junit4]   1> TMP 1 [2016-02-08T11:08:36.860Z; Thread-132]:
> allowedSegmentCount=1 vs count=3 (eligible count=3) tooBigCount=0
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:
> initDynamicDefaults spins=true maxThreadCount=1 maxMergeCount=6
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]: now merge
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:   index:
> _0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:   no more
> merges pending; now return
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:
> updateMergeThreads ioThrottle=true targetMBPerSec=10240.0 MB/sec
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]: now merge
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:   index:
> _0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1
> >    [junit4]   1> MS 1 [2016-02-08T11:08:36.861Z; Thread-132]:   no more
> merges pending; now return
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: waitForMerges
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: waitForMerges
> done
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: commit: start
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: commit: enter
> lock
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: commit: now
> prepare
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]:
> prepareCommit: flush
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]:   index before
> flush _0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.861Z; Thread-132]: startFullFlush
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: apply all 
> > deletes
> during flush
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: now apply all
> deletes for all segments maxDoc=3
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.861Z; Thread-132]: applyDeletes:
> open segment readers took 0 msec
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.861Z; Thread-132]: applyDeletes:
> no segments; skipping
> >    [junit4]   1> BD 1 [2016-02-08T11:08:36.861Z; Thread-132]: prune
> sis=segments_2: _0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1 minGen=0
> packetCount=0
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.861Z; Thread-132]: Thread-132
> finishFullFlush success=true
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: startCommit():
> start
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.861Z; Thread-132]: startCommit
> index=_0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1 changeCount=3
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.863Z; Thread-132]: startCommit:
> wrote pending segments file "pending_segments_3"
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.864Z; Thread-132]: done all 
> > syncs:
> [_1.cfs, _0.cfe, _0.si, _1.cfe, _1.si, _2.si, _0.cfs, _2.cfe, _2.cfs]
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.864Z; Thread-132]: commit:
> pendingCommit != null
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: commit: done
> writing segments file "segments_3"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.865Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1" [3 segments ; isCommit = true]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.865Z; Thread-132]:
> deleteCommits: now decRef commit "segments_2"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.865Z; Thread-132]: delete
> [segments_2]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.865Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.865Z; Thread-132]: delete []
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: commit: took
> 6.1 msec
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: commit: done
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: rollback
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: all running
> merges have aborted
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.865Z; Thread-132]: rollback: done
> finish merges
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.865Z; Thread-132]: abort
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.866Z; Thread-132]: done abort
> success=true
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.866Z; Thread-132]: rollback:
> infos=_0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: now 
> > checkpoint
> "_0(6.0.0):c1 _1(6.0.0):c1 _2(6.0.0):c1" [3 segments ; isCommit = false]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: delete []
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: 0 msec to
> checkpoint
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: refresh:
> removing newly created unreferenced file "segments_2"
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: delete
> [segments_2]
> >    [junit4]   1> IFD 1 [2016-02-08T11:08:36.866Z; Thread-132]: delete []
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.866Z; Thread-132]: rollback
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.866Z; Thread-132]: all running
> merges have aborted
> >    [junit4]   1> IW 1 [2016-02-08T11:08:36.866Z; Thread-132]: rollback: done
> finish merges
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.867Z; Thread-132]: abort
> >    [junit4]   1> DW 1 [2016-02-08T11:08:36.867Z; Thread-132]: done abort
> success=true
> >    [junit4]   1>
> >    [junit4]   2> NOTE: reproduce with: ant test  -
> Dtestcase=TestSimpleFSLockFactory -Dtests.method=testStressLocks -
> Dtests.seed=E11DDD1BFE48D599 -Dtests.slow=true -Dtests.locale=th -
> Dtests.timezone=EST -Dtests.asserts=true -Dtests.file.encoding=UTF-8
> >    [junit4] FAILURE 0.83s J1 | TestSimpleFSLockFactory.testStressLocks <<<
> >    [junit4]    > Throwable #1: java.lang.AssertionError: IndexWriter hit
> unexpected exceptions
> >    [junit4]    >        at
> __randomizedtesting.SeedInfo.seed([E11DDD1BFE48D599:BF2C93E6E2E41DF
> F]:0)
> >    [junit4]    >        at
> org.apache.lucene.store.BaseLockFactoryTestCase.testStressLocks(BaseLock
> FactoryTestCase.java:180)
> >    [junit4]    >        at java.lang.Thread.run(Thread.java:745)
> >    [junit4]   2> NOTE: leaving temporary files on disk at:
> C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\build\core\test\J1\temp\lucene.store.TestSimpleFSLockFa
> ctory_E11DDD1BFE48D599-001
> >    [junit4]   2> NOTE: test params are: codec=CheapBastard,
> sim=ClassicSimilarity, locale=th, timezone=EST
> >    [junit4]   2> NOTE: Windows 7 6.1 x86/Oracle Corporation 1.8.0_72 (32-
> bit)/cpus=3,threads=1,free=3647624,total=20934656
> >    [junit4]   2> NOTE: All tests run in this JVM: [TestRollback,
> TestLevenshteinAutomata, TestBlockPostingsFormat, TestBM25Similarity,
> TestComplexExplanationsOfNonMatches, TestFixedBitSet,
> TestBlockPostingsFormat3, TestDocumentWriter, TestMathUtil,
> TestBinaryTerms, TestIndexWriterOnJRECrash, TestNotDocIdSet,
> TestLSBRadixSorter, TestMatchNoDocsQuery, TestAttributeSource,
> TestCachingTokenFilter, TestByteArrayDataInput, TestLogMergePolicy,
> TestIndexWriterDeleteByQuery, TestIndexFileDeleter,
> TestSearchForDuplicates, TestFastDecompressionMode,
> TestSimpleFSLockFactory]
> >    [junit4] Completed [39/412 (1!)] on J1 in 6.74s, 7 tests, 1 failure <<<
> FAILURES!
> >
> > [...truncated 1248 lines...]
> > BUILD FAILED
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\build.xml:740: The following error occurred while executing this
> line:
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\build.xml:684: The following error occurred while executing this
> line:
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\build.xml:59: The following error occurred while executing this line:
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\build.xml:50: The following error occurred while executing
> this line:
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\common-build.xml:1457: The following error occurred
> while executing this line:
> > C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-
> Windows\lucene\common-build.xml:1014: There were test failures: 412
> suites (1 ignored), 3385 tests, 1 failure, 66 ignored (62 assumptions) [seed:
> E11DDD1BFE48D599]
> >
> > Total time: 5 minutes 38 seconds
> > Build step 'Invoke Ant' marked build as failure
> > Archiving artifacts
> > [WARNINGS] Skipping publisher since build result is FAILURE
> > Recording test results
> > Email was triggered for: Failure - Any
> > Sending email for trigger: Failure - Any
> >
> >
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> > For additional commands, e-mail: dev-h...@lucene.apache.org
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to