Another data point - the 5 node cluster does have another collection on
it that is large (maybe 500G in HDFS) that did have field guessing
enabled on it, but it is a static collection (I'm not adding data to
it). I've just removed that collection and am running the test again -
it's gotten a lot further along so far.
-Joe
On 1/19/2017 12:59 PM, Joe Obernberger wrote:
Thank you Erick! For this scenario, I was defining the schema
manually (editing managed_schema and pushing to zookeeper), but didn't
realize that I had left the field guessing block in the solrconfig.xml
file enabled. I've now disabled the field guessing, but still getting
errors when indexing many small records. This error happened on one
of the 5 servers in the cluster:
Exceptioat
org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:668)exOutOfBoundsException
at
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:648)
Caused by: java.lang.ArrayIndexOutOfBoundsException
at
org.apache.lucene.store.ByteArrayDataInput.readBytes(ByteArrayDataInput.java:165)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnumFrame.nextLeaf(SegmentTermsEnumFrame.java:284)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnumFrame.next(SegmentTermsEnumFrame.java:269)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnum.next(SegmentTermsEnum.java:955)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnum.seekCeil(SegmentTermsEnum.java:762)
at
org.apache.lucene.index.BufferedUpdatesStream.applyTermDeletes(BufferedUpdatesStream.java:538)
at
org.apache.lucene.index.BufferedUpdatesStream.applyDeletesAndUpdates(BufferedUpdatesStream.java:287)
at
org.apache.lucene.index.IndexWriter._mergeInit(IndexWriter.java:4068)
at
org.apache.lucene.index.IndexWriter.mergeInit(IndexWriter.java:4026)
at
org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3880)
at
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:588)
at
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:626)
Then 2 seconds later:
2017-01-19 17:45:02.798 ERROR (commitScheduler-32-thread-1)
[c:Wordline2 s:shard1 r:core_node3 x:Wordline2_shard1_replica1]
o.a.s.u.CommitTracker auto commit
error...:org.apache.lucene.index.CorruptIndexException: codec header
mismatch: actual header=164048902 vs expected header=1071082519
(resource=_i_Lucene50_0.tip)
at
org.apache.lucene.codecs.CodecUtil.checkHeader(CodecUtil.java:196)
at org.apache.lucene.util.fst.FST.<init>(FST.java:327)
at org.apache.lucene.util.fst.FST.<init>(FST.java:313)
at
org.apache.lucene.codecs.blocktree.FieldReader.<init>(FieldReader.java:91)
at
org.apache.lucene.codecs.blocktree.BlockTreeTermsReader.<init>(BlockTreeTermsReader.java:234)
at
org.apache.lucene.codecs.lucene50.Lucene50PostingsFormat.fieldsProducer(Lucene50PostingsFormat.java:445)
at
org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsReader.<init>(PerFieldPostingsFormat.java:292)
at
org.apache.lucene.codecs.perfield.PerFieldPostingsFormat.fieldsProducer(PerFieldPostingsFormat.java:372)
at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:106)
at
org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:74)
at
org.apache.lucene.index.ReadersAndUpdates.getReader(ReadersAndUpdates.java:145)
at
org.apache.lucene.index.BufferedUpdatesStream$SegmentState.<init>(BufferedUpdatesStream.java:384)
at
org.apache.lucene.index.BufferedUpdatesStream.openSegmentStates(BufferedUpdatesStream.java:416)
at
org.apache.lucene.index.BufferedUpdatesStream.applyDeletesAndUpdates(BufferedUpdatesStream.java:261)
at
org.apache.lucene.index.IndexWriter.applyAllDeletesAndUpdates(IndexWriter.java:3413)
at
org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:3399)
at
org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2987)
at
org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3206)
at
org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3171)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:607)
at
org.apache.solr.update.CommitTracker.run(CommitTracker.java:217)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Then a few mSec later:
2017-01-19 17:45:33.176 ERROR (qtp606548741-122) [c:Wordline2 s:shard1
r:core_node3 x:Wordline2_shard1_replica1] o.a.s.h.RequestHandlerBase
org.apache.solr.common.SolrException: Exception writing document id
/mnt/ice/Whitespace/WorldLineData_v2/WorldLine_Data_v2/2016-07-14/SensorData/Worldline_2016-07-14_6.wl_Telco_1154353070
to the index; possible analysis error.
at
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:178)
at
org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:957)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1112)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:738)
at
org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
at
org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:97)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:179)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readIterator(JavaBinUpdateRequestCodec.java:135)
at
org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:275)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readNamedList(JavaBinUpdateRequestCodec.java:121)
at
org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:240)
at
org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:158)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:186)
at
org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:107)
at
org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:54)
at
org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97)
at
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:153)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2213)
at
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)
at
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:303)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:254)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160)
at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:518)
at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.lucene.store.AlreadyClosedException: this
IndexWriter is closed
at
org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:740)
at
org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:754)
at
org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1558)
at
org.apache.solr.update.DirectUpdateHandler2.doNormalUpdate(DirectUpdateHandler2.java:279)
at
org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:211)
at
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:166)
... 48 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
at
org.apache.lucene.store.ByteArrayDataInput.readBytes(ByteArrayDataInput.java:165)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnumFrame.nextLeaf(SegmentTermsEnumFrame.java:284)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnumFrame.next(SegmentTermsEnumFrame.java:269)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnum.next(SegmentTermsEnum.java:955)
at
org.apache.lucene.codecs.blocktree.SegmentTermsEnum.seekCeil(SegmentTermsEnum.java:762)
at
org.apache.lucene.index.BufferedUpdatesStream.applyTermDeletes(BufferedUpdatesStream.java:538)
at
org.apache.lucene.index.BufferedUpdatesStream.applyDeletesAndUpdates(BufferedUpdatesStream.java:287)
at
org.apache.lucene.index.IndexWriter._mergeInit(IndexWriter.java:4068)
at
org.apache.lucene.index.IndexWriter.mergeInit(IndexWriter.java:4026)
at
org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3880)
at
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:588)
at
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:626)
I then stopped the test. Interestingly the other 4 servers have no
errors; 3.1 million records were successfully indexed. I'm happy
(delighted even!) to try things here. Thank you!
-Joe
On 1/19/2017 10:28 AM, Erick Erickson wrote:
It looks to me like you're using "field guessing". For production
systems I
generally don't recommend this. The problem is that it makes the best
estimate
that it can based on the first document for any given field. So it
sees a field
with the value 1 and tries to make the field an int. Then 100 docs
later a doc
comes through with that field as 1.0 and you get an indexing exception.
Next, if you're sending many docs rapidly through SolrCloud, there
are all kinds
of things going on to try to update the configset, reload the cores to
get the latest
configurations down to all of the replicas and the like.
So the very first thing I'd try is to define the schema manually and
see if that
cures things.
BTW, the big, scary "DO NOT EDIT THIS FILE" in the managed_schema file
is a bit of overkill. You _can_ edit that file manually, the danger is
that if you
have the field-guessing turned on, already running solr nodes may
overwrite
your changes. So it's safe to manually edit that file and push it to
Zookeeper
in to situations:
1> you have disabled "field guessing"
or
2> you edit and push when all your Solr nodes are shut down.
Best,
Erick
On Wed, Jan 18, 2017 at 9:11 PM, Joe Obernberger
<joseph.obernber...@gmail.com> wrote:
Hi All - I've been trying to debug this, but it keeps occurring.
Even if I
do 100 at a time, or 50 at a time, eventually I get the below stack
trace.
I've also adjusted the autoSoftCommit and autoCommit times to a
variety of
values. It stills fails after a time; typically around 27-50 million
records, I get this error. This is on a newly created collection
(that I've
been dropping and recreating after each test).
Is there anything I can try that may help debug? Perhaps my method of
indexing is incorrect? Thanks for any ideas!
-Joe
On 1/17/2017 10:13 AM, Joe Obernberger wrote:
While indexing a large number of records in Solr Cloud 6.3.0 with a
5 node
configuration, I received an error. I'm using java code / solrj to
perform
the indexing by creating a list of SolrInputDocuments, 1000 at a
time, and
then calling CloudSolrClient.add(list). The records are small -
about 6
fields of short strings and numbers.
If I do 100 at a time, I can't replicate the error, but 1000 at a
time has
consistently causes the below exception to occur. The index is
stored in a
shared HDFS.
2017-01-17 04:21:00.022 ERROR (qtp606548741-21) [c:Worldline s:shard5
r:core_node1 x:Worldline_shard5_replica1] o.a.s.h.RequestHandlerBase
org.apache.solr.common.SolrException: Exception writing document id
6228601a-8756-4b16-bdc3-ad026754b225 to the index; possible
analysis error.
at
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:178)
at
org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:335)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:957)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1112)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:738)
at
org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
at
org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:97)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:179)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readIterator(JavaBinUpdateRequestCodec.java:135)
at
org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:275)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readNamedList(JavaBinUpdateRequestCodec.java:121)
at
org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:240)
at
org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:158)
at
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:186)
at
org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:107)
at
org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:54)
at
org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97)
at
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:153)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2213)
at
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)
at
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:303)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:254)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160)
at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:518)
at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.lucene.store.AlreadyClosedException: this
IndexWriter is closed
at
org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:740)
at
org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:754)
at
org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1558)
at
org.apache.solr.update.DirectUpdateHandler2.doNormalUpdate(DirectUpdateHandler2.java:279)
at
org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:211)
at
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:166)
... 62 more
Caused by: org.apache.lucene.index.CorruptIndexException: invalid
state:
base=49, docID=258046 (resource=_8u.cfs [slice=_8u.fdt])
at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.merge(CompressingStoredFieldsWriter.java:559)
at
org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java:200)
at
org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:89)
at
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4312)
at
org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3889)
at
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:588)
at
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:626)
-Joe