[ 
http://issues.apache.org/jira/browse/LUCENE-488?page=comments#action_12363628 ] 

george washington commented on LUCENE-488:
------------------------------------------

Daniel, a combination of :

      iwriter.setMaxBufferedDocs(2);
      iwriter.setMergeFactor(2);
      iwriter.setUseCompoundFile(false);

seems to help. I still get OOM errors but only with several larger docs 
(>10MB),  in succession, a significant improvement from the 5MB docs limit.
Perhaps this issue should be kept open so that a more satisfactory solution is 
found.
Thank you for your help.


> adding docs with large (binary) fields of 5mb causes OOM regardless of heap 
> size
> --------------------------------------------------------------------------------
>
>          Key: LUCENE-488
>          URL: http://issues.apache.org/jira/browse/LUCENE-488
>      Project: Lucene - Java
>         Type: Bug
>     Versions: 1.9
>  Environment: Linux asimov 2.6.6.hoss1 #1 SMP Tue Jul 6 16:31:01 PDT 2004 
> i686 GNU/Linux
>     Reporter: Hoss Man
>  Attachments: TestBigBinary.java
>
> as reported by George Washington in a message to [email protected] 
> with subect "Storing large text or binary source documents in the index and 
> memory usage" arround 2006-01-21 there seems to be a problem with adding docs 
> containing really large fields.
> I'll attach a test case in a moment, note that (for me) regardless of how big 
> i make my heap size, and regardless of what value I set  MIN_MB to, once it 
> starts trying to make documents of containing 5mb of data, it can only add 9 
> before it rolls over and dies.
> here's the output from the code as i will attach in a moment...
>     [junit] Testsuite: org.apache.lucene.document.TestBigBinary
>     [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 78.656 sec
>     [junit] ------------- Standard Output ---------------
>     [junit] NOTE: directory will not be cleaned up automatically...
>     [junit] Dir: 
> /tmp/org.apache.lucene.document.TestBigBinary.97856146.100iters.4mb
>     [junit] iters completed: 100
>     [junit] totalBytes Allocated: 419430400
>     [junit] NOTE: directory will not be cleaned up automatically...
>     [junit] Dir: 
> /tmp/org.apache.lucene.document.TestBigBinary.97856146.100iters.5mb
>     [junit] iters completed: 9
>     [junit] totalBytes Allocated: 52428800
>     [junit] ------------- ---------------- ---------------
>     [junit] Testcase: 
> testBigBinaryFields(org.apache.lucene.document.TestBigBinary):    Caused an 
> ERROR
>     [junit] Java heap space
>     [junit] java.lang.OutOfMemoryError: Java heap space
>     [junit] Test org.apache.lucene.document.TestBigBinary FAILED

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to