Without digging too deep into why exactly this is happening, here are
the general options:

0. Are you actually committing? Check the messages in the logs and see
if the records show up when you expect them too.
1. Are you actually trying to feed 20Mb file to Solr? Maybe it's HTTP
buffer that's blowing up? Try using stream.file instead (notice
security warning though): http://wiki.apache.org/solr/ContentStream
2. Split file into smaller ones and and commit each separately
3. Set hard auto-commit in solrconfig.xml based on number of documents
to flush in-memory structures to disk
4. Switch to using DataImportHandler to pull from XML instead of pushing
5. Increase amount of memory to Solr (-X command line flags)

Regards,
   Alex.

Personal website: http://www.outerthoughts.com/
Current project: http://www.solr-start.com/ - Accelerating your Solr proficiency

On Mon, Mar 31, 2014 at 12:00 PM, Floyd Wu <floyd...@gmail.com> wrote:
> I have many plain text xml that I transfer to form of solr xml format.
> But every time I send them to solr, I hit OOM exception.
> How to configure solr to "eat" these big xml?
> Please guide me a way. Thanks
>
> floyd

Reply via email to