On 11/6/2015 10:32 AM, Yangrui Guo wrote:
> <entity name="movie_actress" transformer="RegexTransformer"

There's a good chance that JDBC is trying to read the entire result set
(all three million rows) into memory before sending any of that info to
Solr.

Set the batchSize to -1 for MySQL so that it will stream results to Solr
as soon as they are available, and not wait for all of them.  Here's
more info on the situation, which frequently causes OutOfMemory problems
for users:

http://wiki.apache.org/solr/DataImportHandlerFaq?highlight=%28mysql%29|%28batchsize%29#I.27m_using_DataImportHandler_with_a_MySQL_database._My_table_is_huge_and_DataImportHandler_is_going_out_of_memory._Why_does_DataImportHandler_bring_everything_to_memory.3F


Thanks,
Shawn

Reply via email to