[
https://issues.apache.org/jira/browse/SOLR-812?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12675227#action_12675227
]
Shalin Shekhar Mangar commented on SOLR-812:
--------------------------------------------
bq. I'm running postgres as well and I finally got rid of my memory issues with
this data source configuration:
Thanks for the information Martin. Can you please add this to the
DataImportHandlerFaq wiki page?
http://wiki.apache.org/solr/DataImportHandlerFaq
> JDBC optimizations: setReadOnly, setMaxRows
> -------------------------------------------
>
> Key: SOLR-812
> URL: https://issues.apache.org/jira/browse/SOLR-812
> Project: Solr
> Issue Type: Improvement
> Components: contrib - DataImportHandler
> Affects Versions: 1.3
> Reporter: David Smiley
> Assignee: Shalin Shekhar Mangar
> Fix For: 1.4
>
> Attachments: SOLR-812.patch, SOLR-812.patch
>
>
> I'm looking at the DataImport code as of Solr v1.3 and using it with Postgres
> and very large data sets and there some improvement suggestions I have.
> 1. call setReadOnly(true) on the connection. DIH doesn't change the data so
> this is obvious.
> 2. call setAutoCommit(false) on the connection. (this is needed by Postgres
> to ensure that the fetchSize hint actually works)
> 3. call setMaxRows(X) on the statement which is to be used when the
> dataimport.jsp debugger is only grabbing X rows. fetchSize is just a hint
> and alone it isn't sufficient.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.