At present, we do not limit the fetch size on the JDBC driver level which
will fix this issue (e.g.
preparedStatement.setFetchSize(getMaxMessagesPerPoll())).
The drawback is, we don't know how many rows are in the database which will
match our query. We only can assume if fetchSize is equal to
maxMessagesPerPoll, there are more rows to read.

>From my point of view, we should go for this change. If nobody has
objections, I will will do the needed changes later this week...

Best,

Christian
-----------------

Software Integration Specialist

Apache Member
V.P. Apache Camel | Apache Camel PMC Member | Apache Camel committer
Apache Incubator PMC Member

https://www.linkedin.com/pub/christian-mueller/11/551/642


On Fri, Aug 8, 2014 at 10:34 PM, Matt Payne <pa...@mattpayne.org> wrote:

> I am getting an  java.lang.OutOfMemoryError: Java heap space when using
> camel sql component with a query that returns a large number of rows.
>
> Using a small value for maxMessagesPerPoll=10 does not help[1].
>
> When reading the source[2], I see:
> 255   protected List<Map<String, Object>> queryForList(ResultSet rs) throws
> SQLException {
> 256         ColumnMapRowMapper rowMapper = new ColumnMapRowMapper();
> 257         RowMapperResultSetExtractor<Map<String, Object>> mapper = new
> RowMapperResultSetExtractor<Map<String, Object>>(rowMapper);
> 258         List<Map<String, Object>> data = mapper.extractData(rs);
> 259         return data;
> 260     }
>
> It seems that all of the result set, rs, is going to be read regardless of
> how large it is.   When stepping through via eclipse's debugger this is
> what I see happening.
>
> It's unclear how to ask camel sql component to only take X rows at a time.
>   Is there a way to do this please?
>
> Thanks! --Matt Payne
>
>
> [1]
> http://camel.465427.n5.nabble.com/Fetching-data-in-batches-td5718366.html
> [2]
>
> https://git-wip-us.apache.org/repos/asf?p=camel.git;a=blob;f=components/camel-sql/src/main/java/org/apache/camel/component/sql/SqlEndpoint.java#l255
>

Reply via email to