On Wed, Nov 5, 2014 at 6:46 PM, Tim Dudgeon <tdudgeon...@gmail.com> wrote:
> I'm encountering a strange problem with using the JDBC component with
> Postgresql.
> Postgresql by default fetches large result sets into memory. To avoid this
> you need to use
> statement.setFetchSize()
> to make it use a cursor.
> This works fine in a simple Java example.  But when I try to use it in Camel
> I get a strange exception thrown from deep down in Postgresql.
> My route looks a bit like this:
>
> from('direct:databasequery')
> .to('jdbc:myDataSoruce?outputType=StreamList&statement.fetchSize=100')
> .split(body()).streaming()
> .log('Processing row')
>
> I then send it a body containing something like:
> select * from a_very_large_table
>
> When I combine it with maxRowSize=10 its fine.
> When I combine it with maxRowSize=1000 it blows up, so it looks like it
> fails when it needs to go back for a second chunk of data.
> The error from Postgresql is this:
>

maxRowSize ??? do you mean fetchSize?

Also which version of Camel do you use?


> Caused by: org.postgresql.util.PSQLException: ERROR: portal "C_2" does not
> exist
> at
> org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198)
> at
> org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927)
> at
> org.postgresql.core.v3.QueryExecutorImpl.fetch(QueryExecutorImpl.java:2130)
> at
> org.postgresql.jdbc2.AbstractJdbc2ResultSet.next(AbstractJdbc2ResultSet.java:1917)
> at
> org.apache.camel.component.jdbc.ResultSetIterator.loadNext(ResultSetIterator.java:117)
> at
> org.apache.camel.component.jdbc.ResultSetIterator.next(ResultSetIterator.java:83)
>
>
> Anyone got any idea what's going on here?
>
> Tim
>



-- 
Claus Ibsen
-----------------
Red Hat, Inc.
Email: cib...@redhat.com
Twitter: davsclaus
Blog: http://davsclaus.com
Author of Camel in Action: http://www.manning.com/ibsen
hawtio: http://hawt.io/
fabric8: http://fabric8.io/

Reply via email to