Hi Pascal,

I suspect you need to utilize an iterated query:

http://cayenne.apache.org/docs/3.1/cayenne-guide/performance-tuning.html#iterated-queries

As you iterate over your entire record set, you can convert the DataRows
into Cayenne objects (see the section in the documentation above the
iterated queries documentation) in a *different* DataContext.  Gather up 50
or 100 or 1000 (whatever number feels good to you) in that second
DataContext and then commit them, throw away that DataContext and create a
new one.  Repeat.  This should keep your memory usage fairly constant and
allow you to process arbitrarily large record sizes.

mrg


On Fri, May 19, 2017 at 9:27 AM, Pascal Robert <[email protected]> wrote:

> Hi,
>
> I’m still in my FileMaker -> MySQL migration project. This time, I want to
> migrate a FileMaker table who have 445 244 records in it. If I fetch
> everything into an object entity for each row, I’m getting a Java heap
> space problem, which is somewhat expected by the size of the result set.
>
> If I call setFetchLimit() with a 10 000 limit, works fine. FileMaker
> doesn’t support fetch limits, so I can’t do something on that side.
>
> Any tips?

Reply via email to