Hi All, I am using OJB1.4 against Oracle10g with classes12.jar. We are trying to export data from the DB, using SQL query which returns 600000 records , what we found out that is when we do iterator = query.getIteratorByQuery(),and iterate over the results, we find that after iterating over 300000 records, the VM grows rapidly and the entire program crashes giving OOM errors, where as when we used simple JDBC program to implement it, we saw that the entire 600K records were extracted using only 160mb of memory , and the ojb execution takes more than 1.5gb to execute before crashing. Do we know how to solve this memory issue when executing large resultset in Oracle. One solution that in the OJB archives for ProgressSQL is that to use fetchSize=<somevalue> , it will solve this issue ??? Thanks and Regards Somendra Paul.
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]