[ http://issues.apache.org/jira/browse/DERBY-1713?page=comments#action_12429186 ] Ibrahim commented on DERBY-1713: --------------------------------
Thank you again. Yes I have indexes on Code, Seq and sheekhid. When I applied: SELECT * FROM Contents ORDER BY Code, Seq The memory reach the max. when: SELECT * FROM Contents ORDER BY Code OR SELECT * FROM Contents ORDER BY Seq The memory is negligible during the processing of the query (less than 2 MB) For completeness I attached the whole db (bzip2 fomat so please use winzip 10) with sample program that simulate the case. I understand that I need to increase the memory but first I want to monitor all the leakages since the db will increase soon so I got these OutOfMemory problems and the memory is not free. By the way the freeing process is working in the attached sample program after shutdown the db where it is not the case in my project (Cataloger) although I got the confirmation exception (SQLException: Derby system shutdown) and I'm trying now to simulate it exactly since I checked it many times but I couldn't get it right in the sample program. > Memory do not return to the system after Shuting down derby 10.2.1.0, > following an out of memory event > ------------------------------------------------------------------------------------------------------ > > Key: DERBY-1713 > URL: http://issues.apache.org/jira/browse/DERBY-1713 > Project: Derby > Issue Type: Bug > Components: Performance > Affects Versions: 10.2.1.0 > Environment: Windows XP SP2 > JRE 1.6 beta2 > Reporter: Ibrahim > Priority: Critical > Attachments: test.zip > > > I face a problem when querying large tables. I run the below SQL and it stuck > in this query and throws java heap exception OutOfMemory: > SELECT count(*) FROM <table> WHERE ..... > N.B. I'm using a database of more than 90,000 records (40 MB). I set the > maxHeap to 32 MB (all other settings have the default value, pageCache ... > etc ). > Then, I shutdown the database but the memory is not returned to the system > (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB > in which it works and releases the memory, so I think the problem is when it > reaches the maxHeap then it seems to not respond to anything such as closing > the connection or shutting down the database. How can I get rid of this? > (because i cannot increase the maxHeap as the database increases, I want to > throw an exception and release the memory) > I'm using this to shutdown the DB: > try{DriverManager.getConnection("jdbc:derby:;shutdown=true");} > catch(SQLException ex){System.err.println("SQLException: " + > ex.getMessage());} > I'm using a memory Profiler for monitoring the memory usage. > Thanks in advanced. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira
