Hi guys,

[DESCRIPTION] The code below inserts a 1000 records in the database.

for (int i=1; i<1000; i++) {
           EntityManager em = PersistenceManager.getEntityManager();
           EntityTransaction tx = em.getTransaction();
           try {
               tx.begin();
// Generate auth token and store it!
               String authInfo = AUTH_TOKEN_PREFIX + UUID.randomUUID();
org.apache.juddi.model.AuthToken modelAuthToken = new org.apache.juddi.model.AuthToken();
               if (authInfo != null) {
                   modelAuthToken.setAuthToken(authInfo);
                   modelAuthToken.setCreated(new Date());
                   modelAuthToken.setLastUsed(new Date());
                   modelAuthToken.setAuthorizedName(publisherId);
                   modelAuthToken.setNumberOfUses(0);
                   modelAuthToken.setTokenState(AUTHTOKEN_ACTIVE);
em.persist(modelAuthToken);
               }
apiAuthToken = new org.uddi.api_v3.AuthToken(); MappingModelToApi.mapAuthToken(modelAuthToken, apiAuthToken); tx.commit(); } finally {
               if (tx.isActive()) {
                   tx.rollback();
               }
               em.clear();
               em.close();
           }
       }


[ISSUE]
After it leaving this code I end up with a 1000 org.apache.juddi.model.AuthToken objects in memory. I've been using the profiler, and these objects cannot be garbage collected.

This seems to be pretty the most common use case of using an OR-mapping tool, so I find it hard to believe openjpa has a memory leak here. Does anyone see what I'm doing wrong? Or can someone point me to an example that does not exhibit this behavior? BTW same code using hibernate does not accumulate these objects.

We're using openjpa 1.2.1.


Thx,


Kurt

Apache jUDDI.

Reply via email to