Re: empty collections
Hi, Solved. Problem was that both sides of the bi-directional relationship were not being assigned before persisting, just one side : user.getGroups().add(group); The following fixed the problem: user.getGroups().add(group); group.getUsers().add(user); Interesting that by restarting the app, the relationships could be read, just not after persisting. Thanks, chris Hi, I have a bidirectional many-to-many relationship between two entities. Each entity references a collection of the other entity. I import data, then retrieve entities with a query. However, when I refer to an entity's collection, its size is always 0 (eg, group.getUsers().size() is 0). I look into the db and the join table is filled correctly. I restart the app, and then the same entity correctly returns the collection. That is, group.getUsers ().size() is now 0 and I can access the collection's elements. I am wondering is this is a known pattern and there is an a-ha out there. Otherwise OpenJPA is working great! Thanks, chris
Re: Problems updating db entry
Hi Will adding a version and locking properties to the entities help with this matter? cheers, Håkon 2010/1/13 Håkon Sagehaug hakon.sageh...@uni.no Hi Simone, The update does not raise any exceptions. I have no version on the entity, and not configured locking. I read alittle about locking, does it also apply outside a transaction, the update operations is in a transactions but the read is not. cheers, Håkon 2010/1/12 Simone Gianni simo...@apache.org Hi Håkon, is the update of the entity on the database raising any exception? Or it silently refuses updating the database? Are you using a version field for optimistic locking? Simone Håkon Sagehaug wrote: Hi all, I've got a problem I'm not sure how to solve. The problems is as follows. We got a web service that can grab files from a ftp or http site, we use openjpa to store metadate about the resource, like name, when inserted etc. After this we write this information to the db and a new thread is started to download the file, after this is finished the db entry is retrieved and updated, with a status message, saying that the download was complete. We then have a client the after it sends a request, will poll the service using a operation called getResource(id), to see if the state is the to finished. The problem we see is that when the client issue a getResource we can't update the entry in the database. The file is downloaded, but the information in db isn't updated. If we issue the request getresource after a delay(2sec), on the client side everything works fine. Is there some synchonization needed, the web services used the same EntityManagerFactoryFactory but each method has it's own EntityManager object, also the thread downloaded the file. Have anyone experienced this? cheers, Håkon -- Simone GianniCEO Semeru s.r.l. Apache Committer http://www.simonegianni.it/
RE: empty collections
I had some similar issues when loading a large number of data records that included some joins, I found that if I called the clear function on he EntityManager every x records it resolved my issue, also solved my OOM issues also. Hth Chris -Original Message- From: Christopher Giblin [mailto:c...@zurich.ibm.com] Sent: Saturday, 16 January 2010 12:38 AM To: users@openjpa.apache.org Subject: Re: empty collections Hi, Solved. Problem was that both sides of the bi-directional relationship were not being assigned before persisting, just one side : user.getGroups().add(group); The following fixed the problem: user.getGroups().add(group); group.getUsers().add(user); Interesting that by restarting the app, the relationships could be read, just not after persisting. Thanks, chris Hi, I have a bidirectional many-to-many relationship between two entities. Each entity references a collection of the other entity. I import data, then retrieve entities with a query. However, when I refer to an entity's collection, its size is always 0 (eg, group.getUsers().size() is 0). I look into the db and the join table is filled correctly. I restart the app, and then the same entity correctly returns the collection. That is, group.getUsers ().size() is now 0 and I can access the collection's elements. I am wondering is this is a known pattern and there is an a-ha out there. Otherwise OpenJPA is working great! Thanks, chris
Re: memory leak? - simple question
Thanks so much Kevin, that did the trick. Some of the issues you mentioned where introduced by me not copying right from the jUDDI project, but the main issue is that the openjpa-plugin simply does not run for some reason. Anyway it is no longer leaking now! I learned a lot from this exercise. And yes my class jumped from 2 to 10 k. BTW I'm pretty sure this used to work.. Thanks again! Going to fix it the jUDDI build now :)! --Kurt Kevin Sutter wrote: Hi Kurt, I have not run a profiler to verify the memory leak, but I think I found the problem... Although you thought you were running the PCEnhancer during build time, it was not being performed. When I looked at the Tomcat log with the original jar file, the following message was logged which indicates you were falling back to the not ready for production subclassing support: 1844 juddiDatabase INFO [http-8080-1] openjpa.Enhance - Creating subclass for [class org.apache.juddi.AuthToken]. This means that your application will be less efficient and will consume more memory than it would if you ran the OpenJPA enhancer. Additionally, lazy loading will not be available for one-to-one and many-to-one persistent attributes in types using field access; they will be loaded eagerly instead. I made a few changes to your pom.xml and the core module to get the enhancement to work and I believe the memory leak has went away. Like I said, I did not run a profiler, but I added a finalize method to your AuthToken class and I now see that these entities are getting GC'd. One thing I would do is add the following property to your persistence.xml. property name=openjpa.RuntimeUnenhancedClasses value=warn/ This property will produce the following warning if you accidentally fall into this subclassing support. We turn this off by default when running within WebSphere since we really don't want customers to accidentally use this subclassing. And, we have turned this off for 1.3.x and trunk due to the various problems associated with it. Unfortunately, the 1.2.x branch is kind of stuck with it... 1688 juddiDatabase WARN [http-8080-1] openjpa.Enhance - This configuration disallows runtime optimization, but the following listed types were not enhanced at build time or at class load time with a javaagent: [class org.apache.juddi.AuthToken]. And, of course, your webservice no longer works either. You will get an HTTP Status 500 error with an error message and call stack that further explains the error. To get the maven plugin to work properly... I made several changes, so hopefully I remember everything... o The number one thing is that the openjpa:enhance goal doesn't seem to get automatically invoked when you compile. I have no idea why, but that doesn't seem to work. o You have to make your persistence.xml file available to the core module. For my testing, I just created a new directory in the core module called \openjpa-leak\core\src\main\resources\META-INF\persistence.xml. This will automatically get copied over to target and then the openjpa:enhance goal will find it. o How you want to make your persistence.xml file available to both your core module and the leak-war module is up to you... :-) o I added a dependency in your core\pom.xml file for the 1.2.1 openjpa. The documentation [1] says it will default to 1.2.0, if none is specified. dependencies dependency groupIdorg.apache.openjpa/groupId artifactIdopenjpa/artifactId version1.2.1/version /dependency /dependencies o Your plugin properties don't seem to be specified correctly, at least according to the documentation, but I didn't touch them and things still seemed to work (once I got around the other problems). o I also changed where to find the entities to enhance since this pom.xml still specified the model directory. There also seems to be confusion as to whether to specify classes or includes. I went with includes since that's what the example showed. o It's very easy to see if the enhancement process worked or not since the size of your entity classes increases by a few thousand bytes. And, with the extra property above, this will prevent you from accidentally using the unenhanced version. Hope this helps! Let me know if this gets you around the memory leak. Kevin [1] http://mojo.codehaus.org/openjpa-maven-plugin/usage.html On Thu, Jan 14, 2010 at 2:12 PM, Kurt T Stam kurt.s...@gmail.com wrote: 1. Cool that you got it running :) 2. I just ran with openjpa 1.2.2.SNAPSHOT and still see an accumulation of AuthToken objects. --K Kevin Sutter wrote: Hey, this wasn't so difficult... :-) Maybe I can still learn new things... I have Tomcat up and running and I can get your app running. On Thu, Jan 14, 2010 at 1:46 PM, Kevin Sutter kwsut...@gmail.com wrote: Hi Kurt, These instructions, of course, assume that I know
Re: memory leak? - simple question
Great. Glad to hear that you're back in business. I've created a JIRA [1] for this issue just in case we resurrect the subclassing support to a first-class citizen. :-) Thanks, Kevin [1] https://issues.apache.org/jira/browse/OPENJPA-1462 On Fri, Jan 15, 2010 at 10:15 AM, Kurt T Stam kurt.s...@gmail.com wrote: Thanks so much Kevin, that did the trick. Some of the issues you mentioned where introduced by me not copying right from the jUDDI project, but the main issue is that the openjpa-plugin simply does not run for some reason. Anyway it is no longer leaking now! I learned a lot from this exercise. And yes my class jumped from 2 to 10 k. BTW I'm pretty sure this used to work.. Thanks again! Going to fix it the jUDDI build now :)! --Kurt Kevin Sutter wrote: Hi Kurt, I have not run a profiler to verify the memory leak, but I think I found the problem... Although you thought you were running the PCEnhancer during build time, it was not being performed. When I looked at the Tomcat log with the original jar file, the following message was logged which indicates you were falling back to the not ready for production subclassing support: 1844 juddiDatabase INFO [http-8080-1] openjpa.Enhance - Creating subclass for [class org.apache.juddi.AuthToken]. This means that your application will be less efficient and will consume more memory than it would if you ran the OpenJPA enhancer. Additionally, lazy loading will not be available for one-to-one and many-to-one persistent attributes in types using field access; they will be loaded eagerly instead. I made a few changes to your pom.xml and the core module to get the enhancement to work and I believe the memory leak has went away. Like I said, I did not run a profiler, but I added a finalize method to your AuthToken class and I now see that these entities are getting GC'd. One thing I would do is add the following property to your persistence.xml. property name=openjpa.RuntimeUnenhancedClasses value=warn/ This property will produce the following warning if you accidentally fall into this subclassing support. We turn this off by default when running within WebSphere since we really don't want customers to accidentally use this subclassing. And, we have turned this off for 1.3.x and trunk due to the various problems associated with it. Unfortunately, the 1.2.x branch is kind of stuck with it... 1688 juddiDatabase WARN [http-8080-1] openjpa.Enhance - This configuration disallows runtime optimization, but the following listed types were not enhanced at build time or at class load time with a javaagent: [class org.apache.juddi.AuthToken]. And, of course, your webservice no longer works either. You will get an HTTP Status 500 error with an error message and call stack that further explains the error. To get the maven plugin to work properly... I made several changes, so hopefully I remember everything... o The number one thing is that the openjpa:enhance goal doesn't seem to get automatically invoked when you compile. I have no idea why, but that doesn't seem to work. o You have to make your persistence.xml file available to the core module. For my testing, I just created a new directory in the core module called \openjpa-leak\core\src\main\resources\META-INF\persistence.xml. This will automatically get copied over to target and then the openjpa:enhance goal will find it. o How you want to make your persistence.xml file available to both your core module and the leak-war module is up to you... :-) o I added a dependency in your core\pom.xml file for the 1.2.1 openjpa. The documentation [1] says it will default to 1.2.0, if none is specified. dependencies dependency groupIdorg.apache.openjpa/groupId artifactIdopenjpa/artifactId version1.2.1/version /dependency /dependencies o Your plugin properties don't seem to be specified correctly, at least according to the documentation, but I didn't touch them and things still seemed to work (once I got around the other problems). o I also changed where to find the entities to enhance since this pom.xml still specified the model directory. There also seems to be confusion as to whether to specify classes or includes. I went with includes since that's what the example showed. o It's very easy to see if the enhancement process worked or not since the size of your entity classes increases by a few thousand bytes. And, with the extra property above, this will prevent you from accidentally using the unenhanced version. Hope this helps! Let me know if this gets you around the memory leak. Kevin [1] http://mojo.codehaus.org/openjpa-maven-plugin/usage.html On Thu, Jan 14, 2010 at 2:12 PM, Kurt T Stam kurt.s...@gmail.com wrote: 1. Cool that you got it running :) 2. I just ran with
Non-unique error
Hello, this question is probably best answered by Michael Dick: Does this code expose/represent bug 1365, or it is something new? public void testNonUnique() throws Exception { EntityA a1 = setup.insertA(1); EntityB b1 = setup.insertB(1); EntityB b2 = setup.insertB(2); a1.getChildren().add(b1); a1.getChildren().add(b2); em.getTransaction().begin(); em.persist(a1); em.getTransaction().commit(); em.close(); em = emf.createEntityManager(); EntityA fresh1 = reload(a1); // OK try { EntityA fresh2 = reload(a1); // fail fail(Expected exception); } catch (Exception exc) {} } private EntityA reload(EntityA original) { Query query = em.createQuery(select distinct o from EntityA as o + left join fetch o.children + where o.id = :id); query.setParameter(id, original.getId()); return (EntityA) query.getSingleResult(); } -- Daryl Stultz _ 6 Degrees Software and Consulting, Inc. http://www.6degrees.com mailto:da...@6degrees.com
Re: collection-valued-path ArgumentException when querying based on multi-valued elements in an XML column
Hi Stella, I did some looking and created a JIRA issue [1] for possible further work. I haven't found any workaround but there are chances for the relaxation in future versions of OpenJPA. If you are still interested, what database are you using? Cheers, Milosz [1] http://issues.apache.org/jira/browse/OPENJPA-1465 Hello! I will look into this in the following weeks. If I am able to relax the limitation or provide a workaround, I will report back here. Greetings, Milosz Hi Milosz, May I know if there is any workaround for this and if there are plans to address it? Thank you for the reference! On Tue, Dec 15, 2009 at 5:44 AM, Miłosz wrote: Hi, I guess this is an OpenJPA limitation - only queries over single-valued elements are supported. There is a section on XML mapping in the user manual [1]. Regards, Milosz [1] http://openjpa.apache.org/builds/latest/docs/manual/manual.html#ref_guide_xmlmapping Hi, I am having trouble executing a JPA named query that uses multi-valued elements, within an XML column, as criteria. I have a table with the following DB2 schema: CREATE TABLE SWD.Dummy ( dummyId VARCHAR(32) NOT NULL, properties XML NOT NULL ) DATA CAPTURE NONE; ALTER TABLE SWD.Dummy ADD CONSTRAINT Dummy_PK PRIMARY KEY (dummyId); and I used JAXB to generate the beans for marshalling unmarshalling from the XSD schema defined for the properties XML column. My XSD schema is as follows: ?xml version=1.0 encoding=utf-8? xsd:schema version=1.0 xmlns:xsd=http://www.w3.org/2001/XMLSchema; xsd:element name=properties type=propertiesType / xsd:complexType name=propertiesType xsd:sequence xsd:element maxOccurs=unbounded name=property type=propertyType / /xsd:sequence /xsd:complexType xsd:complexType name=propertyType xsd:sequence xsd:element name=keyy type=xsd:string / xsd:element name=valuee type=xsd:string / /xsd:sequence /xsd:complexType /xsd:schema In summary, the element properties is the root element and it contains any number of property elements which in turn may contain keyy and valuee string elements. An example of an XML compliant with the schema: ?xml version=1.0 encoding=UTF-16 ?propertiespropertykeyyabc/keyyvaluee123/valuee/propertypropertykeyydef/keyyvalueexyz/valuee/property/properties I am able to use JPA to make simple queries on the Dummy entity. However, I get an error if I try to query on the property element. For example, I get an exception if I execute the following JPQL query: SELECT d FROM Dummy d, IN (d.properties.property) p WHERE p.keyy=:keyy AND p.valuee=:valuee The stack trace that I get is: com.example.DAOException: openjpa-2.0.0-M3-r422266:822833 nonfatal user error org.apache.openjpa.persistence.ArgumentException: collection-valued-path at com.example.DummyDAO.findDummyByProperty(DummyDAO.java:142) at com.example.GetDummyService.invoke(GetDummyService.java:63) ... 32 more Caused by: openjpa-2.0.0-M3-r422266:822833 nonfatal user error org.apache.openjpa.persistence.ArgumentException: collection-valued-path at org.apache.openjpa.kernel.exps.AbstractExpressionBuilder.traverseXPath(AbstractExpressionBuilder.java:269) at org.apache.openjpa.kernel.jpql.JPQLExpressionBuilder.getPath(JPQLExpressionBuilder.java:1921) at org.apache.openjpa.kernel.jpql.JPQLExpressionBuilder.addJoin(JPQLExpressionBuilder.java:741) at org.apache.openjpa.kernel.jpql.JPQLExpressionBuilder.evalFromClause(JPQLExpressionBuilder.java:680) at org.apache.openjpa.kernel.jpql.JPQLExpressionBuilder.evalFromClause(JPQLExpressionBuilder.java:666) at org.apache.openjpa.kernel.jpql.JPQLExpressionBuilder.getQueryExpressions(JPQLExpressionBuilder.java:292) at org.apache.openjpa.kernel.jpql.JPQLParser.eval(JPQLParser.java:67) at org.apache.openjpa.kernel.ExpressionStoreQuery$DataStoreExecutor.init(ExpressionStoreQuery.java:728) at org.apache.openjpa.kernel.ExpressionStoreQuery.newDataStoreExecutor(ExpressionStoreQuery.java:170) at org.apache.openjpa.kernel.QueryImpl.createExecutor(QueryImpl.java:742) at org.apache.openjpa.kernel.QueryImpl.compileForDataStore(QueryImpl.java:700) at org.apache.openjpa.kernel.QueryImpl.compileForExecutor(QueryImpl.java:682) at org.apache.openjpa.kernel.QueryImpl.compile(QueryImpl.java:582) at com.ibm.ws.persistence.EntityManagerImpl.createNamedQuery(EntityManagerImpl.java:104) at
RE: ClassCastException in pcReplaceField
It looks like you are trying to add items to a Set. You will get these casting exceptions if your classes do not implement the IComparable interface. If they do implement the interface, then it is another issue. -Original Message- From: Rick Curtis [mailto:curti...@gmail.com] Sent: Thursday, January 14, 2010 5:02 PM To: users@openjpa.apache.org Subject: Re: ClassCastException in pcReplaceField Udi - You're really going to need to post more information than this for anyone to figure out what is going on I understand you don't want to post your application call stack, what about posting a filtered call stack? That might be a good starting point. You say you are enhancing your entities at runtime... is that via the -javaagent or are you using the subclassing support? -- Thanks, Rick On Thu, Jan 14, 2010 at 6:23 AM, Udi saba...@gmail.com wrote: Hey, first of all, I'm sorry for not pasting the stack - I just can't. I'm using OpenJPA 1.2.0 I have 2 classes: class A{ @OneToMany(mappedBy=a) Set } class B{ @ManyToOne A a; } Now, at some point (I can't create a consistent testcase..) I get ClassCastException: B cannot be cast to A. The stack (typed, so sorry once again...): B.pcReplaceField() StateManagerImpl.replaceField() StateManagerImpl.storeObjectField() StateManagerImpl.storeObject() RelationFieldStrategy.load() FieldMapping.load() JDBCStoreManager.load() I'm enhancing entities at runtime, so I used decompiler to see the generated class, and in the problematic line it appears like a cast for a field. Anyone knows how can it happen (and even more interesting - how to solve the problem? :) ) Thanks, Udi -- View this message in context: http://n2.nabble.com/ClassCastException-in-pcReplaceField-tp4392185p4392185.html Sent from the OpenJPA Users mailing list archive at Nabble.com. CONFIDENTIALITY NOTICE: The information contained in this electronic mail (email) transmission (including attachments), is intended by MCLANE ADVANCED TECHNOLOGIES for the use of the named individual or entity to which it is addressed and may contain information that is privileged, confidential and/or protected as a trade secret. It is not intended for transmission to, or receipt by, any individual or entity other than the named addressee(s). If you have received this email in error, please delete it (including attachments) and any copies thereof without printing, copying or forwarding it, and notify the sender of the error by email reply immediately.