Good Morning Mark,

first of all thanks for the swift and detailed answer! Since this list has been quiet for a while I wasn't sure if there even was someone reading it ^^. I also read the article you suggested, while most of it wasn't new to me it was a really nice way to brush up on the foundational concepts again, thanks!

So my concrete scenario as I understand it is the following (if I'm making any incorrect assumptions please do correct me): We have an @ApplicationScoped/@Singleton bean which has @Dependent bean members that are accessed via Instances/Providers. Now every time the @Singleton accesses its @Dependent member, the instance creates a new Object (as is specified by @Dependent) and remembers this object in its creationalContext. Since the "owner" of this is instance is very long lived (i.e. the lifetime of the application), that owner will never be destroyed by  OWB, and therefore it's member instances (including our @Dependent one with all the object it still references) won't either. Thus we have a memory leak (if we don't clean up the objects manually, which we do not). And as I understand the spec this is exactly how OWB (or any compliant application) should behave, we are just using it wrong. But since we have been doing so for decades, making this right is no easy feat.

You suggested that most often the @Dependent beans should actually be @ApplicationScoped. I believe that in my application they didn't want to use that scope, as it would have required them to make the implementations thread-safe. Whether that was a good decision is a different question.

Our current solution intercepts any @Singleton/@ApplicationScoped Provider/Instance injections, and then replaces (via reflection) the creationalContext in the instance implementation with a dummy map which does not remember any items that are put into it. Thus making sure that no references to our objects can hang around in there, but also breaking @PreDestroy hooks (and possibly other features / internal assumptions of OWB). We don't really need @PreDestroy there so this solution works for us, but the fact that we meddle in the internals of a third-party library via reflection is just ... not how I like to do things.

I'm not really sure what my ideal solution would like though. Initially I thought that I would like the injected object to be an "instance singleton" (keeping with your terminology in the article), i.e. unique to its injection point. I realize now that that would create problems if the beans are not thread-safe (which is very likely the case) and we're injecting into a @Singleton. Another idea was to re-implement the current behavior using normal jakarta APIs, i.e. implementing a custom scope that does not remember created instances. This should mostly work, the only obstacle that I see for that is that we still have some modules that use `bean-discovery-mode="all"` in their beans.xml, and those beans would then still use the standard @Dependent scope (but this is probably fixable with an acceptable amount of work). Another version of that last option would be to only create new objects for every thread (thus ensuring thread safety and at the same time limiting the number of new objects that are created). But especially in the advent of VirtualThreads I don't think it's a good idea to build new things on top of ThreadLocal.


I hope that my explanation of the situation was clear and I'm looking forward to any questions/suggestions you may have.


Best,

Christian


On 10/23/25 07:53, Mark Struberg via user wrote:
External Message - Please be cautious when opening links or attachments

Good morning Christian!

Dependent Scoped beans get destroyed whenever either the cleanup gets invoked manually or (that's the most important use case) when the CreationalContext it is contained gets destroyed. This is usually the CreationalContext of the NormalScoped bean it got injected into.

Often people make Dependent Scoped beans without thinking about what lifecycle the instances should have. Many times it is plain wrong and they should rather have used @ApplicationScoped instead. I don't know how much contact you already had with the CDI spec, so please excuse if I explain things you already know. But my experience is that people often use advanced technology (CDI, JPA, etc) without ever learning about the very mechanics. Maybe it helps to go back and read that very old (but mostly still valid) article I wrote together with my fellow CDI spec author Pete Muir:
https://entwickler.de/java/tutorial-introduction-to-cdi-contexts-and-dependency-injection-for-java-ee-jsr-299-104536

In short: if you do NOT use a @Dependent scoped bean in a NormalScoped bean, EJB, Servlet Filter or any other EE instance which is defined to support CDI, then we also do not store it's CreationalContext. Thus there should also be no mem leak.

I'm interested to learn about how your scenario looks like and where the references pile up. Maybe there was some 'workaround' implemented which is really not needed at all if CDI is used properly?

txs and LieGrue,
strub




Am 22.10.2025 um 09:13 schrieb Christian Ortlepp <[email protected]>:

Hey,

I have a question about the implementation of @Dependent bean destruction in OpenWebBeans. As per 2.5.4.2. of the cdi spec (https://jakarta.ee/specifications/cdi/4.0/jakarta-cdi-spec-4.0.pdf page 77) "Finally, the container is permitted to destroy any @Dependent scoped contextual instance at any time if the instance is no longer referenced by the application (excluding weak, soft and phantom references).". Is this implemented in OpenWebBeans, and if so is it enabled by default or do I have to configure it?


My motivation for having something like this is the following: In the application I am working on @Dependent beans are pretty widely used (I'm guessing because by using them one didn't have to think about lifetimes/thread safety so much). The developers that did this however usually did not remember to call `instance.destroy(bean)` after they were done using that object. This lead to memory leaks (because the OpenWebBeans Instance kept references to those objects, as it should if I understood the spec correctly) and my application has some pretty horrible workarounds in place to make those memory leaks go away. I would like to get rid of those workarounds, or at least use a less horrible one. The workarounds we have in place are also very old and were originally written for (I believe) OWB 1.x. It may very much be that the current version (which we are using) no longer needs (some of) the workarounds, my goal is to get a better understanding of what should be happening.


Practically what I'm looking for is a way to declare a bean in a way that if I get() it via an instance, that instance will immediately forget about the bean after handing it to me so that the bean can be garbage collected once I'm done using it. If anybody has an idea how to achieve this either with built-in means or with extensions I would appreciate any ideas or further resources.


Best,

Christian

Reply via email to