> From: Leo Sutic [mailto:[EMAIL PROTECTED]] > > > From: Berin Loritsch [mailto:[EMAIL PROTECTED]] > > > > > From: Leo Sutic [mailto:[EMAIL PROTECTED]] > > > > > > > From: Berin Loritsch [mailto:[EMAIL PROTECTED]] > > > > > > > > > From: Leo Sutic [mailto:[EMAIL PROTECTED]] > > > > > > > > > Hmm. What we have here is a bad design. Should the > > container make up for your poor design? Can you really get > > away with only three instances in a pool with a web > > environment? Honestly, there is nothing in Cocoon that does > > what you are talking about. > > Yes, it is bad design. It is more than that - it is horrible design. > > But admit one thing: It was not difficult to see that it was > utterly atrocious! > > I meant to use it to show a situation that may occur in a > system. It will not be as clearly awful as the abomination I > showed you, but *hidden* in the code. I can show you more > poor architecture: > > public void tailRecursive (int i) { > if (i < 100) { > mycomp = lookup (...); > tailRecursive (i + 1); > } > }
In order for a component based system to gain all benefits of being component based, we have to consider the "story" of the system. After all, that is where we develop our roles. From the story that we are trying to tell. The point is that you should know ahead of time, what the correct story is. You should know how, in a certain context the components relate to each other. That is what design is all about. As much as possible, you should strive to make your components as threadsafe as possible. That way you just don't have to worry about certain things. Systemic problems such as your recursive and iterative lookups should not be allowed by design. But (in the imortal words of Smokey the Bear) "Only you can prevent forest fires." Let's take, for example, the Sitemap. We have the ability to mount child sitemaps. Each child sitemap has a set of components it can access. If all the components specified in the parent sitemap are sufficient, then the child sitmap uses the parent's versions. Let's say we have our request answered by the seventh deep sitemap. So far, all we have done is used Matchers and Selectors, and possibly some Actions which have traditionally been forced to be ThreadSafe. Theorhitically, if all the Matchers and Selectors are threadsafe, then they only need to be requested once. I.e. they are resolved at sitemap initialization and held indefinitely. I personally think that a Matcher/Selector is too low a granularity to use for a component. Nevertheless, we finally arrive at our entry in the seventh deep sitemap. This happens to be an aggregation of two sources, each having some transformation done to them, and finally we transform the final result and serialize it. So our pipeline looks like this: G->T->T \ *->T->S / G->T->T In essence we have a three pipelines, with one being the merging of two of them. We can either view our Aggregator as a special generator, or we can view it as a special transformer. It is closer to a transformer that accepts multiple inputs than anything else. Currently there is no notion of an Aggregator COMPONENT, it is merely a function of the sitemap (why didn't we do that with Selectors/Matchers?). So far we will have a total of two generators, five transformers, and one serializer. Since a request is handled linearly, we can potentially optimize it so that we are only using one generator and three transformers at any one time. A smart aggregator would only handle events from the input pipeline, kill it, and then process events from the next pipeline. In this case, if we had a recursion in the pipeline, we would likely see it as an infinite loop. Such a thing is *really bad*, and the developer will detect it almost immediately. If a developer wants a self recurring pipeline that handled different pieces with a matcher/selector to handle logic, we have effectively jerked off with our core design. If you need that type of dynamic logic control, then you need a different tool. Now, there is no reason to repeatedly apply the same transformer to a pipeline either (unless you are a glutton for punishment). So I am having a hard time seeing in real life where your systemic problems would occur. Using the pipeline model, no generator, transformer, or serializer has any real business directly interacting with each other. Those are functions of the container--the sitemap. Therefore, the sitemap should exert the proper control over its components. The pipeline should work with XMLSource and ContentHandlers directly--releiving the critical path of repeated lookups. Also keep in mind that with GC based systems, we have to accept that we may be using more memory than we would otherwise have to use if we manually managed memmory. > --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]