Sylvain Wallez wrote:

Geoff Howard wrote:

Upayavira wrote:


<snip/>

So you won't be able to include and share arbitrary components between blocks? i.e. components that have arbitrary interfaces? They'll all have to follow interfaces that already exist within Cocoon? Or am I missing something?



No, you can include and expose arbitrary components. The interfaces he's referring to are the abstracted api interface of the component independant of its implementation but they are totally up to the author of the component.


So, if you are the author of block A you create

public interface MyInterface {
  public void doSomethingInteresting();
}

public class MyComponent implements MyInterface {
...
}

and only MyInterface.class is needed to _compile_ block B, not the whole cob file or even MyComponent.class

So, until the interface changes, no need to update dependencies in cvs.

Now, the big question outstanding in my mind is what happens if you have:

public interface InterestingInterface {
  public BlockSpecificDataStructure doSomethingElse();
}

We've said that direct classloading from the block is prohibited but here you need the definition of BlockSpecificDataStructure. So how do you permit one and deny the other? And does this get you into odd ClassCastExceptions where the same class from different classloaders are not recognized as identical?



We have here the exact same problem as Eclipse plugins. I you look at a plugin.xml file, you will see an runtime/library/export statements where are listed the classes (in the generic meaning, i.e. including interfaces) that are visible from outside the plugin. The classloader for a plugin (or a Cocoon block) will then "see" the classes exported by plugins/blocks the current one depends on. Thus, a plugin/block can export not only behavioural interfaces, but also classes that are part of the definition of the behavioural interface.


Just a note concerning the handling of this probel - in the Melrin container a block declares services that it exports. Based on this infromation the container (or a plugin) can verify that the context that the block services are bing included under container the classes required by the imported block. At the end of the day you have to make sure the APIs exposed by the imported block are included in the classloader of the importing block.

There is a on-line tutorial dealing with this sort of things at the following url:
http://avalon.apache.org/merlin/starting/advanced/composite.html


Now going back to the build system discussion, setting up a compiler classpath with this information is difficult or even impossible, because a plugin/block may export only parts of a given jar file, and the other parts should be kept hidden. I guess this is why the Eclipse compiler loads source file dependencies using a regular classloader and not a filesystem-based classpath.


This can be handled by splitting API amd implementation artifacts into seperate jar files.


Several solutions come to mind to solve the problem :
- write our own <javac> task based on the Eclipse compiler (we already use it to compile XSPs)
- impose a naming scheme for block jars to easily isolate public classes from private ones
- have the librarian provide a service that extracts the public classes of a block, so that we can reconstitute a local lib directory used to compile a block.


Why not just seperate you API and implementation jar files? Then you have the ability to correctly reference structural dependencies.

Steve.


I prefer this last solution as it's the one that has the less impact on the tools that can be used to build a block, and uses the download features provided by the block librarian.


What do you think ?

Sylvain


--


Stephen J. McConnell
mailto:[EMAIL PROTECTED]





Reply via email to