Re: Contribution classloading pluggability: was: Re: Classloading code in core contribution processing

2008-02-25 Thread Jean-Sebastien Delfino

Raymond Feng wrote:

Hi,

I don't want to intercept the discussion but I'm wondering if we should 
define the pluggability of the classloading scheme for SCA contributions.


Typically we have the following information for a ready-to-deploy unit:

* The URL of the deploment composite (deployable composite)
* A collection of URLs for the required contributions to support the SCA 
composite


There are some class relationship defined using import.java and 
export.java. In different environments, we may need to have different 
classloaders to deal with java classes in the collection of 
contributions. Should we define a SPI as follows to provide the 
pluggability?


public interface ClassLoaderProvider {
   // Start the classloader provider for a collection of contributions 
(deployment unit)

   void start(ListContribution contributions);

   // Get the classloader for a given contribution in the deployment unit
   ClassLoader getClassLoaders(Contribution contribution);

   // Remove the contributions from the provider
   void stop(ListContribution contributions);
}

Thanks,
Raymond



This is an interesting proposal but I think it's orthogonal to the 
discussion we've been having on contribution import cycles and support 
for partial packages.


Import cycles and partial namespaces are not specific to Java and can 
occur too with WSDL/XSD. I think we should handle them in a Java (and 
ClassLoader) independent way.

--
Jean-Sebastien

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Contribution classloading pluggability: was: Re: Classloading code in core contribution processing

2008-02-25 Thread Raymond Feng


- Original Message - 
From: Jean-Sebastien Delfino [EMAIL PROTECTED]

To: tuscany-dev@ws.apache.org
Sent: Monday, February 25, 2008 8:23 AM
Subject: Re: Contribution classloading pluggability: was: Re: Classloading 
code in core contribution processing




Raymond Feng wrote:

Hi,

I don't want to intercept the discussion but I'm wondering if we should 
define the pluggability of the classloading scheme for SCA contributions.


Typically we have the following information for a ready-to-deploy unit:

* The URL of the deploment composite (deployable composite)
* A collection of URLs for the required contributions to support the SCA 
composite


There are some class relationship defined using import.java and 
export.java. In different environments, we may need to have different 
classloaders to deal with java classes in the collection of 
contributions. Should we define a SPI as follows to provide the 
pluggability?


public interface ClassLoaderProvider {
   // Start the classloader provider for a collection of contributions 
(deployment unit)

   void start(ListContribution contributions);

   // Get the classloader for a given contribution in the deployment unit
   ClassLoader getClassLoaders(Contribution contribution);

   // Remove the contributions from the provider
   void stop(ListContribution contributions);
}

Thanks,
Raymond



This is an interesting proposal but I think it's orthogonal to the 
discussion we've been having on contribution import cycles and support for 
partial packages.


My proposal is for the java classloading strategy over related 
contributions. That's why I started it in a different thread. The general 
disucssion on import/export should stay independent of java.




Import cycles and partial namespaces are not specific to Java and can 
occur too with WSDL/XSD. I think we should handle them in a Java (and 
ClassLoader) independent way.


+1. My understanding is that the contribution service will figure out the 
import/export for various artifacts across contributions in a general way. 
With such metadata in place, the java class loader provider can be plugged 
to implement a classloading scheme which honors the import/export 
statements.



--
Jean-Sebastien

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Contribution classloading pluggability: was: Re: Classloading code in core contribution processing

2008-02-22 Thread Raymond Feng

Hi,

I don't want to intercept the discussion but I'm wondering if we should 
define the pluggability of the classloading scheme for SCA contributions.


Typically we have the following information for a ready-to-deploy unit:

* The URL of the deploment composite (deployable composite)
* A collection of URLs for the required contributions to support the SCA 
composite


There are some class relationship defined using import.java and 
export.java. In different environments, we may need to have different 
classloaders to deal with java classes in the collection of contributions. 
Should we define a SPI as follows to provide the pluggability?


public interface ClassLoaderProvider {
   // Start the classloader provider for a collection of contributions 
(deployment unit)

   void start(ListContribution contributions);

   // Get the classloader for a given contribution in the deployment unit
   ClassLoader getClassLoaders(Contribution contribution);

   // Remove the contributions from the provider
   void stop(ListContribution contributions);
}

Thanks,
Raymond

- Original Message - 
From: Rajini Sivaram [EMAIL PROTECTED]

To: tuscany-dev@ws.apache.org
Sent: Friday, February 22, 2008 12:38 PM
Subject: Re: Classloading code in core contribution processing



Sebastien,

On 2/22/08, Jean-Sebastien Delfino [EMAIL PROTECTED] wrote:


Cut some sections and reordered for readability.
...
 Jean-Sebastien Delfino wrote:
 - we can use ModelResolvers (1) or bypass them (2)
 - ModelResolvers don't handle import cycles and partial packages

 I think that (1) is better. Do you have a use case for cycles and
 partial packages right now or can it be fixed later?
...
Rajini Sivaram wrote:
 ContributionTestCase in itest/contribution-classloader contains a test
which
 runs into stack overflow if classes are resolved using ModelResolver. I
have
 added another test in there for testing for ClassNotFoundException in
the
 same scenario. To trigger the failure, you need to modify
 ContributionClassLoader.findClass to use the model resolver of 
 exporting

 contributions to resolve classes instead of their classloader.

Great to see a *test* case for cycles, but my question was: Do you have
a *use* case for cycles and partial packages right now or can it be
fixed later?



No, I dont have an use-case, at least not an SCA one. But there are plenty
of them in OSGi - eg. Tuscany modules cannot run in OSGi without support 
for

split-packages.  Of course you can fix it later. But IMHO, breaking
classloading to improve modularity is hardly worthwhile (all the
classloading related implementation code is now contained in
contribution-java, so the improvement will be very marginal). Classloading
errors tend to be hard to fix because classloading is often triggered by 
the

VM and not explicitly by the application. If potential stack overflows are
introduced into classloading, it wont be long before someone else 
complains

All this complexity related to classloading makes my head spin. And
chances are we will be back to a single CLASSPATH based classloader. That 
is

just my opinion.




--
Jean-Sebastien

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Thank you...


Regards,

Rajini




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]