I am working on enhancing the TDML Runner to enable cross testing against both 
Daffodil and IBM DFDL, or really any DFDL implementation.


So I need to build a dynamically-loading module that provides a TDMLRunner 
friendly API, but calls the IBM DFDL implementation underneath.


The module will depend on Daffodil, implementing a new specific trait that 
defines the TDML Runner's API by which it drives a DFDL implementation, which 
is tentatively called 
org.apache.daffodil.tdml.processor.TDMLDFDLProcessorFactory


The dynamically-loading module will depend on IBM DFDL libraries, and will be 
compiled against them.


Not everyone should have those. In fact only people performing cross-validation 
testing should install it.


Hence, the module *cannot* be part of daffodil. It has to be an entirely 
separate jar. Packaged and delivered separately. Perhaps it's not even 
"delivered" in the sense of a maven-central thing. Perhaps we just build it and 
propagate it around to the relevant parties doing this testing ourselves.


So... I'm inclined to just create a repo on github for this, perhaps under the 
openDFDL project.

Still with the ASF v2 license.


But I'm not sure. Is that the right thing to do, or is there some other idea?


There are lots of projects (even lots of Apache projects) which are extensible 
frameworks and they have adapter components and connectors and the like which 
pose a set of similar problems to this. A connector depends on the framework, 
and also the native libraries of the software thing being connected to.


So is there an idiom I should be falling into here that is established?


...mikeb


Reply via email to