> -----Original Message-----
> From: John Casey [mailto:[EMAIL PROTECTED]
> Subject: RE: Mutliple source directories in project.xml
> 
> It seems to me that the POM is the wrong place to put anything related
> to artifacts created during maven execution.  

I tend to agree with this, but in my case, the generated artifacts are
abstract base classes and factory classes, one of each for every entity,
generated from a UML model (somewhat critical to the success of the goals...
;)  The correct implementation is likely that the XMI has special
representation in the POM, but is this just a specific case of a bigger
problem?  If so, there will never be an end to this issue as new metamodels
are introduced.  

XDoclet is a tricky case of this because it is mixing two sources of
information (java classes and class metadata) into a single source set.  The
problem with the POM might be the inability to differentiate between vanilla
source and source that is seasoned with XDoclet tags (or aspects, etc...).
It is a similar argument to the premise against projects having multiple
artifacts.

If I understand the POM correctly, it almost seems that Maven could use
pluggable providers for various metamodels, allowing for it to direct
generation during build and querying of the project from a component/PM
perspective.  Querying a project model but omitting the class metadata gives
an incomplete picture of the project, reducing the effectiveness of eventual
tools that will be able to query POMs on the net like Google (Poogle? :).
Imagine being very happy with a particular component and doing a "backwards
links" search for components that use it.  If metamodels are not available
for querying (XDoclet, UML via XMI, etc.) much of that information could be
unavailable.

> I'd even go so far as to
> say that the list of reports to be generated doesn't belong 
> in here.  To
> me, it makes sense to have the POM describe the project 
> itself, in pure
> terms, without making assumptions about what artifacts will be
> generated. 

It seems to me that the POM should be both a director to the generation of
artifacts as well as an indicator to the existence of artifacts.  In a
potential alternate universe, the transitive closure of POMs through
dependencies includes all components ever built under various Maven
installations.  This mirrors the web itself, but uses POM instead of HTML for
hyperlinking.

Imagining that such a world existed, a developer could import a class by
name, and the IDE would automagically look it up in a hypothetical POM search
engine, then update the POM with the correct dependency.  Or maybe Maven
starts to form the idea of a "build container", and a Java compiler that is
executing uses an import resolution service of the container for issues such
as missing imports.  

But without indication that an artifact (such as a report) existed, the
engine would have a harder time to find it.  Could POM search engine
functionality be the metric by which inclusion in the POM is measured?  If
so, how do generated artifacts fit in now?

> To that end, configuring things like the generated source directory
> should take place in project.properties instead of the POM. Obviously,
> it's not acceptable to try to configure the list of reports via
> project.properties, but somehow this information should also 
> be excluded
> from the POM. 
> 
> In general, operational information used in maven execution but not
> having any use outside of maven SHOULD NOT be in the POM. This will
> leave open the opportunity for the POM to outlive maven's current
> incarnation. Accomplishing this will make the POM much more 
> stable, and
> will mean that users won't be penalized by having to rewrite POMs for
> each update to a maven plugin. It should also reduce the requirements
> involved in providing backwards compatibility.
> 
> -john

-b

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to