Re: javax.jms in not repo

2007-03-05 Thread Craig McClanahan

On 3/5/07, Wayne Fay [EMAIL PROTECTED] wrote:

Good follow-up Dan. I generally tell people to use all the Geronimo
spec jars instead of deal with Sun hassles.

Future releases of these spec jars will most likely be released under
CDDL by Glassfish but for now, the Geronimo route is the way to go.



For Maven access to all of the standard Java EE 5 APIs, see Ludo's
blog for details:

   http://weblogs.java.net/blog/ludo/archive/2007/01/java_ee_5_apis.html


Wayne


Craig McClanahan

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: (important) Copying files from fixed external folder into WAR's /WEB-INF/.

2007-02-26 Thread Craig McClanahan

On 2/26/07, Udaybhaskar Sarma Seetamraju [EMAIL PROTECTED] wrote:

Hi there,

(mailing-list archive does not have a search facility.  Google failed to
show much for [site: mail-archives.apache.org struts])

With many projects already painfully migrated to maven, its a royal pain
to ---copy--- the struts TLDs and DTDs to each developer's folder, for
each of the projects.
Epecially, since we have so successfully been upgrading to newer
versions of struts using Ant's fileSet in such a simple, --STABLE--
and effective manner.


While I agree that a general solution to this problem might be useful,
I've got a question about your specific use case ... why are you
copying the Struts TLDs and DTDs separately?  The runtime environment
does not need them (unless your servlet container is totally broken
about loading resources from inside a jar file), so the only reason I
can think that you might want the files separately is for an IDE.
Even there, most IDEs have a way to register things external to the
project that is worth looking at.

As to the general resource copying problem, one thing you can actually
do is run Ant scripts in a Maven build, using the antrun plugin.  It
would be pretty straightforward to use this in the resource-copying
phase to execute your own script to copy whatever files you need, to
wherever you need, without waiting for Maven to provide a plugin that
behaves the way you require.

Craig





The javaworld article on The Maven 2 POM demystified mentioned the use of

|resource
targetPath/META-INF/plexus//targetPath|

But that didn't work. I had to use a complex workaround that *!*!* may
not work for too long *!*!*.

   resource
!-- DOES NOT WORK : targetPath./WEB-INF/./targetPath --
!-- DOES NOT WORK : targetPathWEB-INF/targetPath --

!-- Make sure the value INSIDE targetPath below is IDENTICAL to
${build.finalName} specified above --

targetPath../${TargetName}-${artifactId}-${version}/WEB-INF/targetPath

why would ./Fldr  and  ../Fldr  for targetPath, behave in different ways?

I am hoping that someone will point me towards a simple and
maven-upgrade-safe way to include struts DTDs and TLDs.
This stable approach is important for us, because that approach is ALSO
DESPERATELY NEEDED for the other DTDs and XSDs that are required under
/WEB-INF by the 3rd-party JARs we use.


Also, there was mention in a web site about a POM capability like :
/dependency
scopetld/scope
Does this exist, and if so, how to use it, and how to create/get the
appropriate JAR to support this?

Summary of Issue:

Copy files from a fixed external path, into specific paths rooted at
/WEB-INF inside the WAR file.
The Path within WAR starting at /WEB-INF/classes is no good.  Not in our
control.
Re: struts, we think its easy to keep the dependency version of struts
in pom.xml to match the FIXED read-only external folder containing the
DTDs and TLDs.

Maven version: 2.0.4
j2sdk1.4.2_05
debian Testing Etch

In case, there are people like me, I like a long-lasting solution,
especially as I am getting older and cannot afford the luxury of
revisiting the same problems for each new upgrade.

[end]


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: dependencies are bloated in M2

2007-02-06 Thread Craig McClanahan

On 2/6/07, Carlos Sanchez [EMAIL PROTECTED] wrote:


exactly, that's why he needs to use exclusions, you exclude things
that you don't need.



Exclusions can get you around immediate build problems, but a feedback loop
is necessary to improve the state of the world in general, or the problem
will just repeat itself the next time.

If your project has a dependency x, and that artifact has an optional
dependency y, the POM for x should explicity *say* it is optional.  That
will cause the web project to do the right thing ... *not* copy in the
optional dependencies unless you explicitly declare a dependency on them
yourself.

Thus, people who published depency x in this scenario need to be lobbied
to get their POMs fixed in the next version, to stop causing everyone who
uses Maven and their x library grief down the road.

A classic case that I know of personally :-) is Jakarta Commons Logging,
where the 1.1 version of the POM declares a dependency on the servlet API
but mistakenly did not declare it to be optional.  Yes, you as a user of
Commons Logging can use an exclusion to get rid of the unwanted file, but
why should you (or anyone else) *have* to?  Wouldn't the right thing be to
also go file a bug against C-L to fix their blasted POM?

(You don't actually have to for this particular scenario ... the POM has
been fixed in the trunk and a 1.1.1 release is likely very soon primarily to
address this issue ... but my point is in general it is the people who
publish broken POMs that should be complained at here, not Maven itself.)

Craig McClanahan


On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:

 It is the right solution. Using exclusions will exclude a dependency
from
 being downloaded at all, which means it won't be available at any path.
 Using provided will still make the dependency available for compile
time,
 but not in runtime, and will not bundle it in the package.

 Read maven FAQ:

 http://maven.apache.org/general.html#scope-provided



 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Carlos
 Sanchez
 Sent: Tuesday, February 06, 2007 4:29 PM
 To: Maven Users List
 Subject: Re: dependencies are bloated in M2

 that's not the right solution, you have to use exclusions

 On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:
  It will. If you don't want to include a particular dependency in your
  generated package just give it the provided scope, it will be excluded
 even
  if it was a transitive dependency of something else.
 
  Bashar
 
  -Original Message-
  From: Christian Goetze [mailto:[EMAIL PROTECTED]
  Sent: Tuesday, February 06, 2007 2:58 PM
  To: Maven Users List
  Subject: Re: dependencies are bloated in M2
 
  Tandon, Pankaj wrote:
 
  
  
  So the questions are:
  1. How can we control what get's into WEB-INF/lib. We tried all the
  scopes mentioned, but that did not help.
  
  I believe that the scope that should work is provided. The problem
is
  that I don't know if maven is smart enough to remove a provided
  dependency from the transitive closure. I would call that a bug if it
  didn't.
 
  --
  cg
 
  -
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
 
 
  -
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
 
 


 --
 I could give you my word as a Spaniard.
 No good. I've known too many Spaniards.
  -- The Princess Bride

 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]


 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]




--
I could give you my word as a Spaniard.
No good. I've known too many Spaniards.
 -- The Princess Bride

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: dependencies are bloated in M2

2007-02-06 Thread Craig McClanahan

On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:


Yes, but sometimes you will need to use a dependency for compile time
only,
and NOT for runtime. You don't need the container to provide it for you
either because it is not required for runtime. Example: aspectjtools.jar.
You can't exclude it because your project will not compile. The only way
is
to give it the provided scope.



That is not correct.  Declaring a dependency to be optional puts it on the
compile classpath, but avoids any attempt to include it at runtime.


Even if your container doesn't provide it

that's not a problem, maven doesn't care. I know it is not very clean to
give a dependency a provided scope when it's not going to be provided
anywhere, but sometimes you need to do this if you want to compile against
it.



The semantics of provided are different than optional even though Maven
does not enforce it.

The code you write against a provided API assumes that the API will indeed
be provided by the container.  As an example, you might declare as
provided a library that you've installed in Tomcat's common/lib
directory.  The library must be there in order for the application to
function -- but Maven can assume that it will indeed by supplied by the
container, so won't include it in the WAR.

Optional, on the other hand, means what it says.  Declaring such a
dependency means that you will need it available at compile time FOR THE
DEPENDENCY, but not necessarily for your own code (unless you explicitly
need it for other reasons).  The library is explicitly NOT required at
runtime, because your dependency has said, in effect, I can use this
library if it exists, but if it does not, no harm no foul.

Note also that optional is NOT a scope -- it is a completely separate
element.  That is because the concept of being optional is orthogonal to
scope ... it's perfectly reasonable, for example, to have an optional module
with compile scope if your build process knows how to intelligently deal
with that combination.

PLEASE do not misuse provided scope to mean the optional element or vice
versa.  PLEASE set up your POMs to say what you mean.  These are two
DIFFERENT concepts!

Craig

Bashar


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Carlos
Sanchez
Sent: Tuesday, February 06, 2007 5:18 PM
To: Maven Users List
Subject: Re: dependencies are bloated in M2

still not right, you have to use exclusions
provided means the environment (read appserver) provides that
dependency, which is only true for few dependencies in the whole
world, like servlet-api

On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:

 This is the question I was answering:

 Tandon, Pankaj wrote:
 
 1. How can we control what get's into WEB-INF/lib. We tried all the
 scopes mentioned, but that did not help.

 And it's follow up:

  Christian Goetze wrote:
  
   I believe that the scope that should work is provided. The problem
is
   that I don't know if maven is smart enough to remove a provided
   dependency from the transitive closure. I would call that a bug if
it
   didn't.

 And the answer to these 2 questions is to use the provided scope. It
will
 also stop a dependency from being passed on transitively. Using
exclusions
 is NOT right if you still want to compile against these dependencies.

 Bashar


 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Carlos
 Sanchez
 Sent: Tuesday, February 06, 2007 5:02 PM
 To: Maven Users List
 Subject: Re: dependencies are bloated in M2

 exactly, that's why he needs to use exclusions, you exclude things
 that you don't need.

 On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:
  It is the right solution. Using exclusions will exclude a dependency
from
  being downloaded at all, which means it won't be available at any
path.
  Using provided will still make the dependency available for compile
time,
  but not in runtime, and will not bundle it in the package.
 
  Read maven FAQ:
 
  http://maven.apache.org/general.html#scope-provided
 
 
 
  -Original Message-
  From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of
Carlos
  Sanchez
  Sent: Tuesday, February 06, 2007 4:29 PM
  To: Maven Users List
  Subject: Re: dependencies are bloated in M2
 
  that's not the right solution, you have to use exclusions
 
  On 2/6/07, Bashar Abdul Jawad [EMAIL PROTECTED] wrote:
   It will. If you don't want to include a particular dependency in
your
   generated package just give it the provided scope, it will be
excluded
  even
   if it was a transitive dependency of something else.
  
   Bashar
  
   -Original Message-
   From: Christian Goetze [mailto:[EMAIL PROTECTED]
   Sent: Tuesday, February 06, 2007 2:58 PM
   To: Maven Users List
   Subject: Re: dependencies are bloated in M2
  
   Tandon, Pankaj wrote:
  
   
   
   So the questions are:
   1. How can we control what get's into WEB-INF/lib. We tried all the
   scopes 

Re: Test source dependencies

2007-01-31 Thread Craig McClanahan

On 1/30/07, Andrew Williams [EMAIL PROTECTED] wrote:


Make a new project (T) and put all of the test code there in compile
scope (src/main/java)

then project A and B can both depend on T in scope test.



The Shale framework[1] has a real-life example of this technique.  We wanted
to use a common set of unit test base classes across tests for a bunch of
other modules, and the best solution was to create a separate Maven module
for the test framework, then depend on it (with test scope) everywhere i
was needed in our tests.

Andy


Craig

[1] http://shale.apache.org/

Leo Freitas wrote:


 How to fix dependencies among testing code?
 That is, if I have a project B test code that
 depends on project A test code.
 That is, there is abstract/base testing source
 code setup for A that B also uses.

 The usual dependency tag (even for test scope)
 would link to B/main/java with A/main/java but
 would not link B/test/java with A/test/java.

 That implies in a compilation error if I try to
 run the test cases for B. I thought to use assemblies
 to fix this, but it does not sound right.

 Any suggestion?

 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Project-wide default profile?

2007-01-22 Thread Craig McClanahan

On 1/22/07, pjungwir [EMAIL PROTECTED] wrote:




Does maven give me any way to do what I want? This seems like such a basic
and obvious requirement.



The POM for the shale-apps module[1] in Shale[2] has an example of exactly
the scenario you describe, showing the way we decided to approach this.

Shale is an add-on library on top of JavaServer Faces, and it can run with
any compatible implementation.  We wanted to provide sample apps that you
could easily build with any of the options, but (as you described) make one
of them the default.  With the settings shown below, we can select any of:

* mvn clean install -- Selects MyFaces 1.1 (the default)

* mvn -Djsf=ri clean install -- Selects the reference implementation 1.1

* mvn -Djsf=ri12 clean install -- Selects the reference implementatin 1.2,
 and includes it in the webapp for a non-JavaEE5 container.

* mvn -Djsf=ee5 clean install -- Selects the reference implemenation 1.2,
 and omits it from the webapp because JSF 1.2 is provided on an EE5
container.

The relevant bits of the pom.xml are:

profiles

 !-- Apache MyFaces 1.1 (default) --
 profile
   idmyfaces/id
   activation
 property
   name!jsf/name
 /property
   /activation
   ...
 /profile

 !-- JSF RI 1.1 --
 profile
   idjsfri/id
   activation
 property
   namejsf/name
   valueri/value
 /property
   /activation
   ...
 /profile

 !-- JSF RI 1.2 (non-EE5 container) --
 profile
   idjsfri12/id
   activation
 property
   namejsf/name
   valueri12/value
 /property
   /activation
   ...
 /profile

 !-- JSF RI 1.2 (EE5 container) --
 profile
   idjsfee5/id
   activation
 property
   namejsf/name
   valueee5/value
 /property
   /activation
   ...
 /profile

/profiles

So, you indirectly choose a profile by passing a system property with a
specific value.

Craig

[1]
http://svn.apache.org/viewvc/shale/framework/trunk/shale-apps/pom.xml?view=markup
[2] http://shale.apache.org/


Re: What's the best way to specify versions for Maven Plugins?

2007-01-22 Thread Craig McClanahan

On 1/22/07, mraible [EMAIL PROTECTED] wrote:



What's the best way to specify versions for Maven Plugins. In the AppFuse
project, we're distributing archetypes that have plugins pre-defined in
the
pom.xml files.

Should we:

1. Have no version
2. Use the latest version in the Maven repo
3. Use versionLATEST/version
4. Use versionRELEASE/version

We've been using #1 and the downside seems to be that snapshot
repositories
are checked for updates. Does this problem go away when we don't depend on
any snapshots?

#2 seems good, but it requires our users to manually update the version
number when a new release comes out.

I'm looking for the method that doesn't cause a slowdown (i.e. checking
repos for updates) in the build process, but auto-upgrades when new
releases
come out.  We have found issues with some plugins (i.e. Jetty 6.0.1doesn't
work with JSF), so for those we're willing to hard-code the plugin
version.



This is not just an issue for plugin versions ... exactly the same issues
apply to dependency versions.  I've been thinking about a strategy of being
explicit on all versions, but using the range syntax ... with a range
covering all the combinations I have actually tested with :-).

IMHO, there is a significant potential problem when you use any of the other
options ... you are implicitly trusting the supplier of your plugin (or
dependency) to not screw you by breaking backwards compatibility in some
*future* version.  I'd rather be conservative in what I claim in a POM, but
more liberal in what I describe (in text) in release notes or things like
that.

A negative side effect of this strategy is the potential need to update the
range later on, when a new version of something you depend on has been
released, and it *does* actually work.  But that is the sort of thing I'd
rather document (hey, even though the POM claims that this works with
Spring 1.2.8, you can actually use Spring 2.0 successfully), and let people
override explicitly in their own POMs until my next release comes out.

Thanks,


Matt



Craig


Re: Dependency groups?

2006-12-22 Thread Craig McClanahan

On 12/22/06, MikeKey [EMAIL PROTECTED] wrote:



Forgive a likely newbie question but I've not found anything outside a
hacked
parent pom to get something like this to work.

Is there any way to setup a pre-defined set of dependencies to include in
a
given pom?  For example, Hibernate requires several jars to be included as
dependencies to a project using it...is there a sane way in maven to
define
a hibernate-dependencies.pom or something like that and include it in my
pom.xml?  To make a reusable set of dependencies?



We do this kind of thing in Shale, to provide a common set of dependencies
for all of the sample apps.  Each individual app's pom mentions 
org.apache.shale:shale-apps-parent[1] as its parent, and therefore inherits
all the common dependencies.

Another useful approach to this sort of problem is the Maven2 archetype
facilities[2], where you can define the initial structure of a new project
(and include the default set of dependencies in the newly created pom.  The
dependencies aren't shared later, but this is a great way to get a kick
start on a new project that needs a starter set of stuff, but then will
evolve on its own.


Thanks for any help.

--
View this message in context:
http://www.nabble.com/Dependency-groups--tf2872833s177.html#a8029585
Sent from the Maven - Users mailing list archive at Nabble.com.



Craig

[1]
http://svn.apache.org/viewvc/shale/framework/trunk/shale-apps/pom.xml?view=log
[2]
http://maven.apache.org/guides/introduction/introduction-to-archetypes.html


-

To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: upload POMs for non-free artifacts

2006-12-14 Thread Craig McClanahan

On 12/14/06, nicolas de loof [EMAIL PROTECTED] wrote:


Hello,

I've deployed some restricted libs to my corporate repository. Those libs
are for example Oracle JDBC driver or IBM MQSeries java client.
Can I make an upload request for such POMs in maven public repo ?
My goal is to avoid the Sun jars hell I got on maven1, where projects
made
reference to the same artifact with various groupId.



What good is it to publish such poms in the public Maven repositories, if
you cannot post the actual jar files they refer to?  And, it seems unlikely
to me that you have the legal right to publish all of the jars referenced in
your follow-up list.

The right solution is to encourage the vendors of the stuff you need to
publish their own software, with the group and artifact ids that *they*
choose.  Trying to publish someone else's work, under an identifier that
*you* choose, strikes me as an attempt at plagarism.

Craig

Nico.





Re: Question about transitive dependencies

2006-12-01 Thread Craig McClanahan

On 12/1/06, Alexander Sack [EMAIL PROTECTED] wrote:


This maybe an artifact of the fact that many plugins use the classpath for
runtime constraints as well.



It's also possible to avoid this behavior, by declaring a scope of runtime
on the dependency.  That way, the module is *not* included on the compile
time classpath (unless it's declared with compile scope by someone else you
depend on), but *is* included for runtime (for example, in a webapp it'd be
included in WEB-INF/lib).

-aps


Craig


Re: Question about transitive dependencies

2006-12-01 Thread Craig McClanahan

On 12/1/06, Alexander Sack [EMAIL PROTECTED] wrote:


And in fact, one can actually remove the transitive dependencies that are
unwanted during the build classpath by declaring them in the POM as
provided.  I believe that fits his scenario where he is building against
a
library that has a transitive dependency that he does not want on his
build
classpath for whatever reason (packaging, etc.).



My understanding is the provided is essentially the opposite of runtime
... it leaves the dependency on the compile classpath but does not include
it in the output.   Whereas, runtime leaves it off the compile classpath
but does include it in the output.

In a web app, I'll want to declare the servlet API as a provided API so
that I can use it in my own code.  The servlet container will make it
available to me at runtime, so I don't want to include it in the war.  But,
if I'm also using something like the MyFaces JSF implementation, which has
both an API jar and an implementation JAR, I'll want the API jar declared as
compile scope (so its on the classpath and included in the war), but the
impl JAR declared runtime scope (so it is *not* possible for me to
accidentally compile against classes in the MyFaces implementation, where I
should be programming solely to JSF APIs), but is included so that the
runtime app will work.

Craig


-aps


Craig


On 12/1/06, Craig McClanahan [EMAIL PROTECTED] wrote:


 On 12/1/06, Alexander Sack [EMAIL PROTECTED] wrote:
 
  This maybe an artifact of the fact that many plugins use the classpath
 for
  runtime constraints as well.


 It's also possible to avoid this behavior, by declaring a scope of
 runtime
 on the dependency.  That way, the module is *not* included on the
compile
 time classpath (unless it's declared with compile scope by someone else
 you
 depend on), but *is* included for runtime (for example, in a webapp it'd
 be
 included in WEB-INF/lib).

 -aps


 Craig




--
What lies behind us and what lies in front of us is of little concern to
what lies within us. -Ralph Waldo Emerson




Re: Looking for a simple maven assembly example

2006-11-07 Thread Craig McClanahan

On 11/7/06, Christian Goetze [EMAIL PROTECTED] wrote:


I've read the better builds with maven book, I've looked at
http://maven.apache.org/plugins/maven-assembly-plugin, but I'm still not
sure I understand how this is supposed to work.

I just want end up with a zip file containing all of the jars needed to
run the particular project I'm building.

Alternatively, I'd already be happy with a way to just get the
transitive runtime dependency list out of the build.

Any help would be appreciated, thanks in advance.
--
cg



It doesn't reallly qualify as simple, but here[1] is the assembly
configuration we use for Shale to build release artifacts.  It includes the
created libraries, all the dependent libraries, javadocs, and source code
for a bunch of different submodules (this descriptor is from a shale-dist
module that is one level below the top level project directory, hence all of
the .. relative paths.

The part that picks up all of the dependent libraries (i.e. those with scope
runtime), and puts them in the lib directory of the output:

   dependencySets
   dependencySet
   outputDirectorylib/outputDirectory
   scoperuntime/scope
   /dependencySet
   /dependencySets


Craig

[1]
http://svn.apache.org/repos/asf/shale/framework/trunk/shale-dist/src/assemble/dist.xml


Re: mvn site running test cases twice

2006-11-07 Thread Craig McClanahan

On 11/7/06, jp4 [EMAIL PROTECTED] wrote:



This seems like a big issue since our nightly builds usually run all of
our
unit and container test cases.  If we have to run the tests twice, it will
almost double the build time which is already several hours.



The Shale build also includes Cobertura (thanks to Wendy :-), and it only
runs the second set of tests twice if you execute site ... if you execute
only install it just does the normal one time.  Is it required that your
nightly builds generate the site?

If it is, you might see if Maven has a command line option to suppress the
normal test phase ... since you know the tests will be run anyway by the
plugin.  Before relying on this, though, I'd want to verify that a test
failure during the Cobertura part actually does abort the build so you hear
about them.

Is there any way to instrument without invoking the test cases?  It seems

like you would want to clean, compile, instrument, test, install create
site
docs.  Has anyone found a workaround?



The way that Cobertura works, it has to actually execute the tests (using
the instrumented classes) in order to determine which code paths you've
covered and which you haven't.  Just instrumenting wouldn't accomplish much
that is useful.

Thanks,


jp4



Craig


Wendy Smoak-3 wrote:


 On 11/7/06, jp4 [EMAIL PROTECTED] wrote:

 I removed the coberatura plugin and unit test cases run only once...
 Here is
 what I have in my pom
 ...
 Any ideas?  Looks like the test cases get run during instrumentation?

 I think it's normal based on Maven's current design.  The tests are
 run once during the 'test' phase, then in order to produce the
 coverage report, the tests have to be re-run on the instrumented code.

 Take a look at this post from Vincent which talks about a similar
 issue with the Clover plugin:

http://www.nabble.com/-M2--My-tests-are-launched-3-times-%21-t2190279s177.html#a6075779

 --
 Wendy

 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]




--
View this message in context:
http://www.nabble.com/mvn-site-running-test-cases-twice-tf2571386s177.html#a7226443
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: no classes, but lib?

2006-11-05 Thread Craig McClanahan

On 11/5/06, jiangshachina [EMAIL PROTECTED] wrote:



Generally, in Web application project, the Java class files would be put
into
WEB-INF/classes folder.



Just out of curiosity, why do you want to do this?  Doing this seems likely
make loading your classes a little bit slower.

Now, I want the classes to be archived and putted into WEB-INF/lib folder.

And the classes should be classified and putted into different
sub-directories under WEB-INF/lib respectively.
For example, WEB-INF/lib/data/(some jars related to Database operations),
WEB-INF/lib/user/(some jars related to user management), etc.



You might want to make sure that your servlet container will actually load
the JAR files if you do this.  The spec only requires that JAR files
directly in WEB-INF/lib be loaded, not from subdirectories.

a cup of Java, cheers!

Sha Jiang



Craig


Re: no classes, but lib?

2006-11-05 Thread Craig McClanahan

On 11/5/06, jiangshachina [EMAIL PROTECTED] wrote:



 You might want to make sure that your servlet container will actually
load
 the JAR files if you do this.  The spec only requires that JAR files
 directly in WEB-INF/lib be loaded, not from subdirectories.
You are right.
But I would put the jars at directory WEB-INF directory, and set web.xmlto
fit for the matter.



That won't work either ... the servlet spec mandates that any JAR files you
want included on your web application's classpath *must* be directly under
WEB-INF/lib.


Just out of curiosity, why do you want to do this?  Doing this seems
 likely
 make loading your classes a little bit slower.
I just make the jars more clearly. Others can easily understand which
lay/part of application some jar files are belong to.
May I think too much?



Perhaps so ... to me, there are many things about application development
that are *much* more important than the precise layout of the deployed
application :-).  I agree with you that understanding the structure of the
application is important.  But it is more important to understand how the
*source* classes reference each other (for example, this is where UML class
diagrams can be helpful) than how they are arranged inside an executable
program.

Imagine that you were using C or C++ and building a .exe native executable
file for Windows, instead of a Java based web application.  Would you really
care much about how the linker combined all of the object files and
libraries together?  Probably not :-).  Would you care about what libraries
you used, and what their APIs are, and what modules depend on what other
modules?  Probably so :-).

a cup of Java, cheers!

Sha Jiang



Craig



Craig McClanahan-3 wrote:

 On 11/5/06, jiangshachina [EMAIL PROTECTED] wrote:


 Generally, in Web application project, the Java class files would be
put
 into
 WEB-INF/classes folder.


 Just out of curiosity, why do you want to do this?  Doing this seems
 likely
 make loading your classes a little bit slower.

 Now, I want the classes to be archived and putted into WEB-INF/lib
folder.
 And the classes should be classified and putted into different
 sub-directories under WEB-INF/lib respectively.
 For example, WEB-INF/lib/data/(some jars related to Database
operations),
 WEB-INF/lib/user/(some jars related to user management), etc.


 You might want to make sure that your servlet container will actually
load
 the JAR files if you do this.  The spec only requires that JAR files
 directly in WEB-INF/lib be loaded, not from subdirectories.

 a cup of Java, cheers!
 Sha Jiang


 Craig



--
View this message in context:
http://www.nabble.com/no-classes%2C-but-lib--tf2577029s177.html#a7193406
Sent from the Maven - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Problems running a JSF application on Jetty:Run

2006-11-01 Thread Craig McClanahan

On 11/1/06, Ronen Naor [EMAIL PROTECTED] wrote:


The problem is that the j2ee spec says that the jsf jars need to be as
part
of the container classpath.



Actually, that is only required (in the specs) for Java EE 5.  For EE 1.4,
you can either treat either the JSF RI or MyFaces as a normal dependency for
webapps and have it loaded in WEB-INF/lib, or do what you did and load it
into the container classpath.

Craig


Re: a help option like ant -projecthelp

2006-10-30 Thread Craig McClanahan

On 10/29/06, 向秦贤 [EMAIL PROTECTED] wrote:


Hi,

Thanks hint. Althrough I'm a newbie to maven, I'd to insist on this help
option, just like maven scanning.
Supposed that a man get a project code, his first work is just to build,
and build what, can build what, he thinks, but maven cannot answer.
Isn't it?



Qinxian, I fear you might have missed the most important part of Wendy's
response to your question:


For the most part, the 'goals' are the lifecycle phases,
and they are the same for every project.


In other words, once you learn Maven, you will *not* need to try to
understand the individual targets of the build script of some particular
project ... it will be just like every other Maven based project.  Learn one
of them, and you've learned the fundamental structure of all of them.

Maven's philosophy about building software is very different from Ant's.  To
become comfortable with it, you have to learn to accept the fact that Maven
is very much about enforcing common standards.  You actually can do a lot of
nonstandard things with Maven, but it is so much extra work that you (as a
developer of a project using Maven for builds) are *much* better off going
with the standard conventions that Maven encourages.

If you are simply trying to build a project that has a build infrastructure
based on Maven, then you are a *huge* beneficiary of this.  Simply do the
following:

* Install Maven itself per the instructions on the website.

* Check out or download the sources for the project you want to build.

* Change your directory to the top level directory of that project

* Execute the command mvn clean install

* And everything should just work.  No fussing with downloading
 your own dependencies.  No need to edit build.properties files.

Personally, I was a long time holdout for using Ant based build
environments.  But, if you look at the current build infrastructure for the
projects I care about (Shale, MyFaces, Struts, (eventually) Jakarta
Commons), you'll see a definite migration towards using Maven build
environments.  Why?  Because it is better -- for both the project developers
*and* the users who want to build the sources of those projects themselves.

So, what should *you* do?  The first thing is to change your question.
Instead of asking how does the build process for project XYZ work?, simply
ask the question how does the build process for any Maven-based project
work?  Answer that question once, and you will instantly understand the
build environment for essentially all Maven based projects.


Regards,


Qinxian



Craig McClanahan


Checkstyle Behind A Firewall

2006-09-30 Thread Craig McClanahan

In the Maven2 build of Shale, I'm having a problem generating the site when
running behind a firewall.  A bit of research indicates that the problem
relates to the following configuration setting for the Checkstyle plugin.

   plugin
   artifactIdmaven-checkstyle-plugin/artifactId
   configuration
   configLocation
http://svn.apache.org/repos/asf/shale/maven/trunk/build/shale_checks.xml
/configLocation
   /configuration
   /plugin

This setting works fine when not behind a firewall ... but when I am, it
fails (even though I have the correct proxy set up in by settings.xml file,
and I can download from repositories normally).  Is there any way to
convince the Checkstyle plugin to use the http proxy for looking up this
configuration resource?

Craig


Re: Checkstyle Behind A Firewall

2006-09-30 Thread Craig McClanahan

On 9/30/06, Andreas Guther [EMAIL PROTECTED] wrote:




Looking at the content of the URL a checkstyle config xml file comes up.
Did you try to overwrite the configuration and point to a checkstyle
configuration file of your own?



I agree that the file is really there.  The problem is that Checkstyle is
not respecting the proxy settings to retrieve it, which causes a failure
when trying to generate the website (from behind the firewall -- it works
fine on a direct connection to the Internet).

I bet you can have that file somewhere behind your firewall and either

have a reltaive, absolute, or internal URL reference to it.



That would work for me, but it would break the build for everyone else.

Andreas


Craig



-Original Message-

From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Craig
McClanahan
Sent: Saturday, September 30, 2006 2:44 PM
To: Maven Users List
Subject: Checkstyle Behind A Firewall

In the Maven2 build of Shale, I'm having a problem generating the site
when
running behind a firewall.  A bit of research indicates that the problem
relates to the following configuration setting for the Checkstyle
plugin.

plugin
artifactIdmaven-checkstyle-plugin/artifactId
configuration
configLocation
http://svn.apache.org/repos/asf/shale/maven/trunk/build/shale_checks.xml
/configLocation
/configuration
/plugin

This setting works fine when not behind a firewall ... but when I am, it
fails (even though I have the correct proxy set up in by settings.xml
file,
and I can download from repositories normally).  Is there any way to
convince the Checkstyle plugin to use the http proxy for looking up this
configuration resource?

Craig

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Any way to release with snapshot dependency?

2006-09-29 Thread Craig McClanahan

On 9/28/06, shinsato [EMAIL PROTECTED] wrote:



We're using the retrotranslator plugin, which unfortunately isn't released
yet.  It's still in the sandbox.  Which breaks the release plugin (which
we
discover the day before we have to get our first ever maven executed
release
out).

Is there any way around this at all?  I'm a bit new to maven2 still, and
when I tried to do an internal release of retrotranslator, it was an
exercise in frustration.

Hopefully someone knows some possible way around this.  If not, does
anyone
have an idea how to get such a release accomplished.  There were snapshot
dependencies as well.  Is this going to require releasing every transitive
dependency too?  Please say there's a short cut around this...



The most important question here is not about your dependencies ... it is
about your own desires.  Why would you want to release something of your
own, based on a snapshot dependency, that might be broken if the dependency
developers later release an updated snapshot that is not compatible with
your own product?

The only reasonable alternatives are:

* Convince the developers of the packages you are dependent on to release a
  non-snapshot version of whatever you are depending on, so you can declare
  a dependency on that version.

* Remove the dependency on the snapshot code by providing equivalent
functionality
 in some other manner.

If you wish to choose a different approach, you are guaranteeing that people
like me will *never* *ever* trust any software you produce as being
something that can be depended on, because there is no way *you* can
guarantee that the snapshot based dependencies will not change in
incompatible ways after your release.

Note that this is not an issue specific to Maven -- it's all about how
serious you are about providing a stable (over time) ability to rebuild your
software from its sources.


 Thanks in advance,

  Harold Shinsato



Craig McClanahan


Re: How to answer to an archive post ?

2006-08-19 Thread Craig McClanahan

On 8/19/06, Eric Reboisson [EMAIL PROTECTED] wrote:


Hello,

i would like to answer to an old message that i found in archives.

I know the id of the message, but i don't know how to answer it.



Best bet would be to create a new message, and then either quote the
question in line, or add a hyperlink to the archive version of the message.
Then, you can answer it.  It won't get linked as a response on that thread
(unless you still have it in your gmail archive and just reply to it
normally), but a text search on the appropriate keywords should find it.

Craig

Thanks for your help.


Best regards
Eric

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Profile Inheritance

2006-08-14 Thread Craig McClanahan

On 8/14/06, Douglas Ferguson [EMAIL PROTECTED] wrote:


Are profiles inherited from the parent pom?



They are supposed to be, although there have been a few issues here and
there.  The one that bit us (Shale project) was MNG-2221[1], which looks
like it's been fixed for 2.0.5 when that is released.

Craig

[1] http://jira.codehaus.org/browse/MNG-2221


__



Douglas W. Ferguson

EPSIIA - Another Fiserv Connection

Development

Office Phone: 512-329-0081 ext. 3309

Dial Toll Free: 800-415-5946

Mobile Phone: 512-293-7279

Fax: 512-329-0086

[EMAIL PROTECTED]

www.epsiia.com http://www.epsiia.com/

__







Re: Conflicting Dependency Version Dilemna

2006-08-09 Thread Craig McClanahan

On 8/8/06, Jörg Schaible [EMAIL PROTECTED] wrote:


Hi Craig,



Hello Jörg

Craig McClanahan wrote on Wednesday, August 09, 2006 6:58 AM:


 A project[1] that I'm a participant in is a recent convert to
 Maven2 as a
 build environment.  So far, there's a lot to like.  But ... I
 think I've run
 into a limitation of the current design related to resolving
 conflicts in dependency versions.  I'm looking for advice on what I
 can do
 other than
 wait for FINR (fixed in next rev) as the Better Builds
 book implies will
 be necessary :-).

 The starting point for my scenario is I wanted to explore
 whether Commons
 Logging 1.1 could successfully replace version 1.0.4 in a
 Shale-created webapp.  To see all the gory details, you're probably
 best
 off downloading
 the sources and trying the build yourself.  But the bottom
 line is that the
 only way I can convince a particular webapp build to use the
 new version of
 Commons Logging is to put the explicit dependence on 1.1
 directly in the POM
 for that webapp.  This is *not* what I want -- I'd much
 prefer to inherit
 the Commons Logging version dependency from the top-level shale POM (
 org.apache.shale:shale-parent), or even from the intermediate
 layer I have
 as a basis for all the example webapps
 (org.apache.shale:shale-apps-parent).

If you have a depednencyManagement section in your top-level POM and add
the dep to your webapp without the version, it will work.



Agreed ... but the requirement to explicitly do this in every single webapp
is not what I want.  The whole point of my apps inheriting a parent POM is
to get rid of this sort of administrivia.


Alas, this doesn't work.  Any dependency such as MyFaces that
 declares a dependency on Commons Logging 1.0.4 seems to win, because
 it
 is closer on
 the nearness algorithm described in Section 3.6 of the Better
 Builds book.

Point is, that you *have to* desclare the dep in the webapp, since the
algorithm for the nearest version would take another version from one of
your dependencies.



Why should I have to declare it in the app, when I've declared it at the
parent level?  What's the point of transitive dependencies if they do not
work the way you want, and there's no way to force them to do so?


It would seem to me that the simplest way to deal with this is that
 inherited dependencies (from a parent POM) should be
 considered as being at
 the same level of the dependency graph, just as if they had
 been explicitly
 declared.  That would always allow a project to establish priority for
 shared dependencies itself, without having their builds
 destabilized because
 inheritance and dependence are both being treated as one step down the
 graph.  Am I missing something?  Is there some way to
 accomplish what I want
 (with M2 2.0.4) in the mean time, without explicitly declaring this
 dependency in the leaf node artifact POMs?

Vote for it:
http://jira.codehaus.org/browse/MNG-1577



Done.

- Jörg


Craig


Re: Conflicting Dependency Version Dilemna

2006-08-09 Thread Craig McClanahan

On 8/8/06, Craig McClanahan [EMAIL PROTECTED] wrote:


Why should I have to declare it in the app, when I've declared it at the
parent level?  What's the point of transitive dependencies if they do not
work the way you want, and there's no way to force them to do so?



I should have been more clear.  It is not that transitive dependencies are
bad ... it's that I believe inherited dependencies (including version
dependencies described by dependencyManagement sections should *always*
override transitive dependencies on the same artifact.

Alternatively, it would be reasonable to allow an override of whatever the
default behavior is for advanced cases ... but requiring me to define
version overrides in leaf nodes, simply because my inheritance hierarchy is
deeper than my dependence hierarchy, encourages bad build architecture
design  behavior -- and isn't part of the point of Maven to eliminate that
kind of idiocy?  :-)

Craig


Re: Conflicting Dependency Version Dilemna

2006-08-09 Thread Craig McClanahan

On 8/9/06, Mike Perham [EMAIL PROTECTED] wrote:



Not exactly.  We have a single parent POM per ear and don't share war/ejbs
between ears, yes.  But you can still do what you want with a
dependencyManagement section in the parent to centralize the version
numbers and just have each ear POM pull in its required ejbs.  Or am I
missing something?



Defining dependencyManagement in the parent does not seem to cover all the
cases for me.  A simplified scenario goes like this:
* Parent POM declares a version for artifact A
* Child POM declares (or inherits) a dependency on artifact B
* Dependency B declares a dependency on artifact A with an explicit, but
different, version number.

It seems that the two dependencies on artifact A are at the same level (in
the terms of the book's section on this) and therefore the behavior is
undefined.  Indeed, it seems to work for most of the dependencies with
version conflicts, but not all of them ... and it is completely unclear to
me at this point why the cases are different.

mike


Craig


Jörg Schaible [EMAIL PROTECTED] wrote on 08/09/2006

11:46:10 AM:


 Unfortunately this pattern does only help if you build a single EAR.
 For multiple EARs (we have so) and a lot of EJBs (not exclusivly
 referenced), you cannot define the dependencies in a common parent,
 since then you woud end up with all depednencies of all EJBs in
 every EAR. So you use a dependencyManagement (we do) and you're left
 with the already described problem - you cannot force transitive
 dependencies toi the version defined in the management.




Re: Conflicting Dependency Version Dilemna

2006-08-09 Thread Craig McClanahan

On 8/9/06, Mike Perham [EMAIL PROTECTED] wrote:






Again, this is because dependency B is not using version ranges but rather
forcing a specific version on its downstream dependents.



Ah, but dependency B is a third party library ... and the entire transitive
dependency tree for Shale (which is pretty small compared to lots of
multi-project environments) has at least 100 such dependencies.  Seems to me
like evangelizing all those folks to set up proper version ranges in their
POMs belongs to the Maven team, not to me whose just an (indirect) Joe
User for all those packages :-).

If A requires a different version, it has to declare that dependency in

order to override the transitive dependency version from B.  There's
nothing Maven can do when the POM data is incorrect.  You need to exclude
it or override it.



Yes, there is something Maven could do -- it could reflect the reality that
most POMs don't actually do this, and provide developers a workaround that
protects them from instabilities caused by the current rules.

It could define the version conflict resolution rules such that
dependencyManagement version declarations in *my* POMs (defined as the one
for this project or the explictly declared parent tree) always win over
version declarations that come from explicit or transitive dependencies.
That way, I can declare in a single parent POM all my version dependency
information for *all* my leaf node projects (webapps in my case, but the
same principle applies to any environment where you are creating lots of
individual artifacts that share a common set of dependencies) without having
to tediously edit all of the leaf POMs individually.

Craig


Re: Conflicting Dependency Version Dilemna

2006-08-09 Thread Craig McClanahan

On 8/9/06, Mike Perham [EMAIL PROTECTED] wrote:



Again, this is because dependency B is not using version ranges but rather
forcing a specific version on its downstream dependents.




Thinking about this a little further, there is a QE viewpoint that leads me
to believe that declaring a specific version on a dependency is actually the
right thing to do.  Let's consider a concrete case:  the Shale core library
(org.apache.shale:shale-core) artifact depends on, among other things,
Commons BeanUtils.  In the POM for this module, I want to inherit from the
parent POM (org.apache.shale:shale-parent) a dependence on a particular
version of this library, so that I can update the dependencies of a bunch of
other artifacts all at once when I've satisfied myself that it works.

Currently, in the parent POM, the version (declared in a
dependencyManagement section) of BeanUtils that is requested is,
specifically, 1.7.0.  This is essentially an assertion that I have tested
my library against this specific version, and am satisfied that it works.

If I understand your recommendation correctly, you would like me to declare
my dependency on version [1.7.0,) instead (meaning version 1.7.0 or any
later version).  From a QE perspective, that is an untestable assertion --
there is no way to know that some future version of BeanUtils might
introduce some incompatible change that makes *my* library no longer work.
That is not acceptable to me as the supplier of a library, because it is
*me* who is going to suffer the you *said* it would work bug reports.

If an end user of my library wants to override my setting, they can
(although making them do it in every leaf node is definitely a violation of
the don't repeat yourself mantra that M2 seems to really like :-).  But I
want *my* POMs to advertise what *I* have tested, and not rely on all of my
dependencies not to break me with future versions.  I wouldn't even want to
trust my own modules enough to use ranges like that :-).

Craig


Conflicting Dependency Version Dilemna

2006-08-08 Thread Craig McClanahan

A project[1] that I'm a participant in is a recent convert to Maven2 as a
build environment.  So far, there's a lot to like.  But ... I think I've run
into a limitation of the current design related to resolving conflicts in
dependency versions.  I'm looking for advice on what I can do other than
wait for FINR (fixed in next rev) as the Better Builds book implies will
be necessary :-).

The starting point for my scenario is I wanted to explore whether Commons
Logging 1.1 could successfully replace version 1.0.4 in a Shale-created
webapp.  To see all the gory details, you're probably best off downloading
the sources and trying the build yourself.  But the bottom line is that the
only way I can convince a particular webapp build to use the new version of
Commons Logging is to put the explicit dependence on 1.1 directly in the POM
for that webapp.  This is *not* what I want -- I'd much prefer to inherit
the Commons Logging version dependency from the top-level shale POM (
org.apache.shale:shale-parent), or even from the intermediate layer I have
as a basis for all the example webapps (org.apache.shale:shale-apps-parent).

Alas, this doesn't work.  Any dependency such as MyFaces that declares a
dependency on Commons Logging 1.0.4 seems to win, because it is closer on
the nearness algorithm described in Section 3.6 of the Better Builds
book.

It would seem to me that the simplest way to deal with this is that
inherited dependencies (from a parent POM) should be considered as being at
the same level of the dependency graph, just as if they had been explicitly
declared.  That would always allow a project to establish priority for
shared dependencies itself, without having their builds destabilized because
inheritance and dependence are both being treated as one step down the
graph.  Am I missing something?  Is there some way to accomplish what I want
(with M2 2.0.4) in the mean time, without explicitly declaring this
dependency in the leaf node artifact POMs?

Craig McClanahan

[1] http://shale.apache.org/