configuring one plugin from another
Hey, I'm trying to run the maven-clover-plugin to generate a code coverage report. It instruments my sources, and copies them to target/clover/src and target/clover/generated-sources. A problem arises when the plugin invokes the compiler:compile plugin. Output looks like: [INFO] [compiler:compile] [DEBUG] Using compiler 'javac'. [DEBUG] Source directories: [d:\myartifact\target\clover\src D:\myartifact\target\clover\generated-sources\groovy-stubs\main D:\myartifact\src\main\java] I end up with lots of duplicate class errors because it tries to compile the instrumented sources along with the non-instrumented source. I wonder is there a way to control the compiler plugin to exclude /src/main/java ? I am executing this plugin from the command line like 'mvn clover:instrument'. Thanks!
Re: configuring one plugin from another
I tried using a profile to alter the compile plugin's config just for certain executions like: profiles profile idclover/id build pluginManagement plugins plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-compiler-plugin/artifactId configuration excludes exclude${basedir}/src/main/java/exclude /excludes /configuration /plugin /plugins /pluginManagement /build /profile /profiles I can see the exclude is being sent through properly, but still compile plugin includes it :S [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-compiler-plugin:2.0.2:compile' -- [DEBUG] (f) basedir = D:\myartifact [DEBUG] (f) buildDirectory = D:\myartifact\target\clover [DEBUG] (f) classpathElements = ... [DEBUG] (f) compileSourceRoots = [D:\myartifact\target\clover\src, D:\myartifact\target\clover\generated-sources\groovy-stubs\main, D:\myartifact\src\main\java] [DEBUG] (f) compilerId = javac [DEBUG] (f) debug = true [DEBUG] (f) excludes = [D:\myartifact\src/main/java] [DEBUG] (f) failOnError = true [DEBUG] (f) fork = false [DEBUG] (f) maxmem = 512 [DEBUG] (f) optimize = false [DEBUG] (f) outputDirectory = D:\myartifact\target\clover\classes [DEBUG] (f) outputFileName = ... [DEBUG] (f) projectArtifact = ... [DEBUG] (f) showDeprecation = true [DEBUG] (f) showWarnings = true [DEBUG] (f) source = 1.5 [DEBUG] (f) staleMillis = 0 [DEBUG] (f) target = 1.5 [DEBUG] (f) verbose = false [DEBUG] -- end configuration -- [INFO] [compiler:compile] [DEBUG] Using compiler 'javac'. [DEBUG] Source directories: [D:\myartifact\target\clover\src D:\myartifact\target\clover\generated-sources\groovy-stubs\main D:\myartifact\src\main\java] On Wed, Aug 13, 2008 at 11:48 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: Hey, I'm trying to run the maven-clover-plugin to generate a code coverage report. It instruments my sources, and copies them to target/clover/src and target/clover/generated-sources. A problem arises when the plugin invokes the compiler:compile plugin. Output looks like: [INFO] [compiler:compile] [DEBUG] Using compiler 'javac'. [DEBUG] Source directories: [d:\myartifact\target\clover\src D:\myartifact\target\clover\generated-sources\groovy-stubs\main D:\myartifact\src\main\java] I end up with lots of duplicate class errors because it tries to compile the instrumented sources along with the non-instrumented source. I wonder is there a way to control the compiler plugin to exclude /src/main/java ? I am executing this plugin from the command line like 'mvn clover:instrument'. Thanks!
gmaven plugin vs. script classloading
I wrote a groovy plugin a while back, and am now trying to convert it to a simple script as it is only used by one artifact and it makes more sense to keep it with the artifact in question rather than as a separate artifact. The artifact that uses the script, artifact A, depends on some classes in artifact B. I have B setup as a dependency of A, and indeed the script can find classes from B and use them. However, when something from B invokes a third party library that tries to load a class from B I get a java.lang.ClassNotFoundException: . at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at org.codehaus.classworlds.RealmClassLoader.loadClassDirect(RealmClassLoader.java:195) at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:255) at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:274) at org.codehaus.classworlds.RealmClassLoader.loadClass(RealmClassLoader.java:214) at java.lang.ClassLoader.loadClass(ClassLoader.java:251) at org.apache.commons.digester.ObjectCreateRule.begin(ObjectCreateRule.java:204) This is really strange because it doesn't happen when the script is instead a groovy plugin. There is no change other than that where I used to invoke a custom plugin, I invoke a groovy script. 50 points for whoever can figure this one out!
Re: gmaven plugin vs. script classloading
-decoration-model-1.0-alpha-10.jar, file:/C:/mavenrepo_artifactory/org/apache/maven/doxia/doxia-module-apt/1.0-alpha-10/doxia-module-apt-1.0-alpha-10.jar, file:/C:/mavenrepo_artifactory/org/apache/maven/doxia/doxia-module-fml/1.0-alpha-10/doxia-module-fml-1.0-alpha-10.jar, file:/C:/mavenrepo_artifactory/org/apache/maven/doxia/doxia-module-xdoc/1.0-alpha-10/doxia-module-xdoc-1.0-alpha-10.jar, file:/C:/mavenrepo_artifactory/org/apache/maven/doxia/doxia-module-xhtml/1.0-alpha-10/doxia-module-xhtml-1.0-alpha-10.jar] Quite peculiar ! On Wed, Jul 23, 2008 at 11:35 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I wrote a groovy plugin a while back, and am now trying to convert it to a simple script as it is only used by one artifact and it makes more sense to keep it with the artifact in question rather than as a separate artifact. The artifact that uses the script, artifact A, depends on some classes in artifact B. I have B setup as a dependency of A, and indeed the script can find classes from B and use them. However, when something from B invokes a third party library that tries to load a class from B I get a java.lang.ClassNotFoundException: . at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at org.codehaus.classworlds.RealmClassLoader.loadClassDirect(RealmClassLoader.java:195) at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:255) at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:274) at org.codehaus.classworlds.RealmClassLoader.loadClass(RealmClassLoader.java:214) at java.lang.ClassLoader.loadClass(ClassLoader.java:251) at org.apache.commons.digester.ObjectCreateRule.begin(ObjectCreateRule.java:204) This is really strange because it doesn't happen when the script is instead a groovy plugin. There is no change other than that where I used to invoke a custom plugin, I invoke a groovy script. 50 points for whoever can figure this one out!
release:prepare performs source generation
I've noticed that when i execute release:prepare on one my projects that generates sources it performs the generation. I don't see why it would do this, as it does it again during release:perform. The plugin config in that artifact's POM looks like the following. Is there a way to prevent release:prepare from executing it? plugin groupIdcom.mycomp/groupId artifactIdgenerator_plugin/artifactId version${project.version}/version executions execution goals goalgen/goal /goals phasegenerate-sources/phase /execution /executions /plugin Thanks, Kal.
Re: prevent javadoc during release
OK thanks. I imagine to make this automatic I could put a release configuration in the relevant POM. On Tue, Jul 15, 2008 at 2:41 AM, Magne Nordtveit [EMAIL PROTECTED] wrote: On Mon, 2008-07-14 at 16:38 -0400, Kallin Nagelberg wrote: Is there a way to prevent javadoc from occuring during release? I would also like to prevent production of source JAR. Any ideas? When releasing, try setting the useReleaseProfile property to false, mvn release:perform -DuseReleaseProfile=false This will disable the attachment of javadoc and sources. Regards, Magne -- Magne Nordtveit [EMAIL PROTECTED] Systems Engineer Offshore Simulator Centre AS http://www.offsimcentre.no - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
java.lang.InternalError: Stubbed method
One my artifacts is using the gmaven plugin for both main and test sources. It was all working fine until recently the test phase began throwing the exception java.lang.InternalError: Stubbed method I have no idea what changed. It seems to be using the compiled stubs instead of the real groovy classes. Any ideas? I'm stumped!
Re: java.lang.InternalError: Stubbed method
Thanks Gert, but I just figured out the problem. I had the directory config as something like '${project.basedir}/src/test/java'. Just using 'src/test/java' fixed it! On Tue, Jul 15, 2008 at 2:53 PM, wohlgemuth [EMAIL PROTECTED] wrote: did you execute a mvn clean to remove existing stubs? On Tue, Jul 15, 2008 at 11:49 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: One my artifacts is using the gmaven plugin for both main and test sources. It was all working fine until recently the test phase began throwing the exception java.lang.InternalError: Stubbed method I have no idea what changed. It seems to be using the compiled stubs instead of the real groovy classes. Any ideas? I'm stumped! -- - gert wohlgemuth blog: http://berlinguyinca.blogspot.com/ work: http://fiehnlab.ucdavis.edu/staff/wohlgemuth - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
release plugin duplication
One of my artifacts takes a long time to generate sources (around 5 minutes). I've noticed that when I use the release plugin to release this artifact it performs that generation at least 3 times. I am using the maven-source-plugin across all my projects. Maybe i'll disable that.. Any other ideas?
release plugin duplication
One of my artifacts takes a long time to generate sources (around 5 minutes). I've noticed that when I use the release plugin to release this artifact it performs that generation at least 3 times. I am using the maven-source-plugin across all my projects. Maybe i'll disable that.. Any other ideas?
prevent javadoc during release
Is there a way to prevent javadoc from occuring during release? I would also like to prevent production of source JAR. Any ideas?
Re: release plugin duplication
Yes I appear to be plagued with duplication. On Mon, Jul 14, 2008 at 5:35 PM, Michael McCallum [EMAIL PROTECTED] wrote: On Tue, 15 Jul 2008 07:05:52 Kallin Nagelberg wrote: thats hilarious did anyone else get three copies of that email ;-) -- Michael McCallum Enterprise Engineer mailto:[EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Accessing artifact Final name in assembly plugin.
Hi, I'm using a dependency set in an assembly of mine using the following config: dependencySets dependencySet useProjectArtifactfalse/useProjectArtifact outputDirectory//outputDirectory includes includegroup:artifact1/include includegroup:artifact1/include includegroup:artifact1/include /includes /dependencySet /dependencySets I would like the artifacts to have the output name of whatever their 'finalName' was when they were built. I understand the default outputFileName mapping is governed by the following: outputFileNameMapping${artifact.artifactId}-${artifact.version}${dashClassifier?}.${artifact.extension}/outputFileNameMapping Looking at the Artifact API, I don't see a way to access the POM of that artifact, or the finalName directly. I guess what I can do is create a new dependency set for each one and declare the output name thrice, but that doesn't seem so clean. Any ideas?
Re: assembly plugin includes duplicates
I noticed that archive file resolution is mentioned here: http://maven.apache.org/plugins/maven-assembly-plugin/advanced-descriptor-topics.html It says that in case file/filesets have conflicting sources only one will be copied. It seems this isn't true for dependencies though. Is there a way to make the same rules apply to dependencies? On Tue, Jul 8, 2008 at 5:58 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I have an assembly descriptor that looks like the following code. There are duplicate libraries in 'target/dependencies' and 'target/core-bl-template/unpack/lib'. This results in duplicate JARs being zipped up at xxx.zip/lib . Is there a way to prevent this? assembly idbin/id formats formatzip/format /formats fileSets fileSet directorytarget/dependencies/directory outputDirectorylib/outputDirectory includes include*.jar/include /includes /fileSet fileSet directorytarget/core-bl-template-unpack/directory outputDirectory//outputDirectory /fileSet /fileSets /assembly
gmaven classloading issues
I'm encountering some really strange classloading issues with the gmaven plugin. I am executing it to generate some sources for a project of mine, and the script I wrote calls some classes in a dependency. I get all sorts of exceptions saying classnotfound. I debugged it, and found that the parent classloader had my dependencies, but not the current one. Does anyone know a way to fix this?
release plugin without SCM
Is there a way to use the release plugin without invoking the SCM activities? (I'm using SVN) The release plugin doesn't seem to be compatible with my project's branching strategy for a number of reasons: 1) It seems to always want to release from the trunk. Sometimes we need to release from branches. 2.) Releasing a non-module nested component doesn't work. Say I have a project in myproject/reports, and I just want to release reports. I'm getting the error 'myproject/tags/reports-1.0' doesn't exist. I would be happy just using the deploy goal directly, but I can't because I need to change the version of the project throughout multiple POMs. Any ideas?
Re: release plugin without SCM
Maybe I should clarify: My project structure is like this: myproject +--module1 +--module2 +--nonmodule1 so the SVN path is something like http://svnserver/myproject/trunk. I happen to be working on a branch now, so i checked out from http://svnserver/myproject/branches/branch1. How would I go about releasing nonmodule1 off my branch? On Tue, Jul 8, 2008 at 11:57 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: Is there a way to use the release plugin without invoking the SCM activities? (I'm using SVN) The release plugin doesn't seem to be compatible with my project's branching strategy for a number of reasons: 1) It seems to always want to release from the trunk. Sometimes we need to release from branches. 2.) Releasing a non-module nested component doesn't work. Say I have a project in myproject/reports, and I just want to release reports. I'm getting the error 'myproject/tags/reports-1.0' doesn't exist. I would be happy just using the deploy goal directly, but I can't because I need to change the version of the project throughout multiple POMs. Any ideas?
a simple zip assembly
Hi, I'm trying to build a simple distribution with the following layout: + |---bin |---libs There are some scripts in the bin folder that build a classpath using the lib folder. It should be distributed as a ZIP. One caveat: Initially it is built as a template. That is, there is a core artifact with all the basic libs/scripts required. Sub projects may wish to add additional libraries/scripts. I've got some of this working using the assembly plugin, but I'm not sure that's the right approach. Does anyone have suggestions as how this could be done smoothly? I noticed for WARs it is easy to do this sort of templating due to the support of 'overlays'.
multiple WAR overlays
I have an artifact which uses 2 wars as an overlay. I wonder if there's a way to choose which web.xml to use, or in general, which one should take priority in the case of conflicts.
Re: multiple WAR overlays
It seems like this should do the trick: http://maven.apache.org/plugins/maven-war-plugin/overlays.html On Tue, Jul 8, 2008 at 3:39 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I have an artifact which uses 2 wars as an overlay. I wonder if there's a way to choose which web.xml to use, or in general, which one should take priority in the case of conflicts.
Re: a simple zip assembly
One problem I'm having is obtaining the initial template ZIP. I managed to deploy the template artifact, which is of type JAR. I don't care about the JAR, but it also deploys my ZIP. Apparently 'zip' isn't a valid package type. Now I want to unpack that to use in another project. I want to use it as an overlay the same way WARs work, but that doesn't seem like an option here so I'm using the dependency plugin. It only grabs the JAR from the template though, not the assembled ZIP. Any ideas? On Tue, Jul 8, 2008 at 2:57 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: Hi, I'm trying to build a simple distribution with the following layout: + |---bin |---libs There are some scripts in the bin folder that build a classpath using the lib folder. It should be distributed as a ZIP. One caveat: Initially it is built as a template. That is, there is a core artifact with all the basic libs/scripts required. Sub projects may wish to add additional libraries/scripts. I've got some of this working using the assembly plugin, but I'm not sure that's the right approach. Does anyone have suggestions as how this could be done smoothly? I noticed for WARs it is easy to do this sort of templating due to the support of 'overlays'.
assembly plugin includes duplicates
I have an assembly descriptor that looks like the following code. There are duplicate libraries in 'target/dependencies' and 'target/core-bl-template/unpack/lib'. This results in duplicate JARs being zipped up at xxx.zip/lib . Is there a way to prevent this? assembly idbin/id formats formatzip/format /formats fileSets fileSet directorytarget/dependencies/directory outputDirectorylib/outputDirectory includes include*.jar/include /includes /fileSet fileSet directorytarget/core-bl-template-unpack/directory outputDirectory//outputDirectory /fileSet /fileSets /assembly
ignore missing web.xml war plugin
I'm trying to build a template WAR (overlay I guess) to be used in other projects. Every other project will supply it's own web.xml, so I don't need one in the template. Is there a way to configure the war plugin so that id doesn't error out on a missing web.xml?
Re: maintaining versions across multi-module project
So I guess the bottom line is 'use the release plugin'. If that's the only effective way to accomplish it I wonder why there is anything associated with the deploy phase by default. On Thu, Jul 3, 2008 at 4:09 AM, Stephen Connolly [EMAIL PROTECTED] wrote: FYI, You are not allowed to use ${blah} in the project/parent/version or project/version tags. There are bugs in Maven that mean it won't blow up in obvious ways if you do use ${blah} in those two tags... but it will blow up when you least expect it You can use ${blah} in project/dependencies/dependency/version project/dependencyManagement/dependencies/dependency/version project/build/plugins/plugin/version project/build/pluginManagement/plugins/plugin/version project/build/plugins/plugin/dependencies/dependency/version project/build/pluginManagement/plugins/plugin/dependencies/dependency/version On Wed, Jul 2, 2008 at 5:16 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: So what do you use for the version label in the root pom and the 'parent' tags of all the modules? For inter-module dependencies ${project.version} seems to work.. On Wed, Jul 2, 2008 at 2:52 AM, Martin Höller [EMAIL PROTECTED] wrote: On Monday 30 June 2008 Kallin Nagelberg wrote: Can anyone tell me what is the best/simplest way to maintain a version number across all the poms in a multi-module project? They are all to be deployed with the same version every time. Use the maven-release-plugin to release your project. The plugin will take care to replace alle version occurences in any pom.xml file. hth, - martin - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: maintaining versions across multi-module project
So what do you use for the version label in the root pom and the 'parent' tags of all the modules? For inter-module dependencies ${project.version} seems to work.. On Wed, Jul 2, 2008 at 2:52 AM, Martin Höller [EMAIL PROTECTED] wrote: On Monday 30 June 2008 Kallin Nagelberg wrote: Can anyone tell me what is the best/simplest way to maintain a version number across all the poms in a multi-module project? They are all to be deployed with the same version every time. Use the maven-release-plugin to release your project. The plugin will take care to replace alle version occurences in any pom.xml file. hth, - martin
configuring a plugin while preventing its execution
I'm trying to create a multi-module project that uses the same plugin for source generation. To make it as simple as possible I have a parent POM that contains the configuration of the plugin in a build plugins plugin tag. My problem is that the plugin is being executed when the parent is built, even though it is of type 'pom', and i only want to use it to aggregate the modules and define some standard config. Is there a way to configure a plugin and prevent it's execution?
obtaining modules as dependencies
I have a multi module project with superpom of type 'pom'. I was wondering how I can obtain all the modules as dependencies in another project. I thought I could just declare a dependency on the superpom, but that doesn't work.
Re: configuring a plugin while preventing its execution
great, thanks for the help On Wed, Jul 2, 2008 at 2:21 PM, Dennis Lundberg [EMAIL PROTECTED] wrote: You can put your plugin configuration in a pluginManagement [1] element inside the build element build pluginManagement plugins plugin This only configures the plugins, but don't execute them. [1] http://maven.apache.org/ref/current/maven-model/maven.html#class_pluginManagement Kallin Nagelberg wrote: I'm trying to create a multi-module project that uses the same plugin for source generation. To make it as simple as possible I have a parent POM that contains the configuration of the plugin in a build plugins plugin tag. My problem is that the plugin is being executed when the parent is built, even though it is of type 'pom', and i only want to use it to aggregate the modules and define some standard config. Is there a way to configure a plugin and prevent it's execution? -- Dennis Lundberg - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: configuring a plugin while preventing its execution
I've noticed that using the pluginmanagment the plugin is not associated to lifecycle events by default in the children. Is there a way to make this happen so that I don't have to declare the plugin in every child? On Wed, Jul 2, 2008 at 3:22 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: great, thanks for the help On Wed, Jul 2, 2008 at 2:21 PM, Dennis Lundberg [EMAIL PROTECTED] wrote: You can put your plugin configuration in a pluginManagement [1] element inside the build element build pluginManagement plugins plugin This only configures the plugins, but don't execute them. [1] http://maven.apache.org/ref/current/maven-model/maven.html#class_pluginManagement Kallin Nagelberg wrote: I'm trying to create a multi-module project that uses the same plugin for source generation. To make it as simple as possible I have a parent POM that contains the configuration of the plugin in a build plugins plugin tag. My problem is that the plugin is being executed when the parent is built, even though it is of type 'pom', and i only want to use it to aggregate the modules and define some standard config. Is there a way to configure a plugin and prevent it's execution? -- Dennis Lundberg - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
maintaining versions across multi-module project
Can anyone tell me what is the best/simplest way to maintain a version number across all the poms in a multi-module project? They are all to be deployed with the same version every time. Ideally, the version could be declared in one place (the superpom), and all children would know to use it. I've tried doing something like this in the root POM: properties parent.groupIdcom.something/parent.groupId parent.artifactIdsomething_parent/parent.artifactId parent.version2.0/parent.version /properties and then groupId${parent.groupId}/groupId artifactId${parent.artifactId}/artifactId version${varent.version}/version in the ROOT POM as well. Then I have my modules below set their parent to be: groupId${parent.groupId}/groupId artifactId${parent.artifactId}/artifactId version${varent.version}/version This sort of works, but not great. What's the best way to do something like this?
deploy uploading with SCP freezes
I'm finding it impossible to upload files to a remote repository using the deploy plugin. I've setup a tunnel to the server and have all authentication setup properly. I can ssh/scp through the tunnel with no problems, and the maven deploy plugin will go so far as to create the directories required for deployment of the new artifact. However, when it gets to the point where it needs to upload a file, it just freezes forever, with no error message. Has anyone encountered something like the following before? $ mvn deploy [INFO] Scanning for projects... [INFO] [INFO] Building Something [INFO]task-segment: [deploy] [INFO] [INFO] [resources:resources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:compile] [INFO] No sources to compile [INFO] [resources:testResources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:testCompile] [INFO] No sources to compile [INFO] [surefire:test] [INFO] No tests to run. [INFO] [jar:jar] [INFO] [install:install] [INFO] Installing blahblahblah.jar [INFO] [deploy:deploy] altDeploymentRepository = null [INFO] Retrieving previous build number from dev [INFO] repository metadata for: 'snapshot group:artifact:1.0-SNAPSHOT' could not be found on reposi tory: somerepo, so will be created Uploading: scpexe://localhost:9006/blahblahblah.jar
Re: deploy uploading with SCP freezes
Originally I was trying this on cygwin, with scp and ssh. I switched to a windows command prompt, with plink and pscp, and it works! Magic! On Mon, Apr 28, 2008 at 11:42 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I'm finding it impossible to upload files to a remote repository using the deploy plugin. I've setup a tunnel to the server and have all authentication setup properly. I can ssh/scp through the tunnel with no problems, and the maven deploy plugin will go so far as to create the directories required for deployment of the new artifact. However, when it gets to the point where it needs to upload a file, it just freezes forever, with no error message. Has anyone encountered something like the following before? $ mvn deploy [INFO] Scanning for projects... [INFO] [INFO] Building Something [INFO]task-segment: [deploy] [INFO] [INFO] [resources:resources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:compile] [INFO] No sources to compile [INFO] [resources:testResources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:testCompile] [INFO] No sources to compile [INFO] [surefire:test] [INFO] No tests to run. [INFO] [jar:jar] [INFO] [install:install] [INFO] Installing blahblahblah.jar [INFO] [deploy:deploy] altDeploymentRepository = null [INFO] Retrieving previous build number from dev [INFO] repository metadata for: 'snapshot group:artifact:1.0-SNAPSHOT' could not be found on reposi tory: somerepo, so will be created Uploading: scpexe://localhost:9006/blahblahblah.jar
Re: deploy uploading with SCP freezes
This is quite bizarre. I can deploy fine from the command shell now, but in Cygwin it continues to freeze while uploading. Has anyone experienced these problems with cygwin? It's a little troublesome, as much of my build scripts are bash scripts, and so I sort of need cygwin to use them. On Mon, Apr 28, 2008 at 1:17 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: Originally I was trying this on cygwin, with scp and ssh. I switched to a windows command prompt, with plink and pscp, and it works! Magic! On Mon, Apr 28, 2008 at 11:42 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I'm finding it impossible to upload files to a remote repository using the deploy plugin. I've setup a tunnel to the server and have all authentication setup properly. I can ssh/scp through the tunnel with no problems, and the maven deploy plugin will go so far as to create the directories required for deployment of the new artifact. However, when it gets to the point where it needs to upload a file, it just freezes forever, with no error message. Has anyone encountered something like the following before? $ mvn deploy [INFO] Scanning for projects... [INFO] [INFO] Building Something [INFO]task-segment: [deploy] [INFO] [INFO] [resources:resources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:compile] [INFO] No sources to compile [INFO] [resources:testResources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:testCompile] [INFO] No sources to compile [INFO] [surefire:test] [INFO] No tests to run. [INFO] [jar:jar] [INFO] [install:install] [INFO] Installing blahblahblah.jar [INFO] [deploy:deploy] altDeploymentRepository = null [INFO] Retrieving previous build number from dev [INFO] repository metadata for: 'snapshot group:artifact:1.0-SNAPSHOT' could not be found on reposi tory: somerepo, so will be created Uploading: scpexe://localhost:9006/blahblahblah.jar
Re: 答复: deployment through tunnel
It doesn't seem to have any problem executing scp. There seems to be a problem with deploy plugin, as it's generating permission problem even though I have no problem connecting with scp and ssh manually. 2008/3/16 Cody Zhang [EMAIL PROTECTED]: Copy scpexe to windows path?? -邮件原件- 发件人: Kallin Nagelberg [mailto:[EMAIL PROTECTED] 发送时间: 2008年3月17日 3:47 收件人: Maven Users List 主题: Re: deployment through tunnel I should also note that I can successfully SSH with ssh -p 9000 [EMAIL PROTECTED] So I can ssh and scp manually just fine. On Sun, Mar 16, 2008 at 3:37 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I've been struggling forever trying to use the deploy plugin through a tunnel into my office. There is a repository, call it maven.int.office.com. I have a tunnel, from localhost:9000 to maven.int.office.com:22. Using the following command I can SCP files to it at will: scp -P9000 somefile.txt [EMAIL PROTECTED]:. The distribution management section of my pom looks like this: distributionManagement repository iddev/id nameRepository/name urlscpexe://localhost:9000/home/cm/maven/dev/url /repository /distributionManagement And my settings file has the following: servers server iddev/id usernameuser/username passphrasepassword/passphrase privateKeymyPuttyKey/privateKey !-- I SHOULDNT NEED THIS AS ITS USED IN THE TUNNEL -- configuration sshExecutablessh/sshExecutable scpExecutablescp/scpExecutable /configuration /server /servers All I get when I run mvn deploy is Uploading: scpexe://localhost:9006/home/cm/maven/dev/..etcetc [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Error deploying artifact: Error executing command for transfer Exit code 255 - Permission denied (publickey,keyboard-interactive). I should note that when I run SCP directly, I am prompted for a password. When I run mvn deploy, I receive no prompt, just the error message. Is there way to prevent mvn from trying to use a key file? I'm so stuck, I've tried everything. Please help! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: deployment through tunnel
Cool Thanks, I'll try that. I don't know how to use ssh-add yet, but I'm sure i can find something online. On Mon, Mar 17, 2008 at 7:33 AM, Jan Torben Heuer [EMAIL PROTECTED] wrote: Uploading: scpexe://localhost:9006/home/cm/maven/dev/..etcetc [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Error deploying artifact: Error executing command for transfer Exit code 255 - Permission denied (publickey,keyboard-interactive). You need a publickey without password (or you have to unlock it before with ssh-add) PW prompt is unsupported Jan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
deployment through tunnel
I've been struggling forever trying to use the deploy plugin through a tunnel into my office. There is a repository, call it maven.int.office.com. I have a tunnel, from localhost:9000 to maven.int.office.com:22. Using the following command I can SCP files to it at will: scp -P9000 somefile.txt [EMAIL PROTECTED]:. The distribution management section of my pom looks like this: distributionManagement repository iddev/id nameRepository/name urlscpexe://localhost:9000/home/cm/maven/dev/url /repository /distributionManagement And my settings file has the following: servers server iddev/id usernameuser/username passphrasepassword/passphrase privateKeymyPuttyKey/privateKey !-- I SHOULDNT NEED THIS AS ITS USED IN THE TUNNEL -- configuration sshExecutablessh/sshExecutable scpExecutablescp/scpExecutable /configuration /server /servers All I get when I run mvn deploy is Uploading: scpexe://localhost:9006/home/cm/maven/dev/..etcetc [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Error deploying artifact: Error executing command for transfer Exit code 255 - Permission denied (publickey,keyboard-interactive). I should note that when I run SCP directly, I am prompted for a password. When I run mvn deploy, I receive no prompt, just the error message. Is there way to prevent mvn from trying to use a key file? I'm so stuck, I've tried everything. Please help!
Re: deployment through tunnel
I should also note that I can successfully SSH with ssh -p 9000 [EMAIL PROTECTED] So I can ssh and scp manually just fine. On Sun, Mar 16, 2008 at 3:37 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I've been struggling forever trying to use the deploy plugin through a tunnel into my office. There is a repository, call it maven.int.office.com. I have a tunnel, from localhost:9000 to maven.int.office.com:22. Using the following command I can SCP files to it at will: scp -P9000 somefile.txt [EMAIL PROTECTED]:. The distribution management section of my pom looks like this: distributionManagement repository iddev/id nameRepository/name urlscpexe://localhost:9000/home/cm/maven/dev/url /repository /distributionManagement And my settings file has the following: servers server iddev/id usernameuser/username passphrasepassword/passphrase privateKeymyPuttyKey/privateKey !-- I SHOULDNT NEED THIS AS ITS USED IN THE TUNNEL -- configuration sshExecutablessh/sshExecutable scpExecutablescp/scpExecutable /configuration /server /servers All I get when I run mvn deploy is Uploading: scpexe://localhost:9006/home/cm/maven/dev/..etcetc [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Error deploying artifact: Error executing command for transfer Exit code 255 - Permission denied (publickey,keyboard-interactive). I should note that when I run SCP directly, I am prompted for a password. When I run mvn deploy, I receive no prompt, just the error message. Is there way to prevent mvn from trying to use a key file? I'm so stuck, I've tried everything. Please help!
Re: deployment through tunnel
Sorry just a typo there, it's 9000 everywhere. On Sun, Mar 16, 2008 at 4:24 PM, Wayne Fay [EMAIL PROTECTED] wrote: Just looking at it, and not knowing anything about it at all, I notice you have port 9000 most places, but then Maven errors out with port 9006. So that might be your problem (?). Wayne On 3/16/08, Kallin Nagelberg [EMAIL PROTECTED] wrote: I should also note that I can successfully SSH with ssh -p 9000 [EMAIL PROTECTED] So I can ssh and scp manually just fine. On Sun, Mar 16, 2008 at 3:37 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I've been struggling forever trying to use the deploy plugin through a tunnel into my office. There is a repository, call it maven.int.office.com. I have a tunnel, from localhost:9000 to maven.int.office.com:22. Using the following command I can SCP files to it at will: scp -P9000 somefile.txt [EMAIL PROTECTED]:. The distribution management section of my pom looks like this: distributionManagement repository iddev/id nameRepository/name urlscpexe://localhost:9000/home/cm/maven/dev/url /repository /distributionManagement And my settings file has the following: servers server iddev/id usernameuser/username passphrasepassword/passphrase privateKeymyPuttyKey/privateKey !-- I SHOULDNT NEED THIS AS ITS USED IN THE TUNNEL -- configuration sshExecutablessh/sshExecutable scpExecutablescp/scpExecutable /configuration /server /servers All I get when I run mvn deploy is Uploading: scpexe://localhost:9006/home/cm/maven/dev/..etcetc [INFO] [ERROR] BUILD ERROR [INFO] [INFO] Error deploying artifact: Error executing command for transfer Exit code 255 - Permission denied (publickey,keyboard-interactive). I should note that when I run SCP directly, I am prompted for a password. When I run mvn deploy, I receive no prompt, just the error message. Is there way to prevent mvn from trying to use a key file? I'm so stuck, I've tried everything. Please help! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
details missing in stack trace
I'm using the maven ant task in groovy to try and create a dependency fileset. Something like: mvn.dependencies(filesetId: 'myfileset') { myArtifacts.each { dependency(groupId: it.groupId, artifactId: it.artifactId, version: it.version) } } It works when all the artifacts are valid, but dies with a pretty unhelpful stacktrace when it doesn't. I see something like : [INFO] [ERROR] FATAL ERROR [INFO] [INFO] java.lang.ClassCastException: org.apache.maven.usability.MojoFailureExceptionDiagnoser [INFO] [INFO] Trace : java.lang.ClassCastException: org.apache.maven.usability.MojoFailureExceptionDiagnoser at org.apache.tools.ant.dispatch.DispatchUtils.execute( DispatchUtils.java:115) at org.apache.tools.ant.Task.perform(Task.java:348) at groovy.util.AntBuilder.nodeCompleted(AntBuilder.java:178) at groovy.util.BuilderSupport.doInvokeMethod(BuilderSupport.java:153) at groovy.util.BuilderSupport.invokeMethod(BuilderSupport.java:64) at org.codehaus.groovy.runtime.InvokerHelper.invokePogoMethod( InvokerHelper.java:777) at org.codehaus.groovy.runtime.InvokerHelper.invokeMethod( InvokerHelper.java:753) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodN( ScriptBytecodeAdapter.java:167) at com.novator.mojo.ArtifactResourcesProcessor.unpackArtifacts( ArtifactResourcesProcessor.groovy:84) at com.novator.mojo.ArtifactResourcesProcessor.this$2$unpackArtifacts( ArtifactResourcesProcessor.groovy) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke( DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java :95) at org.codehaus.groovy.runtime.MetaClassHelper.doMethodInvoke( MetaClassHelper.java:599) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:904) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnCurrentN( ScriptBytecodeAdapter.java:77 but there is no indication of what exactly is wrong. Does anyone know of a way to get more useful information from these tasks when an error occurs?
Re: including content into apt files
Ok, so I added a dependency to site plugin 2.0-beta-6 and now i can use the file parameter like you suggested. However, the generated html is not including the file like I expect. Instead it is doing the following: %a name=snippet|id=myid|file=src/main/resources/help/basichelp.txtsnippet|id=myid|file=src/main/resources/help/basichelp.txt/a given the apt: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} On Feb 4, 2008 8:55 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Which doxia version are you using? The file parameter was added in doxia-1.0-alpha-9 (ie site-plugin 2.0-beta-6). -Lukas Kallin Nagelberg wrote: Thanks Lukas, You know I tried that, but I keep getting the error that 'url' is a required parameters :S On Feb 4, 2008 5:32 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Use the file parameter instead of url, eg: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} HTH, -Lukas Kallin Nagelberg wrote: Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: including content into apt files
I found out that configuring the dependency wasn't enough. I had to explicitly indicate to use the latest site plugin in a plugin tag. That being said, there are still some issues: 1. Nothing is actually included. There is just an empty div. when I use: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} There definitely is a file at src/main/resources/help/basichelp.txt with some content. 2. If I don't put that snippet on the first line of the file then it is treated as text and not interpreted. On Feb 4, 2008 9:49 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Please attach a small test project to jira (http://jira.codehaus.org/browse/DOXIA), it works for me. Just to be sure: the macro line is not indented in your apt source, right? (that would explain why the line is interpreted as an anchor). -Lukas Kallin Nagelberg wrote: Ok, so I added a dependency to site plugin 2.0-beta-6 and now i can use the file parameter like you suggested. However, the generated html is not including the file like I expect. Instead it is doing the following: %a name=snippet|id=myid|file=src/main/resources/help/basichelp.txtsnippet|id=myid|file=src/main/resources/help/basichelp.txt/a given the apt: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} On Feb 4, 2008 8:55 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Which doxia version are you using? The file parameter was added in doxia-1.0-alpha-9 (ie site-plugin 2.0-beta-6). -Lukas Kallin Nagelberg wrote: Thanks Lukas, You know I tried that, but I keep getting the error that 'url' is a required parameters :S On Feb 4, 2008 5:32 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Use the file parameter instead of url, eg: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} HTH, -Lukas Kallin Nagelberg wrote: Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: including content into apt files
Thanks Lukas, You know I tried that, but I keep getting the error that 'url' is a required parameters :S On Feb 4, 2008 5:32 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Use the file parameter instead of url, eg: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} HTH, -Lukas Kallin Nagelberg wrote: Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: including content into apt files
Ok thanks for all the help. So, currently, it's impossible to have multiple includes throughout an apt file? On Feb 4, 2008 10:27 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Kallin Nagelberg wrote: I found out that configuring the dependency wasn't enough. I had to explicitly indicate to use the latest site plugin in a plugin tag. That being said, there are still some issues: 1. Nothing is actually included. There is just an empty div. when I use: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} There definitely is a file at src/main/resources/help/basichelp.txt with some content. And this file contains START SNIPPET: myid and END SNIPPET: myid as demarcators of the snippet you want to include? See http://maven.apache.org/doxia/macros/index.html 2. If I don't put that snippet on the first line of the file then it is treated as text and not interpreted. IMO that's the expected behavior (even though I don't see where it's documented right now), as otherwise it would not be possible to distinguish macros from anchors in apt files. HTH, -Lukas On Feb 4, 2008 9:49 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Please attach a small test project to jira (http://jira.codehaus.org/browse/DOXIA), it works for me. Just to be sure: the macro line is not indented in your apt source, right? (that would explain why the line is interpreted as an anchor). -Lukas Kallin Nagelberg wrote: Ok, so I added a dependency to site plugin 2.0-beta-6 and now i can use the file parameter like you suggested. However, the generated html is not including the file like I expect. Instead it is doing the following: %a name=snippet|id=myid|file=src/main/resources/help/basichelp.txtsnippet|id=myid|file=src/main/resources/help/basichelp.txt/a given the apt: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} On Feb 4, 2008 8:55 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Which doxia version are you using? The file parameter was added in doxia-1.0-alpha-9 (ie site-plugin 2.0-beta-6). -Lukas Kallin Nagelberg wrote: Thanks Lukas, You know I tried that, but I keep getting the error that 'url' is a required parameters :S On Feb 4, 2008 5:32 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Use the file parameter instead of url, eg: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} HTH, -Lukas Kallin Nagelberg wrote: Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: including content into apt files
Oh, I thought when you said: 2. If I don't put that snippet on the first line of the file then it is treated as text and not interpreted. IMO that's the expected behavior (even though I don't see where it's documented right now), as otherwise it would not be possible to distinguish macros from anchors in apt files. you meant that the snippet would only work on the first line. On Feb 4, 2008 10:52 AM, Lukas Theussl [EMAIL PROTECTED] wrote: You can have as many as you want if you distinguish them with a unique id, see an example here: http://svn.apache.org/viewvc/maven/doxia/site/src/site/apt/book/index.apt?revision=574085view=markup HTH, -Lukas Kallin Nagelberg wrote: Ok thanks for all the help. So, currently, it's impossible to have multiple includes throughout an apt file? On Feb 4, 2008 10:27 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Kallin Nagelberg wrote: I found out that configuring the dependency wasn't enough. I had to explicitly indicate to use the latest site plugin in a plugin tag. That being said, there are still some issues: 1. Nothing is actually included. There is just an empty div. when I use: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} There definitely is a file at src/main/resources/help/basichelp.txt with some content. And this file contains START SNIPPET: myid and END SNIPPET: myid as demarcators of the snippet you want to include? See http://maven.apache.org/doxia/macros/index.html 2. If I don't put that snippet on the first line of the file then it is treated as text and not interpreted. IMO that's the expected behavior (even though I don't see where it's documented right now), as otherwise it would not be possible to distinguish macros from anchors in apt files. HTH, -Lukas On Feb 4, 2008 9:49 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Please attach a small test project to jira (http://jira.codehaus.org/browse/DOXIA), it works for me. Just to be sure: the macro line is not indented in your apt source, right? (that would explain why the line is interpreted as an anchor). -Lukas Kallin Nagelberg wrote: Ok, so I added a dependency to site plugin 2.0-beta-6 and now i can use the file parameter like you suggested. However, the generated html is not including the file like I expect. Instead it is doing the following: %a name=snippet|id=myid|file=src/main/resources/help/basichelp.txtsnippet|id=myid|file=src/main/resources/help/basichelp.txt/a given the apt: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} On Feb 4, 2008 8:55 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Which doxia version are you using? The file parameter was added in doxia-1.0-alpha-9 (ie site-plugin 2.0-beta-6). -Lukas Kallin Nagelberg wrote: Thanks Lukas, You know I tried that, but I keep getting the error that 'url' is a required parameters :S On Feb 4, 2008 5:32 AM, Lukas Theussl [EMAIL PROTECTED] wrote: Use the file parameter instead of url, eg: %{snippet|id=myid|file=src/main/resources/help/basichelp.txt} HTH, -Lukas Kallin Nagelberg wrote: Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
including content into apt files
Hi Everyone, I've been tasked with creating some .apt documentation for a new maven built project. Ideally, I'd like to have a lot of the site documentation come from files in my src/main/resources directory. I've read about using the snippet macro, but I can't get it to access the artifacts resources. I've tried something like this but it doesn't work: %{snippet|id=myid|url=file:///./src/main/resources/help/basichelp.txt} Any ideas would be appreciated!
Re: unable to add custom ant tasks to scripts
Anyone? On Jan 5, 2008 6:44 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: Im trying to use the groovy-maven-plugin to execute some groovy scripts that make use of the antbuilder. I need to be able to use the maven-ant-tasks within this script, which I've tried to do as follows: def mvn = NamespaceBuilder.newInstance(ant, 'antlib: org.apache.maven.artifact.ant') mvn.dependencies(pathId:myMvnPath){ pom(file:somepom.xml) } I've been using this approach within some groovy plugins that I've written and it's had no problems picking up the ant tasks from the classpath. However, when utilizing the groovy-maven-plugin within a pom with the exact same dependencies as my plugins, I'm getting the following error: Problem: failed to create task or type antlib: org.apache.maven.artifact.ant:dependencies Cause: The name is undefined. Running dependency:resolve against both poms results in the exact same dependencies. I am stumped about how to get this to work. Anyone have any ideas? Thanks!
Re: Add source folder
I was using this plugin for a bit to add an additional source directory, but intellij IDEA does not recognize the additional source. I ended up having to create a new artifact for the second set of sources. On Jan 7, 2008 7:50 AM, Tom Huybrechts [EMAIL PROTECTED] wrote: http://mojo.codehaus.org/build-helper-maven-plugin/usage.html On Jan 7, 2008 1:48 PM, Jan Torben Heuer [EMAIL PROTECTED] wrote: How can I add another sourcefolder? (/src/extended/java/) The setting should be recognized by maven-eclipse-plugin and the built process. Jan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Add source folder
I would say, theoretically, it should. However, to accomplish it properly, it would have to execute the given pom and analyze the results to see what source folders are being looked at... Afer all, any plugin could be jumping in and adding source folders, even without having them declared anywhere in the pom. There is also the problem where you have non-java sources. I am using groovy in my project, and IDEA/eclipse does not recognize that as a source folder via the pom. I would suggest you do what I did, and split your multi-source artifact into two seperate ones. If they are heavily dependant on each other, you may wish to factor out the dependencies into a third module which could be depended on by your two new artifacts. Alternatively, you could just tell everyone on your team to go add the appropriate source folders manually whenever they open a project. On Jan 7, 2008 9:16 AM, Jan Torben Heuer [EMAIL PROTECTED] wrote: Kallin Nagelberg wrote: I was using this plugin for a bit to add an additional source directory, but intellij IDEA does not recognize the additional source. I ended up having to create a new artifact for the second set of sources. eclipse (or the maven-eclipse-plugin) does not add the sourcefolder as well. Or should it? Jan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
unable to add custom ant tasks to scripts
Im trying to use the groovy-maven-plugin to execute some groovy scripts that make use of the antbuilder. I need to be able to use the maven-ant-tasks within this script, which I've tried to do as follows: def mvn = NamespaceBuilder.newInstance(ant, 'antlib: org.apache.maven.artifact.ant') mvn.dependencies(pathId:myMvnPath){ pom(file:somepom.xml) } I've been using this approach within some groovy plugins that I've written and it's had no problems picking up the ant tasks from the classpath. However, when utilizing the groovy-maven-plugin within a pom with the exact same dependencies as my plugins, I'm getting the following error: Problem: failed to create task or type antlib: org.apache.maven.artifact.ant:dependencies Cause: The name is undefined. Running dependency:resolve against both poms results in the exact same dependencies. I am stumped about how to get this to work. Anyone have any ideas? Thanks!
Re: Plugins within plugins: best practices
What about creating a mojo instance manually from within another plugin.. Has anyone tried this? On Dec 29, 2007 11:37 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I need to run the unpack and copy goals repeatedly, with given group:artifact:version and different target directories for each one. On Dec 29, 2007 4:23 PM, Brian E. Fox [EMAIL PROTECTED] wrote: Which part is it that you need to use in the dependency plugin? The actual copying and unpacking is mostly delegated to the maven-archiver component. -Original Message- From: Kallin Nagelberg [mailto: [EMAIL PROTECTED] Sent: Saturday, December 29, 2007 4:03 PM To: Maven Users List Subject: Plugins within plugins: best practices Hello everyone, I've created a standalone plugin in which I'm doing some unpacking/copying of artifacts in addition to a few other things. I would like to defer the unpacking/copying operations to the maven dependency plugin for obvious reasons. Has anyone tried creating and configuring a plugin instance within another plugin and calling the execute method directly? I'm tempted to try this, but am worried about potential side effects. I figure I can try to build the plugin instance with what components I've already got within my plugin and pass them along. Is there a better way to use this, perhaps some sort of core maven plugin builder? Any help would be greatly appreciated. Thanks! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Plugins within plugins: best practices
Hello everyone, I've created a standalone plugin in which I'm doing some unpacking/copying of artifacts in addition to a few other things. I would like to defer the unpacking/copying operations to the maven dependency plugin for obvious reasons. Has anyone tried creating and configuring a plugin instance within another plugin and calling the execute method directly? I'm tempted to try this, but am worried about potential side effects. I figure I can try to build the plugin instance with what components I've already got within my plugin and pass them along. Is there a better way to use this, perhaps some sort of core maven plugin builder? Any help would be greatly appreciated. Thanks!
Re: Plugins within plugins: best practices
I need to run the unpack and copy goals repeatedly, with given group:artifact:version and different target directories for each one. On Dec 29, 2007 4:23 PM, Brian E. Fox [EMAIL PROTECTED] wrote: Which part is it that you need to use in the dependency plugin? The actual copying and unpacking is mostly delegated to the maven-archiver component. -Original Message- From: Kallin Nagelberg [mailto:[EMAIL PROTECTED] Sent: Saturday, December 29, 2007 4:03 PM To: Maven Users List Subject: Plugins within plugins: best practices Hello everyone, I've created a standalone plugin in which I'm doing some unpacking/copying of artifacts in addition to a few other things. I would like to defer the unpacking/copying operations to the maven dependency plugin for obvious reasons. Has anyone tried creating and configuring a plugin instance within another plugin and calling the execute method directly? I'm tempted to try this, but am worried about potential side effects. I figure I can try to build the plugin instance with what components I've already got within my plugin and pass them along. Is there a better way to use this, perhaps some sort of core maven plugin builder? Any help would be greatly appreciated. Thanks! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Maven Documentation
I've been struggling to find good documentation for maven ever since I started using it, especially in regards to plugin development. The basics are presented on the Maven site, but details like what components are available and what properties are available cannot be found anywhere. I just stumbled across a great source of documentation for plugin development, as well as numerous other maven related issues. Plugin Development: http://docs.codehaus.org/display/MAVENUSER/Mojo+Developer+Cookbook Maven Properties: http://docs.codehaus.org/display/MAVENUSER/MavenPropertiesGuide A whole lot of other stuff: http://docs.codehaus.org/display/MAVENUSER/Home (a little chaotic, but lots of good stuff if you dig.. [try the 'mini-guides') . There is also two free books: The definitive guide: http://www.sonatype.com/book/ and Better Builds with Maven: http://www.devzuz.com/web/guest/products/resources#BBWM If anyone knows of any other highly useful sources of maven documentation it would be great to have it on this thread. Who knows, it may even end up reduce the number of total threads on this list :) Kallin Nagelberg Consulting Architect
plugin annotations
Hello all, Does anyone know if it is possible to write java/groovy mojos using Java5 annotations instead of the javadoc annotations? I was just trying to figure out why the abstract plugin baseclass I was writing wasn't working, then I hit myself on the head and realized the javadocs do not survive compilation, meaning that my subclasses do not generate the necessary stubs. If it's not available now, does anyone know if there are plans to implement it? Thanks, Kallin Nagelberg Consulting Architect
Re: Project Version
I believe what is happening here has to do with my plugin haviing the '@requiresProject false' attribute set. On Dec 27, 2007 4:01 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I'm trying to write a plugin in which I need to access the version of the plugin while it's running. I've tried using ${project.version} within my plugin, but it always resolves to '2.0', as if it is obtaining the maven version. Does anyone know a way to obtain the version of the plugin?
Project Version
I'm trying to write a plugin in which I need to access the version of the plugin while it's running. I've tried using ${project.version} within my plugin, but it always resolves to '2.0', as if it is obtaining the maven version. Does anyone know a way to obtain the version of the plugin?
Re: Project Version
That worked. Thanks! On Dec 27, 2007 4:24 PM, Tom Huybrechts [EMAIL PROTECTED] wrote: Try ${plugin.version}. Not sure if it works, but it doesn't hurt to try. On Dec 27, 2007 10:06 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I believe what is happening here has to do with my plugin haviing the '@requiresProject false' attribute set. On Dec 27, 2007 4:01 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I'm trying to write a plugin in which I need to access the version of the plugin while it's running. I've tried using ${project.version} within my plugin, but it always resolves to '2.0', as if it is obtaining the maven version. Does anyone know a way to obtain the version of the plugin?
transitive dependencies
I'm trying to write a plugin that builds a WAR using a variety of different artifacts. I have a 'pom' artifact that manages all my third party dependencies, and I would like to be able to use the 'maven-dependency-plugin' to unpack them all. Unfortunately, it seems that plugin only transitively unpacks the dependencies of whatever pom you have configured it in. I need to be able to specify which artifacts I want to unpack. I know I can use the 'copy' goal for individual artifacts, but does anyone know a way to unpack all transitive dependencies at once?
current directory from module
I've written a plugin that is being used by an artifact of mine to create some files in my project structure (outside maven, like ../../somedir...). It all works great when I run it from the directory housing the artifact i'm interested in, but when I attempt to make this artifact a module of another artifact, and then run maven against that aggregating artifact, it uses the directory of the aggregating artifact as the current directory when it goes to package the sub-module. I hope that makes sense. Is this a feature/bug, and how do you think I should go about resolving it so that no matter what directory I'm in when I execute maven, the plugin uses the same relative path. Is there a way to obtain the artifact's root directory through the plugin api?
Re: current directory from module
I figured out how to do this by passing the ${basedir} property from the pom into my plugin. Problem solved! On Dec 21, 2007 12:11 PM, Kallin Nagelberg [EMAIL PROTECTED] wrote: I've written a plugin that is being used by an artifact of mine to create some files in my project structure (outside maven, like ../../somedir...). It all works great when I run it from the directory housing the artifact i'm interested in, but when I attempt to make this artifact a module of another artifact, and then run maven against that aggregating artifact, it uses the directory of the aggregating artifact as the current directory when it goes to package the sub-module. I hope that makes sense. Is this a feature/bug, and how do you think I should go about resolving it so that no matter what directory I'm in when I execute maven, the plugin uses the same relative path. Is there a way to obtain the artifact's root directory through the plugin api?
plugins within plugins
I've got a situation on my hands in which I need to convert a maven assembly system that was built to be only linux compatible to something cross platform. Among it's quirks, it generates pom files dynamically so that it can call plugins against it. The first windows hitch I've come across is the invocation of 'mvn' using ant within a plugin (written in groovy). It needs to be mvn.bat on windows.. I can fix this, but in my opinion this system is a little too complex and it appears the original developer went a little 'maven-happy', trying to do everything through poms which would be better suited to something like ant scripting.. The flow of the assembly plugin is: - given an artifact/group/version, - 1) generate a pom file with some set dependencies, and setup to use 'dependency-maven-plugin'. - 2) run mvn against that pom, using the dependency-maven-plugin to either unpack or copy those dependencies . - 3) load another pom Artifact from the repository. This pom's only purpose is to hold a bunch of properties that are then parsed, instructing another plugin to copy certain artifacts to different places on the file system. My first question: Is this a familiar pattern for anyone writing assemblies ? and Two: How might you suggest to accomplish this? I guess the crux of the problem is how best to utilize the maven dependency plugin from within another plugin... I'm going to begin by seeing if I can just use any of the api's from that plugin in my plugin, but I may have to end up just rewriting a lot of it.
Re: generated sources convention
Thanks guys. I've already got it working with the build helper plugin, but was hoping there was some convention i could use to avoid it. Hopefully the maven guys move to add one eventually. On Dec 20, 2007 8:15 AM, Rémy Sanlaville [EMAIL PROTECTED] wrote: Hi, Use the *build* *helper* *plugin* to add additional source paths http://mojo.codehaus.org/*build*-*helper*-maven-*plugin*/ http://mojo.codehaus.org/build-helper-maven-plugin/ cf. http://www.nabble.com/Second-source-directory-for-generated-code-to6446457s177.html#a6446995 Rémy
generated sources convention
I'm working on adapting some source generation code into the maven lifecycle. I've written a plugin that is generating the sources (albeit at random places) and bound it to the generate sources phase. I've read in a couple places that the standard location is target/generated-sources/plugin-id. However, the compiler plugin doesn't check this directory. I could always just ensure these directories are explicitly added for compilation, but I wanted to check if there was any convention (besides generating into src/main/java) that would allow the compilation to occur automatically. Thanks
Re: Assembly plugin - help include all subdirectories
Your ant-path matching needs to change I believe. '*' means all files in present directory. '**/*' is what you want I believe. That would mean any file in any subdirectory (present directory included). Easier yet, get rid of the includes tag altogether. If you don't specify any excludes or includes it defaults to taking everything. If you specify an include, it excludes everything except what you specified to be included. On Dec 11, 2007 9:39 AM, tadamski [EMAIL PROTECTED] wrote: Hello all, I am attempting to pull all the contents out of a directory, into a specified output directory. The problem I'm running into is that I am only getting the root contents, no files or folders in the first level of subdirectories. The code I am attempting to get to work is the following: filesets fileset directorysubModule/java/jre/directory outputDirectorytarget/jre/outputDirectory includes include*/include /includes /fileset /filesets This will not get the contents of the subdirectories...Is there a trick for including all the contents without having to go through and specifying all of the subdirectories? Or perhaps unpackage a zip file into a specified output directory? Thank you. -- View this message in context: http://www.nabble.com/Assembly-plugin---help-include-all-subdirectories-tp14275248s177p14275248.html Sent from the Maven - Users mailing list archive at Nabble.com. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Assembly plugin - help include all subdirectories
Also, what's with the filesets? This looks more like an ant file than a maven pom... On Dec 11, 2007 9:39 AM, tadamski [EMAIL PROTECTED] wrote: Hello all, I am attempting to pull all the contents out of a directory, into a specified output directory. The problem I'm running into is that I am only getting the root contents, no files or folders in the first level of subdirectories. The code I am attempting to get to work is the following: filesets fileset directorysubModule/java/jre/directory outputDirectorytarget/jre/outputDirectory includes include*/include /includes /fileset /filesets This will not get the contents of the subdirectories...Is there a trick for including all the contents without having to go through and specifying all of the subdirectories? Or perhaps unpackage a zip file into a specified output directory? Thank you. -- View this message in context: http://www.nabble.com/Assembly-plugin---help-include-all-subdirectories-tp14275248s177p14275248.html Sent from the Maven - Users mailing list archive at Nabble.com. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: JDK 1.5 java.lang.Enum Buid Failure using Maven 2.0.8
To verify this you should try switching Eclipse to use your installed JDK compiler instead of the built in one. William also mentioned that ANT was producing successful builds as well, so unless ANT also uses it's own compiler it might be another issue.. On Dec 11, 2007 11:29 AM, Wayne Fay [EMAIL PROTECTED] wrote: I played with your sample a bit and wrote a lengthy response in JIRA. I am reasonably convinced this is simply an example where Eclipse's JDT compiler is doing something extra that allows your code to work, but it really doesn't. You can't always trust Eclipse, as it does not dispatch to your system's JDK for compiling like Maven does (as far as I understand what is happening under the hood in Eclipse and Maven). Wayne On 12/11/07, William Hoover [EMAIL PROTECTED] wrote: Done: http://jira.codehaus.org/browse/PLX-358 complete w/sample project. -Original Message- From: Wayne Fay [mailto:[EMAIL PROTECTED] Sent: Monday, December 10, 2007 4:49 PM To: Maven Users List Subject: Re: JDK 1.5 java.lang.Enum Buid Failure using Maven 2.0.8 Zip up a small sample project, create a JIRA issue, and attach it. Then someone can look at your issue more closely. Wayne On 12/10/07, William Hoover [EMAIL PROTECTED] wrote: No takers? -Original Message- From: William Hoover [mailto:[EMAIL PROTECTED] Sent: Saturday, December 08, 2007 8:13 PM To: users@maven.apache.org Subject: JDK 1.5 java.lang.Enum Buid Failure using Maven 2.0.8 I am using JDK 1.5 / Maven 2.0.8 and am attempting mvn clean install on a simple project that contains the following snippet: ... public final Class? extends Enum? extends IDTOPhase getDTOPhaseLifeCycleStrategy(){ return someEnumClass; } ... for(java.lang.Enum? extends IDTOPhase phase : getDTOPhaseLifeCycleStrategy().getEnumConstants()){ ... } ... The problem is that this compiles w/o a problem using ANT (or Eclipse build), but fails using Maven. I get the error: ... incompatible types found : java.lang.Enum? extends IDTOPhase required : java.lang.Enum? extends IDTOPhase I even set the maven-compiler-plugin to ensure compilation in 1.5 plugin groupIdorg.apache.maven.plugins/groupId artifactIdmaven-compiler-plugin/artifactId version2.0.2/version configuration source1.5/source target1.5/target /configuration /plugin Any clue??? Thanks! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Game archetype
As much 'fun' as developing enterprise java applications with Maven is, I would like to think that developing games with Maven would be even better :) I'm hoping to convince the jMonkeyEngine project to let me mavenize their project, and in the process create a game archetype supporting 3d acceleration, sound, input, etc. I think it would be great to get a game-dev environment up and running with one maven command, as all the existing tutorials are complex and error prone. I thought I'd mention it on this list to see if there's any interest in such a project, or if anyone has tried anything like it so far.
Re: Game archetype
Hehe, yeah I meant the gaming tutorials. And the tutorials are well written, but getting an environment up and running with the right native libraries etc. doesn't always work as expected, so hopefully maven can bring some order. On Dec 11, 2007 5:51 PM, Nick Stolwijk [EMAIL PROTECTED] wrote: environment up and running with one maven command, as all the existing tutorials are complex and error prone. all of them, eh? Careful, buddy, many people on this list wrote those tutorials. I think he meant the Java Gaming Tutorials, not the Maven tutorials. ;) - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Ant to Maven
You may wish to start by using the maven-antrun-plugin, breaking up the ant into smaller files that you put alongside each pom of your multi-module project. This way you can avoid writing all the plugins/complex poms that, while ideal, would take more time initially. That is the approach my project took, and we had at least 4 ant files (in the core) with probably about 1000 lines at least. Hopefully once you go through the process of breaking up the ant you will become intimately familiar with it's content, and have a better idea of how much work will be required to complete a full conversion. As to whether or not it's worth it is another question. What is your motivation for performing the conversion? What problems are you currently experiencing with your ant-built process that you felt needed alleviation? On Dec 10, 2007 8:14 AM, Luis Roberto P. Paula [EMAIL PROTECTED] wrote: Hi, I'm work in a huge java project that has a ant script with almost 800 lines. The last two weeks I'm trying to convert this script into a maven2 multiproject, in order to simplify the build process, and its being such a pain in the ass. My questions are: - Is it worth to do this? - I know it is a great software, but in what causes maven is not recommended? Thanks, Luis
Re: Ant to Maven
While I guess some big advantages are: 1. Jars out of SCM. Depending on the number/size of your JARs and your choice of SCM you could see big time benefits on branch/checkout/etc operations, and a corresponding lack of strain on your SCM server. 2. Reporting/site-plugin. The site stage of Maven allows you to keep your documentation in with your source, which to some degree can ensure consitency between the code and documentation. There are also plugins that produce things like javadoc, extract docs from TLDs etc. that will can automatically show up on a website whenever a new deployment is done. 3. Standardization. This is probably the biggest one. Instead of whatever custom conventions for project structure you have internally you can always point people to the Maven standards. Where do tests go? Resources? Groovy code? a new dependency? It's nice not having to think about these sorts of things any more, as they really have nothing to do with what you're trying to accomplish. 4. IDE integration. I'm actually in the middle of a maven conversion of a pretty huge unwieldly project, and one thing I'm looking foward to is automatic configuration of IDEs (Eclipse/IDEA). We did a partial maven-conversion about a year ago, but left a lot of stuff up to ANT. This meant whenever something new was needed we needed to adjust the maven build (for deployment), the ant build (for local dev) and the IDE project files. This was horribly time consuming and error prone. I haven't tried it yet, but the maven IDE plugins purport to handle the generation of modules/classpath management. All that being said, if you go forward with it my one piece of advice is take it all the way so that all developers are using maven for all tasks. People may be reluctant to learn maven if they can still 'get by' with ant, and as such you'll be left doing support for all maven issues. If you ensure everyone uses maven for everything (ok 99% of things), you'll find yourself surrounded by a lot more maven experts than if you hadn't. On Dec 10, 2007 8:57 AM, Luis Roberto P. Paula [EMAIL PROTECTED] wrote: As to whether or not it's worth it is another question. What is your motivation for performing the conversion? What problems are you currently experiencing with your ant-built process that you felt needed alleviation? Actually, our ant-build process works fine and it is well-structured. We thought that we could incorporate some advantages from maven, such as repositories and work in a more high-level way. But nothing is too simple and wonderful like those Hello world examples. On Dec 10, 2007 11:20 AM, Kallin Nagelberg [EMAIL PROTECTED] wrote: You may wish to start by using the maven-antrun-plugin, breaking up the ant into smaller files that you put alongside each pom of your multi-module project. This way you can avoid writing all the plugins/complex poms that, while ideal, would take more time initially. That is the approach my project took, and we had at least 4 ant files (in the core) with probably about 1000 lines at least. Hopefully once you go through the process of breaking up the ant you will become intimately familiar with it's content, and have a better idea of how much work will be required to complete a full conversion. As to whether or not it's worth it is another question. What is your motivation for performing the conversion? What problems are you currently experiencing with your ant-built process that you felt needed alleviation? On Dec 10, 2007 8:14 AM, Luis Roberto P. Paula [EMAIL PROTECTED] wrote: Hi, I'm work in a huge java project that has a ant script with almost 800 lines. The last two weeks I'm trying to convert this script into a maven2 multiproject, in order to simplify the build process, and its being such a pain in the ass. My questions are: - Is it worth to do this? - I know it is a great software, but in what causes maven is not recommended? Thanks, Luis
generating-sources missing resources
I'm trying to convert the source-generation of a legacy system into a mavenized project. Basically I need to run a couple of java classes from an already existing dependency (during the generate-sources phase I assume) which should populate my source directories. The problem I'm having is that it seems maven is ignoring my resource declarations during the generate-sources phase. Is this normal? To run the two java classes requried for source generation I'm using the exec-maven-plugin and it definitely doesn't find my declared resources on it's classpath.. I've managed to find some hacks around this, like telling the maven-resources-plugin to execute the 'resources' goal during generate-sources, but that doesn't seem so clean to me, as it's probably going to do it again during the generate-resources phase.. Any ideas?
Re: generating-sources missing resources
Thanks for the prompt reply. My code-generator (the java classes anyways) have been packaged as a regular jar artifact. I am using the Maven Exec Plug-In java goal, http://mojo.codehaus.org/exec-maven-plugin/java-mojo.html. It states 'Executes the supplied java class in the current VM with the enclosing project's dependencies as classpath.' That is accurate, as the plugin has no problems finding the classes in the pom's dependencies. However it doesn't seem to include the enclosing POM's resources.. From the sounds of it this is likely an issue with the codehaus plugin more than a core maven issue. I'll pose this question on their mailing list also. Kal. On Dec 8, 2007 2:14 PM, nicolas de loof [EMAIL PROTECTED] wrote: Tell me if I understand well : your code-generator has been packaged as a Mojo and is used in another project. It loads some config file from classpath to generate code. Maven plugins run in isolated classloaders, they have no acces to the current project classpath. First option (the maven way) is to rework the code generator to use a parametrized folder to load config files used in generation. You then just have to set a new @parameter in the Mojo. Second option - if changing the legacy code is too complex - is to setup a new URLClassloader with the plugin classloader as parent and add the project resources folder. You can the load the generator class using this classloader and invoke the generate() method by reflexion. Nico. 2007/12/8, Kallin Nagelberg [EMAIL PROTECTED]: I'm trying to convert the source-generation of a legacy system into a mavenized project. Basically I need to run a couple of java classes from an already existing dependency (during the generate-sources phase I assume) which should populate my source directories. The problem I'm having is that it seems maven is ignoring my resource declarations during the generate-sources phase. Is this normal? To run the two java classes requried for source generation I'm using the exec-maven-plugin and it definitely doesn't find my declared resources on it's classpath.. I've managed to find some hacks around this, like telling the maven-resources-plugin to execute the 'resources' goal during generate-sources, but that doesn't seem so clean to me, as it's probably going to do it again during the generate-resources phase.. Any ideas?
Re: generating-sources missing resources
Thank you, I've forwarded the discussion to the codehaus mailing list to see what their opinion is on the matter. On Dec 8, 2007 3:47 PM, nicolas de loof [EMAIL PROTECTED] wrote: enclosing project's dependencies as classpath does not mean enclosing project classpath. You have acces to all declared dependencies BUT not to the project classes/ressources. (this may be a valuale enhancement to the plugin). Nico. 2007/12/8, Kallin Nagelberg [EMAIL PROTECTED]: Thanks for the prompt reply. My code-generator (the java classes anyways) have been packaged as a regular jar artifact. I am using the Maven Exec Plug-In java goal, http://mojo.codehaus.org/exec-maven-plugin/java-mojo.html. It states 'Executes the supplied java class in the current VM with the enclosing project's dependencies as classpath.' That is accurate, as the plugin has no problems finding the classes in the pom's dependencies. However it doesn't seem to include the enclosing POM's resources.. From the sounds of it this is likely an issue with the codehaus plugin more than a core maven issue. I'll pose this question on their mailing list also. Kal. On Dec 8, 2007 2:14 PM, nicolas de loof [EMAIL PROTECTED] wrote: Tell me if I understand well : your code-generator has been packaged as a Mojo and is used in another project. It loads some config file from classpath to generate code. Maven plugins run in isolated classloaders, they have no acces to the current project classpath. First option (the maven way) is to rework the code generator to use a parametrized folder to load config files used in generation. You then just have to set a new @parameter in the Mojo. Second option - if changing the legacy code is too complex - is to setup a new URLClassloader with the plugin classloader as parent and add the project resources folder. You can the load the generator class using this classloader and invoke the generate() method by reflexion. Nico. 2007/12/8, Kallin Nagelberg [EMAIL PROTECTED]: I'm trying to convert the source-generation of a legacy system into a mavenized project. Basically I need to run a couple of java classes from an already existing dependency (during the generate-sources phase I assume) which should populate my source directories. The problem I'm having is that it seems maven is ignoring my resource declarations during the generate-sources phase. Is this normal? To run the two java classes requried for source generation I'm using the exec-maven-plugin and it definitely doesn't find my declared resources on it's classpath.. I've managed to find some hacks around this, like telling the maven-resources-plugin to execute the 'resources' goal during generate-sources, but that doesn't seem so clean to me, as it's probably going to do it again during the generate-resources phase.. Any ideas?
Re: dependencies / is there a smart way to declare it?
You can always wrap certain sets of dependencies up in another POM with 'pom' packaging type to keep the dependencies aspect of your pom isolated from whatever else you're having it do. One thing I wish the pom had was a way to perform composition from common components instead of having to use inheritance. Rigid inheritance hierarchies are more difficult to maintain than delegation IMHO. For example, let's say I want a lot of my modules to use certain settings for the maven-compiler-plugin. I could just wrap those settings into a pom and inherit from it, but then if there are other common settings i want to isolate I have to put them in the same pom, or have the compiler pom inherit from the source pom or vice-versa. Perhaps I've missed something but I can't find a way to perform this sort of 'composition' as Michael has proposed within maven. Please let me know what I'm missing ! Kallin Nagelberg Consulting Architect On Dec 5, 2007 4:02 PM, Michael McCallum [EMAIL PROTECTED] wrote: refactoring, composition, encasulation, aggregation standard techniques for making complex systems easier to understand... On Thu, 06 Dec 2007 09:42:49 Marco Mistroni wrote: hi all, just a quick question to see best practices with maven projects. i have a pom which contains lot of dependencies... so my pom end up being huge.. i was wondering if there is a smart way to declare them so that i don' t end up with a 300 lines pom.. again, it's not a real problem, i was wondering how peoples on the list were dealing with it.. regards marco -- Michael McCallum Enterprise Engineer mailto:[EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Maven and Ant
I may be totally off on this but I'm pretty sure that when using the maven-antrun-plugin maven uses ant artifacts from the maven repo. IE some of these guys: http://www.mvnrepository.com/artifact/org.apache.ant On Dec 5, 2007 11:41 AM, [EMAIL PROTECTED] wrote: Newbie here .. .pretty much ... appreciate help with basic question or pointers to finding help ;-) How does Maven know which version of Ant to use ? We're using Maven to build for Weblogic, and I'm wondering how it finds the various taskdefs BEA provides ... I'm puzzled. thanks in advance for any help ! Ethan Allen Extension 7563 (desk) 414.467.9185 (mobile) [EMAIL PROTECTED] (paging) [EMAIL PROTECTED] (paging) CONFIDENTIALITY NOTICE: This is a transmission from Kohl's Department Stores, Inc. and may contain information which is confidential and proprietary. If you are not the addressee, any disclosure, copying or distribution or use of the contents of this message is expressly prohibited. If you have received this transmission in error, please destroy it and notify us immediately at 262-703-7000. CAUTION: Internet and e-mail communications are Kohl's property and Kohl's reserves the right to retrieve and read any message created, sent and received. Kohl's reserves the right to monitor messages by authorized Kohl's Associates at any time without any further consent.
antLib not producing same classpath as Mvn
I've found what looks like a bug in the Maven Antlib, but I thought I should check here first before sending the issue to JIRA as I am a relatively new Maven user. I've created a pom that transitively has two paths to the artifact commons-logging. Due to the structure of the poms this resulted in 1.0 being selected instead of what I needed, 1.0.4. In an attempt to resolve this I added the following to my SuperPom from which all others inherit: dependencyManagement dependencies dependency groupIdcommons-logging/groupId artifactIdcommons-logging/artifactId version1.0.4/version /dependency /dependencies /dependencyManagement Now, when I run mvn dependency:resolve on my child pom it changes the resolved dependency from 1.0 to 1.0.4 as I desired. Hurrah! Unfortunately when I reference the classpath produced by this pom in my ant script it still shows the 1.0 version on the classpath. It seems that the maven antLib is not taking the parent's dependency management section into account. I've heard that the maven antlib does not always function as maven should. Is this true? Regards, Kallin Nagelberg. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]