How to avoid unzip of large test data on each "mvn test"

2014-07-31 Thread Francois MAROT
Hi all, I'm in the process of switching to Maven for a large ANT project and have a question regarding my test data. I'd like those data to be versionned and stored in the Maven repo. But currently those data are huge (let's say a 1-2 gigabytes once zipped). So I imagine that if stored in the repo

Re: How to avoid unzip of large test data on each "mvn test"

2014-07-31 Thread Graham Leggett
On 31 Jul 2014, at 2:55 PM, Francois MAROT wrote: > I'm in the process of switching to Maven for a large ANT project and have a > question regarding my test data. > I'd like those data to be versionned and stored in the Maven repo. But > currently those data are huge (let's say a 1-2 gigabytes on

How to force individual package dependencies?

2014-07-31 Thread Rabe, Jens
Hello, I have a Maven project which needs a library which is dependent on org.osgi.foundation 1.0.0. This particular project re-implements some of the java.io, java.lang etc. functionality in a very outdated way, e.g., raw Lists. I only need this project for the scope "test", but this dependenc

Re: How to force individual package dependencies?

2014-07-31 Thread Dan Tran
This sounds dangerous journey, you may want to repackage the original osgi jar, remove java.io, give it a new maven coordinate, and upload to your maven repo On Thu, Jul 31, 2014 at 9:23 AM, Rabe, Jens wrote: > Hello, > > I have a Maven project which needs a library which is dependent on > org.

Dependency in profile in pom.xml

2014-07-31 Thread 李响
Dear maven user and developers, Attached is pom.xml of Flume. I have some questions as follow, really appreciate your help or guide, thanks !! 1. Why profile id = hadoop-1.0 is the default profile ? If not specifying -Px for mvn command, I found hadoop-1.0 is the default one to use. why i

Re: Dependency in profile in pom.xml

2014-07-31 Thread Deng Ching-Mallete
Hi, The ! at the beginning of the hadoop.profile property in hadoop-1.0 profile indicates that if the hadoop.profile property is not specified or absent, then the profile will be activated. So if you didn't specify it as a system property (e.g. -Dhadoop.profile=), it would be used by default.