Re: Maven 3 with newest Checkstyle

2015-02-15 Thread Baptiste Mathus
I guess you mean you don't have any  tag in your pom.xml.

Under build/plugins declare a plugin block like this to override the
inherited version

  maven-checkstyle-plugin
  6.3


Btw, you should be aware that always "pinning" the plugin versions is a
strongly recommended practice. Indeed, one of the biggest things maven
promotes is build reproducibility, and not pinning plugins puts you at risk
of having your build suddenly falling (or worse changing behavior unseen)
because you changed your maven version, or some non standard plugin has
just been released...

Cheers
Le 14 févr. 2015 02:24, "Philipp Kraus"  a
écrit :

> Hello,
>
> I'm using Maven 3.2.5 and I would like to use the Checkstyle plugin in the
> newest version (6.3).
> The default installation is 5.8, so how can I add the 6.3 version of the
> plugin at my pom.xml and
> use it on the check goal at the validate phase?
>
> Thanks a lot
>
> Phil
> -
> To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
> For additional commands, e-mail: users-h...@maven.apache.org
>
>


Re: Dependency conflict resolution

2015-02-15 Thread Baptiste Mathus
Do you use dependencyManagement ?

And Jason is right: providing a test project is quicker to help you. Might
be even quicker for you than explaining it by mail.
Push it for example on your github account and post the link here.

Cheers
Le 13 févr. 2015 21:21, "Jason van Zyl"  a écrit :

> Provide a sample project with what you expect as the result for a given
> project.
>
> The dependency plugin does its own weird resolution and M2Eclipse also has
> its own resolver (which is closer to what Maven actually does vs the
> dependency plugin) and I don't really want to think about your description
> of a project. Just give us a sample project.
>
> On Feb 13, 2015, at 3:05 PM, Endo Alejandro <
> alejandro.e...@grassvalley.com> wrote:
>
> > Hello everyone
> >
> > I thought I knew how maven did dependency conflict resolution: always go
> with the version that's closest to the base pom
> >
> > However, it seems that is not the case. Here is my scenario
> >
> > I have a pom of packaging=pom, call it A, which is the base where I
> execute maven.
> >
> > So this is my simplified graph where -> denotes a maven normal
> dependency. All poms of packaging pom will be represented with upper case
> and poms of packaging jar with lowercase
> >
> > A -> b:1.0-SNAPSHOT
> > A -> C -> d -> b:1.0-RC2
> >
> > When I look at the dependency hierarchy in eclipse, or at the
> dependency:tree or dependency:copy-dependencies, maven is always choosing
> b:1.0-RC-2, which means that my understanding of shortest path is not the
> full story since in the graph b:1.0-SNAPSHOT is a direct dependency. Could
> someone clear this for me? Is it related to RC vs snapshot?
> >
> > Thank you,
> >
> > Alejandro
> >
> > DISCLAIMER:
> > Privileged and/or Confidential information may be contained in this
> > message. If you are not the addressee of this message, you may not
> > copy, use or deliver this message to anyone. In such event, you
> > should destroy the message and kindly notify the sender by reply
> > e-mail. It is understood that opinions or conclusions that do not
> > relate to the official business of the company are neither given
> > nor endorsed by the company.
> > Thank You.
>
> Thanks,
>
> Jason
>
> --
> Jason van Zyl
> Founder, Takari and Apache Maven
> http://twitter.com/jvanzyl
> http://twitter.com/takari_io
> -
>
> the course of true love never did run smooth ...
>
>  -- Shakespeare
>
>
>
>
>
>
>
>
>
>
>
>
>
> -
> To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
> For additional commands, e-mail: users-h...@maven.apache.org
>
>


Re: Question about next maven-fluido-skin release

2015-02-15 Thread Baptiste Mathus
Hi better ask for a release in the dev list. If the code is actually
already fixed and just needs be released there's generally committers
willing to do it.

Cheers
Le 14 févr. 2015 02:23, "Mikko Kuokkanen"  a
écrit :

> Hi.
>
> We are prototyping following combination for our web site generation:
> * maven-site-plugin 3.4
> * doxia-module-markdown 1.6
> * maven-fluido-skin 1.3.1
>
> According to fluido documentation it has nice source code prettify
> feature with line numbers. Alas, I didn't get it to work. Then I found
> this bug report http://jira.codehaus.org/browse/MSKINS-86. It seems
> that it is a bug and will be fixed in the next versions 1.3.2 and 1.4.
>
> I did not see any release date attached to those versions. Is there
> any timetable when they could be released?
>
>
> Thanks,
> Mikko Kuokkanen
>
> -
> To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
> For additional commands, e-mail: users-h...@maven.apache.org
>
>


Re: Easy way to test maven / surefire with only changed classes? javac dependencies?

2015-02-15 Thread Kevin Burton
I agree that dependency insanity could cause problems , as well as
resources.

However, the maven dependencies can be partly resolved because we can also
just analyze the pom to look at those dependencies.

I imagine as long as resources are loaded from /src/test/resources then we
can make ANY change here trigger a full re-test.  It’s not like this
happens often.

And in spare CPU time you can re-run FULL tests (and maybe we should do
this before a release).

But at least this way, if something DEFINITELY fails, you can resolve it
somewhat quickly.

On Wed, Feb 11, 2015 at 4:52 AM, Ron Wheeler  wrote:

> How do you manage/allow dependency changes?
> If you are using a parent pom with a dependency management section, you
> have a single pom to watch.
>
> We adopted a policy that developers are not able to change dependencies
> during a release cycle.
> The versions of dependencies are a team decision under the direct control
> of the project manager and a meeting is held at the beginning of the
> release cycle to review the dependencies and settle on a stable set.
> Of course, there are emergencies from time to time but the decision to
> change is a team decision.
>
> This increase the stability of the development environment and makes the
> testing of upgraded dependencies happen at the start of the release cycle
> so that we know at the start that the initial code still works with
> upgraded dependencies.
>
> Nothing is worse or more time-consuming that have a test failure appear in
> the midst of your changing code but caused by a dependency change.
>
> A developer will go crazy trying to figure out why a small change to their
> code caused such a problem.
>
> I am not sure how you can be sure that a change in a transitive dependency
> will not cause errors higher up in the stack or create bad data structures
> that only show up later in code that has no dependency on the original
> culprit.
>
> Ron
>
>
>
> On 11/02/2015 2:57 AM, Andreas Gudian wrote:
>
>> Hi,
>>
>> You can't do that with javac, but the takari-plugins maintain a fine
>> grained dependency graph in order to do incremental builds.
>>
>> With tests, it is a different thing, though. Their runtime behaviour may
>> depend on more than their class dependency might tell you: property/xml
>> files, dependency injection - stuff like that.
>> I think clover (code coverage tool for test) has a feature to run only
>> tests for which any code has changed that has been recorded to be used in
>> the previous run.
>>
>> So for real incremental tests out of the box, we'd have to support
>> different strategies: compile-time dependencies, resource dependencies,
>> runtime dependencies. That's quite an undertaking. For Surefire 3 we want
>> to open up the API to allow attaching stuff like that from the outside.
>>
>> Andreas
>>
>> Am Mittwoch, 11. Februar 2015 schrieb Kevin Burton :
>>
>>  Is there an easy way to build the Java dependency tree from the compiler?
>>>
>>> I was thinking that if you can get the Java dependency tree built, then
>>> you
>>> take take a look at a diff and look at which files have changed.
>>>
>>> Then from there you could take say 1000 test and reduce that to only 10
>>> test if only those ten had their dependencies changed.
>>>
>>> The theory being that if the previous commit already tested the previous
>>> 990, why test them again?
>>>
>>> The epiphany I had was that one could EASILY integrate this into maven by
>>> just passing a list of which tests to skip.
>>>
>>> This could dramatically improve the speed of continuous integration
>>> systems
>>>
>>> --
>>>
>>> Founder/CEO Spinn3r.com
>>> Location: *San Francisco, CA*
>>> blog: http://burtonator.wordpress.com
>>> … or check out my Google+ profile
>>> 
>>> 
>>>
>>>
>
> --
> Ron Wheeler
> President
> Artifact Software Inc
> email: rwhee...@artifact-software.com
> skype: ronaldmwheeler
> phone: 866-970-2435, ext 102
>
>
> -
> To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
> For additional commands, e-mail: users-h...@maven.apache.org
>
>


-- 

Founder/CEO Spinn3r.com
Location: *San Francisco, CA*
blog: http://burtonator.wordpress.com
… or check out my Google+ profile




Re: Cleaning source code repositories

2015-02-15 Thread Viktor Sadovnikov
Hi Curtis,

Yes, I believe, we are concerned about the same challenges, however with
slightly different approach.

I'm trying to come with a recipe or recipes for:

   - determining impact of introduction of a backward incompatible change;
   - determining delivered (released) versions of software, which are
   affected by a discovered defect

Your point, as it seems to me, is that even a MINOR or PATCH change can
break dependency. The theoretical response would be: insure your modules
(projects) are not tidily coupled, program against interfaces, which
composed the public API of the dependency. Yes, these are valid
recommendations, however they are extremely difficult to follow completely.

Instead of "melting pot" I use upgrade builds. All projects (not modules of
multi-module projects) depend on each other released versions and every
project has a separate upgrade build, which is scheduled to run at least
once a day. This build

   - uses maven dependency plugin to upgrade dependencies and parent of the
   project (with filters to exclude external dependencies);
   - runs regular "clean install" with modified POM;
   - if the previous step succeeds, commits changed POM to the repository

This approach gives freedom to break backward compatibility in SNAPSHOT
versions and, after release, just flags the problem (by failing the upgrade
build), allowing the team to continue working on none-upgraded version,
while one of team members resolves upgrade problem.

Hope this helps,
Viktor

Viktor Sadovnikov @ JV-ration
evolution of Joint Vision
Tuinluststraat 14, 2275XZ Voorburg, The Netherlands
vik...@jv-ration.com  | http://jv-ration.com | +31 6
2466 0736

On Fri, Feb 13, 2015 at 5:33 PM, Curtis Rueden  wrote:

> Hi Viktor,
>
> > Do you actually consider this situation as a problem or is it just a
> > perfectionist talking to me? ;-)
>
> I would say it is a very real challenge of managing projects with many
> components.
>
> > how would you approach determining those, which are required for final
> > deliveries, and those, which might break if another module changes
> > (sort of reverse dependency management)?
>
> That "reverse dependency management" need in particular is the most
> important of those on your list, I think. (Detecting obsolete modules and
> build jobs is nice and helps declutter, but often has little consequence
> beyond that.)
>
> My project [1] is still looking for better ways to be notified in advance
> when code changes somehow affect downstream projects. (Of course, you
> cannot be responsible for the entire world -- but you can define a known
> set of downstream code that you want to help maintain.)
>
> We use release couplings for reproducible builds [2], SemVer for
> versioning [3], and Jenkins for CI [4]. Because of the release dependency
> couplings, Jenkins cannot tell you when upstream changes to master (or even
> new release versions) break downstream projects, until such projects
> attempt to update to the new release. What we are working towards creating
> is a "melting pot" Jenkins job that switches everything to snapshot
> couplings using a profile [5] in a giant synthetic multi-module build. Then
> the java compiler would tell you directly if you broke backwards
> compatibility -- at least compile-time compatibility, which is more than
> half the battle.
>
> If anyone knows of a better established best practice for this sort of
> thing, that would be awesome.
>
> Regards,
> Curtis
>
> [1] http://imagej.net/Architecture
> [2] http://imagej.net/Reproducible_builds
> [3] http://imagej.net/Versioning
> [4] http://imagej.net/Jenkins
> [5]
> https://github.com/scijava/pom-scijava/blob/pom-scijava-5.7.0/pom.xml#L1048-L1051
>
> On Fri, Feb 13, 2015 at 7:09 AM, Viktor Sadovnikov 
> wrote:
>
>> Good day,
>>
>> I wonder if this community can provide some hints on handling the
>> following.
>>
>> At a few last projects I was asked to set (or clean) automated builds up,
>> so they can get (at least) deployable software package(s) after code
>> changes in a minimum time. Starting from the final desirable results, I
>> was
>> able to trace down every module, which is needed to build the "master CD".
>> It was especially easy for Maven-based projects. However discovering these
>> modules in source repositories were always highlighting:
>>
>>- lack of knowledge if a certain module in repository is needed or even
>>used in multiple products;
>>- duplications of modules with similar purposes - sometimes a conscious
>>decision to copy in order to avoid breaking backward compatibility with
>>unknown dependents;
>>- existence of build jobs for obsolete modules;
>>- absence of builds for stable modules, which are not changed during
>> the
>>last couple of years
>>- and things like these
>>
>> Assuming all projects in the repositories are maven-ized, how would you
>> approach determining those, which are required for final deliveries, and
>> those, which might brea