Re: missing resources when using --patch-module

2020-04-03 Thread Robert Scholte
Thanks for the explanation, it makes sense:
Patch acts more like an override than an append.

best,
Robert

On 3-4-2020 12:56:10, Alan Bateman  wrote:
On 03/04/2020 10:17, Robert Scholte wrote:
> This issue was registered as SUREFIRE-1768[1]
> It contains a very small Maven project to demonstrate the issue.
>
> That project contains one method executing the following:
>
> Demo.class.getClassLoader().getResources("demo").asIterator().forEachRemaining(url
>  -> {
>   System.out.println(url.getFile()); // I'd like to see the 
> target/classes/demo directory here at some point.
> });
>
>
> After executing the test it shows the following result
> /E:/test-classpath-demo-master/target/test-classes/demo/
> /E:/test-classpath-demo-master/target/test-classes/demo
>
> these are similar, but more worrying: where is
> /E:/test-classpath-demo-master/target/classes/demo
>
> I rewrote it a bit by including a main method  to ensure it is not caused by 
> surefire:
> "%JAVA_HOME%"\bin\java --module-path target/classes --patch-module 
> test.classpath.demo=target/test-classes  --module 
> test.classpath.demo/demo.DemoTest
>
>
> this gave me only one result (where I expected 2):
> /E:/test-classpath-demo-master/target/test-classes/demo/
>
>
> So the question is, where is
> /E:/test-classpath-demo-master/target/classes/demo/
>
Patching is to used to replace specific classes or resources in a module
with alternative versions. It can also be used to augment a module with
new classes or resources. In the reproducer, it looks like module
test.classpath.demo has been patched so that "demo" is found in the
patch rather than the original module. This is correct behavior.

If it helps, this will give a list of the resources in the module so you
can see the effect of the patching:

    String name = this.getClass().getModule().getName();
    ModuleLayer.boot()
    .configuration()
    .findModule(name)
    .orElseThrow()
    .reference()
    .open()
    .list()
    .forEach(System.out::println)

-Alan


missing resources when using --patch-module

2020-04-03 Thread Robert Scholte
This issue was registered as SUREFIRE-1768[1]
It contains a very small Maven project to demonstrate the issue.

That project contains one method executing the following:

Demo.class.getClassLoader().getResources("demo").asIterator().forEachRemaining(url
 -> {
  System.out.println(url.getFile()); // I'd like to see the target/classes/demo 
directory here at some point.
});


After executing the test it shows the following result
/E:/test-classpath-demo-master/target/test-classes/demo/
/E:/test-classpath-demo-master/target/test-classes/demo

these are similar, but more worrying: where is 
/E:/test-classpath-demo-master/target/classes/demo

I rewrote it a bit by including a main method  to ensure it is not caused by 
surefire:
"%JAVA_HOME%"\bin\java --module-path target/classes --patch-module 
test.classpath.demo=target/test-classes  --module 
test.classpath.demo/demo.DemoTest


this gave me only one result (where I expected 2):
/E:/test-classpath-demo-master/target/test-classes/demo/


So the question is, where is
/E:/test-classpath-demo-master/target/classes/demo/


thanks,
Robert

[1] https://issues.apache.org/jira/browse/SUREFIRE-1768


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-25 Thread Robert Scholte
Since my request is only about required modules, I'll solely focus on that.

The responsibility of a build tool is to provide the right set of modules as 
described in the module descriptor.
But this module descriptor doesn't specify where these modules are coming from, 
so the build tool should provide some metadata file to define that.

There's no discussion that such module descriptor is useful for the main 
sources.

For the test sources it is unclear.
One reason to not provide a module descriptor is because it will never be a 
requirement for other modules.
And what about its name, as it is actually a required module and at the same 
also the same module.

One reason to provide a module descriptor is for the build tool to know which 
modules are required, so it can provide the right set of modules.
But how would such descriptor for test sources look like? Well, in the end it 
must contain everything from the main module descriptor + the test requirements.
I assume that copy/pasting the main module descriptor for a new test module 
descriptor + adding the extra requirements is not the preferred solution.

Hence in case of compiling test sources Maven does it the other way around. 
It's metadata file already contains the modules required for testing, aka test 
scoped dependencies.
And with this Maven is capable to construct the right set of arguments to 
compile the test sources, giving proper --add-read arguments, etc.
But there's one small set of modules that is not defined in this metadata file: 
the java.* and jdk.* modules. (this in my only issue!)

In case of Maven, even if we could find a way to define these modules as 
dependencies, this won't simplify the usage of the modular system. 
Why does one need to define if for test sources and not main sources? For 
consistency we could ask users to add them for both main and test sources.
Any solution in the pom file is a workaround: in the end they are just a set of 
sources files that depend on modules and for that reason deserves a module 
descriptor.
But due to its special purpose it should somehow be based on the main module 
descriptor.

Without it people will specify these test required java.*/jdk.* modules in the 
only place they know to specify any required modules: the main module 
descriptor.
Having a mixture here of required modules for both main and test to avoid 
compilation errors is something we should refuse.

regards,
Robert
On 25-2-2020 18:24:51, Alex Buckley  wrote:
On 2/25/2020 9:11 AM, Alex Buckley wrote:
> And there are other ways, where the build tools (or their plugins) take
> responsibility for arranging the test-time module graph. This may
> require more work on the part of build tool/plugin maintainers than
> simply telling their users to write JDK command line options in the
> middle of a config file.

Roy knows what I mean:
https://twitter.com/royvanrijn/status/1232309684788432896


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-14 Thread Robert Scholte
I am aware that the current patch concept is already complicating things.
Apart from the impact of the requested implementation, the information+options 
should all be available already: the patch module descriptor should be 
considered as an alias for some related commandline options.


With a tool like Maven it is already possible to compile and run patched 
modules, but in case of requiring java.* modules it is very awkward that it 
requires additional configuration in the pom instead of some dedicated module 
descriptor, which feels most natural.
You'll get an exception like: package java.sql is declared in module java.sql, 
but module some.module.name does not read it
And the first thing that comes to mind is: add java.sql to the (main) module 
descriptor. But this should not be done.

Other options to avoid this would be: 
- source code scanning to figure out all used modules. To me not an option: way 
too expensive/resource consuming.
- by default add all java.* modules, since they should be available anyway.
All other solutions will be a tool specific solution, even though they're 
struggling all with the same problem.

To me having the correct (main) module descriptor should be the ultimate goal. 
People should not be tempted to apply workarounds on this descriptor to make 
things work for tests (even though those module are for sure part of the java 
runtime)
Especially because of the impact wrong module descriptors would have and which 
cannot be adjusted.

thanks,
Robert
On 13-2-2020 00:12:02, Alex Buckley  wrote:
On 2/12/2020 1:08 PM, Robert Scholte wrote:
> To prevent these workarounds and to provide an easier way to patch a
> module via a dedicated descriptor will help keeping the module
> system cleaner.

It will lead to "split modules" on the modulepath, which will cause just
as many maintenance headaches as split packages on the classpath. Yes,
there is some maintenance benefit if a module explicitly declares that
it patches (i.e. becomes part of) another module (which might then have
to explicitly declare that it allows patching) ... but for a developer
to understand the resulting module graph requires looking at everything
on the modulepath, which is no better than having to look at everything
on the classpath. In Java, a declaration -- whether a module, a class,
or a method -- happens in one source file, and it's up to tools to
rewrite declarations if other interesting source files are known to
those tools.

> However, recently I've informed by this case: if the test sources
> use one of the java.* modules (that are not used by the main sources)
> the only correct way to solve it now is by adding the required flags
> by hand (and only to the test-compile configuration!). This is hard
> to explain and instead of diving into the specifications,
> understanding what's happening, you'll see that they choose for the
> easy workaround: add this "test scoped" module as a required module
> to the module descriptor.

Is there nothing that Maven can do to make the test-compile
configuration easier to create? Maven has all the source code at its
fingertips, including knowledge of module directories which seem to
declare the same module more than once because JUnit recommends it, yet
still Maven makes the user laboriously write out the command line flags
for patching?

Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-12 Thread Robert Scholte
It is kind of unfortunate that this topic has been hijacked for something I 
don't see as an issue.
I've been able to build up the right set of commandline arguments for this case 
for both javac and java

With *just* the dependencies(respecting compile + test scope) from the pom, and 
the module descriptor I've been able to compile the main/test sources and run 
the tests.

However, recently I've informed by this case: if the test sources use one of 
the java.* modules (that are not used by the main sources) the only correct way 
to solve it now is by adding the required flags by hand (and only to the 
test-compile configuration!).
This is hard to explain and instead of diving into the specifications, 
understanding what's happening, you'll see that they choose for the easy 
workaround: add this "test scoped" module as a required module to the module 
descriptor. 
In the end this will do more harm than good. A bit off-topic, but with Maven we 
already see that adding dependencies is simple, however cleaning up unused 
dependencies is not. Hence the exclude-option for dependencies is used 
frequently. Unlike Maven dependencies, you cannot exclude required modules, so 
here it is even more important that the list of required modules is the minimum 
required list.

To prevent these workarounds and to provide an easier way to patch a module via 
a dedicated descriptor will help keeping the module system cleaner.
I'm very pleased I've been able to hide most logic behind the required flags. 
This is a valid use case and in my opinion deserves a solution without explicit 
knowledge of these flags by the enduser (a tool like Maven should solve this, 
but this can only be done if it has access to this information.)
Introducing an extra modifier looks like one solution, maybe there is a better 
one.

thanks,
Robert

On 10-2-2020 23:03:28, Alex Buckley  wrote:
Hi Christian,

On 2/7/2020 4:41 AM, Christian Stein wrote:
> This time, I created a project at [0] with a detailed description on its
> front page, i.e the README.md file.
>
> [0]: https://github.com/sormuras/java-module-patching

To restate:

- You're saying that, today, it's brittle to copy directives from
src/org.astro/main/java/module-info.java to
src/org.astro/test/java/module-info.java. (And having copied them, you
still need to `open` the test module and add some `requires`.)

- You suggest that, in future, there will still be a
src/org.astro/test/java/module-info.java file which is of interest to
test frameworks.

What you're hoping is that new syntax will let you invert the patching:

- Today, you set the module-path so that out/modules/test/org.astro.jar
is the "primary" version of the module; then you redefine everything in
it except module-info by overlaying out/modules/main/org.astro.jar.

- In future, you want to have out/modules/main/org.astro.jar as the
"primary" version, and redefine only its module-info by specifying the
sidecar out/modules/test/org.astro.jar. The sidecar would have some
syntax to identify that its declaration of org.astro is strictly
additive to the "primary" version. You would set the module-path to
out/modules/main:out/modules/test:lib so that the module system (1)
finds the "primary" version in out/modules/main and (2) augments its
module-info with sidecars found in out/modules/test and lib. I assume
some new command line option would enable or enumerate the sidecars
explicitly, because obviously the dependences and exports of
out/modules/main/org.astro.jar shouldn't depend on which JAR files
happen to be lurking deep in the module-path.

Stepping back, the core issue is that once the true "primary" version of
a module is built -- out/modules/main/org.astro.jar -- you don't want to
change it. No build or test tool wants to physically rewrite its
module-info to require test-time dependencies at test time, and then to
not require such dependencies at release time. You want the module
system to virtually rewrite the module-info instead. And that's already
possible, as long as you put on your test-colored sunglasses and view
out/modules/test/org.astro.jar as the "primary" version, and
out/modules/main/org.astro.jar as the overlay ... once the tests have
run, go back to viewing out/modules/main/org.astro.jar as the "primary"
version. Introducing a new kind of module descriptor, with merging by
the module system according to new command line options, seems like a
lot of overhead that can already be worked around by tools at test-time.

Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-05 Thread Robert Scholte
Hi Simone,

to me the commandline doesn't change, so in your case the library.jar won't 
patch jetty-client.jar.

It will only be patch as you already would do right now:
java --module-path jetty-client.jar:library.jar:app.jar --module 
app/com.app.Main --patch-module jetty.client=/path/to/library.jar


For Maven plugins I don't expect a lot of changes. I should probably use the 
name in the patched module descriptor instead of automatically choosing the 
main module descriptor, but these are little adjustments.

Having a patched module descriptor in a jar might be awkward, hence maybe the 
packager shouldn't allow to add it, nor other tools to use it.
But these are details open for discussion.

Robert
On 5-2-2020 08:57:53, Simone Bordet  wrote:
Hi Robert,

On Wed, Feb 5, 2020 at 8:38 AM Robert Scholte wrote:
>
> Hi Simone,
>
> I understand your concern, but the patched module descriptor doesn't have to 
> (or should not) replace the --patch-module option. This proposal is about the 
> additional options you now need to put on the commandline, but which already 
> fit in the module descriptor.

I understand it does not replace --patch-module.
I understand it adds the additional "requires", "opens", etc.

But how do you stop a library that uses Jetty to ship a jar containing
a patched module file that exports and opens things in Jetty that the
Jetty authors did not want to export or open, without users knowing
it?

jetty-client.jar -> contains legit module-info.class
library.jar -> contains patched descriptor that patches jetty-client
app.jar -> my application with a legit module-info.class

java --module-path jetty-client.jar:library.jar:app.jar --module
app/com.app.Main

With this command line, does the Java runtime parse and enable the
patched descriptor contained in library.jar, opening up jetty-client?
If not, how would you enable it in Maven?

Am I missing something?

Thanks!

--
Simone Bordet
---
Finally, no matter how good the architecture and design are,
to deliver bug-free software with optimal performance and reliability,
the implementation technique must be flawless. Victoria Livschitz


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-04 Thread Robert Scholte
Hi Simone,

I understand your concern, but the patched module descriptor doesn't have to 
(or should not) replace the --patch-module option. This proposal is about the 
additional options you now need to put on the commandline, but which already 
fit in the module descriptor.

thanks,
Robert
On 5-2-2020 08:19:21, Simone Bordet  wrote:
Hi,

> With the "patch modifier"-proposal applied as suggested by Robert,
> this manually constructed test module descriptor:
>
> open module foo {
> exports foo;
> requires org.junit.jupiter.api;
> }
>
> from [2] would shrink to:
>
> open patch module foo {
> requires org.junit.jupiter.api;
> }

While I was involved in the original report, I have concerns about its security.

Would not anyone be able to patch an existing module without the
author's consent?
For example:

patch module org.eclipse.jetty.client {
exports org.eclipse.jetty.client.internal;
opens org.eclipse.jetty.client;
}

Doing the same on the command line keeps the end user in control,
rather than having the end user possibly scan hundreds of jar to see
if someone snuck in a patched module descriptor.

However, the need for such "test" module descriptor is evident.

What if patched module descriptors are only effective when a command
line option is present, say "--allow-patch-descriptors", or something
like that?

Thanks!

--
Simone Bordet
---
Finally, no matter how good the architecture and design are,
to deliver bug-free software with optimal performance and reliability,
the implementation technique must be flawless. Victoria Livschitz


Re: ServiceLoader.load(Class, ClassLoader) does not load services exposed in modules and loaded by parent CL

2018-05-24 Thread Robert Scholte

Hi Peter,

you've been hit by MNG-6371[1]. We've tried to solve this in Maven 3.5.1,  
but we faced other unexpected classloader issues with maven-extensions and  
maven-plugins with extensions. So far we haven't been able to fix it.
This is actually THE reason why Maven 3.5.1 was never released, we  
reverted the classloader related changes and successfully released Maven  
3.5.2.


If you like a challenge or a bit more info: the main issue is when we  
create a new Realm with null[2]. This will create a new ClassLoader with  
parent null[3], meaning no bootstrap classloader.


I can quote Alan:
Rhino used to be the JS engine in older releases and that may have been in  
rt.jar and so loaded by the boot loader. When Nashorn replaced it (in JDK  
8) then it was configured to be defined to the extension class loader so  
this is why the code snippet doesn't find it.


In the Maven mailinglist are several threads trying to define how Maven  
Classloading should work. So far only a few have mentioned this issue.


IIRC some have worked around it by initializing a new classloader.

thanks,
Robert


[1] https://issues.apache.org/jira/browse/MNG-6371
[2]  
https://github.com/apache/maven/blob/4b95ad9fce6dfe7eec2be88f5837e96c7fbd7292/maven-core/src/main/java/org/apache/maven/classrealm/DefaultClassRealmManager.java#L123
[3]  
https://docs.oracle.com/javase/9/docs/api/java/lang/ClassLoader.html#ClassLoader-java.lang.String-java.lang.ClassLoader-


On Thu, 24 May 2018 12:29:33 +0200, Alan Bateman <alan.bate...@oracle.com>  
wrote:



On 23/05/2018 21:28, Peter Levart wrote:

:

It's not an official plugin. And it seems that the Maven container is  
to blame, not the plugin.
Robert Scholte is on this mailing list and may be able to comment on  
this.



The nonstandard ClassLoader is supplied by the container. The plugin  
just uses the most direct and default API possible to instantiate  
JavaScript engine:


jsEngine = new ScriptEngineManager().getEngineByName("JavaScript");

It is the environment the plugin is executing in that doesn't play well  
with how system service providers are located from JDK 9 on - namely,  
the nonstandard ClassLoader that delegates to system class loader, but  
does not express this also in the .getParent() result. I don't know why  
Maven choose this, but closer inspection reveals that its ClassLoader  
does have a "parent", but it keeps it in its own field called  
"parentClassLoader" and doesn't return it from .getParent(). There must  
be a reason for this, but I don't know that it is.


Do other parts of the JDK also use TCCL to bootstrap service lookup by  
default? Isn't it unusual that ScriptEngineManager uses TCCL by default?
I wasn't involved in JSR 223 but it may have envisaged scenarios where  
applications bundle scripting language implementations. This is not too  
unusual and you'll find several APIs do this to allow for cases where an  
application is launched in a container environment. Legacy applet and  
Java EE containers have historically created a class loader per  
"application" and this becomes the TCCL for the threads in that  
application.


-Alan


Re: Explicit file names in module-info - #AutomaticModuleNames

2017-05-05 Thread Robert Scholte
On Fri, 05 May 2017 12:36:38 +0200, Stephen Colebourne  
 wrote:



I think this design addresses some of Roberts's concern too. With this
plan, Maven Central would contain modules depending on automatic
modules, but the dependency names would be sufficiently stable for
this not to be the major issue it has been previously.
While I don't think automatic modules are the best option, were the
above chosen, I think it would be a solution the community could
successfully and easily adopt.


My most favorite solution is still the support for loose/soft modules. The  
pros: it'll only introduce one new keyword; the cons: the code might not  
compile/run *upfront* due to missing jars, however I doubt if this is  
really an issue assuming most Java projects use tooling to solve this.


Adding mapper information to the module-descriptor is an option, but  
that'll add the automodule name in some way to the descriptor, and I'd  
like to avoid that because it should not be used as inspiration to pick  
your module name.


Re: Can automatic modules be made to work?

2017-05-05 Thread Robert Scholte

Yes, this option has been mentioned in some form.

The good thing about this part is that exports and requires use the same  
structure, i.e. the package is the unit shared between 2 modules.
However, this will probably lead to a long list of requirements, so I  
understand the choice the name of the bundle of exported packages, being  
the module name.


Also keep in mind that one of the fundamental choices made is that the  
module descriptor should not have mechanisms for migration path.
That said, one might think that the partial requirements or soft/loose  
modules are there also for migration. I would disagree with that. I expect  
that some jars will never become a module or will ever be re-released just  
for specifying a Automatic-Module-Name attribute, in which case you must  
refer to them as auto modules. In such cases you cannot talk about  
migration but about the fact that projects will always depend on  
unmodularized jars.
One solution was the soft/loose modules proposal which I think is valid  
but it all depends on the opinion if this "pattern" will be there just for  
migration or not.


Robert

On Fri, 05 May 2017 16:19:38 +0200, Robert J. Saulnier  
 wrote:



I've only somewhat followed this discussion. The following might be
nonsensical or already discussed.

The issue with automatic modules is we don't know what the names of  
future

modules will be, but what we do know is what packages we require. So
instead of guessing a module name, we could require package(s) until the
actual module becomes available.

module my.module {

exports ...;
requires ...;

requires package org.junit;
requires package org.junit.rules;
}

So in the above example, it would need to check if a module on the
module-path exports the packages listed, if not, look in the Jar files  
for

the packages.

Once Junit modularizes their stuff, we can update our module at our  
leisure:


module my.module {

exports ...;
requires ...;

requires ;
}

Bob


Re: Revised proposal for #AutomaticModuleNames

2017-05-05 Thread Robert Scholte

Hi Mark,

thanks for these adjustments. In general they look good, but it all  
depends on the details:


Define a JAR-file manifest attribute, `Automatic-Module-Name`, whose
value is used as the name of the automatic module defined by that JAR
file when it is placed on the module path, as previously suggested
[2][3].  If a JAR file on the module path does not have such a
manifest attribute then its automatic-module name is computed using
the existing filename-based algorithm.

Just to be clear: does this make it an auto module? In that case we're  
still a bit stuck, because I still insist that jars published to any  
public artifact repository should never refer to an automodule. Published  
jars with this attribute should have received this name by their developer  
and this also implies that the jar is Jigsaw-ready, e.g. no more split  
packages. In the future its users should be able to switch to the fully  
modular jar with the same name without any problem. So I need an extra  
state, e.g. isAutomatic() + isNamed(), in which case I have to change my  
insisting line: "Never publish jars to any public artifact repository that  
refers to *unnamed* automatic modules".


A user of a library can suggest a stable name to the library's
maintainer, and easily submit a patch that implements it.  This will
enable the uncoordinated modularization of libraries across the
entire ecosystem.

I don't think this is realistic. Jars live in the local repository and  
will stay there even during compilation. Is a user patches such jars,  
it'll effect all his local Java projects. Only when Maven will do both  
compiling and generation the distributable/deployable one could isolate  
jars. However, the concept is that plugins don't have any knowledge of  
each other and shouldn't share any information. In other words:  
maven-war-plugin is not aware of maven-compiler-plugin; the  
maven-war-plugin just collects all the content for the web archive,  
including the jars as specified in the pom.xml, pulling them from the  
local repository.
In case the user is developing an application he might use this, but since  
he can already use automodules, I expect him to use that, it's less  
painful compared to patching its dependencies artifacts.
In case the user is developing a library he cannot use this, nor can he  
refer to automodule, because he doesn't control the combinations of jars  
used by the application developer.


thanks,
Robert

On Thu, 04 May 2017 19:39:06 +0200, <mark.reinh...@oracle.com> wrote:


Thanks to everyone, and especially Stephen Colebourne, Brian Fox, and
Robert Scholte, for the extensive feedback.

  http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2017-May/000687.html

TL;DR: Keep automatic modules, bring back the module-name JAR-file
manifest attribute, and strongly recommend reverse-DNS module names.

Comments?

- Mark


Re: Can automatic modules be made to work?

2017-04-27 Thread Robert Scholte
The returning question is: how can I as a *library builder* participate in  
adopting Jigsaw?


The first thing you need to ensure is that there are no split package  
issues.

The next steps can be conflicting:
- I want to name my module, so my customers can refer to it
- I do not want to refer to auto modules, because their names are  
unreliable.

- I do not want to add command line flags to change module access behavior.

The only place where you can specify the module name is in the module  
descriptor. (The proposal to provide a name via a MANIFEST attribute has  
been rejected. Having only one location to specify it is reasonable)
Adding a module descriptor means you have to specify every requirement,  
which would mean probably referring to automodules.

Because we don't want to do that, it seems we're blocked.

There is an option which makes it possible for a module to read from the  
classpath, i.e. add-reads =ALL-UNNAMED . This way you don't have  
to specify those automodule requirements and the effect will be the same.
However, using these commandline options means you need to specify them  
both at compiletime and runtime. You cannot expect from your customers to  
do this by hand. And since this kind of information is lost after  
compilation, no tool will be able automatically add it again at runtime.


If only this kind of information could be stored in the module  
descriptor...


Which made me think of the concept of soft and strict modules. Assuming  
'strict' is the preferred default, 'soft' or any equivalent alternative  
would be a new keyword which has the same effect as add-reads  
=ALL-UNNAMED, but it's information is available at both compile  
time and runtime.
With soft modules you can require a subset of modules. This should ease  
the migration a lot.


Is it bad that the module gets access to all classes on the classpath? I  
don't think so. This was already happening when using *only* the  
classpath. Now you get access to the required (not all) modules on the  
module path and everything on the classpath, which is already much less  
then ever before. With the soft module you control the pace of migration,  
won't be blocked by the pace of others and you can really participate  
without the possible consequences of referring to automodules.


Does this mean we need to remove automodules? Even though I still think  
they can do quite some damage to the ecosystem, I know it is too popular  
by some to be removed. So yes, we could keep automodules. In fact, you  
could say that (strict) modules should never refer to automodules, but  
soft modules can. This still matches the modular setup of Java 9.
Soft modules gives the option to 1. refer to automodules and 2. omit  
requirements


So can automatic modules be made to work? Yes, by not requiring to refer  
to them.


Robert


On Wed, 26 Apr 2017 23:19:42 +0200, Stephen Colebourne  
 wrote:



On 26 April 2017 at 17:27,   wrote:
I think I need to reconsider my previous conclusion that explicit  
modules

that depend upon automatic modules should never be published for broad
use [2].
...
The only remaining objection seems to be the aesthetic one, i.e., the
fact that the name of an automatic module is derived from the artifact
that defines it rather than from some intrinsic property of its content
or an explicit declaration by its author.  I understand and share this
concern.  I completely agree that modules are not artifacts, as I've
written before [3], but I don't see a better alternative [4].


OK, so in this thread, I'll outline what changes could potentially
allow distributed modules based on automatic modules to work. This is
not an endorsement of the concept, but a pragmatic take on what could
be done now (other than delay JDK9 or remove the modulepath from
JDK9).

Some basic assertions:
1) Modules != Artifacts [1]
2) Module names should thus be aligned with code, not artifacts [2]
3) Automatic modules currently derive their name from the artifact,
which is almost always wrong wrt #2
4) A module author is forced to choose the artifact name initially,
and change it to the module name when released
5) Since multiple artifact names point represent the same module [1],
there are guaranteed to be problems in large graphs
6) There is no way out when module hell hits

To succeed with distributing modules depending on automatic modules,
it seems to me that we need:
1) a naming strategy that is reliable for the person making the guess
2) an approach for JPMS to link the guessed name to an artifact

I'd argue that the naming strategy can be relatively simple - the
highest package name [2]. Feedback has been pretty much universal in
agreeing to super-package reverse-DNS so far, thus the chances of the
guess being right are definitely increased. While not a perfect, it
might just about be good enough.

The second point, what does JPMS do, has yielded various options and
much 

Re: Automatic module names

2017-02-07 Thread Robert Scholte

On Tue, 07 Feb 2017 14:20:07 +0100, Brian Fox  wrote:


On Fri, Feb 3, 2017 at 10:40 AM, Alan Bateman 
wrote:

As regards the example naming clash then these two projects might  
already
get complaints over their poor choice of artifacts, esp. when artifacts  
for
both projects are in same directory (say where someone distributes with  
all

JAR files in a `lib` directory).



This is an incorrect assumption.

At present these artifact names do not conflict, neither in the  
repository

coordinates, nor in the the class package. It's in fact the auto module
algorithm which ignores both the project's self chosen full coordinate,  
as

well as any version numbers present in the filename which is solely
responsible for introducing the conflict.


For Maven, at compile time the absolute paths of the jars in the local  
repository are used. In case of creating a distribution Maven is capable  
of detecting filename collisions, in which case the files will be renamed  
according to a strategy. Most common way is to prefix the name with the  
groupId, but the renaming strategy is up to the end-user.
This also made me aware that this would suddenly change the name of the  
automodule, unless the module-info is rewritten.


Robert


Re: Automatic module names

2017-02-06 Thread Robert Scholte
On Fri, 27 Jan 2017 16:54:59 +0100, Robert Scholte <rfscho...@apache.org>  
wrote:


On Fri, 27 Jan 2017 15:11:14 +0100, Stephen Colebourne  
<scolebou...@joda.org> wrote:



Back in October, I raised the issue of modules names generally and for
automatic modules specifically [1]. The short thread came to no
conclusion, but recent threads have again raised similar problems. The
problem is that automatic modules have magical name creation from a
filename, which is brittle and unlike anything else in Java.

I also recently looked at the Joda-Convert and Joda-Beans libraries,
to see if I could add module-info in preparation for Java 9. I quickly
backed away, again because of the same issue. Put simply, I am
unwilling to write a module-info file that refers to a dependency that
is not yet a module. And I have to advise all open source projects to
do the same. Given this, there can be no simple migration to the JPMS
for open source projects. Each open source project must wait for all
its dependencies to migrate to JPMS (by adding a module-info and
publishing to Maven Central).

The issue is clear. If I write this:

module org.joda.convert {
  requires guava;
}

where guava is an automatic module, I am locking in the name of the
guava dependency, something that I do not control. The name "guava" is
just a guess. The guava authors might choose "com.google.guava" or
something else entirely.

In a closed system of modules, ie. a private application, automatic
modules are fine, because the requires clause can be changed if it
turns out the guess was wrong. But once published as an open source
project to Maven Central or elsewhere, the guess cannot be fixed if it
is wrong (without releasing a new version of the library, which is not
an acceptable solution).

I also strongly believe that module names cannot be flat and
unstructured, such as "joda-convert" or "guava". They must have
structure, such as the domain name or a Maven-style group name
"org.joda.convert" or "org.joda:joda-convert". The potential for
clashes has been shown by the Maven team [2].

Some brainstormed possible changes:

- Remove the automatic module concept altogether

This matches proposal #2 and prevents all side effects of auto modules.



- Define a clear mapping from Maven Central co-ordinates to module
name that includes the group, artifact and classifier
Who/where to maintain this? By OpenJDK or by Maven Central? And are  
projects in the lead to specify such mapping entries? I have my doubts  
if this will work.




- Provide a text file to JPMS that allows incorrect module names to be
mapped to the correct name
Such information must be bundled as part of the module-info, otherwise  
the next project using this jar as dependency will face the same  
problems.




- Publicly advise against using automatic modules for open source  
projects
With Maven in mind, and especially the maven-compiler-plugin, this makes  
sense when a module name cannot be mapped to a dependency, which means  
it cannot decide if the jar should be on the classpath or modulepath.




- Change rules of Maven Central to prevent modular jars being added
that depend on an automatic module
Quality gateway of Maven Central is a continuous improving set of rules,  
I can imagine that analyzing the module-info would become another rule




- Allow requires clauses to have aliases - requires org.guava.guava OR  
guava.
Keep in mind there can be indirect requirements which names must be  
adjusted as well. I don't think you want to add them as new requirements  
to your project just to be able to add an alias.




- Allow modules to have aliases - module org.guava.guava AKA guava
I assume that this means that org.guave.guava is the only official  
modulename, 'guava' can only be used in case of automodules. So if there  
is another jar with "module guava {}", it does not map.





Given that applications can depend on libraries that haven't been
released in years, this has the potential to be a critical problem for
the ecosystem. My preference remains to define a clear mapping from
the widely adopted Maven Central naming strategy to JPMS modules.
Ideally, this would be a formal group concept in the JPMS, something
that I believe is sorely lacking.


There is a conflict between "ease of transition" and "stability". What I  
wonder is:

Is it acceptable to start with:

module org.joda.convert {
  // no module yet:   requires guava;
}


Let me reply to myself. Mark pointed me to a feature of automodules  
compared with classpath jars.
Classes on the classpath are not visible if you are compiling your modular  
project.

I've done a small test to ensure I did understand it correctly.

So my suggestion won't work right now, you need to specify guava, which I  
consider as a blocker. Developers are forced to refer to unnamed modules,  
even though nobody can ensure that t

Re: Automatic module names

2017-02-03 Thread Robert Scholte

Hi Nicolai,

let's consider that my project depends on the following dependencies:
com.foo.bar:library:1.0 and com.acme:library:2.3.1, both unnamed.

I somehow want to have them both as requirements:
module M.N {
  requires static library; // com.foo.bar:library
  requires library; // com.acme:library
}

How can I define that the 'requires static library' should be mapped to  
com.foo.bar:library:1.0 on the modulepath, while 'requires library' should  
be mapped to com.acme:library:2.3.1


One ugly solution would be:
  requires static library containing com.foo.bar.baz.SomeClass;
  requires library containing acme.AnotherClass;

We should really wonder if ease-of-transition is worth the minefield we're  
creating with the introduction of automodules. IMHO all options we're  
trying to add to keep automodules will only over-complicate things, not  
even being sure if all edges are covered.


thanks,
Robert

On Thu, 02 Feb 2017 12:28:13 +0100, Nicolai Parlog  wrote:


 Hi everyone,

after thinking about this a little longer, I came to the conclusion that
compile-time/launch-time aliasing might be the only way out of this (at
least the only I could come up with) that keeps automatic modules alive
and does not introduce a conceptual dependency on Maven.

The idea:

A command line option, let's say `--alias-modules A=X`, maps module name
A to module name X. Every dependency on either A or X will be resolved
to X, implying that there must a module X in the universe of observable
modules. There can be several aliases for the same module
(`--alias-modules A=X,B=X`; X needs to be observable) and they can be
chained (`--alias-modules A=X,X=Y`; Y needs to be observable)

Aliasing would of course have to be applied to qualified exports, opens,
and similar mechanisms as well.

It might be worth adding the rule that no observable module must have an
aliased name. So for `--alias-modules A=X` there must be no observable
module A. This prevents ambiguity and would effectively prevent aliasing
platform modules. That might be a good thing because it looks like
aliasing and upgrading modules has quite some overlap (or is even
identical?)

Unfortunately I could not come up with a way to limit aliasing to
automatic module names (in case that were desirable) without somehow
marking dependencies on automatic modules, likely in the module
declaration. If changing module declaration syntax is still on the
table, it could be changed so that dependencies on automatic modules
must be phrased as something like `requires automatic`.

The obvious semantics would be that only such requires clauses can be
fulfilled with automatic modules and that only such dependencies could
be aliased (this might make it prudent to phrase the aliasing option
accordingly, e.g. `--alias-automatic-modules`).

This could also be used to help developers in keeping their module
declarations clean: The compiler could to emit a warning if a `requires
automatic` clause is fulfilled by a regular module.

I would love to hear some thoughts on this idea, even if it considered
to be stupid, impractical,etc. :)

 so long ... Nicolai



On 27.01.2017 15:11, Stephen Colebourne wrote:

Back in October, I raised the issue of modules names generally and for
automatic modules specifically [1]. The short thread came to no
conclusion, but recent threads have again raised similar problems. The
problem is that automatic modules have magical name creation from a
filename, which is brittle and unlike anything else in Java.

I also recently looked at the Joda-Convert and Joda-Beans libraries,
to see if I could add module-info in preparation for Java 9. I quickly
backed away, again because of the same issue. Put simply, I am
unwilling to write a module-info file that refers to a dependency that
is not yet a module. And I have to advise all open source projects to
do the same. Given this, there can be no simple migration to the JPMS
for open source projects. Each open source project must wait for all
its dependencies to migrate to JPMS (by adding a module-info and
publishing to Maven Central).

The issue is clear. If I write this:

module org.joda.convert {
  requires guava;
}

where guava is an automatic module, I am locking in the name of the
guava dependency, something that I do not control. The name "guava" is
just a guess. The guava authors might choose "com.google.guava" or
something else entirely.

In a closed system of modules, ie. a private application, automatic
modules are fine, because the requires clause can be changed if it
turns out the guess was wrong. But once published as an open source
project to Maven Central or elsewhere, the guess cannot be fixed if it
is wrong (without releasing a new version of the library, which is not
an acceptable solution).

I also strongly believe that module names cannot be flat and
unstructured, such as "joda-convert" or "guava". They must have
structure, such as the domain name or a Maven-style group name

Re: Reusing module name token `*` in -d

2017-01-21 Thread Robert Scholte
On Sat, 21 Jan 2017 19:35:28 +0100, Stephan Herrmann  
 wrote:



On 01/21/2017 06:52 PM, fo...@univ-mlv.fr wrote:

Robert,
How do you compile these 2 modules with Maven ?

module foo {
  exports foo to bar;
}

module bar {
  requires foo;
}

when compiling 'foo' javac needs to see if 'bar' exists and when  
compiling 'bar', javac will ask to see 'foo'.


I don't think so:

"It is permitted for the to clause of an exports or opens statement to  
specify a module which is not observable."

[lang-vm.html 1.1.2 - 2017/1/9]

I assume this will eventually (when??) become part of JLS, right?

cheers,
Stephan



Confirmed. I've added an integration-test to the maven-compiler-plugin,  
works as expected. No need for cross reference dependencies.


thanks,
Robert


Re: Reusing module name token `*` in -d

2017-01-21 Thread Robert Scholte

On Sat, 21 Jan 2017 14:55:50 +0100, Nicolai Parlog  wrote:


 Hi!


Ah, i see why you have a problem, a jigsaw module != a sub project

A sub project with your layout will contain several modules if you
prefer. A jigsaw module is more lightweight that the other kind of
''modules'' you usually find in Java and features like restricted
export or uses/provides also force several modules to be compiled
together.


That's interesting. As far as I understood up to now a single POM (to
fall back on a known build tool) will by default correspond to a
single module. Isn't that so? Or in other words how would I use Maven
or Gradle to effortlessly create multiple artifacts?

 so long ... Nicolai



Maven is not using module-source-path, but remains using source-path so  
the current structure can stay as it is, both input and output folders.


And this works, because one pom represents one jar (=represents one module)

Robert




On 21.01.2017 14:32, fo...@univ-mlv.fr wrote:

- Mail original -

De: "Nicolai Parlog"  À: fo...@univ-mlv.fr Cc:
jigsaw-dev@openjdk.java.net Envoyé: Samedi 21 Janvier 2017
13:08:24 Objet: Re: Reusing module name token `*` in -d



Hi Remi.


Hi Nicolai,




My advice is to not try to fight the module layout, it's like
trying to fight ocean waves, it's better to surf on it.


My personal opinion is that the proposed layout with a src folder
at the top is not going to see a lot of adoption.

The main reason for that is that I think each sub-project/module
should have a directory structure just to itself to store
sources, tests, resources, configuration, build scripts,
documentation, source control info, etc. The fact that most build
tool and IDEs understand this structure by default underlines
that thought. And not only does the proposed structure not add
any benefits (that I can see), it also comes at a considerable
costs because (a) all tools have to be taught "the new way" and
(b) a migration is a lot of work.

So I believe the
`/src/{main,test}/{java,resources,whatever}` structure is
here to stay.


Ah, i see why you have a problem, a jigsaw module != a sub project

A sub project with your layout will contain several modules if you
prefer. A jigsaw module is more lightweight that the other kind of
''modules'' you usually find in Java and features like restricted
export or uses/provides also force several modules to be compiled
together.




And yes it means that if you want to modularize an already
existing project, you have to change its layout to be jigsaw
compatible


I disagree.

I'm not sure how essential it is for tools to have the compiled
classes land in  `/target/classes`. If it is important,
they could not have compiled several sub-projects at the same
time anyways (unless I'm missing something). If they already put
all compiled classes into the same folder, then multi-module
builds will work just fine for them.

Not being able to do multi-module builds into
`/target/classes` or similar is hence no new limitation
from Java 9 and compiling modules one by one can be done for
arbitrary directory structures.


see my comment above about not being able to compile multiple
modules in isolation.

also Java has never supported an arbitrary layout, packages has to
be organized in a certain way and now that jigsaw modules are part
of the language. I think the problem is more that what you call a
module may not be what Java calls a module.



At the same time it looks to me that the concept of a module
name token opens up the possibility to create a feature that
didn't exist before and allows tools to compile many modules at
once where they couldn't before.

so long ... Nicolai


cheers, Rémi





On 21.01.2017 11:37, Remi Forax wrote:

Hi Nicolai, the runtime (ModuleFinder) is able to read
exploded modules, .class in folders, not only modules in jars,
so the layout on disk is more or less fixed.

My advice is to not try to fight the module layout, it's like
trying to fight ocean waves, it's better to surf on it. And yes
it means that if you want to modularize an already existing
project, you have to change its layout to be jigsaw compatible,
this is exactly what was done for the jdk.

regards, Rémi

- Mail original -

De: "Nicolai Parlog"  À:
jigsaw-dev@openjdk.java.net Envoyé: Samedi 21 Janvier 2017
11:00:35 Objet: Reusing module name token `*` in -d



Hi!

Another feature request from the trenches regarding
multi-module compilation. (It is possible that there was a
similar thread a couple of days/weeks (?) back but I didn't
find it.)

It would be nice to have the ability to specify module
specific target folders, so they do not automatically end up
in `/`.

It seems obvious (which could very well make it stupid) to
reuse the asterisk here and allow something like

javac --module-path mods --module-source-path
"./*/src/main/java" -d "./*/target/classes" -module
initial.module

I have not thought through how this might 

[ANN] Apache Maven Compiler Plugin 3.6.1 Released

2017-01-16 Thread Robert Scholte
The Apache Maven team is pleased to announce the release of the Apache  
Maven Compiler Plugin, version 3.6.1


Most important change is the support for test-compile when using JDK 9  
build b148+


https://maven.apache.org/plugins/maven-compiler-plugin/

You should specify the version in your project's plugin configuration:


  org.apache.maven.plugins
  maven-compiler-plugin
  3.6.1


You can download the appropriate sources etc. from the download page:

https://maven.apache.org/plugins/maven-compiler-plugin/download.cgi

Release Notes - Maven Compiler Plugin - Version 3.6.1

** Bug
* [MCOMPILER-282] - Remove link to non-existing Codehaus wiki
* [MCOMPILER-284] - maven.test.skip doesn't skip test compilation
* [MCOMPILER-287] - Adjust documentation module-info

** Documentation
* [MCOMPILER-281] - Remove reference to Maven 1's Java plugin

** Improvement
* [MCOMPILER-285] - Support test-compile for JDK 9 build b148+

Enjoy,

-The Apache Maven team


Re: modules and tests

2016-11-25 Thread Robert Scholte

On Thu, 24 Nov 2016 15:39:19 +0100, Remi Forax  wrote:

setting command line arguments or using a build tool to fiddle them for  
you is exactly what we do not want here! We want fidelity between the  
compile time configuration and the runtime configuration. Having to play  
with -Xpatch at runtime is conceptually exactly like setting the  
classpath. I don't want to explain to the Java devs that we have  
fidelity between compile-time and runtime on source code but not on test  
code.


I agree on this one. I've been thinking about this a lot and I'm wondering  
if this is a Java issue or test-tool issue.
What I see with JUnit is that everything is added to the (class)path. I've  
been wondering if having separate arguments for the main classes and test  
classes would make it possible to prevent the patch argument while  
chaining classloaders.

e.g. java -jar junit.jar -DmainPath= -DtestPath= ...

in Maven terms: mainPath will contain all compile-dependencies, testPath  
will contain all test-dependencies WITHOUT the compile-dependencies.


However, is this enough to support split packages?

Robert


Re: onejars under Jigsaw

2016-11-19 Thread Robert Scholte

Hi,

The following topics have been created for this issue:
http://openjdk.java.net/projects/jigsaw/spec/issues/#MultiModuleExecutableJARs
http://openjdk.java.net/projects/jigsaw/spec/issues/#MultiModuleJARs

Once resolved we should improve the maven-shade-plugin according the new  
specifications.


Robert


On Fri, 18 Nov 2016 22:40:13 +0100,  wrote:


Hello!

When I write command line applications, I typically produce an
additional platform-independent "onejar" for convenience. More
specifically, I use the Maven Shade plugin to pack all of the classes
of all of the dependencies into a single jar with MainClass attribute.
The main benefit of doing things this way is that the jar file remains
platform independent (assuming that the code itself is platform
independent). A good example of this is my kstructural package:

  http://io7m.github.io/kstructural/

The main command-line jar program is an amalgamation of all of the
other modules and all dependencies:

  
https://repo1.maven.org/maven2//com/io7m/kstructural/io7m-kstructural-cmdline/0.3.0/io7m-kstructural-cmdline-0.3.0-main.jar

Is there already a facility to do this under Jigsaw? Jlink is not quite
what I'm looking for in this case, because the produced artifacts would
be platform-specific.

Clearly, producing onejars in pre-Jigsaw Java is like taking all of the
problems of the classpath and smashing them into one unfixable lump for
deployment. The fact that we often get builds that appear to work using
this method seems to owe a lot to blind luck.

No doubt doing this sort of transformation is a hell of a lot safer when
there are module boundaries to work with, services declared in module
descriptors, and so on.

I suspect I could address the same question to the Maven Shade list,
but I thought I'd better check here first. :)

M


Re: module-info hygiene

2016-10-17 Thread Robert Scholte
On Mon, 17 Oct 2016 12:59:25 +0200, Alan Bateman   
wrote:



On 17/10/2016 08:32, Peter Levart wrote:


:

Do we need an --exclude-modules (in addition to --add-modules) option  
on javac, java and jlink commands?


--exclude-modules would be different to --limit-modules. If some module  
requires module M and there is no module M on the module path or it is  
not observable because it was not mentioned in the --limit-modules  
option, then an exception is raised. OTOH if some module X requires  
module M and module M is mentioned in the --exclude-modules option,  
then such requires is silently ignored in hope that module X will not  
actually need types from module M.
The module declaration is intended to be authoritative and so we have to  
trust module author when they declare that the module `requires M`. So  
my view is that options such as --exclude-modules that would have the  
effect of dropping requires puts us on the road to anarchy.


That said, I do see Robert's concern that there might be orphaned  
`requires` clauses in some modules.  My module started using the  
preferences API but later the implementation changed to use something  
else. I neglected to remove the `requires java.prefs` from the module  
declaration and the result is that my module cannot compile against or  
run on a run-time image that doesn't include this module. Static  
analysis tools might help here, as might the IDE. We are used to IDEs  
highlighting unused `import` statements and in time then I expect they  
will do the same for apparently unused `requires` clauses in  
module-info.java. If the usage is purely reflective then the module  
author might need to put a comment on the `requires` clause to avoid  
other maintainers from removing it (a bit like "// used by javadoc" in  
comments today when an import is for an unqualified reference in the  
javadoc).


Another part to Robert's mail is the case where something is making use  
of types in modules that it doesn't depend on. Assuming these are static  
references then they will be caught at compile-time. This is big  
improvement compared to today's class path.


A more general comment is that module authors will need to learn a few  
new things about compatibility and refactoring. One example is changing  
`requires transitive M` to `requires M` is an incompatible change.   
Another is splitting a module (several sub-cases) where the module  
author will need to add `requires transitive` to avoid breaking  
consumers. There are lots of opportunities here for authoritative books  
and documentation to help module authors do this right.


-Alan

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



I understand why *in concept* the --exclude-modules is an unwanted option.  
The module-info clearly states "requires A.B", otherwise it should have  
been marked as "optional" or simply removed.
Now that the user fully relies on the discipline of the library-builders:  
users cannot control the modules, nor will the compilation fail in case of  
an incorrect module-info.
It is really a matter of hoping that library-builders are aware of this  
and maybe it will make libraries more popular based on the quality of the  
module-info instead of the quality of the classes. As a user you probably  
don't want to be forced to choose on these facts.
And for the smaller and medium application this will work, but for the  
larger this can really become problematic.


Up until now the compiler was always about "is everything on the classpath  
to compile the classes?". If there is more, we'll, it'll be ignored.  
"More" was never a problem. And if it was a problem, the user could fix it.


Now we have the module-info, and it is actually a safety-belt for the  
library-builder! Now he can never be blamed (almost): the module-info  
contains at least all info to compile and run this library, maybe even  
more for free.
But with a lot of libraries with their own safety-belts there can be (and  
will be) conflicts and there's nothing you can do right now (apart from  
dropping all safety-belts).
For the end-user all these small safety-belts doesn't feel very "safe". He  
would feel much better if he had some of the control back (and yes, he's  
very well aware of the possible consequences).


The introduction of the module-info comes with great powers, but that  
comes with great responsibilities as well. I would like to see that the  
compiler could help with controlling those required modules (which would  
mean that "More" is considered to be a problem). Static analysis is IMHO  
just a hint, ignorable, but to me it shouldn't be that way.


Robert


Re: module-info hygiene

2016-10-17 Thread Robert Scholte
I didn't had dead code in mind. And as Remi explained it cannot be  
detected on a jar-base, only on an application base. (which reminds me  
that we also need to have a look at the minimizeJar option of the  
maven-shade-plugin).
My idea was more about collecting all classes required to compile the  
sourcefiles and verify that of all required modules at least one class is  
used. In case of "transitive" it should also verify the method-signatures.
And yes, in this case I assume that modules which classes are accessed by  
reflection are optional, which sounds fair enough to me.


Robert

On Mon, 17 Oct 2016 13:06:59 +0200, Remi Forax <fo...@univ-mlv.fr> wrote:


The compiler can not detect dead code because it can be a library.
jlink can detect dead code and provide a list of unneeded modules  
because it has the view of the whole application.


Rémi

On October 17, 2016 10:45:26 AM GMT+02:00, Andrew Haley <a...@redhat.com>  
wrote:

On 16/10/16 19:52, Robert Scholte wrote:


To enforce the discipline, the java compiler should IMHO at least
check if all required modules are indeed required and if the
transitive required modules are indeed transitive.


How can the compiler possibly know this?  There are ways of requiring
a module without naming it in a declaration.

Andrew.


module-info hygiene

2016-10-16 Thread Robert Scholte

Hi,

with the introduction of the module-info something interesting is  
happening. Up until now the scope of a Java project was limited to the  
compilation of the classes. In case of Maven the end-user was in full  
control regarding the classpath and the order of entries. With the order  
of the dependencies you can control the order of classpath entries. You  
could add your own dependencies but could also exclude them. The exclude  
is especially powerful in those cases where library builders have made  
mistakes (e.g. junit without test-scope) or simply forgot to remove  
dependencies when refactoring code.
The first project poms are often quite clean, but it requires discipline  
to keep cleaning up your pom, e.g. removing unused dependencies. However,  
you're not really punished if you don't do this.


With the shift to module-info, suddenly every library-builder gets  
control. If the module-info of that library "requires A.B", it means that  
every project using this library MUST have A.B on its module-path. As  
end-user you cannot exclude A.B, nor can you say "B.A replaces A.B for  
x.y.z" in case of duplicate classes as allowed on the classpath. In short:  
the end-users must rely on the discipline of library builders.


This loss of control for the end-user will have huge impact. Maven has a  
analyze-goal in the maven-dependency-plugin which show both unused  
declared dependencies and used undeclared dependencies (which means pulled  
in as transitive dependency, even though the code directly uses it).  
Almost every time I run this goal on any project it detects dependencies  
in one of both groups.


To enforce the discipline, the java compiler should IMHO at least check if  
all required modules are indeed required and if the transitive required  
modules are indeed transitive. The role of the module-info is way too  
important to simply allow all content. The scope is not just about the  
classes anymore; the complete module tree is now locked into the jar. Any  
mis-configured module-info down the tree can never be fixed by the  
end-user, which could block the end-user to use the modulepath.


just sharing my concerns,
Robert


Re: Module names - related to #ModuleNameSyntax

2016-10-11 Thread Robert Scholte
One thing that is missing here is that you can only control the direct  
dependencies, you cannot control the transitive dependencies.


To complete the example:
module com.mycompany:foo-utils {
   requires guava; // automodule from google
}

but we also have
module com.acme:bar-utils {
  requires guava; // automodule from acme
}

two modules which both require a guava module, though not the same guava  
module.


With my application:

module myapplication {
  requires com.mycompany:foo-utils;
  requires com.acme:bar-utils;
}

I don't control foo-utils nor bar-utils, those are third party  
dependencies.
This application requires 2 different guava modules, but without the  
"groupId" you cannot decide which one to choose. The proposal to fail when  
2 automodules have the same name implies that myapplication can never  
become a module.


Maven has a project which cannot be transformed to a module with the  
current and proposed specs:
Aether is a project which handles the artifact resolution of Maven. It was  
first developed by Sonatype (groupId org.sonatype.aether; package  
org.sonatype.aether.*), than handed over to Eclipse (groupId  
org.eclipse.aether; package org.eclipse.aether.*). Maven 3.0.x uses  
Sonatype's Aether, Maven 3.1+ uses Eclipse's Aether. (and we know there  
won't be any Aether release by Sonatype nor by Eclipse)
For the Maven3 plugins which do something with artifact resolution and  
transfer we wrote maven-artifact-transfer, which selects the proper group  
of Aether artifacts based on the Maven version.
The current automodule naming is not strong enough to convert  
maven-artifact-transfer to a module; it will simply not compile because of  
the duplicate artifact names.


Robert

On Mon, 10 Oct 2016 13:47:45 +0200, Stephen Colebourne  
 wrote:



"At JavaOne I asked a question about the naming standards for modules
given that we have automatic modules.

The scenario is as follows:
- an open source module foo that depends on Google guava
- guava has not yet been modularised
- guava is provided by jar file guava-19.0.jar and used as an automatic  
module

- foo declares "requires guava", where "guava" is a module name
derived from the jar file name

At some later time, guava is modularised, but chooses the sensible
module name "com.google.guava". Now, the module foo (released as open
source to maven central) has the wrong module name for guava.

Given this scenario, the Jigsaw team suggested that the correct module
name for guava is "guava". (The only alternative strategy at the
moment would be for open source projects published to maven central to
*never* declare requires on an automatic module).

I, and others, have expressed concern about a naming strategy for
modules that is not based on the reverse domain name strategy, eg
com.google or org.joda. It seems common to me that companies will have
modules that have the same name as publicly available modules in maven
central. The migration problem described above also seems overly
restrictive.

In addition, there are related problems to do with projects that fork
but want to retain the same name (maybe not desirable, but GutHub
forking is common).

Proposal
-
- a new concept *group name* should be introduced above module name
- the combination groupName:moduleName should always match the maven
group:artifact coordinates
- messaging in this way to the community will avoid migration issues
as the standard is already in use and accepted
- in module-info.java, the user must always specify both the group and
module name
- the requires clause may specify or omit the group name but always
has the module name
- best practice would be to include the group name to ensure reliable
configuration
- when depending on automatic modules, the group name would be omitted
- if omitted, the group name is inferred by the system from the
available modules on the module path
- automatic modules are in the unamed group

With this setup, the migration problem outlined above disappears. The
fully qualified name of guava would be "com.google.guava:guava", as
per maven naming standards [1]. Anybody who had depended on the
automatic module for guava using "requires guava" would still work (in
all except edge cases).

This would look for a module named "guava" in any group (likely to be
an automatic module). If found in two groups, the system fails to
start:
module com.mycompany:foo-utils {
  requires guava;
}

This would look for a module named "guava" in group "com.google.guava:
module com.mycompany:foo-utils {
  requires com.google.guava:guava;
}


This relates to #ModuleNameSyntax, in that the module name will be
more distinct - either with a colon, or being the short form. Ideally,
the convention would be for the module name to use dashes, not dots.
Thus, Joda-Beans would be "org.joda:joda-beans" when fully qualified.

Given the widespread adoption of maven, the combination of
group:artifact is very well known and 

Review Request: Apache Maven Recipe for module-info and older projects

2016-10-04 Thread Robert Scholte

Hi,

I've written a page[1] about the problem related to projects which need to  
be compatible with pre Java9 versions, but also want to provide a  
module-info file in case it is used in a Java 9 project. This is mainly an  
issue for library builders, end-application builders can simply push  
everything to Java 9.


Let me also explain why I didn't go for any other options:
- 1 execution-block to rule them all (maven-compiler-plugin magic): this  
would mean you need to introduce several parameters for the compilation of  
the module-info file. And we must assume they cannot always use the same  
JDK. To keep enough strength for the end-user you must provide duplicates  
of source/target/release, jdkToolchain, includes, excludes and maybe more.
  The next issue is about result handling: the result of both javac calls  
need to be merged, in all combinations of success and failure. In case of  
failure you must provide the exact information. Whoever tries to implement  
this just like I did must admit it results in ugly and hard controllable  
code.
-  2 separate source folders could be a pattern, but probably not a  
developer-friendly option. This also depends on what IDEs are going to do.
- Multimodule JAR: feels quite expensive to have a Maven MultiModule for  
only one file. Also in this case we might need to wait and see what kind  
of solutions IDEs have.
- auto-ignore module-info in case source/target/release < 9: I've decided  
not to do this. Uninformed developers might think that the module-info is  
automatically compiled. However, in the end the jar won't have the  
module-info file. Better break the build and point them to the module-info  
page[1]


I've written an integration test[2] matching this concept.

As said: this is mainly an issue for library builders. I expect that they  
don't have a real issue with this small amount of extra configuration in  
their pom. And this way they are still in full control.


Any comment is appreciated,
Robert

ps. .apt-files will be transformed to html with the maven-site-plugin as  
part of every release.


[1]  
http://svn.apache.org/repos/asf/maven/plugins/trunk/maven-compiler-plugin/src/site/apt/examples/module-info.apt.vm
[2]  
http://svn.apache.org/repos/asf/maven/plugins/trunk/maven-compiler-plugin/src/it/MCOMPILER-275_separate-moduleinfo/


Re: [MRJAR] Entry order matters?

2016-09-08 Thread Robert Scholte
Confirmed there's an issue; jar should have been recognized as a  
multirelease jar.


https://bugs.openjdk.java.net/browse/JDK-8165723


thanks,
Robert

On Mon, 05 Sep 2016 09:10:56 +0200, Alan Bateman <alan.bate...@oracle.com>  
wrote:





On 04/09/2016 21:56, Robert Scholte wrote:

:


Also if you "unzip   -d " then does everything look okay?



hboutemy$ unzip -t multirelease-0.8-SNAPSHOT_failure.jar | grep  
class

hboutemy testing: base/Base.class OK
hboutemy testing: META-INF/versions/9/mr/A.class OK
hboutemy testing: META-INF/versions/8/mr/A.class OK
hboutemy testing: mr/A.class OK
hboutemy$ unzip -t multirelease-0.8-SNAPSHOT_success.jar | grep  
class

hboutemy testing: base/Base.class OK
hboutemy testing: META-INF/versions/8/mr/A.class OK
hboutemy testing: mr/A.class OK
hboutemy testing: META-INF/versions/9/mr/A.class OK

Looks good to me.
I think I need to see multirelease-0.8-SNAPSHOT_failure.jar to  
understand this as I suspect there is more going on that is obvious  
here. I'll see if I can duplicate it.


-Alan


Re: [MRJAR] Entry order matters?

2016-09-04 Thread Robert Scholte
On Sun, 04 Sep 2016 22:50:00 +0200, Alan Bateman <alan.bate...@oracle.com>  
wrote:



On 04/09/2016 18:01, Robert Scholte wrote:


Hi,

we have this demo application[1] to show how you can generate a  
multirelease JAR right now with Maven.
However, in my case the result for Java9 is very unstable. Most of the  
time I get something like

9-ea+133-jigsaw-nightly-h5435-20160828
BASE

but I would expect
9-ea+133-jigsaw-nightly-h5435-20160828
FROM BASE -> NINE

Once I had both a successful and a failing jar, I compared the content:
Just to double check. For the failure case then are you sure that that  
"Multi-Release: true" is in the main manifest.

Yes


Also if you "unzip   -d " then does everything look okay?



hboutemy$ unzip -t multirelease-0.8-SNAPSHOT_failure.jar | grep class
hboutemy testing: base/Base.class OK
hboutemy testing: META-INF/versions/9/mr/A.class OK
hboutemy testing: META-INF/versions/8/mr/A.class OK
hboutemy testing: mr/A.class OK
hboutemy$ unzip -t multirelease-0.8-SNAPSHOT_success.jar | grep class
hboutemy testing: base/Base.class OK
hboutemy testing: META-INF/versions/8/mr/A.class OK
hboutemy testing: mr/A.class OK
hboutemy testing: META-INF/versions/9/mr/A.class OK

Looks good to me.

Robert


-Alan


[MRJAR] Entry order matters?

2016-09-04 Thread Robert Scholte

Hi,

we have this demo application[1] to show how you can generate a  
multirelease JAR right now with Maven.
However, in my case the result for Java9 is very unstable. Most of the  
time I get something like

9-ea+133-jigsaw-nightly-h5435-20160828
BASE

but I would expect
9-ea+133-jigsaw-nightly-h5435-20160828
FROM BASE -> NINE

Once I had both a successful and a failing jar, I compared the content:
success:
META-INF/MANIFEST.MF
META-INF/
base/
mr/
META-INF/maven/
META-INF/maven/multirelease/
META-INF/maven/multirelease/multirelease-base/
META-INF/versions/
META-INF/versions/8/
META-INF/versions/8/mr/
META-INF/versions/9/
META-INF/versions/9/mr/
base/Base.class
META-INF/maven/multirelease/multirelease-base/pom.properties
META-INF/maven/multirelease/multirelease-base/pom.xml
META-INF/versions/8/mr/A.class
mr/A.class
META-INF/versions/9/mr/A.class

failure:
META-INF/MANIFEST.MF
META-INF/
base/
mr/
META-INF/versions/
META-INF/versions/8/
META-INF/versions/8/mr/
META-INF/versions/9/
META-INF/versions/9/mr/
base/Base.class
META-INF/maven/multirelease/
META-INF/maven/multirelease/multirelease-base/pom.properties
META-INF/versions/9/mr/A.class
META-INF/versions/8/mr/A.class
META-INF/maven/multirelease/multirelease-base/pom.xml
mr/A.class
META-INF/maven/multirelease/multirelease-base/
META-INF/maven/

AFAIK the JEP238 doesn't mention something about order.
This is an issue on both Windows and Mac.
Anyone who has an explanation?

regards,
Robert Scholte

[1] https://github.com/hboutemy/maven-jep238
[2] http://openjdk.java.net/jeps/238


Re: Building jar targeting multiple Java versions, including 9

2016-08-31 Thread Robert Scholte
I've been working on the implementation of this in the  
maven-compiler-plugin, but I'm not really pleased with the result.
The problem is that in the worst case scenario we have to work with 3  
different versions of Java:

- The Maven Runtime (set as JAVA_HOME)
- JDK for the module-info.java
- JDK for all other source files.

The example below worked because all three were set to JDK9.
But based on the source/target of 1.6 I cannot predict which JDK is used,  
only that it is at least JDK6. Should the plugin switch to another JDK?
And if you want to compile with source/target 1.5 or less, you're in  
trouble. There's something called toolchain, where you can specify the  
JavaHome per version, but in case of maven-compiler-plugin it assumes that  
all java-related plugins and execution blocks want to use the same  
toolchain through the whole Maven project.
The good news is that for the maven-jdeps-plugin I improved this part in  
Maven 3.3.1, since this plugin only works with Java8 and above, which  
doesn't have to be the same JDK to compile the sources with. Now you can  
simple say: I want the toolchain for version X. This feature needs to be  
added to the plugin.


That said I think I will write a recipe for this. This is first of all an  
issue for library writers who want to have their jar compatible with  
multiple Java versions for their end users.

Result: One javac call per execution block as it was meant to be.

thanks,
Robert

On Fri, 26 Aug 2016 15:31:07 +0200, Oliver Gondža   
wrote:


Thank you all for your suggestions. I managed to get the project to  
build with following maven setup:


```
   
   
 
   
 org.apache.maven.plugins
 maven-compiler-plugin
 3.5.1
 
   1.6
   1.6
 
 
   
 default-compile
 
   
 **/module-info.java
   
 
   
 
   
 
   

   
 
   jigsaw
   
 [1.9,)
   
   
 
   
 org.apache.maven.plugins
 maven-compiler-plugin
 3.5.1
 
   1.9
   1.9
 
 
   
 module-infos
 compile
 
   compile
 
 
   
 **/module-info.java
   
 
   
 
   
 
   
 
   
```

It does compile with older javac versions as a bonus. Given this is  
nothing else than using `-source 9 -target 9` for module-info.java if  
present, I dare to say maven-compiler-plugin can be adapted to figure  
this out on its own.


Re: Building jar targeting multiple Java versions, including 9

2016-08-26 Thread Robert Scholte
On Fri, 26 Aug 2016 13:31:47 +0200, Alan Bateman <alan.bate...@oracle.com>  
wrote:



On 26/08/2016 11:49, Robert Scholte wrote:


Hi,

I'm struggling with this issue too.
I would have liked to see all the files under src/main/java, but since  
module-info cannot be compiled at the same time as the other files we  
need to do some extra things:


Possible solutions:
- Keep them all in the same folder but let Maven do 2 javac executions,  
auto-selecting the right files. I don't like this kind of magic and  
when the configuration of the maven-compiler-plugin becomes complex,  
there's a chance the magic is doing it wrong.
- Introduce an extra sourcefolder. This is by far the most clean  
solution and with good documentation we should be able to explain this.

I've created MCOMPILER-275[1] to implement this new feature.
The intention is that the module-info.java be in the root directory and  
I could imagine it confusing developers if they have to edit it in a  
separate source tree. I also worry that it would give the impression  
that a separate source tree is the right way to structure the source  
code. I'm also not sure how it would work in the IDE. Then we have the  
issue that the src and output tree no longer matching, that could be  
confusing too. Then we have the question of projects targeting >= JDK 9  
where I assume you would have the module-info.java in the root directory.


How difficult would it be for the compiler plugin to special case  
module-info.java when compiling for an older release? Would there be  
something in the POM to indicate whether the project produces a module,  
as a modular JAR?


The maven-compiler-plugin has the option of includes/excludes (the files  
to compile), so yes it is possible to do this.

But it implies that the plugin will be in control.
There is already some logic based on the existence of a module-info file,  
which causes to switch from classpath to modulepath. So far there was no  
need for extra configuration and I hope it is not required in this case.


I expect the challenge to be a lot bigger for IDEs.

Robert



-Alan.


Re: Building jar targeting multiple Java versions, including 9

2016-08-26 Thread Robert Scholte

Hi,

I'm struggling with this issue too.
I would have liked to see all the files under src/main/java, but since  
module-info cannot be compiled at the same time as the other files we need  
to do some extra things:


Possible solutions:
- Keep them all in the same folder but let Maven do 2 javac executions,  
auto-selecting the right files. I don't like this kind of magic and when  
the configuration of the maven-compiler-plugin becomes complex, there's a  
chance the magic is doing it wrong.
- Introduce an extra sourcefolder. This is by far the most clean solution  
and with good documentation we should be able to explain this.

I've created MCOMPILER-275[1] to implement this new feature.

thanks,
Robert

[1] https://issues.apache.org/jira/browse/MCOMPILER-275



On Fri, 26 Aug 2016 12:02:02 +0200, Alan Bateman   
wrote:



On 26/08/2016 09:52, Oliver Gondža wrote:

I am about to stretch support of my project from java 6, 7 and 8 to  
version 9 as well. It does not work there out of the box as it accesses  
sun.tools.attach.HotSpotVirtualMachine:


```
class MyClass (in unnamed module @0x4d14b6c2) cannot access class  
sun.tools.attach.HotSpotVirtualMachine (in module jdk.attach) because  
module jdk.attach does not export sun.tools.attach to unnamed module

@0x4d14b6c2.
```

Before I had a chance to experiment with introducing modules to my  
application and declaring dependency on jdk.attach, my project refuses  
to compile on java 9 as soon as I add module-info.java as I instruct  
javac to produce java 6 bytecode (-target 1.6 -source 1.6):


```
modules are not supported in -source 1.6; use -source 9 or higher to  
enable modules


You can invoke javac twice, as Uwe mentions. One starting point is:

javac -release 6 -d classes src/p/MyClass.java
javac -d classes src/module-info.java

The resulting classes should work with JDK 6+ on the class path, or as a  
module on the module path with JDK 9. The important thing with the  
second compilation is that you specify the same output directory as the  
compiler access to the source or class files when compiling the module  
declaration.


I see multi release JARs have been mentioned. This is also something to  
look at, assuming you end up with classes (beyond module-info) that are  
Java release specific.


In your mail then the class is "MyClass", I'm guessing this isn't really  
your actual class name. If it is then keep in mind that named modules  
require the classes to be in packages, you can't have types in the  
unnamed package in a named module.


On the attach API then the supported/documented API is  
com.sun.tools.attach. It's never been documented to make direct use of  
types in sun.tools.attach. Are you casting the VirtualMachine to a  
HotSpotVirtualMachine and hacking into implementation? It might be best  
to explain what you are doing. You can of temporarily break  
encapsulation to allow the above to compile/run with `--add-exports  
jdk.attach/sun.tools.attach=", where  is your module  
name or ALL-UNNAMED if your library is on the class path.


-Alan


Re: Discover modulename

2016-08-25 Thread Robert Scholte

Hi,

JavaOne spoiler alert:
I've been able to add this kind of information to the output of the  
maven-dependency-plugin:


[INFO] --- maven-dependency-plugin:3.0.0-SNAPSHOT:list (default-cli) @  
maven-dependency-plugin ---

[INFO]
[INFO] The following files have been resolved:
[INFO]com.google.code.findbugs:jsr305:jar:2.0.1:compile -- module  
jsr305
[INFO]org.apache.maven.wagon:wagon-provider-api:jar:1.0-beta-6:compile  
-- module wagon.provider.api

[INFO]org.apache.maven:maven-compat:jar:3.0:test -- module maven.compat
[INFO]org.apache.maven.doxia:doxia-decoration-model:jar:1.4:compile --  
module doxia.decoration.model
[INFO]org.apache.maven:maven-settings-builder:jar:3.0:compile --  
module maven.settings.builder
[INFO]org.sonatype.aether:aether-util:jar:1.7:compile -- module  
aether.util

[INFO]org.apache.maven:maven-core:jar:3.0:compile -- module maven.core
etc.

this way users have a relative simple way to get a complete overview of  
all the modules used.

I might tweak the output, but the info is there.

Robert

On Thu, 25 Aug 2016 12:26:48 +0200, Robert Scholte <rfscho...@apache.org>  
wrote:



Hi,

In an old thread I asked this before and got the following answer:
"One way is `jar --file foo.jar -p`. That will print the module  
descriptor when the JAR file is a modular JAR. There is API support for  
reading the binary form of the module declaration too."


With the renaming of the arguments I assume it is now:
jar --file foo.jar --print-module-descriptor

However, this gives me an exception on the asm-6.0_ALPHA.jar

d:\jdk-9\bin\jar --file asm-6.0_ALPHA.jar --print-module-descriptor
java.lang.module.InvalidModuleDescriptorException: Index into constant  
pool out of range
 at  
java.lang.module.ModuleInfo.invalidModuleDescriptor(java.base@9-ea/ModuleInfo.java:804)



How about jars without module descriptor? Is there a commandline option  
to discover what the name of an automodule will be? I could guess the  
name, but I'd prefer to show the calculated name.


thanks,
Robert


Discover modulename

2016-08-25 Thread Robert Scholte

Hi,

In an old thread I asked this before and got the following answer:
"One way is `jar --file foo.jar -p`. That will print the module descriptor  
when the JAR file is a modular JAR. There is API support for reading the  
binary form of the module declaration too."


With the renaming of the arguments I assume it is now:
jar --file foo.jar --print-module-descriptor

However, this gives me an exception on the asm-6.0_ALPHA.jar

d:\jdk-9\bin\jar --file asm-6.0_ALPHA.jar --print-module-descriptor
java.lang.module.InvalidModuleDescriptorException: Index into constant  
pool out of range
at  
java.lang.module.ModuleInfo.invalidModuleDescriptor(java.base@9-ea/ModuleInfo.java:804)



How about jars without module descriptor? Is there a commandline option to  
discover what the name of an automodule will be? I could guess the name,  
but I'd prefer to show the calculated name.


thanks,
Robert


Paths in Mutli Release Jar

2016-05-27 Thread Robert Scholte
I noticed that the path is still META-INF/*versions*/ even though the JEP  
has been renamed from Multi Version jar to Multi Release jar. Is that  
intended or should/will that be changed as well?


thanks,
Robert


Javadoc custom Taglets

2016-04-17 Thread Robert Scholte

Hi,

in preparation of the DevoxxFr talk about Maven and Java9 by Hervé Boutemy  
and Arnaud Héritier I noticed some issues with custom taglets when  
generation Javadoc reports.


For the developers of Maven plugins we have a set of Annotations or  
DocletTags[1] to generate the plugin descriptor.
The taglets are based on com.sun.tools.doclets.Taglet, but when trying to  
run this with Java 9-ea+113 I get the following exception.


Caused by: org.apache.maven.plugin.MojoExecutionException: An error has  
occurred in JavaDocs report generation:
Exit code: 1 - javadoc: error - Error - Exception  
java.lang.ClassCastException thrown while trying to register Taglet  
org.apache.maven.tools.plugin.javadoc.MojoAggregatorTypeTaglet...


What's the preferred way to write custom Taglets when using Java9 and will  
this also work with older versions of Java?


thanks,
Robert

[1]  
https://maven.apache.org/components/plugin-tools/maven-plugin-tools-javadoc/


Re: modulepath and classpath mixture

2016-03-19 Thread Robert Scholte
On Thu, 17 Mar 2016 21:23:25 +0100, Alan Bateman <alan.bate...@oracle.com>  
wrote:





On 17/03/2016 19:51, Robert Scholte wrote:

:

To me it seems like the need for knowing the module name keeps  
returning.
This increases the need for a proper implementation of the  
maven-compiler-plugin as a multirelease JAR.
The pattern as shown during FOSDEM showed that the idea works, but it  
is not solid enough.
And the next question would be: can Maven (or actually Plexus  
ClassWorld) handle this?


I'll need to work out the things to be done and try to get more Maven  
developers involved.
Would it you take it from the source module-info.class or the compiled  
module-info.class? The former would require a small parser. The latter  
is not difficult with ASM.


-Alan


I can do the former with QDox, for the latter I had JDK APIs in mind, but  
if ASM is an option too, that's an interesting option. Need to figure out  
how to do that.


Robert


Re: modulepath and classpath mixture

2016-03-08 Thread Robert Scholte
On Mon, 07 Mar 2016 14:53:28 +0100, Sander Mak   
wrote:




I don't think I understand the issue here. Using -Xpatch doesn't change  
the module declaration or export. It can be used to override or augment  
the module content, it just can't override the module declaration. It  
can be used in conjunction with -XaddReads and -XaddExports to read  
additional modules or export additional packages. For example, if a  
patch adds types to a new package then you could export that package  
with -XaddExports. If the patch injects tests into an existing package  
then those tests might have new dependences and requires compiling or  
running with -XaddReads:$MODULE=junit for example.


I was playing around with exactly this yesterday, and this is what I  
ended up with:


javac -Xmodule:javamodularity.easytext.algorithm.naivesyllablecounter \
  -XaddReads:javamodularity.easytext.algorithm.naivesyllablecounter=org.junit  
\

  -mp mods:lib-test \
  -d  
mods-test/javamodularity.easytext.algorithm.naivesyllablecounter $(find  
src-test -name '*.java')


java -Xpatch:mods-test \
 -XaddReads:javamodularity.easytext.algorithm.naivesyllablecounter=org.junit  
\
 -XaddExports:javamodularity.easytext.algorithm.naivesyllablecounter/javamodularity.easytext.algorithm.naivesyllablecounter=org.junit  
\

 -mp mods:lib-test \
 -addmods  
javamodularity.easytext.algorithm.naivesyllablecounter,hamcrestcore \
 -m org.junit/org.junit.runner.JUnitCore  
javamodularity.easytext.algorithm.naivesyllablecounter.NaiveSyllableCounterTest


Which patches my application module to contain a unit test, and then  
exposes my application module to junit at runtime (which is used as  
automatic module here). This works as expected.



-- Sander


When translating this to Maven it assumes that Maven is aware of the  
module name of the project is it building.
Up until now that's not true. Developers specify the moduleName in the  
module-info.java and it doesn't make sense to ask them to add the same  
modulename to the pom (it that was possible) or the maven-compiler-plugin  
configuration. Instead Maven could use some new java9 APIs to discover the  
moduleName, but that would imply that from now on maven-compiler-plugin  
requires Java9 runtime. I don't think that's the way we want to go right  
now. Several Maven plugins have their own kind of multi-release pattern  
where some codeblocks depend on a specific Maven version, but we really  
want to avoid this.
I hope we can find a way where Maven can simply refer to the  
classes-directory or the jar for some java/javac arguments where one would  
now need to be aware of its module name.
From Mavens point of view the output directories are facts, dependencies  
from the pom.xml too, as is the packaged artifact name & location, the  
content of java files are a mystery and not of any interest (at least in a  
classpath world ;) ).


thanks,
Robert


Re: modulepath and classpath mixture

2016-03-08 Thread Robert Scholte
On Mon, 07 Mar 2016 13:13:38 +0100, Alan Bateman <alan.bate...@oracle.com>  
wrote:



On 06/03/2016 14:01, Robert Scholte wrote:

Hi,

I've asked for some feedback and there seems to be concerns with split  
packages when there are two or more modules on the module path that  
export the same package.
Unless I misunderstand the issue, I'd say you have the same problem  
with -addmods. If you add mod1 and mod2, which both export the same  
package, you have exactly the same issue, right?
Yes, although at least if you specify the module names to -addmods then  
you are being explicit as to the additional modules to resolve. The  
issue with -addmods ALL-NAMED (or what the token is) is that it will  
resolve all modules that can be found via the application module path  
and so would need to be used with great care.




I can only speak for Maven how it wants to use it. Only the modules used  
to compile the src/main/java sources will end up on the modulePath in this  
case. So the set of modules has already been validated, kind of.




When talking about exports it made me realize there's probably another  
issue: only the exported packages of the "main"-module are accessible,  
right? Since src/test/java has no module-info, the -Xpatch is useless.
An idea that comes to my mind is something like -mainModule ,  
where are either a jar or directory containing module-info. All its  
classes can be accessed by the classes on the classpath, all other  
modules keep their package access restrictions.
I don't think I understand the issue here. Using -Xpatch doesn't change  
the module declaration or export. It can be used to override or augment  
the module content, it just can't override the module declaration. It  
can be used in conjunction with -XaddReads and -XaddExports to read  
additional modules or export additional packages. For example, if a  
patch adds types to a new package then you could export that package  
with -XaddExports. If the patch injects tests into an existing package  
then those tests might have new dependences and requires compiling or  
running with -XaddReads:$MODULE=junit for example.


This is how I thought that -Xpatch would work in short: the module-info in  
src/main/java and src/test/java have both the same modulename. The  
module-info in src/test/java specifies the extra required modules (like  
junit). With -Xpatch the test-classes have access to the non-exported  
main-classes too.




-Alan


thanks,
Robert


Re: modulepath and classpath mixture

2016-02-27 Thread Robert Scholte

Hi,

I noticed[1] that -addmods already has a special option: ALL-SYSTEM
What I'm looking for is something like ALL-MP or ALL-MODULEPATH, which  
simply exposes all modules on the modulepath to the classpath. The set of  
moduleEntries on the modulePath are already chosen with care and are in  
the end all required to be able to compile the test-classes without the  
need of knowing the name of the module being used to compile with.


thanks,
Robert

[1] http://openjdk.java.net/jeps/261


On Tue, 23 Feb 2016 01:52:50 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/22/2016 12:44 PM, Robert Scholte wrote:

Hi,

first of all I'd like to say that I'm very pleased with the new -mp  
options, these matches better with the way Apache Maven would like to  
work with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to  
src/main/java and put all compile-scoped dependencies to the module  
path, all compiles fines.
I assume that developers are less interested in adding a  
module-info.java file to src/test/java, so that's what I'm doing right  
now too.
Now it seems that I *must* add compile + test scoped to the *classpath*  
to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on the  
modulepath and all test-scoped dependencies on the classpath, so the  
modules keeps their inner related structure, but it seems that the  
classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on the  
classpath is indeed the right approach in this case.


thanks,
Robert


Robert,

We definitely need some more detailed notes on setting up javac  
compilations (note to self!) but one thing to note is that by default,  
the unnamed module (i.e. code on the classpath) only has observability  
of the modules in the system image. To make modules on the module path  
observable, you need to use the -addmods option.


-- Jon



--
Using Opera's mail client: http://www.opera.com/mail/


Re: modulepath and classpath mixture

2016-02-24 Thread Robert Scholte
On Wed, 24 Feb 2016 09:52:06 +0100, Alan Bateman <alan.bate...@oracle.com>  
wrote:




On 23/02/2016 21:10, Robert Scholte wrote:

:

If I understand this correctly I need to know the module name.
Has there been any discussion around having the module name in the POM?  
 From the mails then it sounds like the project is mostly "unaware" that  
it is creating a module. The other thing that comes to mind is the  
source layout and whether it will get to the point where the module name  
is in the file path. I'm mostly thinking of multi module projects where  
one might have the source to multiple modules in the same source tree.


-Alan


The name of the module with not end up in pom-4.0.0, it'll simply break  
the xsd and would have effect on all other tools/IDEs/etc depending on the  
pom.
The only place where one might specify the module name is in the  
configuration of the maven-compiler-plugin.
In Brussels we talked about multimodules, but it makes more sense that 1  
Maven project contains zero or one module-info[1].
And yes, I think a MavenProject will probably be unaware of its own module  
name. It knows its source roots and outputdirectories (for both main/java  
and test/java) and the packaged jar. Based on the availability of the  
module-info in one of these locations it knows how to construct the  
commandline arguments.

At least, this is what I'm hoping to achieve.

thanks,
Robert

[1] https://twitter.com/rfscholte/status/694599731515899904


Re: modulepath and classpath mixture

2016-02-24 Thread Robert Scholte
On Wed, 24 Feb 2016 00:15:26 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/23/2016 01:22 PM, Robert Scholte wrote:
On Tue, 23 Feb 2016 22:14:32 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/23/2016 01:06 PM, Jonathan Gibbons wrote:



On 02/23/2016 12:48 PM, Robert Scholte wrote:
On Tue, 23 Feb 2016 01:52:50 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/22/2016 12:44 PM, Robert Scholte wrote:

Hi,

first of all I'd like to say that I'm very pleased with the new  
-mp options, these matches better with the way Apache Maven would  
like to work with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to  
src/main/java and put all compile-scoped dependencies to the  
module path, all compiles fines.
I assume that developers are less interested in adding a  
module-info.java file to src/test/java, so that's what I'm doing  
right now too.
Now it seems that I *must* add compile + test scoped to the  
*classpath* to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on  
the modulepath and all test-scoped dependencies on the classpath,  
so the modules keeps their inner related structure, but it seems  
that the classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on  
the classpath is indeed the right approach in this case.


thanks,
Robert


Robert,

We definitely need some more detailed notes on setting up javac  
compilations (note to self!) but one thing to note is that by  
default, the unnamed module (i.e. code on the classpath) only has  
observability of the modules in the system image. To make modules  
on the module path observable, you need to use the -addmods option.


-- Jon


Hi Jonathan,

this would indeed explain what I'm facing right now. However, adding  
-addmods gives me the following exception:
Caused by: java.lang.IllegalArgumentException: -addmods requires an  
argument
at  
com.sun.tools.javac.main.Arguments.error(jdk.compiler@9-ea/Arguments.java:708)


Is -addmods followed by the same entries as -modulepath or by the  
modulenames. I really hope it is not the latter, because that would  
mean that I first need to discover and read all module-info files.


thanks,
Robert


Sorry, I should have been more explicit.

Both javac and java (the launcher) accept an option "-addmods  
,..." which can be used to name modules to be included  
in the module graph.   Confusingly, for javac, the option is listed  
under javac -X (that's a bug we will fix), but setting that aside,  
here's what the command line help says:


  -addmods [,...] Root modules to resolve in  
addition to the initial modules


"java -help" says effectively the same.


So yes, the option takes a list of module names, not module paths.

-- Jon







... but that being said, note that you don't have to list all the  
modules on the module path.You only need to list root modules, and  
javac will determine the transitive closure of all the necessary  
modules.


So, if you're writing tests in the unnamed module, to test a module M,  
the chances are that you only need "-addmods M".




This makes sense. And although I still don't like the fact that this  
requires me to read the module-info, this should be possible for the  
target/mods/m (since it is already compiled).

So my response to Alan was probably a bit too fast.
This requires some tricks on our side to stay compatible with lower  
Java versions while adding some code to read the module-info.


thanks,
Robert



When we've had discussions about how these options might work, we've  
generally assumed you might have some a priori knowledge of the module  
name from some other context, rather than having to rely on reading  
module info.


-- Jon



Hmm, would have been nice if I had known about these discussions, because  
I don't think that this is a valid assumption from a Maven perspective.  
Ideally developers simply add module-info.java files to the source-roots  
of their choice and Maven should be able to construct the correct set of  
javac arguments.
I don't expect developers to open a jar to see if there's a module-info  
available. Actually, how can he figure out the module name, since the  
module-info is a compiled file?
Anyhow, Maven is capable to discover the module name when required, but it  
is not that efficient. Maybe it is time to work on some feedback to  
describe the issues I'm facing regarding some of the javac options.


thanks,
Robert






Alan wrote a separate email about different compilation scenarios.  
Note that in many/most of those cases, no -addmods was necessary.



-- Jon








--
Using Opera's mail client: http://www.opera.com/mail/


Re: modulepath and classpath mixture

2016-02-23 Thread Robert Scholte
On Tue, 23 Feb 2016 22:14:32 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/23/2016 01:06 PM, Jonathan Gibbons wrote:



On 02/23/2016 12:48 PM, Robert Scholte wrote:
On Tue, 23 Feb 2016 01:52:50 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/22/2016 12:44 PM, Robert Scholte wrote:

Hi,

first of all I'd like to say that I'm very pleased with the new -mp  
options, these matches better with the way Apache Maven would like  
to work with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to  
src/main/java and put all compile-scoped dependencies to the module  
path, all compiles fines.
I assume that developers are less interested in adding a  
module-info.java file to src/test/java, so that's what I'm doing  
right now too.
Now it seems that I *must* add compile + test scoped to the  
*classpath* to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on  
the modulepath and all test-scoped dependencies on the classpath, so  
the modules keeps their inner related structure, but it seems that  
the classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on  
the classpath is indeed the right approach in this case.


thanks,
Robert


Robert,

We definitely need some more detailed notes on setting up javac  
compilations (note to self!) but one thing to note is that by  
default, the unnamed module (i.e. code on the classpath) only has  
observability of the modules in the system image. To make modules on  
the module path observable, you need to use the -addmods option.


-- Jon


Hi Jonathan,

this would indeed explain what I'm facing right now. However, adding  
-addmods gives me the following exception:
Caused by: java.lang.IllegalArgumentException: -addmods requires an  
argument
at  
com.sun.tools.javac.main.Arguments.error(jdk.compiler@9-ea/Arguments.java:708)


Is -addmods followed by the same entries as -modulepath or by the  
modulenames. I really hope it is not the latter, because that would  
mean that I first need to discover and read all module-info files.


thanks,
Robert


Sorry, I should have been more explicit.

Both javac and java (the launcher) accept an option "-addmods  
,..." which can be used to name modules to be included in  
the module graph.   Confusingly, for javac, the option is listed under  
javac -X (that's a bug we will fix), but setting that aside, here's  
what the command line help says:


  -addmods [,...] Root modules to resolve in  
addition to the initial modules


"java -help" says effectively the same.


So yes, the option takes a list of module names, not module paths.

-- Jon







... but that being said, note that you don't have to list all the  
modules on the module path.You only need to list root modules, and  
javac will determine the transitive closure of all the necessary modules.


So, if you're writing tests in the unnamed module, to test a module M,  
the chances are that you only need "-addmods M".




This makes sense. And although I still don't like the fact that this  
requires me to read the module-info, this should be possible for the  
target/mods/m (since it is already compiled).

So my response to Alan was probably a bit too fast.
This requires some tricks on our side to stay compatible with lower Java  
versions while adding some code to read the module-info.


thanks,
Robert




Alan wrote a separate email about different compilation scenarios. Note  
that in many/most of those cases, no -addmods was necessary.



-- Jon



--
Using Opera's mail client: http://www.opera.com/mail/


Re: modulepath and classpath mixture

2016-02-23 Thread Robert Scholte
On Tue, 23 Feb 2016 01:30:16 +0100, Alex Buckley <alex.buck...@oracle.com>  
wrote:



Hi Robert,

On 2/22/2016 12:44 PM, Robert Scholte wrote:

Here's my use case: I noticed that if I add a module-info to
src/main/java and put all compile-scoped dependencies to the module
path, all compiles fines.


Sounds good.


I assume that developers are less interested in adding a
module-info.java file to src/test/java, so that's what I'm doing right
now too.


To clarify: you are NOT putting module-info.java in src/test/java.


Now it seems that I *must* add compile + test scoped to the *classpath*
to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on the
modulepath and all test-scoped dependencies on the classpath, so the
modules keeps their inner related structure, but it seems that the
classpath classes cannot access the modulepath classes.


Your first approach sounds preferable. Can you copy-paste a minimized  
invocation of javac that works, and that doesn't work?


Alex


Here's the debug output when calling 'mvn test'.
This doesn't work:

[DEBUG] Classpath:
[DEBUG]   
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\test-classes
[DEBUG]  C:\Users\Robert  
Scholte\.m2\repository\junit\junit\4.12\junit-4.12.jar
[DEBUG]  C:\Users\Robert  
Scholte\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar

[DEBUG] Modulepath:
[DEBUG]   
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\classes
[DEBUG]  C:\Users\Robert  
Scholte\.m2\repository\org\codehaus\plexus\plexus-utils\3.0.22\plexus-utils-3.0.22.jar
[DEBUG]  C:\Users\Robert  
Scholte\.m2\repository\org\apache\commons\commons-lang3\3.4\commons-lang3-3.4.jar

[DEBUG] Source roots:
[DEBUG]   
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\src\test\java
[DEBUG]   
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\generated-test-sources\test-annotations

[DEBUG] Command line options:
[DEBUG] -d  
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\test-classes  
-classpath  
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\test-classes;C:\Users\Robert  
Scholte\.m2\repository\junit\junit\4.12\junit-4.12.jar;C:\Users\Robert  
Scholte\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;  
-modulepath  
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\classes;C:\Users\Robert  
Scholte\.m2\repository\org\codehaus\plexus\plexus-utils\3.0.22\plexus-utils-3.0.22.jar;C:\Users\Robert  
Scholte\.m2\repository\org\apache\commons\commons-lang3\3.4\commons-lang3-3.4.jar;  
-sourcepath  
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\src\test\java;E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\generated-test-sources\test-annotations;  
-s  
E:\java-workspace\apache-maven-maven\maven\maven-builder-support\target\generated-test-sources\test-annotations  
-g -nowarn -target 1.9 -source 1.9 -encoding UTF-8


Result:
[ERROR] COMPILATION ERROR :
[INFO] -
[ERROR]  
/E:/java-workspace/apache-maven-maven/maven/maven-builder-support/src/test/java/org/apache/maven/building/DefaultProblemTest.java:[25,41]  
package org.apache.maven.building.Problem does not exist
[ERROR]  
/E:/java-workspace/apache-maven-maven/maven/maven-builder-support/src/test/java/org/apache/maven/building/DefaultProblemCollectorTest.java:[24,41]  
package org.apache.maven.building.Problem does not exist


followed by a lot of 'cannot find symbol'.

If all modulepath entries were added to the classpath instead, then it  
compiles fine.


thanks,
Robert


Re: modulepath and classpath mixture

2016-02-23 Thread Robert Scholte
On Tue, 23 Feb 2016 13:59:13 +0100, Alan Bateman <alan.bate...@oracle.com>  
wrote:




On 22/02/2016 20:44, Robert Scholte wrote:

Hi,

first of all I'd like to say that I'm very pleased with the new -mp  
options, these matches better with the way Apache Maven would like to  
work with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to  
src/main/java and put all compile-scoped dependencies to the module  
path, all compiles fines.
I assume that developers are less interested in adding a  
module-info.java file to src/test/java, so that's what I'm doing right  
now too.
Now it seems that I *must* add compile + test scoped to the *classpath*  
to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on the  
modulepath and all test-scoped dependencies on the classpath, so the  
modules keeps their inner related structure, but it seems that the  
classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on the  
classpath is indeed the right approach in this case.


For the tests then I assume they are in the same packages as the sources  
under src/main/java, is that right?


In that case I think you will want to compile the tests as if they are  
part of the module:


   javac  -Xmodule:m  -d testclasses/m  -mp m.jar  test/java/...

where m is the module name and the module (with sources in  
src/main/java) has already been compiled and then packaged as m.jar. The  
-Xmodule: option tells the compiler that you compiling the test  
classes as if they are part of module m. There is no module-info.java in  
the test tree.


The related lifecycle phases of Maven are: compile, test-compile, test,  
package.
So during test there's no m.jar yet, but target/classes or target/mods/m.  
This shouldn't be an issue, though.


If I understand this correctly I need to know the module name. That is  
information defined in the module-info, meaning I need to read that class  
first. When possible I would like to avoid this. Suppose a developer has  
made a syntax failure, I would hope that such error is thrown by javac,  
not by Maven while doing some pre-compile actions on the source-files to  
construct the correct commandline arguments.


I've already talked with Mark about the usage of -Xpatch, but that's  
required if src/test/java is considered a module too.


And maybe this is the key question: if src/main/java is a module, should  
we handle src/test/java as a module too or leave it as a classpath based  
project?


thanks,
Robert



Going further then I expect that JUnit or TestNG is also in the picture,  
I assume the class path. In that case, the command becomes:


   javac  -Xmodule:m  -d testclasses/m  -mp m.jar   \
   -cp junit-4.12.jar  -XaddReads:m=ALL-UNNAMED  \
   test/java/...

where you are compiling test classes as if they are module m and at the  
same time referencing JUnit types on the class path. The  
-XaddReads:m=ALL-UNNAMED augments the module declaration to say that  
module m reads all unnamed modules, just think class path here.



In order to run then you can use -Xpatch to augment the module with the  
test classes:


   java -Xpatch:testclasses  -mp m.jar  -cp junit-4.12.jar  
-XaddReads:m=ALL-UNNAMED   ...


It is as if the test classes are in m.jar. The alternative is of course  
to add the test classes to the packaged module but you would still need  
the -XaddReads because module m does not (and can not) declare that it  
depends on types on the class path.



While on the topic then I should mention that we have a proposal coming  
to support patches as JAR files as I'm sure you will get to the point  
soon where the test classes are in a JAR file.


Hopefully the above is useful. I completely agree with Jon that we need  
to put down detailed notes and examples. In the case of testing then we  
have tried out the popular test frameworks on the class path (as above)  
and also as modules. In the case of JUnit then we have been successful  
with it on a the module path as an automatic module. Definitely  
something to write up.


-Alan



--
Using Opera's mail client: http://www.opera.com/mail/


Re: modulepath and classpath mixture

2016-02-23 Thread Robert Scholte
On Tue, 23 Feb 2016 01:52:50 +0100, Jonathan Gibbons  
<jonathan.gibb...@oracle.com> wrote:





On 02/22/2016 12:44 PM, Robert Scholte wrote:

Hi,

first of all I'd like to say that I'm very pleased with the new -mp  
options, these matches better with the way Apache Maven would like to  
work with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to  
src/main/java and put all compile-scoped dependencies to the module  
path, all compiles fines.
I assume that developers are less interested in adding a  
module-info.java file to src/test/java, so that's what I'm doing right  
now too.
Now it seems that I *must* add compile + test scoped to the *classpath*  
to be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on the  
modulepath and all test-scoped dependencies on the classpath, so the  
modules keeps their inner related structure, but it seems that the  
classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on the  
classpath is indeed the right approach in this case.


thanks,
Robert


Robert,

We definitely need some more detailed notes on setting up javac  
compilations (note to self!) but one thing to note is that by default,  
the unnamed module (i.e. code on the classpath) only has observability  
of the modules in the system image. To make modules on the module path  
observable, you need to use the -addmods option.


-- Jon


Hi Jonathan,

this would indeed explain what I'm facing right now. However, adding  
-addmods gives me the following exception:
Caused by: java.lang.IllegalArgumentException: -addmods requires an  
argument
at  
com.sun.tools.javac.main.Arguments.error(jdk.compiler@9-ea/Arguments.java:708)


Is -addmods followed by the same entries as -modulepath or by the  
modulenames. I really hope it is not the latter, because that would mean  
that I first need to discover and read all module-info files.


thanks,
Robert


modulepath and classpath mixture

2016-02-22 Thread Robert Scholte

Hi,

first of all I'd like to say that I'm very pleased with the new -mp  
options, these matches better with the way Apache Maven would like to work  
with jars and class-folders.


Here's my use case: I noticed that if I add a module-info to src/main/java  
and put all compile-scoped dependencies to the module path, all compiles  
fines.
I assume that developers are less interested in adding a module-info.java  
file to src/test/java, so that's what I'm doing right now too.
Now it seems that I *must* add compile + test scoped to the *classpath* to  
be able to compile the test classes.
My first approach was to leave the compile-scoped dependencies on the  
modulepath and all test-scoped dependencies on the classpath, so the  
modules keeps their inner related structure, but it seems that the  
classpath classes cannot access the modulepath classes.


I'm looking for the confirmation that putting all dependencies on the  
classpath is indeed the right approach in this case.


thanks,
Robert


Re: hg: jigsaw/jake/langtools: support module files directly on file manager module paths

2016-02-08 Thread Robert Scholte

Hi Jon,

thanks, I'll give it a try with the next jigsaw-ea.

I also noticed the following change below.
IIRC a .zip file was considered a valid file extension for classpath  
entries (even though we still don't support it with Maven). Is the zip  
file dropped in case of modules?


thanks,
Robert


2.30
+private void checkValidModulePathEntry(Path p) {
2.31
+if (Files.isDirectory(p)) {
2.32
+// either an exploded module or a directory of modules
2.33
+return;
2.34
+}
2.35
+
2.36
+String name = p.getFileName().toString();
2.37
+int lastDot = name.lastIndexOf(".");
2.38
+if (lastDot > 0) {
2.39
+switch (name.substring(lastDot)) {
2.40
+case ".jar":
2.41
+case ".jmod":
2.42
+return;
2.43
+}
2.44
+}
2.45
+throw new IllegalArgumentException(p.toString());
2.46
+}




Op Mon, 08 Feb 2016 03:58:37 +0100 schreef Jonathan Gibbons  
<jonathan.gibb...@oracle.com>:



Hi Robert,

Thanks for the report.   This should now have been addressed by
http://hg.openjdk.java.net/jigsaw/jake/langtools/rev/719a1da641c7

-- Jon


On 02/03/2016 09:38 AM, Robert Scholte wrote:

Hi Jonathan,

it seems like this change is not enough, see:  
Locations.ModulePathLocationHandler::setPaths


910
@Override
911
void setPaths(Iterable paths) {
912
if (paths != null) {
913
for (Path p: paths) {
914
if (!Files.isDirectory(p))
915
throw new  
IllegalArgumentException(p.toString());

916
}
917
}
918
super.setPaths(paths);
919
}

I still got the IAE.

thanks,
Robert


Op Thu, 21 Jan 2016 02:59:30 +0100 schreef  
<jonathan.gibb...@oracle.com>:



Changeset: 546b5fa35f9a
Author:jjg
Date:  2016-01-20 17:58 -0800
URL: http://hg.openjdk.java.net/jigsaw/jake/langtools/rev/546b5fa35f9a

support module files directly on file manager module paths

!  
src/jdk.compiler/share/classes/com/sun/tools/javac/file/Locations.java


Re: hg: jigsaw/jake/langtools: support module files directly on file manager module paths

2016-02-03 Thread Robert Scholte

Hi Jonathan,

it seems like this change is not enough, see:  
Locations.ModulePathLocationHandler::setPaths


910
@Override
911
void setPaths(Iterable paths) {
912
if (paths != null) {
913
for (Path p: paths) {
914
if (!Files.isDirectory(p))
915
throw new IllegalArgumentException(p.toString());
916
}
917
}
918
super.setPaths(paths);
919
}

I still got the IAE.

thanks,
Robert


Op Thu, 21 Jan 2016 02:59:30 +0100 schreef :


Changeset: 546b5fa35f9a
Author:jjg
Date:  2016-01-20 17:58 -0800
URL:
http://hg.openjdk.java.net/jigsaw/jake/langtools/rev/546b5fa35f9a


support module files directly on file manager module paths

! src/jdk.compiler/share/classes/com/sun/tools/javac/file/Locations.java


Re: Specifying module paths

2016-01-15 Thread Robert Scholte

Hi Paul,

no, I'm not talking about multiple versions of the same module, that  
subject is clear to me.
Alan described quite precise my issue (it's the first described usecase,  
although the others are interesting as well). So it seems that if two  
different modules export the same package, there will be a compilation  
error. No chance of class collisions within a moduled system. That's a  
very tight safety belt! For me it is too early to jump to conclusions, but  
this might have bigger impact than the module separation itself.


thanks,
Robert

Op Fri, 15 Jan 2016 20:33:32 +0100 schreef Paul Benedict  
<pbened...@apache.org>:



Robert, in the SOTM document, it explicitly calls out that Module systems
are not required to support multiple versions of a module. Correct me if
wrong, but I think you're hinting at that?

Cheers,
Paul

On Fri, Jan 15, 2016 at 3:06 AM, Robert Scholte <rfscho...@apache.org>
wrote:


Op Thu, 14 Jan 2016 23:45:32 +0100 schreef Jonathan Gibbons <
jonathan.gibb...@oracle.com>:





On 01/14/2016 12:25 PM, e...@zusammenkunft.net wrote:


Hello,

If I understood it correctly the modules on the MP must be unique and
are not merged, thats why the order inside the directory does not  
matter

for the named modules.

Bernd



Let me refine that for you ...

The modules in each directory on the module path must be unique.   A
module with a specific name in a directory on the module path will  
shadow
(hide) any other module with the same name in a later directory on the  
path.


So, the order of directories on the module path matters (just like the
order of entries on a class path matters), but the "order" of entries
within any specific directory on the module path does not matter.

-- Jon



Suppose there's a logging module and a fat module, which also contains  
the

classes of the logging module, but older.
In my module-info I have
requires logging;
requires fat;

These modules are in the same directory. Which class is loaded first and
why? If it is the order in the module-info, then I would also like to  
see

an example of the strategy with "requires public".

thanks,
Robert


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Specifying module paths

2016-01-15 Thread Robert Scholte
Op Thu, 14 Jan 2016 23:45:32 +0100 schreef Jonathan Gibbons  
:





On 01/14/2016 12:25 PM, e...@zusammenkunft.net wrote:

Hello,

If I understood it correctly the modules on the MP must be unique and  
are not merged, thats why the order inside the directory does not  
matter for the named modules.


Bernd



Let me refine that for you ...

The modules in each directory on the module path must be unique.   A  
module with a specific name in a directory on the module path will  
shadow (hide) any other module with the same name in a later directory  
on the path.


So, the order of directories on the module path matters (just like the  
order of entries on a class path matters), but the "order" of entries  
within any specific directory on the module path does not matter.


-- Jon




Suppose there's a logging module and a fat module, which also contains the  
classes of the logging module, but older.

In my module-info I have
requires logging;
requires fat;

These modules are in the same directory. Which class is loaded first and  
why? If it is the order in the module-info, then I would also like to see  
an example of the strategy with "requires public".


thanks,
Robert


Re: Specifying module paths

2016-01-14 Thread Robert Scholte

Hi,

on the maven-dev list I've received a couple of responses.
The following comments are worth mentioning:

Igor Fedorenko says: "This is a very good proposal. My only suggestion is  
to extend javax.tools CompilationTask API to take modulepath map as  
in-memory parameter. Not a big deal, but it'll be silly to write  
properties file to disk for it to be immediately read by the code executed  
in the same jvm."


I agree with Igor on the in-memory option: JDK-8144665 is first of all  
initiated as a request to have a more effective way to handle modules *for  
build-tools*.


Paul Benedict says: "It sounds like Maven will have to generate many  
.properties file in a build.

1) Modules to compile main source
2) Modules to compile test source
3) Modules to execute tests
4) And what about forking?
I am concerned #4 is going to create issues unless the .properties file  
name is unique enough. Perhaps it can be based on process id."


I haven't had a look at surefire (framework for executing tests) yet, so  
we still need to analyze the impact for it. What Tibor Digana told me is  
that it leans on the Classpath element of the MANIFEST file.


My remarks:

There are 2 things important to me:
a. reference must result in a single file
   A Maven Artifact coordinate refers to exactly one file. In the local  
repository the folder containing that artifact often contains other files  
as well and you cannot predict which files were intended to be added to  
the class/module path just by referring to that folder. A properties file  
would at least result in a 1:1 mapping from dependency to file.

b. predictable order
   With the classpath it was the order of the cp-entries specified, and  
Maven calculated that order based the "distance" of that dependency in  
relation to the Java project, especially important with transitive  
dependencies. Based on what I read about JEP-261 and 'order' it is *not*  
the "requires X" from the module-info which decides the order. Just like  
the cp argument it is based on the mp argument. This is interesting,  
because what's the order of files within a folder? It depends on the OS  
[1].
	bq. There is no guarantee that the name strings in the resulting array  
will appear in any specific order; they are not, in particular, guaranteed  
to appear in alphabetical order.

Reordering dependencies is a well known trick in case of class collisions.

Regarding the extra file, I'm still not sure about it. To me it adds  
unnecessary complexity. We didn't ask for a replacement of the current  
behavior, just to *add* support for jars as -mp argument. For those who  
want to use the commandline, the can still refer to module folders. As for  
Maven (I cannot speak for other tools), we use the CompilerAPI, so  
commandline-length is not the issue.


thanks,
Robert Scholte

[1] https://docs.oracle.com/javase/7/docs/api/java/io/File.html#list()


Re: [Jigsaw] Apache Maven status update

2016-01-05 Thread Robert Scholte
Op Tue, 05 Jan 2016 22:55:03 +0100 schreef Jochen Wiedmann  
<jochen.wiedm...@gmail.com>:


On Thu, Dec 31, 2015 at 1:01 PM, Robert Scholte <rfscho...@apache.org>  
wrote:


The next blocking issue requires some work by the Apache Maven team  
and/or

at the QDox project[2].
QDox is a Java Parser which is used to read javadoc and/or annotations  
from

source in order to generate xml or other Java files.
With the module-info.java in place we get a ParserException, which is of
course expected at this moment.


Hi, Robert,

wouldn't it be possible to simply ignore **/module-info.javaa?

Sounds like the obvious solution to me.

Jochen


QDox has no such option, there was never a need for it: if a java-file is  
within reach, just parse it.
I've talked with Hervé Boutemy about this and in this case we don't need  
source parsing anymore. Everything has already been transformed from  
javadoc to Annotations.
However, for extracting this info both strategies where executed. I've  
improved the corresponding maven-plugin so you can choose which  
extractor(s) to use[1].


The result is that I can run "mvn package" with success for the subset of  
this Maven multimodule project.


The next challenge is: "mvn compile". This means there are no jars  
available yet within the reactor, those Maven modules still have the  
(exploded) directory with classes. Up until now the 'target/classes' was  
used, but according to JEP261 it now must look something like  
target/mods/module.name/ (notice the extra directory). Maven uses the  
Artifact.file field, which points to the classes output directory after  
"compile" and points to the jar after "package". A lot of plugins build up  
the classpath based in this information.
So I'm looking for a solution there the compiler can use target/mods as  
-mp argument, but where other plugins can still use it as a classpath. I  
want to prevent that users need to update almost *every* maven-plugin when  
using Java9.


Robert

[1]  
https://github.com/codehaus-plexus/plexus-containers/commits/plexus-containers-1.x/plexus-component-metadata





-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org


Re: Draft of The Java Language Specification Java SE 9 Edition

2015-12-30 Thread Robert Scholte

Op Wed, 30 Dec 2015 23:30:30 +0100 schreef Remi Forax <fo...@univ-mlv.fr>:


- Mail original -

De: "Robert Scholte" <rfscho...@apache.org>
À: jigsaw-dev@openjdk.java.net
Envoyé: Mercredi 30 Décembre 2015 23:11:52
Objet: Draft of The Java Language Specification Java SE 9 Edition

Hi,


Hi Robert,



for QDox (Java Parser @ https://github.com/paul-hammant/qdox ) I'm  
looking
for a draft of the new JLS which contains the syntax for the  
module-info.

Is it already available or could I have a preview of it?


the link is on the OpenJDK page for jigsaw
  http://openjdk.java.net/projects/jigsaw/

class format and jls grammar:
  http://cr.openjdk.java.net/~mr/jigsaw/spec/lang-vm.html



Nice. I know there are still some discussions, but this is a good start  
for me.




Beside that it seems that the syntax chapter of JLS for Java SE8 is
incomplete. It would be great if the missing parts could be added before
releasing the official documents for Java 9.


please can you be a little more specific ?


I haven't been able to parse the following file based on the syntax as  
described in Chapter 19

https://github.com/kronar/qdox-bug/blob/master/src/main/java/EnumWithFields.java

The problem is with the argumentList of the enum.

https://docs.oracle.com/javase/specs/jls/se8/html/jls-8.html#jls-EnumConstant  
is my starting point, but I can't find the way to a MethodInvocation,  
unless I'm missing something of course.


thanks,
Robert





Thanks,
Robert Scholte



cheers,
Rémi


Draft of The Java Language Specification Java SE 9 Edition

2015-12-30 Thread Robert Scholte

Hi,

for QDox (Java Parser @ https://github.com/paul-hammant/qdox ) I'm looking  
for a draft of the new JLS which contains the syntax for the module-info.  
Is it already available or could I have a preview of it?


Beside that it seems that the syntax chapter of JLS for Java SE8 is  
incomplete. It would be great if the missing parts could be added before  
releasing the official documents for Java 9.


Thanks,
Robert Scholte


Re: RFE support jar as modulepath argument

2015-12-07 Thread Robert Scholte

Hi Mark,

I wasn't aware of this request, simply because I assumed that classpath  
and modulepath would allow similar arguments. Main difference: in case of  
a modulepath the module-info is used if available otherwise it is  
considered a auto module. When using classpath arguments, the module-info  
is ignored here.


We did however talk about auto recognition of modules, because for the  
end-users it shouldn't matter if the jar is a module or not. For them it  
is just another dependency.

My current focus is on javac and it seems like I don't need it here yet.

thanks,
Robert

Op Fri, 04 Dec 2015 01:32:56 +0100 schreef :


2015/12/3 11:49 -0800, rfscho...@apache.org:

On behalf of the Apache Maven team I'm working on the Plexus Compiler[1]
to support compilation of modules and I'm struggling with the
specification of the modulepath.

...

To ensure that exactly the dependencies as specified in the pom.xml are
used (not more or less) without the need to copy files it would be my  
wish

to have -mp also support (jar)-files.


Thanks for the reminder about this.  As I may have mentioned to you when
we spoke at JavaOne, several tool maintainers have made this request.

RFE created: https://bugs.openjdk.java.net/browse/JDK-8144665

- Mark


RFE support jar as modulepath argument

2015-12-03 Thread Robert Scholte

Hi,

On behalf of the Apache Maven team I'm working on the Plexus Compiler[1]  
to support compilation of modules and I'm struggling with the  
specification of the modulepath.


According to JEP 261[1] the path of -mp  is not the same as -cp  
. Based of the commandline help of javac I assumed both paths were  
the same, since the parameter argument has the same name.
After reading the specs it seems like I can only refer to a directory  
containing modules. For a dependency specified in the pom.xml I could  
refer to the directory (within the local repository) containing that  
specific artifact.  However, such directory contains more files, so I  
can't be certain the correct file is picked up (e.g.  
cooomons-lang3-3.4[2]. Both commons-lang3-3.4.jar and  
commons-lang3-3.4-tests.jar might contain a module-info.class, but it is  
uncertain if this was the file specified as dependency).
So it seems like for every Maven Project I need to copy the dependencies  
to a specific folder and use this as argument for -mp. My problem with  
this is:
  a. IO is considered slow, so making these copies are expensive. In case  
of a multimodule Maven project this will have a huge negative impact.  
Based on several mailingthreads it seems to me that speed is the most  
important requirement for a build tool.
  b. You will get dozens of copies of the same jar all over your system.  
This reminds me of the times where you added a lib-directory to your java  
project and committed all these files into your SCM...


The beauty of the classpath is that you can *refer* to a specific *jar*.  
It is precise and has no overhead.


Executing javac is probably fast, but the preparations the  
maven-compiler-plugin must do to be able to call javac will take much more  
time.


To ensure that exactly the dependencies as specified in the pom.xml are  
used (not more or less) without the need to copy files it would be my wish  
to have -mp also support (jar)-files.


thanks,
Robert Scholte

[1] https://github.com/codehaus-plexus/plexus-compiler/tree/jigsaw-ea
[2] https://repo1.maven.org/maven2/org/apache/commons/commons-lang3/3.4/


JDeps: detecting offending packages

2015-11-16 Thread Robert Scholte

Hi,

for the maven-jdeps-plugin I use the following code to detect offending  
packages:

---
import java.io.IOException;

import org.codehaus.plexus.util.StringUtils;

public class Base64Codec
{

@SuppressWarnings( "restriction" )
public static byte[] Base64decode( String input ) throws IOException
{
if( StringUtils.isNotEmpty( input ) )
{
return new sun.misc.BASE64Decoder().decodeBuffer( input );
}
else
{
return null;
}
}
}
---

Previous versions of JDK9 gave me the following output:

classes -> java.base
(classes)
  -> java.io
  -> java.lang
  -> sun.misc   JDK internal  
API (java.base)



The current b86 gives me:

classes -> java.base
(classes)
  -> java.io
  -> java.lang
  -> sun.misc

So why this change? And is there another way to detect the usage of  
non-accessible classes?


The maven-plugin has a parameter called failOnWarning (default:true) which  
should break the build in order to help developers to change their code  
when they rely on internal classes.


regards,
Robert Scholte


Re: JavaxToolsCompiler

2015-09-18 Thread Robert Scholte


Op Fri, 18 Sep 2015 10:26:46 +0200 schreef Alan Bateman  
<alan.bate...@oracle.com>:




On 17/09/2015 22:07, Robert Scholte wrote:

I can confirm the fix.

thanks!

Robert
Thanks for the confirmation. Just to double check, this is with the  
ToolProvider fix rather than the patch to the compiler plugin that one  
of Stuart's mails referenced, right?


As we're exchanging mail, I'm curious as to which plugins are exercised  
by your testing, just in case there are other plugins where we might  
still run into issues.


-Alan

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Stuart already closed that pull request[1] since "This has been fixed  
upstream:  
http://hg.openjdk.java.net/jigsaw/jake/langtools/rev/948a1770620e;


I started building Maven[2] with Maven.
Next will be the integrations tests[3] and the maven-plugins[4]
That should give a good overview of the compatibility of Maven projects  
with jigsaw


Robert

[1]  
https://github.com/codehaus-plexus/plexus-compiler/pull/13#issuecomment-140565416

[2] https://git-wip-us.apache.org/repos/asf/maven.git
[3] https://git-wip-us.apache.org/repos/asf/maven-integration-testing.git
[4] https://svn.apache.org/repos/asf/maven/plugins/trunk


JavaxToolsCompiler

2015-09-14 Thread Robert Scholte

Hi,

On behalf of the Apache Maven team I'd like to ask for advice for changing  
the JavaxToolsCompiler[1]
This implementation is used when java code is being compiled with Maven  
*by default*, so right now when pointing JAVA_HOME to the latest JDK9  
version builds will fail.
There are other ways to compile, e.g. use the fork-parameter[2] or with  
toolchains[3], but what I'd like to know is whether it is still  
possible/valid to use javax.tools.JavaCompiler and is so: how should we  
rewrite this code?


regards,
Robert Scholte

[1]  
https://github.com/codehaus-plexus/plexus-compiler/blob/master/plexus-compilers/plexus-compiler-javac/src/main/java/org/codehaus/plexus/compiler/javac/JavaxToolsCompiler.java
[2]  
http://maven.apache.org/plugins/maven-compiler-plugin/compile-mojo.html#fork

[3] http://maven.apache.org/guides/mini/guide-using-toolchains.html


Re: JavaxToolsCompiler

2015-09-14 Thread Robert Scholte
Op Mon, 14 Sep 2015 20:27:39 +0200 schreef Alan Bateman  
<alan.bate...@oracle.com>:



On 14/09/2015 17:40, Robert Scholte wrote:

Hi,

On behalf of the Apache Maven team I'd like to ask for advice for  
changing the JavaxToolsCompiler[1]
This implementation is used when java code is being compiled with Maven  
*by default*, so right now when pointing JAVA_HOME to the latest JDK9  
version builds will fail.
There are other ways to compile, e.g. use the fork-parameter[2] or with  
toolchains[3], but what I'd like to know is whether it is still  
possible/valid to use javax.tools.JavaCompiler and is so: how should we  
rewrite this code?


Thanks for bringing this up as a few people have reported issues with  
Maven not finding the compiler.


Just to be clear, are you seeing this issue with the regular JDK 9 EA  
builds or just the Jigsaw EA builds?


Regular JDK9 EA works fine, I'm only seeing it with Jigsaw EA


Did this start when tools.jar went away?


I guess so.

I just did a quick test to check that  
ToolProvider.getSystemJavaCompiler() is returning the system  
JavaCompiler is returned for both builds (and it is). Is the issue that  
you are seeing that getSystemJavaCompiler() is returning null?


Stuart said yes, I think so too.



-Alan.


Robert


Re: Apache Maven JDeps Plugin

2015-02-19 Thread Robert Scholte

Hi Mandy,

based on your proposal I've added 2 parameters:  
dependenciesToAnalyzeIncludes and dependenciesToAnalyzeExcludes
This way it's not an All-Or-Nothing option, but instead you have full  
control over the dependencies to include.
You can select the dependencies by using the pattern groupId:artifactId in  
combination with '*'.


Some configuration examples:
This will select all dependencies for the specified scope (compile or  
test, depending on the goal)

  dependenciesToAnalyzeIncludes
 include*:*/include
  /dependenciesToAnalyzeIncludes

Here are some other patterns, which are allowed
  dependenciesToAnalyzeIncludes
 includeorg.foo.*:*/include
 includecom.foo.bar:*/include
 includedot.foo.bar:utilities/include
   /dependenciesToAnalyzeIncludes

With dependenciesToAnalyzeExcludes you can exclude a subset of  
dependenciesToAnalyzeIncludes.


  dependenciesToAnalyzeExcludes
 excludeorg.foo.test:*/exclude
  /dependenciesToAnalyzeExcludes

This should match your requirements.

Regards,
Robert Scholte


Op Wed, 18 Feb 2015 05:46:37 +0100 schreef Mandy Chung
mandy.ch...@oracle.com:


Hi Robert,

Indeed this looks very useful.

On 2/16/2015 10:45 AM, Alan Bateman wrote:

On 16/02/2015 18:28, Robert Scholte wrote:

Hi Alan,

if you are referring to the -R / -recursive option of the jdeps tool,  
then yes you can.
See  
http://maven.apache.org/plugins-archives/maven-jdeps-plugin-LATEST/maven-jdeps-plugin/jdkinternals-mojo.html#recursive
I think jdeps is first of all interesting for the classes of the  
current Java project, so I've set the default of this parameter to  
'false'. However, if the majority thinks it is better to activate this  
by default, we will consider to change this value.
I could imagine wanting to run it twice: once for the current project  
where I want the build to fail if it makes direct use of JDK-internal  
APIs, and a second time to run with -R and emit warnings if any of the  
transitive dependences (that I don't control) are using JDK internal  
APIs.


Another alternative to running jdeps -R, the plugin can run jdeps on all  
of the transitive dependences of the current project (all JAR files can  
be put in one jdeps command) that will find out if any of its  
dependences (analyze all classes) is using JDK internal API.


jdeps -R will only analyze classes that are referenced from the root set  
(i.e. the arguments passed to jdeps that I assume be the current  
project) and doesn't analyze any class in the dependences that is not  
referenced transitively.


The default is to run jdeps on the current project sounds reasonable to  
me.


Mandy


Re: Apache Maven JDeps Plugin

2015-02-16 Thread Robert Scholte

Hi Alan,

I've added a flag called failOnWarning (default:true), assuming that the  
usage of jdkinternals is considered a warning and not an error.
With the following configuration you'll be able to run jdeps multiple  
times within the same build.


  plugin
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-jdeps-plugin/artifactId
version3.0-SNAPSHOT/version
executions
  execution
idclasses/id
goals
  goaljdkinternals/goal
  goaltest-jdkinternals/goal
/goals
  /execution
  execution
iddependencies/id
goals
  goaljdkinternals/goal
  goaltest-jdkinternals/goal
/goals
configuration
  recursivetrue/recursive
  failOnWarningfalse/failOnWarning
/configuration
  /execution
/executions
  /plugin

This should match your requirements.

Robert


Op Mon, 16 Feb 2015 19:45:16 +0100 schreef Alan Bateman  
alan.bate...@oracle.com:



On 16/02/2015 18:28, Robert Scholte wrote:

Hi Alan,

if you are referring to the -R / -recursive option of the jdeps tool,  
then yes you can.
See  
http://maven.apache.org/plugins-archives/maven-jdeps-plugin-LATEST/maven-jdeps-plugin/jdkinternals-mojo.html#recursive
I think jdeps is first of all interesting for the classes of the  
current Java project, so I've set the default of this parameter to  
'false'. However, if the majority thinks it is better to activate this  
by default, we will consider to change this value.
I could imagine wanting to run it twice: once for the current project  
where I want the build to fail if it makes direct use of JDK-internal  
APIs, and a second time to run with -R and emit warnings if any of the  
transitive dependences (that I don't control) are using JDK internal  
APIs.


-Alan