Re: [External] : Re: Annotation Dependencies and Requires Static Transitive

2021-07-06 Thread Alex Buckley
Presumably this is a javac lint warning about how types referenced from 
an exported API (e.g., the return type of an exported public method) 
should themselves be exported.


It's hard to tell from 
https://docs.oracle.com/en/java/javase/16/docs/specs/man/javac.html#option-Xlint-custom 
exactly which kind of warning it is, but `exports` is plausible.


Types of annotations on the exported API should probably be excluded 
from that lint check, what ever the retention policy of those annotation 
types. (This feels like something that's come up before...)


Alex

On 6/7/2021 4:45 AM, Anand Beh wrote:

Thanks to you both for the advice.

Following on your suggestions, Caffeine changed to "requires static".
However, javac now produces a warning about the lack of transitivity:

warning: [exports] class Nullable in module
org.checkerframework.checker.qual is not indirectly exported using
requires transitive

Is this a bug in javac or is it anything we should be worried about?

Regards,
Anand

On 6/3/21, Alex Buckley  wrote:

Even without `transitive`, requiring modules with `static` means that
anyone who habitually builds their entire stack from source will still
need the errorprone and checker-qual modules at compile time.

There are no "run-time only" dependencies in module declarations, unless
services come into play, which is not realistic here. As Remi said,
`requires static` is your best bet. Annotation types that are exported
by a third party such as Google, in order for people to annotate their
personal codebases, are not really an API that those personal codebases
need to export -- any fourth party program that wishes to inspect the
annotations in your codebase needs to arrange its own dependency on the
annotation types from the third party.

Alex

On 6/3/2021 1:10 PM, Anand Beh wrote:

Hello,

The cache library Caffeine recently added a full module descriptor. It
has no runtime dependencies, but it depends on metadata annotations
from checker-qual and errorprone, for example @NotNull and
@CanIgnoreReturnValue. The module looks like this:
module com.github.benmanes.caffeine {
exports com.github.benmanes.caffeine.cache;
exports com.github.benmanes.caffeine.cache.stats;

requires static transitive com.google.errorprone.annotations;
requires static transitive org.checkerframework.checker.qual;
}

The annotations are not required at runtime, hence static. They're
visibly placed on public methods and return types, so that API clients
can benefit from them for the purposes of annotation-based null
analysis, kotlin interop, etc. As the annotations are part of the API,
they're marked transitive.

However, the "transitive" aspect imposes some requirements on users. I
am wondering if there is a more correct way to declare these
annotation dependencies than static transitive.

One user would like to avoid the presence of these annotations at
compile-time. For reference, here's the relevant discussion:
https://urldefense.com/v3/__https://github.com/ben-manes/caffeine/issues/535__;!!GqivPVa7Brio!P53mZvzj_tIsON7Z_5P-tzQ-IH1xOgXbL9XWhtMKgoiJj97GfzXgrsvPc8Z27rHccoc$

I'm not a maintainer of caffeine, though I was involved in its
modularization.

Regards,
Anand





Re: Annotation Dependencies and Requires Static Transitive

2021-06-03 Thread Alex Buckley
Even without `transitive`, requiring modules with `static` means that 
anyone who habitually builds their entire stack from source will still 
need the errorprone and checker-qual modules at compile time.


There are no "run-time only" dependencies in module declarations, unless 
services come into play, which is not realistic here. As Remi said, 
`requires static` is your best bet. Annotation types that are exported 
by a third party such as Google, in order for people to annotate their 
personal codebases, are not really an API that those personal codebases 
need to export -- any fourth party program that wishes to inspect the 
annotations in your codebase needs to arrange its own dependency on the 
annotation types from the third party.


Alex

On 6/3/2021 1:10 PM, Anand Beh wrote:

Hello,

The cache library Caffeine recently added a full module descriptor. It
has no runtime dependencies, but it depends on metadata annotations
from checker-qual and errorprone, for example @NotNull and
@CanIgnoreReturnValue. The module looks like this:
module com.github.benmanes.caffeine {
   exports com.github.benmanes.caffeine.cache;
   exports com.github.benmanes.caffeine.cache.stats;

   requires static transitive com.google.errorprone.annotations;
   requires static transitive org.checkerframework.checker.qual;
}

The annotations are not required at runtime, hence static. They're
visibly placed on public methods and return types, so that API clients
can benefit from them for the purposes of annotation-based null
analysis, kotlin interop, etc. As the annotations are part of the API,
they're marked transitive.

However, the "transitive" aspect imposes some requirements on users. I
am wondering if there is a more correct way to declare these
annotation dependencies than static transitive.

One user would like to avoid the presence of these annotations at
compile-time. For reference, here's the relevant discussion:
https://github.com/ben-manes/caffeine/issues/535

I'm not a maintainer of caffeine, though I was involved in its modularization.

Regards,
Anand



Re: @Generated requires java.compiler / what should my annotation processor do

2020-10-30 Thread Alex Buckley
The type javax.annotation.processing.Generated was the replacement 
introduced in Java SE 9 for the type javax.annotation.Generated.


Why was a replacement needed for the type javax.annotation.Generated? 
Because the package javax.annotation belonged to Java EE, and was 
removed in Java SE 11.


By contrast, the package javax.annotation.processing has always been 
part of Java SE and is the logical home for a "Generated" type 
associated with annotation processors.


Since the type javax.annotation.processing.Generated has source 
retention only, there are no @javax.annotation.processing.Generated 
annotations in any class files compiled from source code emitted by your 
processor, so your downstream users don't need `requires java.compiler;`.


Alex

P.S. Your annotation processor `requires java.annotation;` but there is 
no such module.


On 10/29/2020 10:49 PM, Rob Bygrave wrote:

Are Java 9+ annotation processors using module path supposed to generate
source code that has a javax.annotation.processing.Generated annotation?
(which is part of java.compile)

I have an annotation processor that supports modules and is fine except
that when module applications use it the @Generated will not be included in
the generated source unless the application module-info adds a *requires
java.compiler;*  (which seems wrong).

So this is a minor niggle that the @Generated no longer is included in the
generated source when apps go from class path to module path.

Apologies is this has been asked before.  I was unable to find anything on
this issue.

Any pointers or thoughts?


*Background*

I have an annotation processor that generates source code (that now
supports java modules with a module-info via a Multi-Release jar).  The
module-info for the annotation processor is:

module io.avaje.inject.generator {

   requires java.compiler;
   requires io.avaje.inject;
   requires java.annotation;

   provides javax.annotation.processing.Processor with
io.avaje.inject.generator.Processor;
}


The annotation processor adds a @Generated to the generated source to
document that the source code is generated. The annotation processor only
adds the @Generated if javax.annotation.processing.Generated is deemed
available by using:

  *elements.getTypeElement("javax.annotation.processing.Generated") != null*



The annotation processor generates source that includes the @Generated in
the cases of:

- There is no module-info.java  (app is using class path and not module
path)
- The app module-info.java includes:   *requires java.compiler;*

I'm not expecting users of this annotation processor to put a *requires
java.compiler;* into their module-info.

I think I must be doing something wrong or should not try to
use javax.annotation.processing.Generated which is part of java.compile.


Cheers, Rob.



Re: Force JPMS to add module to boot layer

2020-06-23 Thread Alex Buckley

On 6/23/2020 9:31 AM, Alex Orlov wrote:

From what I can establish, org.hibernate.orm.core is an automatic
module so it doesn't have any way to `requires net.bytebuddy` (an
explicit module). You'll have to help out by adding `--add-modules
net.bytebuddy` to the command line that surefire or failsafe uses.
  
Yes, and this is the work that I say is very silly. We add modules to module

path, but JPMS IGNORES THEM. It is obvious that it was a wrong solution
to let JPMS solve what modules it should add to layer and what modules 
shouldn’t.


The module path is not like the class path. The class path is chaos; the 
module path is order. Think of the module path as a repository that 
contains tens, hundreds, thousands of modules, all of which sit silently 
until one is required by some module, or resolved due to `--add-modules`.


You're asking the JDK team how to use the ORM core of Hibernate as an 
automatic module. What did the Hibernate team say what you asked them?


Alex


Re: excluding transitive module

2020-04-15 Thread Alex Buckley

On 4/15/2020 12:00 AM, Jochen Theodorou wrote:

On 14.04.20 19:38, Alex Buckley wrote:
[...]

It's fine for
Library1 to require SharedApi, and for Library2 to require
SharedApiImpl, but if the exported packages overlap, then it's not fine
for Project to require, indirectly, both SharedApi (via Library1) and
SharedApiImpl (via Library2). That is, SharedApi and SharedApiImpl are
not composable.


And what do you do when you are in that situation?


If you had put them both on the classpath in JDK 8, then
you would have had a broken system. If you want to compose an
application that includes both, then one of them has to change.
There is no need to speak of "transitive" modules because `requires
transitive` has not entered the picture.


Without the module system I can just exclude one and be fine. With
module system I get into a split package problem adn have not found a
real solution yet


In the JDK 8 world:  If SharedApi and SharedApiImpl have different 
names, then (as you implied to Remi) Maven is going to treat them 
independently and put both on the classpath. Yes, you can realize that 
their content overlaps and exclude one of them in your POM, but *every* 
*single* *user* has to travel that road independently! If a user doesn't 
realize, and doesn't exclude, then their classpath has split packages 
and their application is broken. I think it's outrageous that the 
developers of SharedApi and SharedApiImpl imposed this tax on *every* 
*single* *user* and no-one held them to account.


Post JDK 8:  Only when SharedApi and SharedApiImpl are modularized does 
the presence of overlapping exports become clear, and prevent an 
application from being composed unsafely.


There is nothing you can do in the module-info.java of your application 
to express that its module graph indirectly requires the uncomposable 
modules M,N,O and that any `requires M` and `requires N` clauses should 
be overridden with `requires O`. This would be a maintenance nightmare. 
Mail their developers and tell them what they've done. Every library 
including its own copy of a standard API is a bug, not a feature.


(If a library is explicitly written to be loaded in its own class 
loader, such as in the OSGi environment, then things are different: 
private copies of APIs are OK, up to a point. However, the libraries you 
are using are plainly from the JDK 8 classpath era and have been lightly 
modularized -- enough for each vendor to let their module be required 
and jlinked, but not enough to address the bigger architectural issue of 
reuse / composability.)


Alex


Re: excluding transitive module

2020-04-14 Thread Alex Buckley

On 4/14/2020 3:12 AM, Jochen Theodorou wrote:

On 14.04.20 11:09, Alan Bateman wrote:

On 14/04/2020 09:24, Jochen Theodorou wrote:

Hi all,

I am wondering if there is a solution purely in the module-info for 
this:


* Project requires Library1 and Library2
* SomeLibrary requires SharedApi
* OtherLibrary requires SharedApiImpl

The problem is, that it will not compile because SharedApi and
SharedApiImpl implement the same packages but not the same logic. One is
just an API, while the other is an implementation of the API.



How does SharedApi locate the implementation?


It does not, there is no SPI. It just implements the interfaces



Why is OtherLibrary requiring SharedApiImpl? This suggests that
SharedApiImpl is more than an implementation, does it have an additional
implementation specific API that OtherLibrary make use of?


There can be different implementations (it is just no an SPI
architecture). There is one that then builds on top of SharedAPI, but it
is not the implementation I need.


All the talk of "implements the same packages" and "an implementation of 
the API" and "just implements the interfaces" is besides the point. 
There are no services here. All the module system has to work with are 
the packages exported by SharedApi and SharedApiImpl. It's fine for 
Library1 to require SharedApi, and for Library2 to require 
SharedApiImpl, but if the exported packages overlap, then it's not fine 
for Project to require, indirectly, both SharedApi (via Library1) and 
SharedApiImpl (via Library2). That is, SharedApi and SharedApiImpl are 
not composable. If you had put them both on the classpath in JDK 8, then 
you would have had a broken system. If you want to compose an 
application that includes both, then one of them has to change.


There is no need to speak of "transitive" modules because `requires 
transitive` has not entered the picture.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-25 Thread Alex Buckley

On 2/25/2020 10:48 AM, Robert Scholte wrote:

For the test sources it is unclear.
One reason to not provide a module descriptor is because it will never 
be a requirement for other modules.
And what about its name, as it is actually a required module and at the 
same also the same module.


One reason to provide a module descriptor is for the build tool to know 
which modules are required, so it can provide the right set of modules.
But how would such descriptor for test sources look like? Well, in the 
end it must contain everything from the main module descriptor + the 
test requirements.
I assume that copy/pasting the main module descriptor for a new test 
module descriptor + adding the extra requirements is not the preferred 
solution.


Correct, that is no-one's preferred solution. My preferred solution (and 
I think Christian's) is for the test module's descriptor to stay 
physically independent of the main module's descriptor. Then, the build 
tool (i) FINDS the test module's descriptor and (ii) MERGES it logically 
with the main module's descriptor. Suppose the merged descriptor says 
that the main module requires org.junit -- the build tool turns that 
into an --add-modules command when compiling and running tests. Maven 
could be doing this today, helping users to focus on their white-box 
test modules rather than asking users to type obscure --add-* commands 
in the POM.


Furthermore, from what Christian has posted, the effect of merging is 
always the same anyway -- require some JUnit modules, and open the main 
module. So, in the common case, people don't even need to write a test 
module descriptor; they just need the Maven test scope to inject the 
customary --add-* commands when compiling and running code in the main 
module.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-25 Thread Alex Buckley

On 2/25/2020 9:11 AM, Alex Buckley wrote:
And there are other ways, where the build tools (or their plugins) take 
responsibility for arranging the test-time module graph. This may 
require more work on the part of build tool/plugin maintainers than 
simply telling their users to write JDK command line options in the 
middle of a config file.


Roy knows what I mean: 
https://twitter.com/royvanrijn/status/1232309684788432896


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-25 Thread Alex Buckley

On 2/25/2020 8:16 AM, Remi Forax wrote:

no you can't because you can not reconstruct a module-info from the
flags --add-modules, --add-reads, etc.

Those flags has been created to patch (mostly the JDK) module
metadata so an application using the classpath can open/patch the JDK
modules metadata enough to be able to run. The semantics of a
module-info is richer than the semantics of the flags, by example,
there is no way to define a transitive dependency or to define a
service. I believe that roughly half of the semantics of a
module-info has no equivalent in term of flags (and for good reason,
those flags have not been created to describe a module-info).


In the context of JUnit, the module-info.java file of a test module does 
not define a transitive dependency or define a service. It opens a 
module and adds some direct dependencies -- clauses that are expressible 
with the current command line options. Other test frameworks may want to 
arrange things that are not expressible with the current command line 
options today, but there is no guarantee that --patch-module-descriptor 
would be able to handle those things either, e.g., if the test module 
wishes to remove dependencies or exports of the main module rather than 
always add them. But for what JUnit needs, a build tool plugin could be 
created today.



No need for --patch-module-descriptor in the JDK.


We need a way to express that the module graph when testing is
different from the one when running in production. 
--patch-module-descriptor is a way to get that.


And there are other ways, where the build tools (or their plugins) take 
responsibility for arranging the test-time module graph. This may 
require more work on the part of build tool/plugin maintainers than 
simply telling their users to write JDK command line options in the 
middle of a config file.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-19 Thread Alex Buckley

On 2/19/2020 3:46 AM, Christian Stein wrote:

On Tue, Feb 18, 2020 at 6:42 PM Alex Buckley 
wrote:


Good description of how the main module
[...]
etc. Then, the build tool can enrich the main module by passing
--add-modules, --add-reads, and --add-opens to javac/java while in a
test scope. No need for --patch-module-descriptor in the JDK.


But, as Robert already noted, each build tool will come up with
a different notation for their users to configure those additional
elements. That's nothing wrong with it per se -- just that there's
already a common notation for providing modular information:
"module-info.java"

And it's hard to explain why users may write "module-info.java"
files for inter-module testing that build pass 1:1 to javac and
friends -- have to either, depending on the tool, supply:

   - command-line flags... [Maven]
   - or a module-info.test file... [Gradle via Plugin]
   - or module-info.java file... [Pro, Bach.java]
   - or some-other-module-layer-magic configuration...

only for in-module testing.


I don't know how you concluded any of the above. I explicitly said (and 
you removed from the quote) that: "If a build tool has detected the 
presence of the test module, then the build tool can use 
javax.lang.model.element.ModuleElement to inspect the test module's 
declaration and infer what it requires, whether it's open, etc." -- 
"test module's declaration" means "org.astro/test/module-info.java".


That is, if a test framework intends that a test module has a 
module-info.java file to augment the main module's module-info.java file 
(and maybe not all test frameworks want to work like that...), then a 
build tool plugin for that test framework is well able to inspect the 
former file and turn it into command line options for javac/java in the 
test scope. The JDK isn't in the business of standardizing test 
framework practices, and already provides the APIs you need to implement 
any practice you like.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-18 Thread Alex Buckley

On 2/14/2020 9:34 PM, Christian Stein wrote:

Assuming `--patch-module-descriptor` was already available, the
org.astro test story would read:

- org.astro/main/module-info.java:

module org.astro {
   exports org.astro;
   requires java.logging;
}

- org.astro/test/module-info.java:

open /*test*/ module org.astro /*extends main module org.astro*/ {
   requires java.sql; // system-provided module needed by test code
   requires org.junit.jupiter; // framework reflecting entry-points
   requires org.assertj.core; // ...and more test-related modules
}

A build tool now easily may detect that `--patch-module-descriptor`
is required at test compile and run time. It could infer it from the
same module name or simply by seeing a main and a test
`module-info.java` compilation unit within the sources of the
org.astro Maven-/Gradle-/pro-/Bach-/any-build-tool-"module".

Build tools could pass the following argument to javac/java
while being in their test-scope tasks. Of course, implying that
org.astro/test/module-info.java is the current "primary" module
descriptor:

--patch-module-descriptor org.astro=org.astro/main/module-info.java

That's it.

The primary test-relevant modular directives are enriched by
those of the main module as declared in org.astro/main/module-info.java
resulting effectively in a synthetic in-module test module descriptor that
resembles the inter-module test descriptor presented above:

- SYNTHETIC test module for org.astro

open /*test*/ module org.astro /*extends main module org.astro*/ {
   requires java.sql; // system-provided module needed by test code
   requires org.junit.jupiter; // framework reflecting entry-points
   requires org.assertj.core; // ...and more test-related modules
   //
   exports org.astro; // patched from org.astro/main/module-info.java
   requires java.logging; // patched from org.astro/main/module-info.java
}


Good description of how the main module 
(org.astro/main/module-info.java) and the test module 
(org.astro/test/module-info.java) interact to form a test-scoped module. 
If a build tool has detected the presence of the test module, then the 
build tool can use javax.lang.model.element.ModuleElement to inspect the 
test module's declaration and infer what it requires, whether it's open, 
etc. Then, the build tool can enrich the main module by passing 
--add-modules, --add-reads, and --add-opens to javac/java while in a 
test scope. No need for --patch-module-descriptor in the JDK.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-14 Thread Alex Buckley

On 2/14/2020 11:29 AM, Christian Stein wrote:

A different way to address those use cases would be to ship the
logic just described for `--patch-module-descriptor` in a launcher
offered by the test framework itself.
 
That's too late in the build game. You already need this merged

descriptor at compile time. And you need the user to define the
additional elements (modifiers and directives), as no build tool
may infer them always correctly by inspecting the test sources...


Fair point, and thanks for reminding me about your "Testing In The 
Modular World" document.


The thing I'm having trouble with is that javac already lets you specify 
the --add-reads options on org.junit.jupiter.api & friends that are 
needed to compile test code. If Maven expects the user to painfully 
configure the POM to pass --patch-module-descriptor to javac, why can't 
the POM offer an easy way to make Maven itself (in the `test-compile` 
phase?) pass some defined-behind-the-scenes --add-reads options?


Similarly, at run time, you propose to always open the main module to 
the same org.junit.platform.commons module -- `module-info.test` is more 
or less the same for every JUnit project -- so why doesn't the POM have 
an easy way to let Maven itself (in the `test` phase?) pass --add-opens 
to java?


There are no new modularity "primitives" here (by which I mean features 
of the module system itself, such as open modules) ... there is just 
detailed configuration which, in the testing context, should be done by 
the build tool rather than the user.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-14 Thread Alex Buckley

On 2/14/2020 12:02 AM, Robert Scholte wrote:
To me having the correct (main) module descriptor should be the ultimate 
goal. People should not be tempted to apply workarounds on this 
descriptor to make things work for tests (even though those module are 
for sure part of the java runtime)
Especially because of the impact wrong module descriptors would have and 
which cannot be adjusted.


Nowhere else in Java is there the ability to "patch" one declaration by 
making further declarations in source code, so I don't think there's a 
way forward for a `patch module ...` declaration. I understand that it 
looks good when viewed through the lens of testing, where there's 
exactly one secondary module declaration (the test module) and it's 
tightly scoped to support white-box tests. However, we have to consider 
if it's a good idea in general for a module to be defined by parts, 
where anyone could throw multiple `patch module foo {..}` declarations 
on the module path and have each of them modify the "master" foo module.


The answer is that it's not a good idea: (i) different `requires` 
clauses might clash (i.e. read modules which export the same packages), 
(ii) one patch might `open` the module while another patch doesn't want 
it open (and there's no way to say non-open), and (iii) the content of 
packages in the unified module is now determined by the order in which 
the "master" module and the "patch" modules are deployed on the module path.


These things are so undesirable that even `--patch-module` is unable to 
override module declarations! That said, command line options can have 
superpowers that would never be expressible in source code declarations, 
so one option might be a `--patch-module-descriptor` option. This would 
be orthogonal to `--patch-module`, which deliberately has leaky 
classpath-style semantics for its specified content, and that's neither 
desirable nor useful here. Within limits, `--patch module-descriptor` 
would merge the declaration of the "patch" module into the "master" 
module, and fail hard if the limits are broken. What are the limits? A 
starting point is the rules for versioned module descriptors in a 
modular multi-release JAR -- 
http://openjdk.java.net/jeps/238#Modular-multi-release-JAR-files -- and 
perhaps it's reasonable to allow additional _qualified_ exports in the 
"patch" module.


Now, I am not "proposing" `--patch-module-descriptor` for a future JDK. 
I am recognizing that certain use cases involve changing a module's 
dependences and encapsulation in a tightly scoped way. A different way 
to address those use cases would be to ship the logic just described for 
`--patch-module-descriptor` in a launcher offered by the test framework 
itself. If JUnit creates its own module layer, then it can define module 
descriptors at run time how ever it likes. It's not clear to me if 
Christian & co. have looked into java.lang.ModuleLayer, 
java.lang.module.Configuration, and java.lang.module.ModuleFinder. If 
they have, we'd love to hear their experiences. If they haven't, they 
should.


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-12 Thread Alex Buckley

On 2/12/2020 1:08 PM, Robert Scholte wrote:

To prevent these workarounds and to provide an easier way to patch a
module via a dedicated descriptor will help keeping the module
system cleaner.


It will lead to "split modules" on the modulepath, which will cause just 
as many maintenance headaches as split packages on the classpath. Yes, 
there is some maintenance benefit if a module explicitly declares that 
it patches (i.e. becomes part of) another module (which might then have 
to explicitly declare that it allows patching) ... but for a developer 
to understand the resulting module graph requires looking at everything 
on the modulepath, which is no better than having to look at everything 
on the classpath. In Java, a declaration -- whether a module, a class, 
or a method -- happens in one source file, and it's up to tools to 
rewrite declarations if other interesting source files are known to 
those tools.



However, recently I've informed by this case: if the test sources
use one of the java.* modules (that are not used by the main sources)
the only correct way to solve it now is by adding the required flags
by hand (and only to the test-compile configuration!). This is hard
to explain and instead of diving into the specifications, 
understanding what's happening, you'll see that they choose for the

easy workaround: add this "test scoped" module as a required module
to the module descriptor.


Is there nothing that Maven can do to make the test-compile 
configuration easier to create? Maven has all the source code at its 
fingertips, including knowledge of module directories which seem to 
declare the same module more than once because JUnit recommends it, yet 
still Maven makes the user laboriously write out the command line flags 
for patching?


Alex


Re: RFE simplify usage of patched module [by Robert Scholte, from jdk-dev]

2020-02-10 Thread Alex Buckley

Hi Christian,

On 2/7/2020 4:41 AM, Christian Stein wrote:

This time, I created a project at [0] with a detailed description on its
front page, i.e the README.md file.

[0]: https://github.com/sormuras/java-module-patching


To restate:

- You're saying that, today, it's brittle to copy directives from 
src/org.astro/main/java/module-info.java to 
src/org.astro/test/java/module-info.java. (And having copied them, you 
still need to `open` the test module and add some `requires`.)


- You suggest that, in future, there will still be a 
src/org.astro/test/java/module-info.java file which is of interest to 
test frameworks.


What you're hoping is that new syntax will let you invert the patching:

- Today, you set the module-path so that out/modules/test/org.astro.jar 
is the "primary" version of the module; then you redefine everything in 
it except module-info by overlaying out/modules/main/org.astro.jar.


- In future, you want to have out/modules/main/org.astro.jar as the 
"primary" version, and redefine only its module-info by specifying the 
sidecar out/modules/test/org.astro.jar. The sidecar would have some 
syntax to identify that its declaration of org.astro is strictly 
additive to the "primary" version. You would set the module-path to 
out/modules/main:out/modules/test:lib so that the module system (1) 
finds the "primary" version in out/modules/main and (2) augments its 
module-info with sidecars found in out/modules/test and lib. I assume 
some new command line option would enable or enumerate the sidecars 
explicitly, because obviously the dependences and exports of 
out/modules/main/org.astro.jar shouldn't depend on which JAR files 
happen to be lurking deep in the module-path.


Stepping back, the core issue is that once the true "primary" version of 
a module is built -- out/modules/main/org.astro.jar -- you don't want to 
change it. No build or test tool wants to physically rewrite its 
module-info to require test-time dependencies at test time, and then to 
not require such dependencies at release time. You want the module 
system to virtually rewrite the module-info instead. And that's already 
possible, as long as you put on your test-colored sunglasses and view 
out/modules/test/org.astro.jar as the "primary" version, and 
out/modules/main/org.astro.jar as the overlay ... once the tests have 
run, go back to viewing out/modules/main/org.astro.jar as the "primary" 
version. Introducing a new kind of module descriptor, with merging by 
the module system according to new command line options, seems like a 
lot of overhead that can already be worked around by tools at test-time.


Alex


Re: 8233922: Service binding augments module graph with observable incubator modules

2019-11-18 Thread Alex Buckley

On 11/18/2019 11:10 AM, Alan Bateman wrote:
We could potentially say something about tools that are incubator 
modules here. The launcher for the tool, "jpackage" in the case of JEP 
343, will use the name of the incubator module as the root module to 
resolve.


I updated JEP 11 to allude to incubator modules containing tools, hence 
no exported API but rather a service provider.



During the JDK build, incubator modules must be packaged ...
There's a typo in this paragraph in that the jmod (and jar) tools use 
"--warn-if-resolved=incubating" rather than 
"--warn-if-resolved=incubator" to package an incubator module.


JEP corrected to say `incubating`.

Alex


Re: 8233922: Service binding augments module graph with observable incubator modules

2019-11-18 Thread Alex Buckley

On 11/18/2019 4:34 AM, Alan Bateman wrote:

The summary on this issue is that service binding augments the
module graph with the modules induced by the service-use relation.
... If service binding were to resolve `jdk.incubator.jpackage`  > then a 
warning would be emitted in all phases that java.base
is resolved (so everywhere).

... The change proposed here means that incubator modules are not 
candidates to add to the module graph during service binding for the
boot layer's configuration. It has no impact on the general case 
where incubating modules may be observable when creating the 
configuration for custom layers.
An incubator module's service providers will now be unavailable by 
default even if a module on the module path says `uses`. I believe that 
JEP 11 should say the following:


-
Incubator modules are part of the JDK run-time image produced by the 
standard JDK build. *However, by default, incubator modules are not 
resolved for applications on the class path. Furthermore, by default, 
incubator modules do not participate in service binding for applications 
on the class path or the module path.*


Applications on the class path must use the --add-modules command-line 
option to request that an incubator module be resolved. Applications 
developed as modules can specify `requires` or `requires transitive` 
dependences upon an incubator module directly. *(Some incubator 
modules do not export packages but rather provide implementations of 
services. It is usually not recommended to require a module that exports 
no packages, but it is necessary in this case in order to resolve the 
incubator module and thus have it participate in service binding.)*


During the JDK build, incubator modules must be packaged ...
-

Alex


Re: Preparing to Repair Gradle's Malformed Module Main-Class Command Line

2019-10-16 Thread Alex Buckley
I was going to ask if setting `jvmArgs` in the `run` task is the right 
way to configure Gradle, but I see it's done here:



https://guides.gradle.org/building-java-9-modules/#modify_the_code_run_code_task_to_use_modules

Setting aside `-m`: How do you specify the traditional `java -jar 
myapp.jar` to the `run` task? If it's by setting `jvmArgs` then 
presumably the same problem occurs of stray arguments being passed to 
the main class?


BTW your module path doesn't need to spell out the JAR file. Just 
`build/libs` is enough for the Java compiler and the Java runtime to 
find `my.module`. A module path option should be orders of magnitude 
shorter than a class path option!


Alex

On 10/16/2019 3:03 PM, Plugins wrote:

Hi all,


Please review this issue [1]? It reports a defect in the Gradle build
tool's current construction of malformed java command lines that are
intended to execute a Main-Class of a JPMS module.
  
At first I thought the bug reported there was in either the Dropwizard

application framework I was using at the time, or in the CLI parser
utility (argparse4j) Dropwizard uses under the hood.

I am now convinced — by diagnosis done by those two projects'
maintainers — that the command line construction bug is definitely in
Gradle's court.

I'd like to request from the list you all's take on this. Please? I have
been experimenting with a solution that works so far. But any insight or
suggestions anybody here can offer would be muy appreciado! TIA.



[1] http://bit.ly/Issue10825 «Refactor
JvmOptions.getAllImmutableJvmArgs() to support executing the main class
in a JPMS module»





Re: Unable to derive module descriptor for gradle-api-5.6.2.jar — Provider class moduleName=model-core not in module

2019-10-16 Thread Alex Buckley
Good to see ... I'm sure you'll let jigsaw-dev know when a new version 
of Gradle is usable as automatic modules.


Alex

On 10/16/2019 7:21 AM, Plugins wrote:

Hi,

{And sorry about my messages not being connected to the thread. I can't
get my webmail to do the right thing}

Github.com have finally unflagged my account now. Their „spam
blocker“ objected to something of mine somewhere — allegedly.

Anyway, the two issues and the one pull request [1] are visible again.



[1] http://bit.ly/PR11039


 Original Message 
Subject: RE: Re: Unable to derive module descriptor for gradle-api-5
.6.2.jar — Provider class module Name=model-core not in module
From: "Plugins" 
Date: Tue, October 15, 2019 10:54 am
To: "Alex Buckley" ,
jigsaw-dev@openjdk.java.net

Yeah. Sorry about that. I don't know what's going on. But only seconds
after I pushed my change up to my forked gradle/gradle repo yesterday,
for some reason github.com flashed an angry red „Your account has been
flagged. Please contact support...“ message on every page I visit on
github. I contacted github.com support several hours ago. But they still
haven't replied.

They've disappeared from Gradle's issue page. They're titled
„Compliance with Groovy 2.5's Standard Mechanism...“ and
„Compliance with the JDK's Standard Mechanism...“. I can only see
them — with the „...flagged...“ message emblazoned across the top
— when I'm logged into github.com.

I was afraid something like this would happen. So shortly after I first
submitted the Groovy one, I saved it on the Wayback Machine [1] Stuff
like this isn't good for that conspiracy theorist in me ;)




[1] http://bit.ly/11028bak




 Original Message 
Subject: Re:__Unable_to_derive_module_descriptor_for _gradle-api-5
.6.2.jar_—_Provider_class_module Name=model-core_n_ot_in_module
From: Alex Buckley 
Date: Tue, October 15, 2019 8:50 am
To: jigsaw-dev@openjdk.java.net

On 10/14/2019 9:07 PM, Plugins wrote:

I've reported to Gradle's issue tracking system, both the
java.security.Provider configuration file issue [1] and the
org.codehaus.groovy.runtime.ExtensionModule configuration file issue [2].


Thanks very much. Strangely, AFAICT, the two issues you submitted have
disappeared -- they return 404 errors, and do not appear in the issue
list:

http://bit.ly/Issue11027
-> https://github.com/gradle/gradle/issues/11027

http://bit.ly/Issue11028
-> https://github.com/gradle/gradle/issues/11028

Alex




Re: _Unable_to_derive_module_descriptor_for_gradle-api-5 .6.2.jar_—_Provider_class_moduleName=model-core_n ot_in_module

2019-10-15 Thread Alex Buckley

On 10/14/2019 9:07 PM, Plugins wrote:
I've reported to Gradle's issue tracking system, both the 
java.security.Provider configuration file issue [1] and the 
org.codehaus.groovy.runtime.ExtensionModule configuration file issue [2].


Thanks very much. Strangely, AFAICT, the two issues you submitted have 
disappeared -- they return 404 errors, and do not appear in the issue list:


http://bit.ly/Issue11027
-> https://github.com/gradle/gradle/issues/11027

http://bit.ly/Issue11028
-> https://github.com/gradle/gradle/issues/11028

Alex


Re: Using ServiceLoader with modules

2019-10-14 Thread Alex Buckley

// cc'ing the list because best practices should be public, and
// because it's easier to refer people to it in five years time

Hi Christian,

On 10/14/2019 12:16 PM, Christian Stein wrote:

  just read your mail and was wondering if this extrapolation is correct:

    A module that does not export any package should never need to
    declare a "requires" as transitive.


Some qualification needed, I think. There are (at least) two kinds of 
"top level" module that commonly have no `exports` clauses:


1. An "application module" intended to be launched (`java -m bizapp`) 
rather than programmed against. I agree that this kind of module should 
not use `requires transitive`.


2. An "aggregator module" intended to present a facade composed from the 
APIs of other modules. For example, the `java.se` module has no 
`exports`, but makes good use of `requires transitive`. (See 
http://hg.openjdk.java.net/jdk/jdk/file/tip/src/java.se/share/classes/module-info.java.)


For example, I should remove all `transitive` modifiers from this module 


module org.junit.platform.console {
   requires transitive org.apiguardian.api;
   requires transitive org.junit.platform.reporting;
   provides java.util.spi.ToolProvider
       with org.junit.platform.console.ConsoleLauncherToolProvider;
}


This "console" module certainly looks like #1 rather than #2.

Alex


Using ServiceLoader with modules

2019-10-14 Thread Alex Buckley
I came across a recent blog entry about using ServiceLoader in a modular 
environment:


  https://blog.frankel.ch/migrating-serviceloader-java-9-module-system/

Unfortunately, it is completely wrong in its advice on "Migrating to the 
Java Platform Module System". I have seen the same bad advice on 
StackOverflow, so let me correct it here, as simply as possible:


1. The module containing the implementation SHOULD NOT export the 
implementation package.


2. The module containing the client MUST say that it uses the service 
interface.


See the "Deploying service providers as modules" section of 
https://docs.oracle.com/en/java/javase/13/docs/api/java.base/java/util/ServiceLoader.html


The whole point of integrating ServiceLoader with the module system was 
get the configuration documented explicitly, via `uses` and `provides` 
clauses. Benefits: the implementation is strongly encapsulated (no 
`exports`), and the client's service lookup is speedy because its 
dependency (`uses`) was found cheaply during startup. Eventually, as the 
blog rightly says, "only the configuration changes: the Service Loader 
code itself in the client doesn’t change."


Alex


Re: Unable to derive module descriptor for gradle-api-5.6.2.jar — Provider class moduleName=model-core not in module

2019-10-14 Thread Alex Buckley
Replying to the mailing list so that our new friend "lingocoder" can 
take action on the bug (which Alan and Jochen have helpfully diagnosed 
elsewhere in this thread).


Alex

On 10/11/2019 10:05 PM, Cédric Champeau wrote:
Looks like the extension file is located in the old path which Groovy 
used. Would you mind creating an issue for this ?


Le sam. 12 oct. 2019 à 01:52, Alex Buckley <mailto:alex.buck...@oracle.com>> a écrit :


On 10/11/2019 3:41 PM, Plugins wrote:
 > That technically-inappropriate
 > META-INF/services/org.codehaus.groovy.runtime.ExtensionModule
entry in
 > Gradle's gradle-api-{version}.jar file is used by Gradle to
extend the
 > Groovy language; which Gradle relies on. Apparently, that service
entry
 > extends Groovy with Java methods [2].

As I write this email, the most recent comment at [2] is:

-
alan.bateman2 • 2 years ago

META-INF/services is specified in the JAR file spec as the location for
services configuration files. It's not appropriate to put properties
file in this location. Can the Groovy extension mechanism use a
different location?
-

 > The contents of that file...
 >
 >     moduleName=model-core
 >     moduleVersion=1.0
 >
 >
extensionClasses=org.gradle.api.internal.provider.MapPropertyExtensions
 >
 > ...breaks Jigsaw when gradle-api-{version}.jar is in the module path
 > (for Gradle plugin development, say).

Then gradle-api-{version}.jar can't be used as an automatic module.

Since Cedric Champeau was writing to jdk-dev earlier today, I have
taken
the liberty of cc'ing him in the hope that he can share Gradle's plans
for fixing this.

Alex



Re: Unable to derive module descriptor for gradle-api-5.6.2.jar — Provider class moduleName=model-core not in module

2019-10-11 Thread Alex Buckley

On 10/11/2019 3:41 PM, Plugins wrote:

That technically-inappropriate
META-INF/services/org.codehaus.groovy.runtime.ExtensionModule entry in
Gradle's gradle-api-{version}.jar file is used by Gradle to extend the
Groovy language; which Gradle relies on. Apparently, that service entry
extends Groovy with Java methods [2].


As I write this email, the most recent comment at [2] is:

-
alan.bateman2 • 2 years ago

META-INF/services is specified in the JAR file spec as the location for 
services configuration files. It's not appropriate to put properties 
file in this location. Can the Groovy extension mechanism use a 
different location?

-


The contents of that file...

moduleName=model-core
moduleVersion=1.0
   
extensionClasses=org.gradle.api.internal.provider.MapPropertyExtensions


...breaks Jigsaw when gradle-api-{version}.jar is in the module path
(for Gradle plugin development, say).


Then gradle-api-{version}.jar can't be used as an automatic module.

Since Cedric Champeau was writing to jdk-dev earlier today, I have taken 
the liberty of cc'ing him in the hope that he can share Gradle's plans 
for fixing this.


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-05-13 Thread Alex Buckley

On 5/13/2019 6:02 AM, Jan Lahoda wrote:

Could I please get a review on the CSR?

https://bugs.openjdk.java.net/browse/JDK-8222396


Added myself as a reviewer.

Alex


And also on the patch:
http://cr.openjdk.java.net/~jlahoda/8220702/webrev.01/

Thanks!

Jan

On 06. 05. 19 21:06, Alex Buckley wrote:

On 4/12/2019 12:03 PM, Alex Buckley wrote:

On 4/12/2019 5:34 AM, Jan Lahoda wrote:

I've started with the CSR here:
https://bugs.openjdk.java.net/browse/JDK-8222396


Looks pretty good. I made some edits to record both of your
single-module and multi-module invocations of javac.

The use case of injecting test code is clear, but the exact connection
between automatic modules and test code is pretty opaque. Is the goal to
make the automatic test module read the explicit test module so that the
former module's code can access the latter module's code? Is the goal to
make the automatic module read (and therefore test) at least the same
set of modules as the explicit modules `requires`?


Reviewing the CSR again, it seemed like the key scenario is multiple
named modules, where for each named module:

1. We don't really care about its relationship with the other named
modules; but

2. We do care about injecting it with test code, and letting that test
code read other, completely arbitrary, modules (say, an
assertion-building library that's been placed on the module path).

I have refactored the CSR to more strongly separate the problem
(patching an automatic module is possible, but readability is sub-par)
from the solution (precedent for ALL-MODULE-PATH from the unnamed
module scenario).

JEP 261 should be updated to explain the awesome power of
--patch-module at compile time, and that is progressing behind the
scenes, but I don't think it needs to block JDK-8220702 -- the CSR is
"good enough" documentation for now.

Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-05-06 Thread Alex Buckley

On 4/12/2019 12:03 PM, Alex Buckley wrote:

On 4/12/2019 5:34 AM, Jan Lahoda wrote:

I've started with the CSR here:
https://bugs.openjdk.java.net/browse/JDK-8222396


Looks pretty good. I made some edits to record both of your
single-module and multi-module invocations of javac.

The use case of injecting test code is clear, but the exact connection
between automatic modules and test code is pretty opaque. Is the goal to
make the automatic test module read the explicit test module so that the
former module's code can access the latter module's code? Is the goal to
make the automatic module read (and therefore test) at least the same
set of modules as the explicit modules `requires`?


Reviewing the CSR again, it seemed like the key scenario is multiple 
named modules, where for each named module:


1. We don't really care about its relationship with the other named 
modules; but


2. We do care about injecting it with test code, and letting that test 
code read other, completely arbitrary, modules (say, an 
assertion-building library that's been placed on the module path).


I have refactored the CSR to more strongly separate the problem 
(patching an automatic module is possible, but readability is sub-par) 
from the solution (precedent for ALL-MODULE-PATH from the unnamed module 
scenario).


JEP 261 should be updated to explain the awesome power of --patch-module 
at compile time, and that is progressing behind the scenes, but I don't 
think it needs to block JDK-8220702 -- the CSR is "good enough" 
documentation for now.


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-04-12 Thread Alex Buckley

On 4/12/2019 5:34 AM, Jan Lahoda wrote:

I've started with the CSR here:
https://bugs.openjdk.java.net/browse/JDK-8222396


Looks pretty good. I made some edits to record both of your 
single-module and multi-module invocations of javac.


The use case of injecting test code is clear, but the exact connection 
between automatic modules and test code is pretty opaque. Is the goal to 
make the automatic test module read the explicit test module so that the 
former module's code can access the latter module's code? Is the goal to 
make the automatic module read (and therefore test) at least the same 
set of modules as the explicit modules `requires`?


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-04-11 Thread Alex Buckley

On 4/11/2019 12:00 PM, Christian Stein wrote:

I'm not sure if this helps in any regards -- here [1] is a draft PR
that adds explicit module descriptors to all JUnit 5 modules.

This PR uses javac in multi-module mode and --patch-module
to access the already (with --release 8) compiled classes of
each module.


Thanks Christian. Ultimately you're invoking javac like so:

javac --patch-module org.junit.platform.commons=
 junit-platform-commons/build/classes/java/main
  --module-version 1.5.0-SNAPSHOT
  src/modules/org.junit.platform.commons/module-info.java

where it is interesting to see --patch-module indicating the putative 
content of a module before the module-info.java file which declares the 
module into existence has been compiled!


You even get the benefit of error-checking for module-info.java, e.g., 
that its `exports` refer to packages which truly exist in the module. I 
suppose if you just put junit-platform-commons/build/classes/java/main 
on the --module-path then you would not get this checking.


The first point for using --patch-module at compile time goes to Jan for 
building tests as if members of automatic modules; you get the second 
point for building module declarations as if members of ... well, 
themselves.


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-04-11 Thread Alex Buckley

On 4/11/2019 1:19 AM, Jan Lahoda wrote:

Yes, I think JEP 261 may need updates. I'd say this is somewhat
unforeseen interaction between automatic modules and --patch-module.

When patching a (named) module (or a set of named modules), that module
(or modules) become the root module for javac, and the ordinary module
graph building algorithm is then used to build the module graph. So the
set of modules in the module graph may be different from the set of the
modules in the module graph when compiling source in the unnamed module.


OK. It sounds like (i) single-module mode supports source files being 
compiled as if members of a named module (the module being patched), and 
(ii) multi-module mode supports source files being compiled as if 
members of named modules (the modules being patched). Great.


Broadly, JEP 261 is correct when it says this in "Root modules":

-
Otherwise, the default set of root modules depends upon the phase:
- At compile time it is usually the set of modules being compiled (more 
on this below);

-

where "below", it is correct for multi-module mode:

-
The set of root modules is the set of modules for which at least one 
source file is specified.

-

but incorrect for single-module mode: (assuming javac is invoked as 
shown in your first mail)


-
Otherwise source files will be compiled as members of the unnamed 
module, and the root modules will be computed as described above.

-

JEP 261 is the ultimate source of truth for the module system, from 
which all tutorials, blogs, books, etc derive. It's very hard to infer 
from it that javac supports compiling source as if in a patched module. 
That's why a CSR to record what javac does (either with no --add-modules 
or with --add-modules=ALL-MODULE-PATH) will be so valuable. In advance 
of that, can you share some detailed invocations of javac using 
--patch-module in single-module and multi-module mode?


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-04-10 Thread Alex Buckley

On 4/10/2019 2:51 PM, Jonathan Gibbons wrote:

On 4/10/19 11:51 AM, Alex Buckley wrote:

There is a question to be answered: When the compiler compiles code in
an automatic module (due to the code being observed in a directory
that is specified to --patch-module), then what is the _default set of
root modules_ for the automatic module?

I expect the answer is: the same as the default set of root modules
for the unnamed module.


I would not expect the fact that you're compiling code in one or more
automatic modules to have any effect on the module graph. The fact that
the definitions of some types have been "patched in" using source code
via --patch-module is (or should be) irrelevant.


I'm not sure how anyone compiles code as if in an automatic module, 
since the automatic module is solely a JAR file, but Jan suggests that 
it's possible to patch an automatic module's content with the source 
files being compiled. So, I looked in JEP 261 to see how javac would 
treat those source files:


"If a module descriptor in the form of a module-info.java or 
module-info.class file is specified on the command line [NO], or is 
found on the source path [NO] or the class path [NO], then source files 
will be compiled as members of the module named by that descriptor and 
that module will be the sole root module. Otherwise if the --module 
 option is present [NO] then source files will be compiled as 
members of , which will be the root module. Otherwise [I GUESS 
WE'RE HERE] source files will be compiled as members of the unnamed 
module, and the root modules will be computed as described above."


Alex


Re: RFR: JDK-8220702: compiling in the context of an automatic module disallows --add-modules ALL-MODULE-PATH

2019-04-10 Thread Alex Buckley

// Adding jigsaw-dev -- this topic is slightly deeper than it appears.

On 4/9/2019 4:43 AM, Jan Lahoda wrote:

Currently, when compiling an unnamed module, --add-modules
ALL-MODULE-PATH adds all modules from the module path to the module
graph, so that the unnamed module can refer to them.


(See also http://openjdk.java.net/jeps/261#Root-modules plus the section 
"Root modules" in the java.lang.module package spec.)



When compiling a named module, this option is not allowed, as it would
be confusing, as it cannot modify the dependencies of the named module.

But, when compiling a code in the context of an automatic module using
--patch-module, e.g.:
javac --module-path  --patch-modules
= ...

The --add-modules ALL-MODULE-PATH is still disallowed, but it may make
sense to use it, as it may affect the set of classes available for the
sources that are being compiled. Compiling in the context of an
automatic module may be useful esp. in when patching multiple modules at
once (which is supported by --patch-module), e.g. to build tests. (Seems
that also at least the use of javadoc in Maven may lead to a similar
setting as well, see the JBS bug.)


JEP 261 doesn't discuss compiling code "in the context of an automatic 
module", and javac's reference guide doesn't imply that `--patch-module` 
supports it, but it is clearly a plausible and useful thing to do, so 
thanks for calling it out.


There is a question to be answered: When the compiler compiles code in 
an automatic module (due to the code being observed in a directory that 
is specified to --patch-module), then what is the _default set of root 
modules_ for the automatic module?


I expect the answer is: the same as the default set of root modules for 
the unnamed module. And actually, for the `javac` invocation shown 
above, which is single-module mode, this answer is given in the "Compile 
time" section of JEP 261 
(http://openjdk.java.net/jeps/261#Compile-time). Unfortunately, the 
answer is only accidentally right, because the section did not envisage 
compiling code in an automatic module; for the `javac` invocation above, 
the section says "source files will be compiled as members of the 
unnamed module".


So, there's another question to be answered: Does `javac` support 
compiling code in an automatic module in single-module mode, or 
multi-module both, or either? Once we know this, we can update the 
"Compile time" section and answer the set-of-root-modules question.



The proposal is to allow the --add-modules ALL-MODULE-PATH option when
compiling code in the context of an automatic module.


I support this proposal, subject to the questions raised above.

A CSR will be needed; in addition to specifying the things mentioned 
above, it can clarify the description of ALL-MODULE-PATH w.r.t. when it 
can be used at compile time. Currently, the description seems unsure 
about when it can be used.


(For reference, the text: "As a final special case, at both run time and 
link time, if  is ALL-MODULE-PATH then all observable modules 
found on the relevant module paths are added to the root set. 
ALL-MODULE-PATH is valid at both compile time and run time. This is 
provided for use by build tools such as Maven, which already ensure that 
all modules on the module path are needed. It is also a convenient means 
to add automatic modules to the root set.")


Alex


Proposed patch: http://cr.openjdk.java.net/~jlahoda/8220702/webrev.01/
JBS: https://bugs.openjdk.java.net/browse/JDK-8220702

What do you think?

Thanks,
 Jan


Re: More flexibility in providing services from modules?

2019-03-04 Thread Alex Buckley

Hi Tom,

Nice to see thoughts about modular services.

FYI Jigsaw's goals in this area were quite limited 
(http://openjdk.java.net/projects/jigsaw/spec/reqs/#services) and the 
static provider method was a relatively late add-on 
(http://openjdk.java.net/projects/jigsaw/spec/issues/#services).


I agree that a static provider method offers a basic level of 
indirection (e.g. multiple `provides` directives can employ the same one 
if its return type implements multiple services) but it doesn't add much 
in the way of abstraction (e.g. over the desired service). This isn't 
the only place where ServiceLoader suffers from a lack of abstraction: 
you can't parameterize services because there's no way to request an 
implementation of a parameterized service (i.e. you can't say 
`ServiceLoader.load(ServiceA.class)`).


Frankly, we are not likely to revisit any of these limitations in the 
near future, but I think that ServiceLoader will always be a handy 
built-in option for limited use cases.


Alex

On 3/2/2019 9:19 AM, Tom De Wolf wrote:

The current ServiceLoader mechanism and Java Modules have a limited set of
options to provide an instance for a service from a module. Or you use the
constructor, or you use a static 'provider' method.

When interacting with frameworks like Spring this does not give a lot of
flexibility. I described the problem statement and some ideas which could
be considered for evolving the services support in the module system in
this blog post:

https://devcreativity.wordpress.com/2019/03/02/java-modules-wish-list-provide-services-using-lambda-functions-and-more/


I would appreciate if some of the oracle contributors that worked on the
module system could read the post and provide me and the community with
some feedback if such a change could be a good idea?

Thanks in advance

Tom



Re: Annotation processors and the --processor-module-path

2019-01-30 Thread Alex Buckley

On 1/30/2019 9:40 AM, cowwoc wrote:

Seeing as `--add-modules=java.xml.bind` no longer works in Java 11 what is
the updated recommendation?


The Java SE Platform no longer includes JAXB. JEP 320 discusses where to 
find standalone versions of JAXB (http://openjdk.java.net/jeps/320).


StackOverflow also has a question on this topic, with many suggestions 
and more than 250,000 views 
(https://stackoverflow.com/questions/43574426/how-to-resolve-java-lang-noclassdeffounderror-javax-xml-bind-jaxbexception-in-j).


Alex


Re: Way to bypass "uniquely visible" check for named vs. unnamed module conflict?

2018-12-20 Thread Alex Buckley

Thank you Stephan. We're looking into javac now.

(BTW I enjoyed your comment on SO that "Adding module-info.java is not 
part of the problem, but part of the solution (thereby moving your code 
from classpath to modulepath)".)


Alex

On 12/20/2018 5:21 AM, Stephan Herrmann wrote:

Sorry, if that part was left implicit.
Yes, all versions of javac that I tried accept all four invocations.
This includes the following builds:
- 9.0.4+11
- 10.0.1+10
- 11.0.1+13-LTS
- 12-ea+19

Stephan


On 20.12.18 00:05, Alex Buckley wrote:

I'm seeing multiple claims in the SO item and the Eclipse bug that
javac does the wrong thing, but no-one says what it is. Are you saying
that javac (which version?) compiles Test.java in the second and third
invocations?

Alex

On 12/18/2018 3:21 PM, Stephan Herrmann wrote:

Thanks for confirming!

Stephan

On 18.12.18 21:38, Alex Buckley wrote:

On 12/18/2018 11:04 AM, Stephan Herrmann wrote:

This has been discussed on StackOverflow in threads like
https://stackoverflow.com/q/51094274/4611488

The question can be simplified like this:
//---
import javax.xml.transform.Transformer;

public class Test {
 Transformer transformer;
}
//---

Which of the following compiler invocations should accept / reject the
program:

$ javac Test.java
$ javac -classpath xml-apis-1.4.01.jar Test.java
$ javac -classpath xml-apis-1.4.01.jar
--limit-modules=java.base,java.xml Test.java
$ javac -classpath xml-apis-1.4.01.jar --limit-modules=java.base
Test.java

From my understanding, only the first and the last invocations
should succeed. In both other cases the code in Test.java (associated
to the unnamed module) can access package javax.xml.transform both
from module java.xml and from the unnamed module containing
xml-apis-1.4.01.jar. (I say "the unnamed module" but I don't seen any
impact if an implementation would choose to support several unnamed
modules).


I agree that only the first and last invocations should succeed.

As you described in an Eclipse bug
(https://bugs.eclipse.org/bugs/show_bug.cgi?id=536928), the qualified
type name `javax.xml.transform.Transformer` mentions a package
`javax.xml` that is NOT uniquely visible. In particular, and assuming
a single unnamed module: when compiling an ordinary compilation unit
(Test.java) associated with the unnamed module (the classpath), there
IS an ordinary compilation unit (some file in xml-apis-1.4.01.jar)
associated with the unnamed module which contains a declaration of the
package; but the unnamed module also reads the java.xml module that
exports the package to the unnamed module.

tl;dr A package split between the classpath and the system image is
just as bad as a package split between two modules on the modulepath.

Alex






Re: Way to bypass "uniquely visible" check for named vs. unnamed module conflict?

2018-12-19 Thread Alex Buckley
I'm seeing multiple claims in the SO item and the Eclipse bug that javac 
does the wrong thing, but no-one says what it is. Are you saying that 
javac (which version?) compiles Test.java in the second and third 
invocations?


Alex

On 12/18/2018 3:21 PM, Stephan Herrmann wrote:

Thanks for confirming!

Stephan

On 18.12.18 21:38, Alex Buckley wrote:

On 12/18/2018 11:04 AM, Stephan Herrmann wrote:

This has been discussed on StackOverflow in threads like
https://stackoverflow.com/q/51094274/4611488

The question can be simplified like this:
//---
import javax.xml.transform.Transformer;

public class Test {
 Transformer transformer;
}
//---

Which of the following compiler invocations should accept / reject the
program:

$ javac Test.java
$ javac -classpath xml-apis-1.4.01.jar Test.java
$ javac -classpath xml-apis-1.4.01.jar
--limit-modules=java.base,java.xml Test.java
$ javac -classpath xml-apis-1.4.01.jar --limit-modules=java.base
Test.java

From my understanding, only the first and the last invocations
should succeed. In both other cases the code in Test.java (associated
to the unnamed module) can access package javax.xml.transform both
from module java.xml and from the unnamed module containing
xml-apis-1.4.01.jar. (I say "the unnamed module" but I don't seen any
impact if an implementation would choose to support several unnamed
modules).


I agree that only the first and last invocations should succeed.

As you described in an Eclipse bug
(https://bugs.eclipse.org/bugs/show_bug.cgi?id=536928), the qualified
type name `javax.xml.transform.Transformer` mentions a package
`javax.xml` that is NOT uniquely visible. In particular, and assuming
a single unnamed module: when compiling an ordinary compilation unit
(Test.java) associated with the unnamed module (the classpath), there
IS an ordinary compilation unit (some file in xml-apis-1.4.01.jar)
associated with the unnamed module which contains a declaration of the
package; but the unnamed module also reads the java.xml module that
exports the package to the unnamed module.

tl;dr A package split between the classpath and the system image is
just as bad as a package split between two modules on the modulepath.

Alex




Re: Way to bypass "uniquely visible" check for named vs. unnamed module conflict?

2018-12-18 Thread Alex Buckley

On 12/18/2018 11:04 AM, Stephan Herrmann wrote:

This has been discussed on StackOverflow in threads like
https://stackoverflow.com/q/51094274/4611488

The question can be simplified like this:
//---
import javax.xml.transform.Transformer;

public class Test {
 Transformer transformer;
}
//---

Which of the following compiler invocations should accept / reject the
program:

$ javac Test.java
$ javac -classpath xml-apis-1.4.01.jar Test.java
$ javac -classpath xml-apis-1.4.01.jar
--limit-modules=java.base,java.xml Test.java
$ javac -classpath xml-apis-1.4.01.jar --limit-modules=java.base Test.java

From my understanding, only the first and the last invocations
should succeed. In both other cases the code in Test.java (associated
to the unnamed module) can access package javax.xml.transform both
from module java.xml and from the unnamed module containing
xml-apis-1.4.01.jar. (I say "the unnamed module" but I don't seen any
impact if an implementation would choose to support several unnamed
modules).


I agree that only the first and last invocations should succeed.

As you described in an Eclipse bug 
(https://bugs.eclipse.org/bugs/show_bug.cgi?id=536928), the qualified 
type name `javax.xml.transform.Transformer` mentions a package 
`javax.xml` that is NOT uniquely visible. In particular, and assuming a 
single unnamed module: when compiling an ordinary compilation unit 
(Test.java) associated with the unnamed module (the classpath), there IS 
an ordinary compilation unit (some file in xml-apis-1.4.01.jar) 
associated with the unnamed module which contains a declaration of the 
package; but the unnamed module also reads the java.xml module that 
exports the package to the unnamed module.


tl;dr A package split between the classpath and the system image is just 
as bad as a package split between two modules on the modulepath.


Alex


Re: Qualified exports to an unknown module

2018-12-07 Thread Alex Buckley

On 12/7/2018 8:46 AM, cowwoc wrote:

Back in 2016 you guys had a discussion about qualified exports to a module
that was not available at compile-time:
http://jigsaw-dev.1059479.n5.nabble.com/Issue-with-qualified-exports-td5712839.html

I could not find the conclusion to that discussion. How are users supposed
to use qualified exports to some module that is defined by some future
compilation? See https://stackoverflow.com/q/53670052/14731 for a concrete
example.


The decision was: Qualified exports/opens to a non-observable module is 
NOT an error (see 
https://docs.oracle.com/javase/specs/jls/se11/html/jls-7.html#jls-7.7.2-310). 
javac will give an on-by-default lint warning instead.


The StackOverflow question says "Because I configured the compiler to 
treat warnings as errors (-Werror), the build fails." so the questioner 
should look at disabling the lint warning (see 
https://docs.oracle.com/en/java/javase/11/tools/javac.html#GUID-AEEC9F07-CB49-4E96-8BC7-BCC2C7F725C9).


Alex


Re: Where do empty compilation units belong?

2018-12-03 Thread Alex Buckley

Thanks to Jon and Jay's testing, we can make the following statement:

Compilers ignore a source file that is physically empty (zero length) or 
logically empty (contains only whitespace and/or comments).


(I confirmed this by tweaking Test.java so that the empty case was not 
`""` but rather `" /* Comment */ "`. javac still accepted/ignored it.)


In other words, compilers do not observe an ordinary compilation unit if 
it has no package, import, or type declarations -- a.k.a. a "vacant" 
ordinary compilation unit.


We know this to be true because if compilers did observe a vacant 
ordinary compilation unit, then the lack of a package declaration would 
cause an error when the empty source file is in a modular location; but 
no such error is given.


Compilers are free to take this not-observable stance, per 7.3: "The 
host system determines which compilation units are observable". It would 
be possible to mandate that the host system MUST NOT observe a vacant 
ordinary compilation unit, but such a mandate would probably have 
unintended consequences. It would also be possible to define a vacant 
ordinary compilation unit out of existence, by tweaking 7.3's grammar as 
proposed in the quoted mail below, but again, beware unintended 
consequences. What the JLS should do is affirm the compilers' decision 
to "accept/ignore" a vacant ordinary compilation unit, by clarifying 
that a vacant ordinary compilation unit is exempt from the "part of an 
unnamed package" rule in 7.4.2. I have filed spec bug JDK-8214743; "An 
ordinary compilation unit that has no package declaration, but has at 
least one other kind of declaration, is part of an unnamed package."


Alex

P.S.  In the course of examining 7.3's grammar, I realized that 
OrdinaryCompilationUnit is not congruent with how 2.1 defines a 
production in a context-free grammar as having "a sequence of one or 
more nonterminal and terminal symbols as its right-hand side."


2.1's definition is intended to apply _after_ interpretation of 2.4's 
grammar notation. For example, the production `A: [B]` is really two 
productions, `A: ` and `A: B`. The first has zero symbols as its RHS, so 
the grammar is not context-free -- parsing of an A is possible at any 
time, based on considerations other than the terminals in hand. 
Similarly, the production `C: {D}` is really an infinite number of 
productions `C: ` and `C: D` and `C: D D` and `C: D D D` etc. 
OrdinaryCompilationUnit is significant for being the only production in 
the JLS to allow zero symbols and thus _not_ be context-free. Compilers 
provide the context when they lex an empty source file and decide not to 
observe an ordinary compilation unit therein.


There's nothing good to be done here. We aren't going to change the 
longstanding OrdinaryCompilationUnit production after all, and I don't 
want to complicate 2.1 by special-casing its zero-symbols RHS.


On 12/3/2018 8:29 AM, Jayaprakash Artanareeswaran wrote:

Thanks for the test file Jon. Last week I and Stephan had a discussion
and agreed with the specified behavior and made some changes to our
compiler.

I can also confirm that both the compilers behave the same way for all
the scenarios included in the test file.

Regards,
Jay

--------
*From:* Jonathan Gibbons 
*Sent:* Monday, November 26, 2018 11:22 PM
*To:* Alex Buckley; Jayaprakash Artanareeswaran;
jigsaw-dev@openjdk.java.net; compiler-dev
*Subject:* Re: Where do empty compilation units belong?



On 11/26/2018 01:44 PM, Alex Buckley wrote:

// Adding compiler-dev since the parsing of files into compilation
units is not a Jigsaw issue.

On 11/20/2018 9:14 PM, Jayaprakash Artanareeswaran wrote:

"jigsaw-dev" 
<mailto:jigsaw-dev-boun...@openjdk.java.net> wrote on 21/11/2018
01:56:42 AM:
 > Jon points out that `OrdinaryCompilationUnit` will match an empty
stream
 > of tokens (I dislike the syntax-driven optionality here, but it's
 > longstanding) so the file D.java could be regarded as a
compilation unit
 > with no package declaration, no import declarations, and no type
 > declarations.
 >
 > Per JLS 7.4.2, such a compilation unit is in an unnamed package, and
 > must be associated with an unnamed module.
 >
 > I would prefer 7.4.2 to say only that a compilation unit with no
package
 > declarations _and at least one type declaration_ is in an unnamed
 > package (and must be associated with an unnamed module; 7.3 should
 > enumerate that possibility). A compilation unit with no package
 > declarations _and no type declarations_ would be deemed
unobservable by
 > 7.3, and all these questions about what to do with empty files would
 > disappear.

That would be perfect and make things unambiguous. But for now, the
paragraph above is good enough

Re: Where do empty compilation units belong?

2018-11-26 Thread Alex Buckley
// Adding compiler-dev since the parsing of files into compilation units 
is not a Jigsaw issue.


On 11/20/2018 9:14 PM, Jayaprakash Artanareeswaran wrote:

"jigsaw-dev"  wrote on 21/11/2018
01:56:42 AM:
 > Jon points out that `OrdinaryCompilationUnit` will match an empty stream
 > of tokens (I dislike the syntax-driven optionality here, but it's
 > longstanding) so the file D.java could be regarded as a compilation unit
 > with no package declaration, no import declarations, and no type
 > declarations.
 >
 > Per JLS 7.4.2, such a compilation unit is in an unnamed package, and
 > must be associated with an unnamed module.
 >
 > I would prefer 7.4.2 to say only that a compilation unit with no package
 > declarations _and at least one type declaration_ is in an unnamed
 > package (and must be associated with an unnamed module; 7.3 should
 > enumerate that possibility). A compilation unit with no package
 > declarations _and no type declarations_ would be deemed unobservable by
 > 7.3, and all these questions about what to do with empty files would
 > disappear.

That would be perfect and make things unambiguous. But for now, the
paragraph above is good enough for me.


Unfortunately, import declarations can have side effects (compile-time 
errors) so to be sure that the "no package or type decl === 
unobservable" rule is suitable for a file containing just an import 
decl, we would have to do a case analysis of how javac and ecj handle 
the eight combinations of the three parts allowed in an ordinary 
compilation unit. That's overkill for the situation involving empty 
files that keeps coming up and that I really want to clarify. I don't 
think anyone loves that an ordinary compilation unit matches the empty 
stream, so let's define away that scenario. As Jon said, an empty file 
doesn't present anything to be checked; there is no compilation unit 
there, so let's be unambiguous about that.


We can rule out the empty stream in 7.3 with grammar or with semantics. 
Usually a semantic description is clearest (gives everyone the proper 
terminology and concepts) but in this case we don't want the description 
to wrestle with "consists of one, two, or three parts" when the grammar 
allows zero. So, a new grammatical description is appropriate, and 
straightforward:


  OrdinaryCompilationUnit:
PackageDeclaration {ImportDeclaration} {TypeDeclaration}
ImportDeclaration {ImportDeclaration} {TypeDeclaration}
TypeDeclaration {TypeDeclaration}

The "three parts, each of which is optional" description is still 
accurate. The package decl part is optional (as long as you have the 
import decls part and/or the type decls part); the import decls part is 
optional (as long as you have either the package decl part or ...) ... 
you get the picture.


I would leave 7.4.2 alone; an ordinary compilation unit with no package 
or type decls but with import decls is part of the unnamed package (and 
thus unnamed module) as before, and compilers can handle that, I think.


Any comments?

Alex


Re: jlink /Jigsaw for and with migrated legacy code

2018-11-26 Thread Alex Buckley

// Redirecting to jigsaw-dev

Hi Hannes,

On 11/15/2018 1:41 PM, Hannes H. wrote:

I am new to this list and I apologize in advance if this is not the right
place to bring this in the discussion.


No problem -- jdk-dev is for discussion of how the JDK is developed, 
though you wouldn't guess that from browsing 
http://mail.openjdk.java.net/mailman/listinfo. For issues and queries 
about how _your_ code may be developed in a modular fashion, jigsaw-dev 
is the right place.



One of the reasons to migrate the code base was the idea to benefit from
project Jigsaw by being able to bundle a runtime with the application. I do
understand that a trimmed runtime will be only possible if the code base
and all external dependencies are as well properly modularized. This is not
the case, however, I still want to bundle a runtime with the distribution
of the application even if it is the whole JDK.

The last few days I read a lot about Jigsaw and jlink but the more I read
the more I got the impression that the only possible way would be to run
jdeps for each and every external dependency, create a module-info.java for
them and then put the module-info.java into the JAR.

That leads to my two main questions:

(1) Is my assumption right that there is no other way to bundle legacy
(=not yet modularized) code with a runtime?


Yes. Note that we never recommend creating module-info.java files for 
someone else's JAR -- neither you nor jdeps can be expected to fully 
understand its dependencies.



(2) If (1) is true, why has this - very common - case not be targeted yet?


A goal of Jigsaw was "reliable dependencies". Things wouldn't be very 
reliable if a runtime built with jlink had "dangling references" to code 
that is not in the runtime and whose JARs you have to remember to put on 
the classpath of the runtime.


That's also why you can't jlink automatic modules into a runtime -- they 
might depend on the right JARs being put on the classpath.


You can jlink a runtime consisting of the JDK modules that are needed by 
your application + its dependencies -- that's usually in the realm of 
what jdeps can figure out -- and then run your application on the new 
runtime exactly as if you were running on a newly-downloaded JDK. As 
your dependencies release modular JARs (i.e., containing 
module-info.class, not just Automatic-Module-Name in the manifest), you 
can jlink them into the runtime too.


Alex


Re: Where do empty compilation units belong?

2018-11-20 Thread Alex Buckley

On 11/20/2018 10:54 AM, Alex Buckley wrote:

From the above, it's hard to understand for which file an error is
reported by Eclipse. In any case, as Jon indicated, if the file D.java
is empty, then there is no stream of tokens matching the JLS 7.3
production `CompilationUnit` and thus there is no compilation unit to
discuss.


Jon points out that `OrdinaryCompilationUnit` will match an empty stream 
of tokens (I dislike the syntax-driven optionality here, but it's 
longstanding) so the file D.java could be regarded as a compilation unit 
with no package declaration, no import declarations, and no type 
declarations.


Per JLS 7.4.2, such a compilation unit is in an unnamed package, and 
must be associated with an unnamed module. (The "must" somewhat 
conflicts with the "may" in 
https://docs.oracle.com/javase/specs/jls/se11/html/jls-7.html#jls-7.3-310)


I would prefer 7.4.2 to say only that a compilation unit with no package 
declarations _and at least one type declaration_ is in an unnamed 
package (and must be associated with an unnamed module; 7.3 should 
enumerate that possibility). A compilation unit with no package 
declarations _and no type declarations_ would be deemed unobservable by 
7.3, and all these questions about what to do with empty files would 
disappear.


What do compiler engineers think?

Alex


Re: Where do empty compilation units belong?

2018-11-20 Thread Alex Buckley

On 11/19/2018 9:27 PM, Jayaprakash Artanareeswaran wrote:

I have the following folder structure in a module

my.module
  -module-info.java
  - p
  -- C.java
  -- q
  --- D.java

where C.java contains only the package declaration and D.java is
empty.

When I run javac with --module-source-path and all the individual
files as arguments the compiler is happy and reports no error. This
behavior is different from Eclipse compiler which reports an error
about "declaring a named package because this compilation unit is
associated to the named module". Is this because empty compilation
units are considered to be part of unnamed modules or because the
D.java and hence package q are omitted?


From the above, it's hard to understand for which file an error is 
reported by Eclipse. In any case, as Jon indicated, if the file D.java 
is empty, then there is no stream of tokens matching the JLS 7.3 
production `CompilationUnit` and thus there is no compilation unit to 
discuss.


In contrast, the file C.java does contain a compilation unit and I would 
expect it to be "associated" (JLS 7.3 again) with the `my.module` module 
like any other compilation unit contained in a file in the same 
directory. It is legal to annotate the package declaration in the 
compilation unit contained in C.java (let's not rathole into a 
discussion about package-info.java), and the types that can be used in 
the annotation are determined by what `my.module` reads.


Alex


Re: Error in JLS for module-info.class

2018-10-19 Thread Alex Buckley

On 10/19/2018 4:09 PM, Luke Hutchison wrote:

In JLS section 4.1 ("The ClassFile structure") for SE 11, it states:


You mean JVMS, not JLS. See the text under Table 4.1-B: "The ACC_MODULE 
flag indicates that this class file defines a module, not a class or 
interface. If the ACC_MODULE flag is set, then special rules apply to 
the class file which are given at the end of this section. If the 
ACC_MODULE flag is not set, then the rules immediately below the current 
paragraph apply to the class file."


The rules at the end of the section mandate that super_class is zero for 
an ACC_MODULE class file.


Alex


Re: Deprecation beyond the level of types

2018-04-24 Thread Alex Buckley

On 4/24/2018 11:55 AM, Stephan Herrmann wrote:

On 03.04.2018 02:49, Alex Buckley wrote:

You're right -- the JLS does not mandate a warning when compiling code
on the classpath that uses code in deprecated modules.


Wouldn't it be more consistent to give such warnings?
Why should writing code in an unnamed vs. named module make a
difference in whether or not I'm seeing deprecation warnings?


Consistent yes, usable no. The possibility of a module being deprecated 
is not understood by vast amounts of code on the classpath written 
before Java SE 9. If deprecating a module implicitly deprecated all its 
exported public types, then code on the classpath would start to see 
warnings "out of thin air" -- nothing that the classpath code refers to 
by name would be explicitly deprecated. We thought such "action at a 
distance" was too great a burden to impose on classpath code. (And we 
are consistent in not imposing "action at a distance" -- applying 
@SuppressWarnings to a module declaration does NOT implicitly suppress 
warnings for all the type declarations in the module.)



Would you consider a compiler that issues such warnings as violating JLS?


A compiler can only issue _deprecation warnings_ (and their brother, 
_removal warnings_) where the JLS mandates it. But a compiler is free to 
issue other kinds of warnings anywhere. For example, javac has 
-Xlint:dep-ann which issues warnings related to, but not exactly about, 
deprecation.


Alex


Re: Deprecation beyond the level of types

2018-04-02 Thread Alex Buckley

On 4/1/2018 4:31 AM, Stephan Herrmann wrote:

Packages: JLS & javadoc agree that @Deprecated applied to a package
has no effect. Given that @Target explicitly includes PACKAGE this
looks inconsistent to me. A user adding @Deprecated to a package would
certainly expect *some* effect, no?


I understand that JSR 175 forgot to restrict the targets of the 
Deprecated annotation type, which meant it was possible to write 
@Deprecated on a package declaration. This should have been disallowed, 
because (as you imply) no-one conceived of a warning or an error when 
the package is imported/used. The targets of Deprecated were specified 
in Java SE 7 as part of the background work for JSR 308; PACKAGE was 
included for source compatibility with package declarations that already 
had @Deprecated.



JLS argues about the same programmer developing the package and
managing its export from then enclosing module. I fail to see how this
justifies the decision to let deprecation of packages go without warning.
If a library developer deprecates an exported package, shouldn't this
signal to consumers of the library, that they should migrate away from
using types within that package? Currently it doesn't, because consumers
will see no effect of that deprecation.


The JLS "argument" that you mention is a corner case that concerns a 
deprecated package being exported in a qualified fashion. It's not meant 
to be a general justification "to let deprecation of packages go without 
warning". The reason why exporting a deprecated package doesn't generate 
a warning is because no other use of a deprecated package generates a 
warning. (And note JEP 211, which reduced the population of warnings for 
a deprecated package even further, at `import`.)


Of course, we could change this behavior, and have long discussions 
about whether deprecating a package means "deprecate the package, not 
its types" or "deprecate its types, not the package". But it's not on 
the radar at this time.



Modules: I could find no reference in JLS specifying any relation between
modules and deprecation.


I enumerated "program elements" in the first sentence of JLS 9.6.4.6 
specifically to show that "modules" are as subject to deprecation as 
classes, fields, etc. It's easily concluded from the section that, say, 
requiring a deprecated module (directly or indirectly) should cause a 
warning.



In the javadoc of @Deprecated, however, I find
some additions (apparently not provided/reviewed via jigsaw), notably:
  "A module being deprecated does not cause warnings to be issued for
   uses of types within the module."
For a named module, this makes sense, since "requires" will already raise
a warning for the deprecated module, and hence flagging each use of a type
from that module adds little value.
For unnamed modules, however, this seems to completely bypass any module-
level deprecation. There is no "requires" that could be flagged, and use
of types from the deprecated modules isn't flagged either.

Ergo: most today's consumers of libraries that deprecate either a module
or a package will not be informed about this deprecation by compilers.

...

I see two consequences from the above:
There's little to no incentive to use @Deprecated at any level greater than
types, because it is not reliably communicated to users. We keep thinking
in terms of types, not modules (not even packages).
Second, developers of unnamed modules will see fewer deprecation warnings.
Hence, migrating from unnamed to named has an additional burden of having
to deal with warnings that could have been dealt with before, but simply
weren't visible.


You're right -- the JLS does not mandate a warning when compiling code 
on the classpath that uses code in deprecated modules. But it's 
short-sighted to say that deprecation warnings mandated by 
modularization are a burden; they communicate important information 
which couldn't be communicated by the compiler before.


Alex


Re: Adding module causes classloading issues

2017-11-27 Thread Alex Buckley

On 11/27/2017 5:46 PM, Michael Hall wrote:

Thanks again, for the time.

It is a little difficult for me to track these things. For now I will probably 
just omit java.corba and hope it isn’t really my problem with JMX attach.
I will take a closer look at jconsole to try and figure that out.

I will maybe check back on the newer JTA later.


Thank you, for investigating how your app relates to CORBA, JTA, and 
Attach. I hope http://openjdk.java.net/jeps/8189188 was informative at 
least. Alan will probably be along shortly with some points I missed.


Alex


Re: Adding module causes classloading issues

2017-11-27 Thread Alex Buckley

On 11/27/2017 5:22 PM, Michael Hall wrote:

On Nov 27, 2017, at 7:15 PM, Alex Buckley mailto:alex.buck...@oracle.com>> wrote:

--add-modules java.transaction


Tried to simplify.

java -cp . --patch-module java.transaction=jta.jar --add-modules
java.transaction ModuleForClass javax.transaction.UserTransaction
Error occurred during initialization of boot layer
java.lang.LayerInstantiationException: Package javax.transaction.xa in
both module java.transaction and module java.sql


Oh yes, jta.jar includes the XA package, so force-patching that package 
into java.transaction will conflict with the same package in java.sql.


If your app doesn't use JDBC, then you could prevent java.sql from being 
resolved by passing the JDK modules that you DO want to be resolved to 
the --limit-modules option. Being precise about your app's use of JDK 
modules is a down payment on writing its module declaration.


But most likely, you need the new, XA-less JTA jar which is coming soon 
from the JSR 907 Maintenance Lead.


Alex


Re: Adding module causes classloading issues

2017-11-27 Thread Alex Buckley

On 11/27/2017 5:03 PM, Michael Hall wrote:

The application usually runs as an OS X application. So I don’t use
-jar. I do have a shell script launch for testing…

#!/bin/bash

APP_ROOT=HalfPipe7.app
JAVA=${APP_ROOT}/Contents/Java
export R_HOME=/Library/Frameworks/R.framework/Resources
java --patch-module java.transaction=${JAVA}/jta.jar -XX:+CITime -Xms32M
-Xmx256M -Xdock:name=HalfPipe
-Dcom.apple.mrj.application.apple.menu.about.name=HalfPipe
-Dapple.laf.useScreenMenuBar=true
-Djava.library.path=$APP_ROOT/Contents/MacOS -Djava.security.manager
-Djava.security.policy=$APP_ROOT/Contents/JavaApp/all.policy
-Dapp.lib=$APP_ROOT/Contents/JavaApp -Dconsole=pane -cp
.:..:hp_jshell.jar:${JAVA}/halfpipe.jar:${JAVA}/log4j-1.2.16.jar:${JAVA}/quartz-2.2.2.jar:${JAVA}/quartz-jobs-2.2.2.jar:${JAVA}/httpcore-4.1.jar:${JAVA}/httpclient-4.1.jar:${JAVA}/commons-logging-1.1.1.jar:${JAVA}/slf4j-api-1.7.7.jar:${JAVA}/slf4j-log4j12-1.7.7.jar:${JAVA}/antlr-2.7.7.jar:${JAVA}/AppleScriptEngine.jar:${JAVA}/Classes:${JAVA}/groovy-all-2.4.5.jar:${JAVA}/JRI.jar:${JAVA}/JRIEngine.jar:${JAVA}/JRS.jar:${JAVA}/REngine.jar:${JAVA}/RserveEngine.jar:${JAVA}/jta.jar:${JAVA}/macnio2.jar:${JAVA}/stringtemplate-3.2.1.jar:${JAVA}/weka.jar:${JAVA}/js.jar
us.hall.hp.common.LoaderLaunchStub

Note—patch-module at start if I am doing that correctly, gets…
WARNING: Unknown module: java.transaction specified to --patch-module

This is the installed jvm, not the embedded application one (I could
point it at that if useful), but it should have all modules included?


You said that you jlinked an image to include java.corba, which means 
the image will contain java.transaction as well. I don't know what the 
quartz scheduler that you mentioned is doing when you run it on your 
custom image, but evidently java.corba was being resolved at run time 
because it triggered resolution of java.transaction with its miniature 
javax.transaction package.


If you're running a classpath application on the JDK out of the box, 
then java.corba will not even be resolved, and nor will 
java.transaction. Hence the warning that patching it is fishy. You can 
use --add-modules java.transaction to force its resolution.


Please note that java.corba and java.transaction are both deprecated FOR 
REMOVAL. There will soon be a modular version of JTA which you can 
deploy on the upgrade module path rather than via patching. See the very 
end of http://openjdk.java.net/jeps/8189188.


Alex


Re: Adding module causes classloading issues

2017-11-27 Thread Alex Buckley

On 11/27/2017 3:16 PM, Michael Hall wrote:

JMX attach keeps telling me that RMI is not a accepted protocol.
I wondered if possibly this was a modular issue, I checked my main app jar…
jdeps halfpipe.jar
...
halfpipe.jar -> java.corba
…

which seems to say I need java.corba which I didn’t have.
If I add that with jlink I get…

java.lang.NoClassDefFoundError: javax/transaction/UserTransaction

from the quartz scheduler. This class should come from the included jta.jar, 
and usually does without the java.corba module.


java --list-modules
// Will show java.corba in your jlinked image

java --describe-module java.corba
// Will show 'requires java.transaction'
// Since this is an implementation dependency, it's not listed in 
https://docs.oracle.com/javase/9/docs/api/java.corba-summary.html


java --show-module-resolution -jar halfpipe.jar
// Will show java.transaction being resolved in support of java.corba

Because java.transaction is resolved, the miniature javax.transaction 
package that it exports will "win", and the full-strength 
javax.transaction package in jta.jar on the classpath will "lose".


The story of java.transaction is unfortunate and complicated (see 
http://openjdk.java.net/jeps/8189188) but you can augment it with the 
stuff in jta.jar:


java --patch-module java.transaction=jta.jar -jar halfpipe.jar
// See http://openjdk.java.net/jeps/261#Patching-module-content

Alex


Re: JDK-8191112: javac OutOfMemoryError caused by "-Xlint:exports" option

2017-11-14 Thread Alex Buckley

On 11/14/2017 11:16 AM, Jan Lahoda wrote:

With the proposed patch, the warning would be printed for all "requires
transitive ext1", "requires ext1" and "requires dep", but not for
"requires transitive dep", as in the last case depending on "mod" will
make exported types from dep available to the client ...


> If "ext1" would be an explicit module, which would

have "requires transitive dep", then the warning would not be printed
for "requires transitive ext1".


> If ext1 would contain only "requires

dep", then the warning would be printed.


All sounds good!

Alex


Re: JDK-8191112: javac OutOfMemoryError caused by "-Xlint:exports" option

2017-11-14 Thread Alex Buckley

On 11/14/2017 5:17 AM, Jan Lahoda wrote:

---
module mod {
requires transitive ext1;
exports api;
}
---

"dep.Dep" is accessible using requires transitive edges of the automatic
modules, hence the warning/lint is not printed.

That does not seem quite right, an explicit "requires transitive dep"
dependency would be better, otherwise the changes in the module path can
lead to surprising behavior.

The suggested patch is to simply ignore the dependencies of automatic
modules in the check - that should avoid the infinite loop and print the
warning in the "requires transitive ext1" case.


Good idea to make -Xlint:exports be more sophisticated w.r.t. automatic 
modules ... but for the very last point, are you saying that a warning 
is given for "requires transitive ext1" because of "transitive" or 
because of "ext1"? That is, would a warning be printed for just 
"requires ext1" ? I think it should be; the "transitive" in "requires 
transitive ext1" is for the benefit of consumers of mod, which is not 
germane to mod's barely-visible consumption of dep via ext1's implicit 
"requires transitive dep1".


Alex


Re: Fwd: Module naming for logging implementations

2017-10-26 Thread Alex Buckley

On 10/26/2017 1:03 PM, Stephen Colebourne wrote:

(previously posted on core-libs-dev, moved by request)


(Thanks!)


Option 1:
All modules that implement a particular logging API must have the same
module name
eg. every module that implements "org.slf4j" (the API) must be named
"org.slf4j.impl"

Option 2:
The module name of the implementation can be whatever name makes sense.


For most service providers, option 2 is obvious, however for logging
it is generally the case that only one implementation should be
present. If all the jar files that implement a specific logging API
had the same module name (option 1) then the module system could
ensure that only one was loaded. This is a particular concern as it is
not uncommon for a jar file in Maven Central to depend on a specific
implementation of logging when it should only be depending on the API.


I'm leaning towards option 2, as it is simpler and does not require
all implementations to have the same module name (which would be
difficult to require). Any other considerations I'm missing?


Option 1 opens the door to multiple modules with the same name being 
placed in different directories of the modulepath. Not a good place to 
be, even if no-one is targeting them via 'requires'.


(I think ServiceLoader does not care about duplicate module names when 
scanning modules on the modulepath, and will inspect their 'provides' 
directives regardless. However, I confess that I cannot figure out from 
the ServiceLoader spec which modules are observable during binding.)


Stepping back, there are two big anti-patterns in the world of 
ServiceLoader. First, it is an anti-pattern for a provider module to 
'exports' its provider implementation. Second, it is an anti-pattern for 
a consumer module to 'requires' a particular provider module. Option 2 
fights the second anti-pattern by making provider modules not "stand 
out" more than other modules. This in turn fights the first 
anti-pattern, because a provider module that is not expecting to be 
mentioned in 'requires' will not hurry to 'exports' anything. (Yes, this 
is all a bit soft, but programming with services is primarily about 
mindset.)


Alex


Re: module-info file location

2017-09-22 Thread Alex Buckley
See "Modular multi-release JAR files" in http://openjdk.java.net/jeps/238

- khmarba...@gmx.de wrote:

> From: khmarba...@gmx.de
> To: jigsaw-dev@openjdk.java.net
> Sent: Friday, September 22, 2017 12:43:38 AM GMT -08:00 US/Canada Pacific
> Subject: module-info file location
>
> Hi,
> 
> I have seen an example of a multi version jar file which contained the
> mudule-info.class at the following location:
> 
> META-INF/versions/9/module-info.class
> 
> based on the information I have I would have assumed that this
> is only allowed having the module-info.class file at the root of the
> jar.
> 
> So the question is: Is it allowed to have a module-info.class file in
> that location or is it only allowed having only a singe 
> module-info.class file at the root level of a jar file?
> 
> 
> 
> Kind regards
> Karl Heinz Marbaise


Re: Become an early Java 9 expert: AJUG + vJUG + JUGs Worldwide Hackday Feedback on JDK 9 EA

2017-09-19 Thread Alex Buckley

Hi Mani,

On 9/19/2017 2:55 PM, Mani Sarkar wrote:

Last month (19th August 2017) AJUG and a number of JUGs worldwide with the
help and support from vJUG, re-ran the  "Become an early Java 9 expert"
hackday.


Thank you AJUG and vJUG!


You can find the feedback gathered in http://bit.ly/J9HackDay-AJUG-feedback,
we have been trailing JDK 9 EA b181 (RC1).


A lot of feedback seems to boil down to "JDK command line tools are not 
so easy to use; I want my IDE!". I don't mean to make light of people's 
usability issues, but the module-related paths and flags in JDK 9 tools 
tend to operate along similar lines as the paths and flags in JDK 8 
tools -- it's just that a lot of people haven't set ANY paths and flags 
for a long time.


I see there were some more open-ended questions and this one in 
particular caught my attention:


-
Do I need to convert a legacy Java program to use named modules in order 
to take advantage of the smaller images that jlink can create?


Mani: You will have to convert your applications to use Java 9’s modules 
system in order to take advantage of JLink fully, although please play 
around with older legacy apps to see what JLink produces (most likely 
the whole JDK and not modularised pieces).


Simon: create an empty module with module-info.java and handcraft the 
dependencies using requires and have jlink compile it. This is 
experimental, would need to be tested to see how it works.

-

The direct answer to the question is "No, you do not need to convert a 
legacy Java program to use named modules in order to take advantage of 
the smaller images that jlink can create."


The Java runtime that's present in even the smallest image (just 
java.base) still lets you to put your pre-existing JAR files on the 
classpath and run them with java -cp. You do not need to turn your JAR 
files into named modules. Even as traditional JAR files, they have 
access to all the APIs that you would expect from such an image. 
(Obviously if your JAR files try to use Swing on an image built from 
just java.base, that won't work.) The reduced footprint and security 
surface of the smaller image is plainly an advantage from jlink.


Alex


Re: Moving to Java 9 - module-info.class not found for module

2017-09-01 Thread Alex Buckley

On 9/1/2017 1:21 PM, Rahman USTA wrote:

java --module-path
%JAVA_HOME%/jmods;target\terminalfx.jar;target\dependency --add-modules
terminalfx -m terminalfx/com.terminalfx.AppStarter


(You shouldn't need the --add-modules, since terminalfx is already the 
main module.)



It works normally. Then, I want to generate a jlink image with the
following script

jlink --module-path
%JAVA_HOME%/jmods;target\terminalfx.jar;target\dependency --add-modules
terminalfx --launcher terminalfx=terminalfx/com.terminalfx.AppStarter
--output target/release

However it gives me the following error;

Error: module-info.class not found for jackson.databind module


I suspect jackson.databind is an automatic module. jlink does not 
support linking of automatic modules because they can rely on the 
arbitrary content of the classpath, which goes against the idea of a 
self-contained Java runtime.


Alex


Re: Jigsaw related questions emerged during Java 9 Jigsaw Hack Day

2017-08-14 Thread Alex Buckley

On 8/14/2017 1:35 AM, Oleg Tsal-Tsalko wrote:

Recently we run Java 9 Jigsaw Hack Day in Kiev, Ukraine and several
questions have been raised:

1. When declaring 'exports' in module descriptor why wildcard are not
supported like com.abs.*?


The packages exported by a module are meant to be a stable API that 
consumers can rely on. For this reason, we make the module author spell 
out the exported packages explicitly. This also dials down the 
likelihood of multiple modules needlessly exporting the same package. 
Additionally, it avoids the confusion that would occur if com.abs.* was 
exported without qualification while com.abs.foo was exported with 
qualification.



2. Why compilation doesn’t fail in case several modules with same name
put on module-path, but it only fails in runtime on startup (

https://github.com/AdoptOpenJDK/jdk9-jigsaw/tree/master/session-1-jigsaw-intro/08_ModulesExportConflict
)?


When compiling src/com.greetings/module-info.java, javac resolves 
'requires org.astro;' by


1) searching the modulepath for a directory called org.astro, then

2) seeking a module-info.class file in that directory which declares a 
module called org.astro.


The directory is found successfully (mods/org.astro), and a 
module-info.class file is found in that directory which declares 'module 
org.astro {..}'.


javac is disinterested in the existence of another directory 
(mods/org.astro2) which contains another declaration for a module called 
org.astro. If we assume that directories are named sensibly, that is, in 
accordance with the modules they declare, then it's reasonable for javac 
to ignore mods/org.astro2. javac observes just enough to compile what it 
was given on the command line -- it does not validate the world. In 
contrast, the Java runtime is in a good position to validate the world, 
so it's reasonable for the runtime to detect and complain about 
duplicate modules called org.astro. This should all be documented in JEP 
261.



3. What is the best practice for using `transitive` dependencies? Where
to use them apart of cases when type of one module are used in API of
another module like return types, etc.?


Apart from API dependencies, 'requires transitive' is useful for 
declaring "aggregator" modules such as java.se.



4. If there are 3 conflicting modules in module path error says only
about 2 first modules conflict only. Why that? Fail fast?


This is really a request for better traceability of which modules are 
observed and resolved, which is fair.


Alex


Re: package hierarchy?

2017-07-17 Thread Alex Buckley
The quote from 6.5.3.2 is incorrect -- the paragraph beginning "If Q 
does not name ..." is deleted. We discussed this in May, where I 
explained why the section is the way it is. See 
http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-May/012779.html.


I have also explained that the meaning of the qualified package name 
Q.Id is NOT specified in terms of the simple package name Q. See 
http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-May/012488.html. 
That's why there is no requirement in 6.5.3.2 that Q be uniquely 
visible, only Q.Id.


Stepping back, you seem to have a concern that the spec no longer relies 
on hierarchy to give meaning to packages yet continues to present 
subpackages as a feature of the language; and you believe this is a 
tension that will have deleterious effects on ordinary developers. I do 
not believe it will, because ordinary developers have always understood 
that a package-access type in package p1.p2 is NOT accessible from the 
"related, but not really" package p1.p2.p3. In the modular world, I do 
not believe that a developer who exports p1.p2 has any expectation that 
p1.p2.p3 is exported too. We deliberately designed the syntax of exports 
-- one package per 'exports', no wildcards, no duplicates -- to ensure 
explicit, methodical enumeration of the packages which represent the 
module's contract with the outside world.


Alex

On 7/17/2017 3:59 AM, Stephan Herrmann wrote:

This continues a discussion we had in January, but still isn't
fully resolved in JLS 2017-06-26:

Does the parent-sub relation of packages bear any meaning
from the JLS p.o.v.?

On 2017-01-10 Alex conceded:
   "... you can consider all packages as unrelated to each other."

Still JLS 6.5.3.2 has this (my emphasis):

"If a package name is of the form Q.Id, then Q must also be a package name.
  The package name Q.Id names a package that is the member named Id
  *within the package named by Q*.
  If Q does not name an observable package (§7.4.3), or Id is not the
simple name
  of an observable subpackage of that package, then a compile-time error
occurs.

  If Q.Id does not name a package that is uniquely visible to the current
  module (§7.4.3), then a compile-time error occurs."

Now, what's the meaning of "the package named by Q"?

Technically, we need to consider Q as a package name, whose meaning
is determined by 6.5.3.1 or 6.5.3.2.
This inevitably implies that Q must be uniquely visible.

In the end, this requires, e.g., that package "java" be uniquely visible in
every Java 9 program. Or: every Java 9 program (even JDK itself) is
illegal.



I *imagine*, the spec authors intended two different definitions of
PackageName: an internal definition that is applied whenever JLS
itself refers to a package name, and an external, applicable to
type or package references in the source code.
I imagine, only the external definition has to apply the second
paragraph requiring unique visibility, whereas internal references
to PackageName should be unaffected by this addition.

*Perhaps*, the phrase "If Q does not name an observable package"
intends to override the normal interpretation of package names, to
require only observability, not unique visibility, but this doesn't
work if first name resolution is obliged to check unique visibility
and possibly fail compilation.

Now correcting JLS in its current form will be quite tedious, like:
"within a package that is identified by the name Q, by applying
6.5.3 recursively while ignoring any requirement of unique visibility".
Yikes.

am I missing anything?
Stephan

Ceterum censeo ...
I'm really looking forward to the day, when JLS is refactored to
specify the language with no parent-sub package relation whatsoever,
leaving it entirely to any host system, to use prefixes of package
names for grouping stuff in containers like folders of a file system.
Wouldn't that make for a much cleaner specification?

OTOH, wherever JLS and compiler messages surface the concept of
a package "hierarchy", it will influence user expectations: Having
learned about package hierarchies users may likely expect that
exporting package "p1.p2" will also make "p1.p2.p3" accessible
just like "p1.p2.MyClass".


Re: Use classes in unnamed module that are also contained in a JDK platform/runtime module

2017-07-07 Thread Alex Buckley

On 7/6/2017 11:37 PM, Langer, Christoph wrote:

On 7/4/2017 12:02 AM, Langer, Christoph wrote:

I have some piece of software that we ship as a jar file and
which will hence run on a JDK 9 in the unnamed module. However,
this jar file contains a package that is also contained in our
JDK image in a module that is always part of the runtime.


It would be remiss of me if I didn't ask for some details about why
this scenario has arisen. From "always part of the runtime", I
would have guessed the package is in java.base, but you also
mention "our JDK image" so perhaps the package is in a module that
SAP always jlinks in?


Thanks for your interest in the details. Here it is: We have a
separate module that exposes an augmenting SAP specific API which is
publicly exported. This module is part of our JDK image.


The default set of root modules for the unnamed module is specified by 
JEP 261 rather than a JSR, but I guess the SAP JDK implementation uses 
the same default set as the OpenJDK implementation. Then, since your 
separate module exports a package without qualification, the separate 
module is in the default set.


You could mark the separate module as DoNotResolveByDefault so that it 
is observable *but not readable* by the unnamed module which holds your 
JAR file's tool classes + copies of SAP-specific API/impl classes.


Or, given that the JAR file is meant to be cross-JDK and so should never 
be aware of the separate module, your tool's launch script could make 
the separate module unobservable (--limit-modules).


Alex


Re: Use classes in unnamed module that are also contained in a JDK platform/runtime module

2017-07-06 Thread Alex Buckley

Hi Christoph,

On 7/4/2017 12:02 AM, Langer, Christoph wrote:

I have some piece of software that we ship as a jar file and which
will hence run on a JDK 9 in the unnamed module. However, this jar
file contains a package that is also contained in our JDK image in a
module that is always part of the runtime.


It would be remiss of me if I didn't ask for some details about why this 
scenario has arisen. From "always part of the runtime", I would have 
guessed the package is in java.base, but you also mention "our JDK 
image" so perhaps the package is in a module that SAP always jlinks in?


Alex


Re: 8182482: Module System spec updates

2017-06-21 Thread Alex Buckley

On 6/21/2017 5:34 PM, Hamlin Li wrote:

Got it, Thank you for planing to update it, especially for "load" and
"instantiate", in several places they are used in a mixed way.


Yes, it is critical to be clear about:

- when a service provider is discovered (search a family of loaders or 
layers, depending on how the service loader was created), versus


- when a service provider is loaded (not by the load* methods! and maybe 
the provider is found in the cache...), versus


- when a service provider is instantiated (streaming lets you defer 
that, iteration doesn't).


Lots of puzzle pieces that need to snap together.

Alex


Re: 8182482: Module System spec updates

2017-06-20 Thread Alex Buckley

Hi Remi,

On 6/20/2017 6:29 AM, fo...@univ-mlv.fr wrote:

ok, let's focus on abstract class defining a service.


I would be happy for the "Designing services" section to give more 
advice about the tradeoffs between an interface and an abstract class. 
Two sentences, written in a style that leads a junior developer but does 
not judge them if they don't follow the advice. Can you write it? :-


-
A service is a single type, usually an interface or abstract class. 
***REMI'S TEXT HERE*** A concrete class can be used, but this is not 
recommended. The type may have any accessibility.


The methods of a service are highly ...
-

Alex



Re: "The unnamed module reads every other module."

2017-06-06 Thread Alex Buckley

On 6/6/2017 2:55 PM, Stephan Herrmann wrote:

On 06.06.2017 23:40, Alex Buckley wrote:

On 6/6/2017 2:28 PM, Stephan Herrmann wrote:

SOTMS says: "The unnamed module reads every other module."

I am unable to find any similar statement in any specification.
Where should I be looking?


The unnamed module is never resolved, so it doesn't feature in the
reworked spec for Resolution in java.lang.module. Instead, the
dependences and exports of unnamed modules are specified in new text
in JLS 7.7.5 and JVMS 5.3.6.

Alex


Show me.

I don't see anything relating to "dependences and exports of unnamed
modules"
in either document, versions 2017-05-25 and 2017-02-22 respectively,
which for me are the current versions linked from
http://cr.openjdk.java.net/~mr/jigsaw/spec/


It's new text, it hasn't been published yet.


Re: "The unnamed module reads every other module."

2017-06-06 Thread Alex Buckley

On 6/6/2017 2:28 PM, Stephan Herrmann wrote:

SOTMS says: "The unnamed module reads every other module."

I am unable to find any similar statement in any specification.
Where should I be looking?


The unnamed module is never resolved, so it doesn't feature in the 
reworked spec for Resolution in java.lang.module. Instead, the 
dependences and exports of unnamed modules are specified in new text in 
JLS 7.7.5 and JVMS 5.3.6.


Alex


Re: What does a qualified name mean for a module?

2017-06-06 Thread Alex Buckley

On 6/6/2017 10:14 AM, Stephan Herrmann wrote:

Normally, a qualified name denotes two things: a parent element and a
child. The package name "java.lang" has a qualifier "java" which
denotes a top-level package and "lang" can be used relative to that
package to denote a member package etc.

For a module - say "java.base" - the qualifier "java" denotes
nothing. And hence, the simple name "base" cannot be resolved in any
context.

So the question is: should ModuleElement.getSimpleName() answer the
totally useless last segment of the name, or should it answer the
same as getQualifiedName()?


When Joe asked for feedback on this API two months ago [1], I made 
essentially the same point [2], and a bug was filed [3].


Alex

[1] 
http://mail.openjdk.java.net/pipermail/compiler-dev/2017-April/010896.html
[2] 
http://mail.openjdk.java.net/pipermail/compiler-dev/2017-April/010905.html

[3] https://bugs.openjdk.java.net/browse/JDK-8163989


Re: What does a qualified name mean for a module?

2017-06-06 Thread Alex Buckley
A module name has the same structure as a package name, so ModuleElement 
has the same shape as PackageElement: each inherits getSimpleName() from 
Element, and getQualifiedName() from getQualifiedName() from 
QualifiedNameable.


Alex

On 6/6/2017 7:24 AM, Jayaprakash Artanareeswaran wrote:

Hello,


The newly introduced ModuleElement has two APIs to get a module's name, namely 
getQualifiedName() and getSimpleName(). The JLS, though says a module only has 
one name.


"A module name consists of one or more Java identifiers (§3.8) separated by "."
tokens."


I also see this in the "JPMS: Modules in the Java Language and JVM":

ModuleName:
   Identifier
   ModuleName . Identifier

I am not really sure what a qualifier for a module is. In the given example

Module M.N {}

are 'M' and 'N' separate names and if so, what do they denote?

Jay



Re: JPMS Access Checks, Verification and the Security Manager

2017-06-01 Thread Alex Buckley

On 5/24/2017 12:13 AM, Volker Simonis wrote:

OK, so from what you say I understand that the verification errors I
see with the Security Manager enabled are an implementation detail of
HotSpot (because verification uses the same class loading mechanism
like the runtime) which is not required but still acceptable under the
JVMS. Is that correct?


The JVMS is precise about which exceptions are allowed to be thrown by a 
JVM implementation during verification, and AccessControlException is 
not one of them. However, the JVMS is only one part of the Java SE 
Platform Specification. It is quite proper if another part specifies an 
AccessControlException when a class in a restricted package is 
referenced by a class without permission.


I'm thinking in particular of the API specification for 
SecurityManager::checkPackageAccess. It states, "This method is called 
by the loadClass method of class loaders." Plainly, the intention is 
that a class (Tricky) which initiates the loading of another class 
(com.sun.crypto.provider.SunJCE) can do so only if it has permission to 
reference the other class. Unfortunately, the statement as written is 
only guaranteed to be true for the built-in class loaders of the Java SE 
Platform and not for user-defined class loaders. Accordingly, we will 
update the API specification to clarify how a JVM implementation may 
support the Security Manager in checking permissions when classes are 
loaded and resolved. But to answer your original question, an 
application CAN fail because the verifier can't load classes due to 
Security Manager restrictions; you may have to grant additional 
permissions if application classes wish to reference certain JDK 9 packages.


Alex


Re: Compiling with automatic modules

2017-05-30 Thread Alex Buckley

On 5/30/2017 2:08 PM, Jochen Theodorou wrote:

On 30.05.2017 21:42, Alex Buckley wrote:

On 5/26/2017 4:12 AM, Jochen Theodorou wrote:

On 26.05.2017 01:04, Alex Buckley wrote: [...]

The semantics of an observed JAR without module-info.class are
 specified as part of JPMS resolution, and JLS 7.3 explicitly
defers to that, so I believe it is clear how a compiler must
behave when a modular compilation unit 'requires' a module that
turns out to be automatic. (Of course a big part of the
migration story is that the requirer is unaware of whether the
requiree is automatic or explicit.)


Isn't the consequence that I can write a compiler which does only
allow named modules?


You mean a compiler that understands named module and does not
understand unnamed modules?


actually I was wondering more about automatic modules and inexact in
my question.


No, per JLS 7.7.5: "An implementation of the Java SE Platform must
support at least one unnamed module."  The mandates for unnamed
modules in 7.7.5 are essentially identical to the historical
mandates for unnamed packages in 7.4.2.


""" An implementation of the Java SE Platform must support at least
one unnamed module. An implementation may support more than one
unnamed module, but is not required to do so. Which ordinary
compilation units are associated with each unnamed module is
determined by the host system.

The host system may associate ordinary compilation units in a named
package with an unnamed module. """

OK, from this I understand there must be at least one unnamed module.
 Nothing about automatic modules.


Correct. Automatic modules are named modules known to the JPMS, just 
declared implicitly by the JPMS rather than explicitly in the Java 
language. Where a named module IS declared explicitly in the Java 
language, it may reference, in its 'requires' directives, any other 
named module known to the JPMS, regardless of whether that other named 
module is declared implicitly or explicitly.



What comes after that is a bit confusing to me. Could I for example
say that only compilation units, that declare to be part of a package
with the name "unnamed" will be part of the unnamed module?


Yes, the host system can choose to associate those compilation units 
with an unnamed module if it wishes. See JLS9 7.2 and 7.3, paying 
attention to the flexibility granted for ordinary compilation units (the 
ones in your paragraph) versus no flexibility for modular compilation units:


-
Each host system determines which compilation units are observable in a
particular compilation (§7.3). Each host system also determines which 
observable compilation units are associated with a module.


...

The host system also determines which observable ordinary compilation 
units are associated with a module, except <>.


The host system must determine that an observable modular compilation 
unit is associated with the module declared by the modular compilation unit.

-


I mean, I understand that the "which" refers to the way the files are
given to javac... But it feels like the JLS allows here many other
variants as well.


Correct. The JLS is a language spec, not a compiler spec. With the 
exception of the rule for 'public' types in 7.6, the JLS has 
historically imposed very few constraints on a compiler ("host system").


Alex


Re: Compiling with automatic modules

2017-05-30 Thread Alex Buckley

On 5/26/2017 4:12 AM, Jochen Theodorou wrote:

On 26.05.2017 01:04, Alex Buckley wrote:
[...]

The semantics of an observed JAR without module-info.class are specified
as part of JPMS resolution, and JLS 7.3 explicitly defers to that, so I
believe it is clear how a compiler must behave when a modular
compilation unit 'requires' a module that turns out to be automatic. (Of
course a big part of the migration story is that the requirer is unaware
of whether the requiree is automatic or explicit.)


Isn't the consequence that I can write a compiler which does only allow
named modules?


You mean a compiler that understands named module and does not 
understand unnamed modules? No, per JLS 7.7.5: "An implementation of the 
Java SE Platform must support at least one unnamed module."  The 
mandates for unnamed modules in 7.7.5 are essentially identical to the 
historical mandates for unnamed packages in 7.4.2.


Alex


Re: Package name semantics

2017-05-25 Thread Alex Buckley

On 5/23/2017 1:37 PM, Stephan Herrmann wrote:

To tersely illustrate my confusion: "the" _could_ imply that the
existence of several packages named Q.Id forces separate parent
packages Q, so that each Q indeed contains exactly one member named
Id.

From your mail I infer that this is not your intention.

I read you as saying: even a single package named Q can contain
arbitrary many member packages named Id, provided that (at the
location of each package reference) exactly one of those packages is
visible (or a compile time error occurs), right?


Yes.

Alex


Re: Compiling with automatic modules

2017-05-25 Thread Alex Buckley

On 5/23/2017 1:47 PM, Stephan Herrmann wrote:

On 23.05.2017 22:30, Alex Buckley wrote:

Automatic modules are not a source artifact, so their specification is
found in the API portion of the JPMS spec rather than in the JLS. The
JLS has traditionally not specified how a compiler interprets
non-source artifacts, e.g., the JLS says nothing about whether the
host system understands a "class path" or that such a thing might
identify non-source artifacts like JAR files. The analog in the Java
SE 9 era is that it's up to a compiler to choose to support a thing
called the "module path" and to identify non-source artifacts on it; I
do not see how the JLS can say anything for that. Once a compiler has
made the choice, then the JPMS specifies that certain non-source
artifacts must be interpreted as automatic modules, at which point
they're "just" named modules that a modular compilation unit's
'requires' directive can refer to.


There's a decisive difference: for .class files (in the file system
or in a .jar) the compiler "knows" what it means, because .class
files are well in the realm of compilers. There are no hidden
semantics.


I agree that JLS 13.1 mandates a compiler to have detailed knowledge of 
.class files, but I do not agree that the JLS mandates a compiler to 
understand JAR files. It is merely convention that motivates a compiler 
to (i) understand a thing called "classpath" and (ii) open JAR files 
thereon in search of .class files. In Java SE 9, the analog of (i) 
remains -- it is convention that a compiler understands a thing called 
"modulepath" -- but we go beyond convention for (ii) -- JPMS resolution 
actually specifies that observable JAR files without module-info.class 
declare automatic modules. So, assuming that a compiler has a way to 
observe JAR files in the first place, JLS 7.3 specifies that 'requires 
M' in a modular compilation unit triggers JPMS resolution to read a 
module called M, and JPMS resolution will abstract away whether M is an 
explicit module (observed in M.jar with module-info.class) or an 
automatic module (observed in M.jar [or Foo.jar with 
Automatic-Module-Name: M] without module-info.class).



A jar-as-automatic-module has semantics that are not known to the compiler,
unless JLS specifies s.t. to this end.


The semantics of an observed JAR without module-info.class are specified 
as part of JPMS resolution, and JLS 7.3 explicitly defers to that, so I 
believe it is clear how a compiler must behave when a modular 
compilation unit 'requires' a module that turns out to be automatic. (Of 
course a big part of the migration story is that the requirer is unaware 
of whether the requiree is automatic or explicit.)


Alex


Re: Compiling with automatic modules

2017-05-23 Thread Alex Buckley

On 5/23/2017 12:54 PM, Stephan Herrmann wrote:

The 2017-05-18 draft of JLS indicates that automatic modules are beyond the
scope of JLS.
I'm puzzled what that should mean for a compiler.
At face value it seems to say that compilers need not care about automatic
modules. Instead I'd expect JLS to state that the host system must be able
to discover automatic modules "as if by invocation of
ModuleFinder.of(Path...)." or similar.

Which is it?


Can you quote the text of concern?

Automatic modules are not a source artifact, so their specification is 
found in the API portion of the JPMS spec rather than in the JLS. The 
JLS has traditionally not specified how a compiler interprets non-source 
artifacts, e.g., the JLS says nothing about whether the host system 
understands a "class path" or that such a thing might identify 
non-source artifacts like JAR files. The analog in the Java SE 9 era is 
that it's up to a compiler to choose to support a thing called the 
"module path" and to identify non-source artifacts on it; I do not see 
how the JLS can say anything for that. Once a compiler has made the 
choice, then the JPMS specifies that certain non-source artifacts must 
be interpreted as automatic modules, at which point they're "just" named 
modules that a modular compilation unit's 'requires' directive can refer to.


Alex


Re: Package name semantics

2017-05-23 Thread Alex Buckley

On 5/23/2017 1:04 PM, Stephan Herrmann wrote:

The 2017-05-18 update of JLS 6.5.3.2 introduces the concept of
unique visibility, but still has this unchanged sentence:

"The package name Q.Id names a package that is the member named Id
   within the package named by Q."

If "the" in "the member named Id" is to be taken literally, then
the specification still doesn't work, as I may elaborate if needed.
But I assume, dropping "the" in favor of "a" or a similar change
easily fixes this to reflect the intention, right?


The full text is:

-
If a package name is of the form Q.Id, then Q must also be a package 
name. The package name Q.Id names a package that is the member named Id 
within the package named by Q.


[DELETED]If Q does not name an observable package (§7.4.3), or Id is not 
the simple name of an observable subpackage of that package, then a 
compile-time error occurs.[/DELETED]


[ADDED]If Q.Id does not name a package that is uniquely visible to the 
current module (§7.4.3), then a compile-time error occurs.[/ADDED]

-

The editorial style of this section has historically been rather odd, 
because the first paragraph makes assertions that are true only if the 
second paragraph's compile-time error doesn't occur. I have continued 
with that style in JLS9: the first paragraph can say "_the_ member named 
Id" because the second paragraph assures a unique Q.Id.


Alex



Re: JPMS Access Checks, Verification and the Security Manager

2017-05-23 Thread Alex Buckley

On 5/23/2017 7:44 AM, Volker Simonis wrote:

So maybe I rephrase my question a little more generally:

Is it required for the verifier to do security and/or access checks
during the verification phase or could/should these checks be
postponed to runtime? The issue with verification errors due to
missing classes from Remi's previous answer is probably a corner case
of this question.


Verification must perform class loading in order to check subtyping, but 
verification does not check access to the loaded classes. To be precise, 
verification in JVMS 4.10.1 does not appeal to class resolution (JVMS 
5.4.3.1) nor to access control (JVMS 5.4.4). Nor does verification in 
JVMS 4.10.1 know what the package.access file is.


What you are seeing when the Security Manager is enabled is that class 
loading fails (due to a package.access check in Hotspot) and so 
verification fails. The verifier is not performing the package.access 
check per se.


Alex


Re: private and non-final fields in Java 9 interfaces

2017-05-11 Thread Alex Buckley

compiler-dev is the right list to query javac's behavior.

Support for private methods in interfaces came via JEP 213, and it 
sounds like you're saying private fields are allowed accidentally. 
Please give example source code when you write to compiler-dev.


Alex

On 5/11/2017 3:45 PM, Ess Kay wrote:

(This is not a jigsaw specific specific question but I could not find
a more appropriate mailing list. The COIN list is archived.  If there
is a a more appropriate mailing list then please let me know.)

The Java 9 compiler currently allows
1) private static and instance fields and
2) non-final static and instance fields.
Is this a bug?  If not then are these changes specified or mentioned anywhere?



Re: Need help implementing Java modules

2017-05-10 Thread Alex Buckley

On 5/9/2017 5:20 PM, Ralph Goers wrote:

Log4j already has a robust plugin approach for users to implement
their own Appenders, Filters, Layouts, and other Log4j components. We
are not going to modify that as it would severely impact our users
who have already implemented custom components and what we have works
very well. Although the FlumeAppender I mentioned previously is
provided by Log4j it, and all other Log4j components, are located via
an annotation processor provided by Log4j. The processor runs at
compile time and generates a file named
META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat.
All of these files are then located at runtime using
ClassLoader.getResources(). As we parse the configuration provided by
the user we instantiate the plugins using the class names provided in
those files and the static builder or static factory method provided
in the plugin class. From what I am reading in the javadoc we should
not have any trouble with this because META-INF is not a valid
package name so the resource file will not be hidden. Log4j uses
reflection to call the static builder or static method to create the
plugin.

With all this in mind, if users create modules will they be required
to declare the packages where they have created plugins as “open” to
log4j for this to work? I am assuming that Log4j will be able to
access this module without having to declare a dependency on it?


Basically you've got it.

- The Log4j module does not need to 'requires' any user module, nor 
invoke addReads to "read" any user module.


- Log4j code can continue to invoke Class::forName to obtain a Class 
object for a class in a user module. This is based purely on class 
loader visibility, no changes from the module system.


- Log4j code can continue to invoke Class::getMethod et al to obtain a 
Method object for a static builder/factory method. There are no access 
control rules when simply "inspecting" the Class object via getMethod, 
getMethods, getFields, etc.


- Log4j code can attempt to invoke Method::invoke on its preferred 
Method object, but for the attempt to succeed, the user module must open 
or export the class's package. If you mandate that a plugin class must 
be public with a public builder/factory method, then exporting the 
package will be enough; otherwise the package needs to be open.


- I recommend that you recommend that users should open the package to 
specifically Log4j, rather than opening the package to everyone (or 
opening the entire module). You said that your JARs will probably be 
automatic modules for some time, so you can use the 
Automatic-Module-name manifest entry to introduce a stable name that 
user modules can open to.


Alex


Re: Need help implementing Java modules

2017-05-10 Thread Alex Buckley

On 5/9/2017 11:50 PM, Remi Forax wrote:

On May 10, 2017 2:20:31 AM GMT+02:00, Ralph Goers 
wrote:

With all this in mind, if users create modules will they be
required to declare the packages where they have created plugins as
“open” to log4j for this to work?


It depends if when you call the static method you need to bypass the
encapsulation or not i.e. if your current code uses setAccessible,
yes, the plugin's module has to be opened.


Yes; I'll talk about that in a parallel mail.


In any case, you need to add a read edge at runtime from log4j to the
plugin, otherwise you will not find the plugins class.


No need for a reads edge. If you're using Core Reflection to instantiate 
classes and access their members, then you're subject to class loader 
visibility (which is the same as in JDK 8) and module accessibility 
(hence the need to open or export the plugin package), but readability 
comes for free.



I am assuming that Log4j will be able to access this module without
having to declare a dependency on it?


yes, if you add a read edge at runtime.


No need for that.

Alex


Re: Need help implementing Java modules

2017-05-09 Thread Alex Buckley

On 5/9/2017 3:04 PM, Ralph Goers wrote:

Pardon me for being dense, but my reading said that Java modules
disallowed runtime cycles as well as compile time. Once LoggerFinder
binds with the module that provides that service does that not create
a runtime dependency?  I don’t recall seeing anything describing what
the behavior of that is.


The module system disallows cycles in the 'requires' directives of 
module declarations. The module system allows cycles in the "reads" 
relation of run-time modules.


When java.base 'uses LoggerFinder;' and invokes ServiceLoader to find 
providers, there is indeed a "reads" cycle created between the provider 
module and java.base. ServiceLoader is not special in creating this 
cycle -- you can create them yourself with the two addReads methods in 
the API, and all automatic modules have cyclic readability. But there is 
no 'requires' cycle.


Alex


Re: Need help implementing Java modules

2017-05-09 Thread Alex Buckley

On 5/9/2017 11:51 AM, Ralph Goers wrote:

I am attempting to modularize Log4j and am running into some trouble
understanding how this can work.

Log4j supports many Appenders, such as the FlumeAppender,
KafkaAppender, CassandraAppender, and JPAAppender.


I don't know how Log4j is currently discovering Appenders, but I guess 
it's through reflective means (Class.forName, classpath scanning, etc). 
Services are a better approach because then the JDK does the discovery 
while you focus on the type-safe protocol between Log4j and Appenders. 
Consider this:


- The Log4j module should not require the FlumeAppender module et al.

- The Log4j module should export a service interface AppenderIntf.

- The FlumeAppender module should i) require the Log4j module to gain 
access to AppenderIntf, and ii) provide its Flume-specific 
implementation of the service interface ('provides AppenderIntf with 
FlumeAppenderImpl;'). If the FlumeAppender module requires helper 
modules, and they in turn require the Log4j module, that's fine.


- The Log4j module should use its own service interface ('uses 
AppenderIntf;') and discover appender implementations via ServiceLoader.


Alex


Re: Reminder to update JVMS

2017-05-08 Thread Alex Buckley

On 5/6/2017 10:13 AM, Stephan Herrmann wrote:

I just happened to search for the specification of the class file
representation of modules. I was quite surprised to find nothing in JVMS.


To clarify, you are referring to the Public Review Specification of JSR 376.


Then I found:
"The changes to the class-file chapter in support of module declarations
  are not included in this draft of the JVMS; they will be included in
  a future version of this specification."


That sentence contains a hyperlink to Chapter 2 of the document "JPMS: 
Modules in the Java Language and JVM". This document is a normative part 
of the Public Review Specification, so the class file format supported 
by Java SE 9 is the union of the JVMS9 draft + the document's Chapter 2.


As to precisely how Chapter 2 will affect the final JVMS9 in Java SE 9, 
the document clearly states: "The sections below will contribute to 
chapter 4 of JVMS9."



Makes me wonder, if this is still work in progress?


As Alan noted, the class file changes are stable.

Alex


Re: Views on JSR 376 from the Eclipse JDT team

2017-05-05 Thread Alex Buckley

On 5/4/2017 11:51 AM, Markus Keller wrote:

The other big thing is that the JLS draft specifies the syntax of the
module-info.java file, but it is quite vague about the semantics of
modules.

JLS9 12 "Execution" explains how class loading is supposed to work in the
JVM, but it's unclear how modules and and their access restrictions should
come into the picture here. Layers are not even mentioned anywhere. Before
Java 9, classloading and discovery of .class files was only a run-time
concern. During compilation, the assumption was that  all dependencies are
available and accessible from a flat source-/classpath.
=> Since a Java compiler is now also supposed to check access restrictions
imposed by module declarations, the JLS also needs to specify this in
depth, or it at least needs to point to JavaSE-9 APIs that contain the
necessary specifications. See e.g. Stephan Hermann's questions about the
meaning of a qualified name.


Chapter 12 of the JLS contains a variety of material:

- 12.1, 12.2, and 12.3 cover run time behavior, not compile time 
behavior, so I'm not sure why the Eclipse compiler relies on these 
sections. For example, layers are not part of the Java language so there 
is nothing to mandate about them for a compiler. Looking forward, there 
are enhancements to the corresponding JVMS9 sections (5.2 and 5.3) which 
may yet result in enhancements to these JLS sections, but that is 
"editorial"; it will not affect the behavior of a compiler.


- 12.4, 12.5, and 12.6 are normative for a compiler but are unrelated to 
the module system.


- 12.7 and 12.8 are normative but for a JVM implementation not a compiler.


Some examples:
- "The ordinary compilation units that are visible to M are the
observable ordinary compilation units associated with modules read by M.
The host
system must use the Java Platform Module System to determine which modules
are read by M (§7.7.1)."
=> Neither "read" nor "Java Platform Module System" are specified
anywhere.


This was raised in the jigsaw-dev thread "Java Platform Module System" 
[1] and discussed at length there. The Java Platform Module System is as 
much part of the Java SE Platform as the Java language, so the Java 
Language Specification is able to depend upon the specification of the 
Java Platform Module System (which happens to be presented in API form 
rather than as a narrative document).


[1] http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-April/012301.html


- "An implementation of the Java SE Platform must keep track of types
within
packages by the combination of their enclosing module names and their
binary
names (§13.1). Multiple ways of naming a type must be expanded to binary
names
to make sure that such names are understood as referring to the same
type."
=>What should happen if there are multiple types with the same binary name
but different enclosing modules? Can they coexist or is this a compile
error? JLS9 7.6 "Top Level Type Declarations" doesn't mention modules when
it says: "It is a compile-time error if the name of a top level type
appears as the name of any other top level class or interface type
declared in the same package."


This was discussed in the same thread, and I made concrete suggestions 
for how to clarify the matter [2].


[2] http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-May/012488.html


Such things are relevant if you want to write a compliant compiler.

E.g. JLS9 7.7 "Module Declarations" informally talks about "the
modulepath" and "automatic modules", but neither of these concepts are
explained any further. Automatic modules, unnamed modules, and their
semantics must be specified in the JLS. The outdated
http://openjdk.java.net/projects/jigsaw/spec/sotms/ has some more
explanations, but since this is not part of the spec, it's irrelevant for
a vote on JSR 376.

=> The JLS must either be self-contained or it must link to relevant other
documents that are declared as equally dependable parts of the spec.


This was discussed in the same thread, and appeared to be clarified [3].

[3] http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-April/012425.html


The grammar for the module-info.java with its "restricted keywords" is
highly problematic, since the language it defines is not processable by
established compiler technology. Hacks are possible, but they are costly
and prevent established error recovery techniques from working.


This was discussed in the same thread, and appeared to be non-blocking 
for JDT [4]. The JLS has not guaranteed for many years that the grammar 
of the Java language is aligned with the capabilities of pre-existing 
tools. I recall that parsing lambda expressions presented a similar kind 
of challenge to JDT, and that admirable solutions were found [5].


[4] http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-May/012500.html
[5] 
https://www.eclipsecon.org/na2014/session/jdt-embraces-lambda-expressions.html


Alex


Re: Java Platform Module System

2017-05-03 Thread Alex Buckley

On 5/2/2017 3:39 PM, Alex Buckley wrote:

On 5/2/2017 7:07 AM, Jayaprakash Arthanareeswaran wrote:

Chapter 2 in [1] describes context-free grammars. The addition to "3.9
Keywords" defines "restricted keywords", which prevent the grammar for
ModuleDeclaration from being context-free. This prevents compilers from
using common parser generators, since those typically only support
context-free grammars. The lexical/syntactic grammar split defined in
chapter 2 is not of much use for actual implementations of
module-info.java parsers.
The spec at least needs to point out that the given grammar for
ModuleDeclaration is not actually context-free.


The syntactic grammar in JLS8 was not context-free either; the opening
line of Chapter 2 has been false for years. For JLS9, I will remove the
claim that the lexical and syntactic grammars are context-free, and
perhaps a future JLS can discuss the difficulties in parsing the


Jan Lahoda pointed out privately that the syntactic grammar in JLS8 and 
JLS9 is in fact context-free -- it's just not LL(1). Not being LL(1) is 
what I should have said the grammar hasn't been for a long time.


Alex


Re: Java Platform Module System

2017-05-02 Thread Alex Buckley

On 5/2/2017 7:07 AM, Jayaprakash Arthanareeswaran wrote:

Chapter 2 in [1] describes context-free grammars. The addition to "3.9
Keywords" defines "restricted keywords", which prevent the grammar for
ModuleDeclaration from being context-free. This prevents compilers from
using common parser generators, since those typically only support
context-free grammars. The lexical/syntactic grammar split defined in
chapter 2 is not of much use for actual implementations of
module-info.java parsers.
The spec at least needs to point out that the given grammar for
ModuleDeclaration is not actually context-free.


The syntactic grammar in JLS8 was not context-free either; the opening 
line of Chapter 2 has been false for years. For JLS9, I will remove the 
claim that the lexical and syntactic grammars are context-free, and 
perhaps a future JLS can discuss the difficulties in parsing the 
syntactic grammar.


Alex


Re: Java Platform Module System

2017-05-02 Thread Alex Buckley

On 5/2/2017 5:13 AM, Stephan Herrmann wrote:

Thanks, Alex, for promising improvements in various places of the spec.


Re: Multiple packages with the same name can be "visible" (helpful 
terminology for program analysis) but exactly one of these packages must 
be identified as the meaning of the name. First, let's define what we want:


7.4.3 Package Observability and Visibility
...
A package is _uniquely visible_ to a module M if and only if either (i) 
an ordinary compilation unit associated with M contains a declaration of 
the package, and M does not read any other module that exports the 
package to M; or (ii) no ordinary compilation unit associated with M 
contains a declaration of the package, and M reads exactly one other 
module that exports the package to M.


Then we can pick the "right" package to go with the name:

6.5.3.1  Simple Package Names

If a package name consists of a single Identifier, then the identifier 
must occur in the scope of exactly one declaration of a top level 
package with this name (§6.3), and that package must be uniquely visible 
to the current module (§7.4.3), or a compile-time error occurs. The 
meaning of the package name is that package.


6.5.3.2  Qualified Package Names

If a package name is of the form Q.Id, then Q must also be a package 
name. The package name Q.Id names a package that is the member named Id 
within the package named by Q. If Q.Id does not name a package that is 
uniquely visible to the current module (§7.4.3), then a compile-time 
error occurs.


(Note that 6.5.3.2 did not, and does not, recurse into 6.5.3.1 when the 
Q in the qualified name Q.Id is simple. 6.5.3.1 relies on scope, which 
means it applies only when someone has a declaration of a top level 
package, but plainly 6.5.3.2 has always been able to give meaning to the 
name P.Q based on a declaration of 'package P.Q;' -- no-one needed to 
declare a top level 'package P;'. The 6.5.3.2 phrase "Q must also be a 
package name" in JLS8 was a trailer for the phrase about Q being an 
observable package, and in JLS9 is a trailer for the phrase about Q.Id 
being a uniquely visible package.)



I recall several threads on this list ending in you saying "still being
clarified" [1][2]. Are those issues settled by now and just need to be
penned down? Otherwise it would be very helpful just to see the list of
open questions, so we don't bang our heads against walls that are still
subject to change (I'm not speaking about the general "Issue Summary",
but s.t. more focused on JLS and its dependencies in JPMS spec).

Looking forward to an updated spec version, allowing us to double check
if those changes raise any follow-up questions,
Stephan

[1]
http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-January/010866.html


This was about multiple packages with the name P being visible at once. 
So that's covered by the "uniquely visible" invocation in package names.



[2]
http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-March/011544.html


The open question there was about multiple modules, directly required by 
a root module, containing the same package, but without any exports that 
would cause any module to see a split package. This module graph is 
legal, so it will resolve in JPMS; the JLS defers to that.


javac gives a lint warning (on-by-default, suppressible) if the modules 
being compiled cannot be mapped to the boot layer.


Alex


Re: Java Platform Module System

2017-05-01 Thread Alex Buckley

On 5/1/2017 1:48 PM, Stephan Herrmann wrote:

On 01.05.2017 22:17, Alex Buckley wrote:

- Another reference links "automatic modules" into JLS and will
probably
   link to ModuleFinder.of(Path...), right?


This text is also an informative note. Automatic modules are
discovered through ModuleFinder.of, sure, and they appear in other
places in the java.lang.module API too -- but none of that is the
point of the note. The point of the note is that the developer
doesn't specify 'requires' any differently for an automatic module
than for an explicit module.


You make it sound as if automatic modules are relevant only at runtime.


Huh? The JPMS is assumed to be present at compile time, not just run
time. And automatic modules are a feature of the JPMS.

Alex


Wait, are you expecting compilers to actually use the implementation behind
ModuleFinder etc. in order to discover modules etc.?

When you said
   "The host system must use the Java Platform Module System (as if by
execution of the 'resolve' method of java.lang.module.Configuration)"
I read this as requesting a compiler to perform the same operation as is
specified in that API.

Was the intention behind "must use the Java Platform Module System"
to say s.t. like "must call methods of API in java.lang.module"?


A compiler is not required to physically invoke methods of the JPMS API. 
A compiler is required to determine which modules are read by each 
module *as if* by execution of Configuration::resolve, a Java SE API. 
The as-if term is common in Java SE Platform specs to mean "Implement it 
any way you like, but the result must agree with the result from thing after as-if>."


Alex


Re: Java Platform Module System

2017-05-01 Thread Alex Buckley

On 5/1/2017 12:40 PM, Stephan Herrmann wrote:

Asked differently: when it says
  "Generally, the rules of the Java programming language are
   more  interested  in  dependences  than dependencies."
which are the aspects where the rules of the Java programming language
*are* interested in dependencies?


I'm sorry that this attempt to be helpful -- to distinguish the popular 
term "dependency" from the actual term "dependence" used by the JLS -- 
is causing so much confusion. The JLS has always contained rather 
open-ended text intended to add color rather than to specify an 
implementation. But to answer your question: When we fix up readability 
in the API, I expect the JLS will then speak more about the dependencies 
which result from resolution and are more numerous than the dependences 
expressed with static 'requires' directives.



- Another reference links "automatic modules" into JLS and will probably
   link to ModuleFinder.of(Path...), right?


This text is also an informative note. Automatic modules are
discovered through ModuleFinder.of, sure, and they appear in other
places in the java.lang.module API too -- but none of that is the
point of the note. The point of the note is that the developer
doesn't specify 'requires' any differently for an automatic module
than for an explicit module.


You make it sound as if automatic modules are relevant only at runtime.


Huh? The JPMS is assumed to be present at compile time, not just run 
time. And automatic modules are a feature of the JPMS.


Alex


Re: Java Platform Module System

2017-05-01 Thread Alex Buckley

On 5/1/2017 12:23 PM, Stephan Herrmann wrote:

Meanwhile I've come to the interpretation that the main weakness of JLS
concerns
handling of same-named packages in different modules.

Trouble seems to start at a difference between 6.5.5.2 and 6.5.3.1/2:
To identify a type, that type must be accessible from the current module,
but for identifying a package, the package only needs to be visible.
Ergo: identifying a package does not consider exports.

Furthermore, 6.5.3.2 implies that each qualified name uniquely defines a
package,
when it speaks of "the member named Id within the package named by Q".
Note the two occurrences of "the".


Understood. I need to clarify 6.5.* to accept that multiple packages may 
be visible by the same name. But when we get to accessibility, only one 
of the visible packages should matter.



This finally undermines the definition of accessibility (6.6.1), when it
speaks
of the "module to which the package is exported". I read this as follows:
When M1 exports p1 to M2, this makes all public members of p1 accessible
in M2,
even those that belong to totally unrelated modules, which may not
export p1.

I recall Alex answering "this is still being clarified / discussed" to
several questions in this area.

As a result I can only conclude: JLS still doesn't tell us which module
system to implement.


If this were just a minor omission, why then would it still be subject to
discussion, this late in the game? I see one possible explanation:
changing the
spec may involve much more trouble than meets the eye. Changes
concerning packages
are very much focusing on the hierarchical structure of packages and sub
packages,
despite the fact that 7.1. has always been describing this structure as
having
"no significance in itself".
7.4.3 already jumps through hoops, trying to balance the hierarchy-based
notion of
"technically observable" with the concept of "really observable" which
disregards
the hierarchy.
In my view, a forest rooted at toplevel packages is not suitable for
specifying
the rules of accessibility, where each module may have a different
interpretation
of a given package name, in a way that is completely unrelated to
hierarchy.
Since "exports" refers to a package, this notion must be better aligned
with modules.


It's hard to respond to the same point in multiple sub-threads. Please 
see my other mail where I accept that package visibility is unrelated to 
'exports'.


Alex


Re: Java Platform Module System

2017-05-01 Thread Alex Buckley

On 4/30/2017 4:10 AM, Stephan Herrmann wrote:

No. (B) may be true for your example, but it is not for the following
(which is similar to examples we had in our January thread):

//-- M/module-info.java
module M { exports pm; }

//-- M/impl/Other.java
package impl;
public class Other { }

//-- M/pm/C1.java
package pm;
import impl.Other;
public class C1 extends Other {
 public void m1(Other o) {}
}
//--
//-- O/module-info.java
module O { requires M; }

//-- O/impl/Other.java
package impl;
public class Other { }

//-- O/po/Client.java
package po;
import pm.C1;
public class Client {
 void test1(C1 one) {
 one.m1(one);
 }
}
//--

Looking at O, and trying to determine whether the method invocation
one.m1(one)
is legal, M's type impl.Other is *relevant*, because analysis must ...
- detect that the type reference "Other" in the signature of m1 refers
to the
   type defined in M, not to the same-named type in O.
- similarly detect that the type reference in C1's class header (or
superclass
   classfile attribute) refers to M's impl.Other.
- conclude from the above, that C1 is compatible to m1's parameter.

Ergo, the set of types relevant from the perspective of O contains two
same-named types.


Per 7.3, it's true that when compiling any observable ordinary 
compilation units associated with O (such as O/po/Client.java), the host 
system must limit the ordinary compilation units that would otherwise be 
observable, to only those that are visible to O. Since O requires M, and 
since M/impl/Other.java is an observable ordinary compilation unit 
associated with M, we have that M/impl/Other.java is visible to O.


Then, by 6.3, the scope of M's top-level impl package is all observable 
compilation units in O.


Then, we get the difficulty in 6.5.3.2, because two top-level packages 
called impl are visible to code in O.


I specified package visibility in 7.3 in the way I did -- not 
considering exports -- in order to encourage compilers to take a "wide 
view" of what packages are physically present in required modules, even 
if a package isn't exported (M's package impl) and thus its types won't 
be accessible (M's type impl.Other isn't accessible to code in O).


For example, if M's author forgets to export impl (quite possible when M 
is first declared), I'd like a tool processing O to have the words to 
say: "impl is /visible/ in M, but not exported, so none of its types -- 
not even its 'public' types -- can be accessed".



If Java 9 permits this situation, it not only hugely increases the
complexity
of all tools that need to "understand" this program, it also wastes a
unique
opportunity that JPMS has, which existing module systems did not have:

Java 9 could make "API leaks" either illegal or ineffective and thus
rule out
an entire category of ill-formed programs, which to-date must unfortunately
be accepted by module-unaware compilers:

   (A) Java 9 has the opportunity to say that the declaration of m1 is
illegal,
   because it publicly exposes a non-accessible type, which is broken in
   every regard.

   (B) Alternatively, Java 9 has the opportunity to say that any attempt to
   invoke m1 from outside M is illegal, because clients would need to know
   about an inaccessible type.


Understood, but we didn't take those directions. More here: 
https://bugs.openjdk.java.net/browse/JDK-8153362


Alex


Re: Java Platform Module System

2017-05-01 Thread Alex Buckley

On 4/30/2017 3:25 AM, Stephan Herrmann wrote:

For the question at hand, this is what we learn from that improved
reference:
   "A readability graph is constructed"

Now we only need a link to the specification that *defines* what is a
readability graph and what is the meaning of "m1 reads m2".
I assume, you want to add a further reference to the "Resolution"
section of the package specification for java.lang.module?


Yes, the API spec for java.lang.module will be updated to define the 
readability relation. But instead of waiting for that, I recommend you 
watch https://youtu.be/Vxfd3ehdAZc?t=18m15s because readability has been 
stable for a long time.



 BTW: while pondering if the given package specification is sufficient,
 I wonder if "requires transitive" should work through multiple levels:
   M1
   M2 requires transitive M1
   M3 requires transitive M2
   M4 requires M3
 Does M4 read M1?


It does work through multiple levels, in order to support arbitrary 
amounts of refactoring: once you've released a module that someone else 
reuses (via 'requires'), then you've committed to your module's name and 
API but are free to refactor its content into deeper modules which your 
original module "reuses" (via 'requires transitive') for the benefit of 
consumers. There is no "re-exporting", just modules being made to read 
one another.


So, going top down (because resolution starts from a set of root 
modules) :- M4 requires and thus reads M3, and M3 requires transitive 
M2, so M4 reads M2. Since M4 reads M2, and M2 requires transitive M1, we 
have M4 reads M1.



 Looking at 7.7.1:
   "The requires keyword may be followed by the modifier transitive.
This causes
any module which depends on the current module to have an
implicitly declared
dependence on the module specified by the requires transitive
directive."
 Am I right in assuming that "depends" should cover explicitly and
implicitly
 declared dependences? Taking into consideration the subtlety about
dependence
 vs. dependency, may I suggest adding s.t. like
   "A module M1 is said to depend on another module M2, if it has an
explicitly
or implicitly declared dependence on M2."
 (this also makes "depends" a technical term rather than the general
(fuzzy)
  English word).


I understand the point; when we clarify the API spec for readability, 
I'll make sure the JLS usage of "depends" is explicitly aligned.



Revisiting other references to "Java Platform Module System" inside JLS,
what about the two occurrences in the body of 7.7:

- One reference is used to discriminate "dependence" from "dependency":
   From a quick scan, I believe this sentence:
 "Generally, the  rules  of  the  Java  programming  language  are
  more  interested  in  dependences  than dependencies."
   can probably be made stronger:
 "The rules of the Java programming language are not interested in
  dependencies, only in dependences.".
   Or perhaps the paragraph about dependencies could be removed entirely.
   If this interpretation is wrong, another reference to detailed
specification
   would be needed. Perhaps it is only JLS, that is agnostic to
dependencies,
   whereas the API specification part indeed uses this concept?


This text is an informative note distinguishing the "dependence" 
expressed rather statically in the Language, from the "dependency" 
module determined rather dynamically by the JPMS. I see no reason to 
change it.



- Another reference links "automatic modules" into JLS and will probably
   link to ModuleFinder.of(Path...), right?


This text is also an informative note. Automatic modules are discovered 
through ModuleFinder.of, sure, and they appear in other places in the 
java.lang.module API too -- but none of that is the point of the note. 
The point of the note is that the developer doesn't specify 'requires' 
any differently for an automatic module than for an explicit module.


Alex


Re: Java Platform Module System

2017-04-28 Thread Alex Buckley

On 4/27/2017 12:38 PM, Stephan Herrmann wrote:

On 25.04.2017 19:02, Alex Buckley wrote:

JPMS semantics (notably, dependency resolution) are defined by the API
specification (not the implementation) of
java.lang.module.Configuration and friends. JLS references to JPMS are
references to this Java SE API.


Got it. Since now JLS is no longer self-contained it would tremendously
help if we could get a list of which parts of the API specification are
expected to be considered at compile time. I understand that we need to
apply the naming rules for automatic modules. Is there more that should
be respected / validated / enforced at compile time?


The JLS was never self-contained as it always referenced a variety of 
java.lang and java.io types (and more recently java.lang.annotation and 
java.lang.invoke types). I have changed 7.3 to state:


"The host system must use the Java Platform Module System (as if by 
execution of the 'resolve' method of java.lang.module.Configuration) to 
determine which modules are read by M (§7.7.1). It is a compile-time 
error if the Java Platform Module System is unable to determine which 
modules are read by M."


That is, if a compiler processes a module declaration mentioning 
"requires X;", and the "as if" JPMS resolution fails because no module 
called "X" is found (whether an explicitly declared module with that 
name, or an implicitly declared i.e. automatic module with that name), 
then compilation fails too. The mapping from a JAR filename to an 
implicitly declared i.e. automatic module name is part of JPMS 
resolution. And even if a module called "X" is found, there are other 
reasons why JPMS resolution (and hence compilation) can fail, e.g. the 
module requiring X also requires Y and both X and Y export the same 
package. The JLS, as is traditional, allows a compiler to be as helpful 
or as terse as it likes w.r.t. the content of the compile-time error 
message.



Let me add a friendly reminder, that we are still waiting for a
specification that unambiguously tells us which module system to implement.
For illustration:

(A) Is JPMS a module system that keeps the semantics of qualified names as
they are in Java 8 and only superimposes encapsulation boundaries?
(i.e., each type is globally uniquely identified by its qualified name).

(B) Is JPMS a module system that maintains the assumption that from the
perspective of each module all relevant types can be distinguished using
their qualified name?
(i.e. admitting identical qualified names as long as no compilation of one
module will ever encounter several candidates at once).

(C) Is JPMS a module system that establishes a separate namespace for each
module, where types with identical qualified name - but defined in
different modules - need to be distinguished?
(i.e., uniqueness is primarily required for module-prefixed qualified
names).

Despite some efforts I fail to find a definite answer in JLS (and Alex
mentioned that some of this is still being discussed). Still JLS as of
today sounds mostly like (A). To me (B) sounds like the natural choice, but I
understood Alex as saying it *should* be (C). I don't see, however, how the
conceptual framework of JLS could possibly support such design.


(B) and (C) are not mutually exclusive because (B) was worded from the 
perspective of each module while (C) was not.


(B) is true. Assume two modules M and N each contain the type P.C, but 
neither module exports P (or, M exports P and N doesn't, or, N exports P 
and M doesn't). Then, a third module O can require M and N. If code in 
any module refers statically to a type P.C, then JPMS resolution 
guarantees that P.C is either defined by that module or is exported to 
the module by exactly one other module which the module reads.


At run time, when 'java' is run with M+N+O on the modulepath, the system 
will stop -- M+N+O will pass resolution (i.e. a Configuration will be 
constructed) but they can't all be mapped to the application class 
loader. javac will produce a lint warning to this effect. However, M and 
N and O are by no means "bad" modules, either individually or jointly, 
as M+N+O will work if mapped to a multi-loader layer. So, (C) is true too.


The JLS, as is traditional with classes and packages, does not restrict 
the modules which can be given as input for an invocation of a compiler. 
A compiler is assumed to be able to process multiple modules at once. In 
the case of M and N, a compiler will encounter P.C in M and P.C in N, 
and is expected to distinguish them -- code in M refers to P.C in M 
while code in N refers to P.C in N. This (C)-style property is now 
expressed in 4.3.4:


"Two reference types are the same compile-time type if they are declared 
in compilation units associated with the same module (§7.3), and they 
have the same binary name (§13.1), and their type arguments, if any, are 

Re: Java Platform Module System

2017-04-25 Thread Alex Buckley

On 4/25/2017 1:20 AM, Stephan Herrmann wrote:

On 25.04.2017 03:50, Alex Buckley wrote:

Dependency resolution in JPMS is accomplished by the static 'resolve'
method of java.lang.module.Configuration.


Interesting.
Are you saying the semantics of JPMS depends on the implementation
of one or more methods in java.lang.module.Configuration and friends?
Are all mentions of JPMS inside JLS intended as references into JDK API?
How are compiler engineers expected to use this information?


JPMS semantics (notably, dependency resolution) are defined by the API 
specification (not the implementation) of java.lang.module.Configuration 
and friends. JLS references to JPMS are references to this Java SE API.


Alex


Re: Java Platform Module System

2017-04-24 Thread Alex Buckley

On 4/24/2017 5:22 PM, Stephan Herrmann wrote:

Obviously, defining JPMS is not done in index.html itself but delegated
to individual documents.

One of the linked documents is a version of JLS with changes on behalf
of JSR 376.

Jay's question was triggered by the observation that this exact version
of JLS contains references like these:
  - "the host system must use the Java Platform Module System to
determine ..."
  - "A 'dependency' is the module resolved by the Java Platform Module
System
for a given requires directive."
  - "The Java programming language does not distinguish between named
modules
specified explicitly in module declarations versus named modules
specified by the
Java Platform Module System when it detects a JAR file on the
modulepath
('automatic modules')"

This creates the impression that for implementing a compiler for JPMS
another document must be consulted in addition to JLS, but the
reference "specified by the JPMS" gives no clue were to look, as it
appears inside the specification ofJPMS.


Dependency resolution in JPMS is accomplished by the static 'resolve' 
method of java.lang.module.Configuration.


Alex


Re: Issue with JavaFX and Jigsaw

2017-04-17 Thread Alex Buckley

On 4/17/2017 12:56 PM, Kevin Rushforth wrote:

The current implementation effectively requires that the containing
package of the type in question be exported unconditionally, since
javafx.beans uses the sun.reflect.misc.Trampoline class to reflectively
call the module. It happens to work if the user exports or opens the
package to ALL-UNNAMED on the command line, but that isn't something we
would want to document or recommend. Without changing the implementation
(which would be quite risky for JDK 9), it seems best to just require
the application's package to be unconditionally exported in JDK 9 and
then relax this requirement in JDK 10 to allow it to be exported (or
opened) just to javafx.base.


OK, so for the JDK 9 docs, "unconditionally exported (or unconditionally 
opened)" would be clear.


Alex


Re: Issue with JavaFX and Jigsaw

2017-04-17 Thread Alex Buckley

On 4/10/2017 3:56 PM, Kevin Rushforth wrote:

The short version is that JavaFX beans is (mostly) working as expected,
except for the misleading exception message. In JDK 9 it is required
that any object that is reflected on by JavaFX beans, specifically the
items passed to TableView, which are accessed via a
PropertyValueFactory, will need to be in a package that is exported
unconditionally. In JDK 10 we can look into relaxing this requirement
such that the package only needs to be exported to javafx.beans.


"exported unconditionally" -- sorry to be picky but this guidance needs 
to be a bit tighter. Can't the user export to a (possibly large) set of 
javaxfx.* modules, rather than unconditionally? Is it OK to _export_ the 
package (rather than open it) because beans will access only its public 
types/members? What if the user opens the package anyway (rather than 
exports it)? I don't know the relationship between the 
PropertyValueFactory type and the type of the items passed to TableView, 
but I think the high-level requirement is: some user type must be 
accessible to JavaFX code, and that is achieved by exporting or opening 
the type's package to at least the javafx.* modules.


Alex


Re: Progress report on SLF4J project modularization

2017-03-24 Thread Alex Buckley

On 3/24/2017 1:12 PM, Ceki Gülcü wrote:

To be more precise, I am trying to have a single jar file which will
be seen as modular on Java 9 and as a regular jar in older JVMs,
with module-info.class ignored.


Isn't that just the first two javac invocations in Alan's mail at 
http://mail.openjdk.java.net/pipermail/jigsaw-dev/2017-February/011306.html 
? Ignore the stuff about multi-release JARs in that scenario.


Alex


Re: Issue with JavaFX and Jigsaw

2017-03-24 Thread Alex Buckley

Trisha, many thanks for forwarding. Here's the fun part of the trace:

On 3/24/2017 2:34 PM, Trisha Gee wrote:

Caused by: java.lang.IllegalAccessException: class
sun.reflect.misc.Trampoline cannot access class
com.mechanitis.demo.sense.client.user.TwitterUser (in module
com.mechanitis.demo.sense.client) because module
com.mechanitis.demo.sense.client does not export
com.mechanitis.demo.sense.client.user to unnamed module @779d6cc6
at
java.base/jdk.internal.reflect.Reflection.throwIllegalAccessException(Reflection.java:423)
at
java.base/jdk.internal.reflect.Reflection.throwIllegalAccessException(Reflection.java:414)
at
java.base/jdk.internal.reflect.Reflection.ensureMemberAccess(Reflection.java:112)
at
java.base/java.lang.reflect.AccessibleObject.slowCheckMemberAccess(AccessibleObject.java:632)
at
java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:624)
at java.base/java.lang.reflect.Method.invoke(Method.java:539)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72)
at jdk.internal.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:547)
at java.base/sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:261)
at
javafx.base/com.sun.javafx.property.PropertyReference.getProperty(PropertyReference.java:198)
... 48 more


Re: ALL-UNNAMED module does not export all packages from classpath

2017-03-20 Thread Alex Buckley
I can't figure out which classes are on which path, and why you think 
ALL-UNNAMED should export FROM the classpath when its purpose is to 
export TO the classpath.


Please clarify your configuration in a few short sentences, rather than 
asking us to open a zip file on an unknown host.


Alex

On 3/20/2017 2:44 PM, Pavel Bucek wrote:

// moving from jdk9-dev, as suggested.

Hi Jon,

Thanks for clarification of the error message.

The main point here is that adding "import ... " fixes the issue, which
doesn't feel correct.

When dependencies are put on the classpath, the import statement is not
required.

Regards,
Pavel

On 20/03/2017 22:26, Jonathan Gibbons wrote:

If nothing else, the javac error message needs work.


(package org.mockito.stubbing is declared in module , which does not
export it)


The space between "module" and "," means there's an "empty" module
name there, for the unnamed module, which should have been stated
explicitly (i.e. "declared in the unnamed module").

Follow-ups would be better on jigsaw-dev or compiler-dev.

-- Jon



On 03/20/2017 02:15 PM, Libor Kramolis wrote:

Hello.

I have problem to compile following unit test:
import org.junit.Test;
import static org.junit.Assert.assertEquals;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

public class TestCase {

 @Test
 public void test() {
 Context context = mock(Context.class);
 when(context.test(any())) //returns
org.mockito.stubbing.OngoingStubbing
 .thenReturn("mock");

 assertEquals("mock", context.test("any"));
 }

 interface Context {
 String test(String value);
 }

}
with following error:

src/test/java/tst/TestCase.java:15: error:
OngoingStubbing.thenReturn(T,T...) in package org.mockito.stubbing is
not accessible
 .thenReturn("mock");
 ^
   (package org.mockito.stubbing is declared in module , which does
not export it)
   where T is a type-variable:
 T extends Object declared in interface OngoingStubbing
1 error

Interface org.mockito.stubbing.OngoingStubbing is returned by when(…)
method. And whenever I explicitly import the interface (no other
change in code is necessary) compilation works.

Full reproduced sources are available in zip file at
http://anise.cz/~paja/liba/reproducer.zip
. It contains javac
commands. It is also possible to build it by Maven.

What do you think about this behaviour? It seems to me as a bug. The
import statement is very artificial in this case.

Thanks in advance for your help.

Best regards,
Libor






Re: No JLS assertion specifying a compile-time error

2017-03-03 Thread Alex Buckley

On 3/3/2017 3:01 PM, Georgiy Rakov wrote:

currently javac from JDK9 build 159 fails to compile following modules:

module m1 { exports p; }
module m2 { exports p; }
module m3 {
 required m1
 required m2;
}

./modules/m3/module-info.java:1: error: module m3 reads package p from
both m1 and m2
module m3 {
^
1 error

Currently lang-vm
 doesn't
specify this behavior, however API documnetation

specifies that such error can occur during resolution.

Should lang-vm specify it explicitly as a compile-time error?


This is partly covered by the requirement, found in the JLS draft in the 
JSR 376 EDR, that "The host system must use the Java Platform Module 
System to determine which modules are read by M (§7.7.1)."


To complete the picture, I agree the JLS draft must mandate a 
compile-time error if the JPMS fails to determine which modules are read 
by M, typically because the JPMS could not resolve M. There is also the 
matter of what should happen for m3/module-info.java when m1 (or m2) 
contains _but does not export_ p; this is still under discussion.


Alex


Re: Confusing error message for inner non-public service provider

2017-02-09 Thread Alex Buckley

// Rewording and resending to avoid confusion.

On 2/9/2017 4:21 PM, Jonathan Gibbons wrote:

On 2/9/17 3:07 PM, Alex Buckley wrote:

All the JLS wants is for the class to be 'public'.

Does that just apply locally to the declaration of the class itself, or
does it also indirectly apply to any enclosing classes, in the case of a
nested class?


Just the declaration of the class itself. The JLS does NOT want the 
specified class to be accessible from . That is, the 
JLS does not care about a chain of access from  to the 
provider class, which might conceivably allow the provider class to have 
default (package) access. The JLS just wants the 'public' modifier on 
the class declaration, end of story.


Alex


Re: Confusing error message for inner non-public service provider

2017-02-09 Thread Alex Buckley

On 2/9/2017 4:21 PM, Jonathan Gibbons wrote:

On 2/9/17 3:07 PM, Alex Buckley wrote:

All the JLS wants is for the class to be 'public'.

Does that just apply locally to the declaration of the class itself, or
does it also indirectly apply to any enclosing classes, in the case of a
nested class?


Just the declaration of the class itself. The JLS does NOT want the 
specified class must be accessible from , which would 
imply a chain of accessibility from  to the provider 
class. The JLS just wants the 'public' modifier on the class 
declaration, end of story.


Alex


Re: Confusing error message for inner non-public service provider

2017-02-09 Thread Alex Buckley

On 2/9/2017 2:49 PM, Vicente Romero wrote:

Just to double check, the right javac behavior in this case should be to
issue similar errors in both cases like:

some position here: error: ServiceImpl is not public in
com.example.internal; cannot be accessed from outside package


It's correct to give an error, but the clause "cannot be accessed from 
outside package" should be dropped (it's not relevant to ServiceLoader).



some other position here: error: Outer.ServiceImpl is not public in
com.example.internal; cannot be accessed from outside package


It's not correct to give an error at all. The JLS (acting on behalf of 
ServiceLoader) is not interested in the class Outer.ServiceImpl being 
"accessible" by some arbitrary client. All the JLS wants is for the 
class to be 'public'.



without mentioning in any case anything about visibility right?


Correct. All the types we're discussing are in the same module so they 
(and their packages) are all visible to each other; package visibility 
is irrelevant to this example.


Alex


  1   2   3   >