On 03/01/2015 07:08 PM, Peter Levart wrote:

Hi Peter,
You can see the whole thing in the opposite way which I think is less disruptive in term of tooling.

You only maintain one module which depend on the latest version.
When you ship the module let say as a jar, you also ship the way to downgrade it as an executable code. At install time, when all the modules are known, before trying to create the images,
the code that downgrades the module is executed if necessary.
This code can rewrite the whole bytecodes of all the classes the module thus ensure backward compatibility without providing binary backward compatibility.

It's a win-win-win strategy, your code doesn't live in the past, the bytecode rewriter can fail saying there is a problem before runtime (if by example a method is missing), you can break the binary backward compatibility if this is necessary and still being compatible.

You can see this mechanism as a kind of annotation processor that will be executed at install time instead of at compile time.

cheers,
RĂ©mi


On 03/01/2015 12:53 PM, Remi Forax wrote:
Currently, there are two ways to solve the Base64 issue:
- by loading at runtime either the class that use java.util.Base64 or the class sun.misc.BASE64Decoder and use it through an interface

And this, I think, is also the cleanest way to approach a problem that multi-version JAR wants to solve. The only argument against it is that it perhaps requires more work and planing, but that doesn't mean it is less maintainable. In general one has to split the software into "modules":

- base module (platform independent code) + facade interfaces to access platform dependent functionality - platform dependent module(s) that depend on base module and contain implementations of facade interfaces published as services through ServiceLoader - one module per "target platform version" / "platform dependent feature" combination

Base module is compiled 1st (it does not depend on platform version, so it should be compiled with lowest version javac/rt.jar to be able to run on any target platform)

Platform dependent module(s) are compiled against classes of base module and with javac/rt.jar of the desired target platform(s).


Now if module system of JDK9 had a feature that would enable/disable the deployed module conditionally, then that condition could be an exact target platform version.

But I know, multi-version JAR wants to solve similar problem for pre-JDK9 target platforms too which don't yet have modules. The approach to organizing sources and compiling can be the same as described, just packaging can then use multi-version JARs instead of "conditional" modules.

One problem I see with requirering that in multi-version JARs each version sub-path must have same public classes with same public APIs is that IDEs don't like that. They don't like to see multiple versions of same class and they paint them red. Refactoring doesn't work correctly if IDEs see multiple versions of same class and can never be made to work correctly as it would have to keep versions always in sync. I always have problems with IDEA when I present it the unix/windows versions of classes in JDK build at the same time. This can be solved nicely by not requirering that and simply relying on the programmer that (s)he knows what (s)he is doing. With interfaces as services and implementations locateable via ServiceLoader, one does not have to use same names of classes for implementations of interfaces targeting different platforms and IDEs will be happy.

Regards, Peter


Reply via email to