On 03/01/2015 11:41 AM, Florian Weimer wrote:
On 02/27/2015 06:16 PM, Paul Sandoz wrote:
On Feb 27, 2015, at 4:47 PM, Florian Weimer <fwei...@redhat.com> wrote:
I really don't think this tooling support will provide sufficient
enticement to developers to maintain separate 7/8/9 source branches of
their libraries. Isn't that the main obstacle, and not the way the bits
are delivered?
What if all the source for 7/8/9 bits were under one project?
Tool support for that is still worse than for separate branches of the
same project.
In general, VCS support for branches is still quite poor because the
sensible thing to do is to develop a feature on the master branch and
then backport it to older release branches as needed. But such
selective backwards-merging very much lacks VCS support because the
least common ancestor logic build into almost all current tools does not
apply in this scenario. From a tool perspective, developing the feature
in the oldest relevant release and then merging forward is the only
supported way. But it's usually bad engineering because you want new
features to use current facilities of the code base. So the only thing
you have left is to cherry-pick individual patches (I think thats what
“hg transplant” does in Mercurial).
Anyway, you lose the tiny bit of tool support you have for backporting
if you have everything in the same source repository.
For example, say there is a source file:
src/main/java/foo/Foo.java
whose content is:
import sun.misc.BASE64Decoder;
public class Foo {
// does something with sun.misc.BASE64Decoder
}
There might be another source file located in the 8 area that overrides that in
the unversioned area:
src/main-8/java/foo/Foo.java
whose content is:
import java.util.Base64;
public class Foo {
// does something equivalent with java.util.Base64
}
This really screams preprocessor. :-(
The public contract of Foo should remain identical across the major Java
platform dependent versions, in a more strict sense the public signatures in
the byte code should be identical (the jar tool has been modified to check
this).
If that's the goal, something like Retroweaver seems far more
appropriate than forcing library authors to maintain separate sources.
yes, i fully agree,
in that case you need to bundle
- a jar compatible with the lastest version
- and some metadata to help the 'retroweaver' to transform the new code
to the old code.
then you need a special step during the install process where you can
inspect the dependencies
and choose to run the 'retroweaver' or not.
Multi-version JAR files seem useful because they will eventually allow
evolution of the class file format. But I think the current use case
isn't really there, I'm afraid.
I think being able to ship several versions in one jar is a valid use
case but the solution seems causing more troubles than just solving this
use case.
Currently, there are two ways to solve the Base64 issue:
- by loading at runtime either the class that use java.util.Base64 or
the class sun.misc.BASE64Decoder and use it through an interface
- by using using only one class this a code like this:
try {
// use java.util.Base64
} catch(NoClassDefFoundError e) {
// use sun.misc.BASE64Decoder
}
at compile time, you need to provide a fake empty class
sun.misc.BASE64Decoder
that will not be bundled in the destination jar.
Rémi