On Wed, Mar 30, 2011 at 02:21:04AM +0100, sebb wrote:
> On 30 March 2011 01:15, Gilles Sadowski <gil...@harfang.homelinux.org> wrote:
> > Hi.
> >
> >> We have been talking about moving away from interfaces as the
> >> preferred way to support people plugging in alternative
> >> implementations because they have in several places gotten "behind"
> >> due to the fact that adding anything to them breaks compatibility.
> >> We should probably continue that discussion in a different thread.
> >
> > [This is the different thread.]
> >
> > From comments that were posted to the other thread, I gather the main trend
> > that, because some interfaces needed an upgrade, the "interface" design tool
> > is becoming "evil". Did I get this right?
> 
> It's not as clear-cut as that.
> Interfaces have their place, but have drawbacks if the original
> interface is later found wanting.

I have no problem with that; especially, I am against creating interfaces
just to follow the "programming by interface" paradigm.
This was done at a few places in CM, and I wholeheartedly welcome the
simplification brought by removing all interfaces for which there is only
one implementation.

> > I guess that you refer to "RandomData" and "RandomDataImpl". This is indeed
> > the typical example of abusing the "interface" tool. When only one
> > implementation is meaningful, an "interface" need not be defined.
> >
> > The "interface" is not the way (preferred or not) to support alternative
> > implementations. As was already said, this is done with (abstract or not)
> > classes which an alternative implementation can inherit from.
> > Rather, the (Java) "interface" is supposed to represent the abstraction
> > (from the "real" world object) of everything that is needed to interact with
> > that object (i.e. its interface). It makes it possible to treat different
> > objects on a equal footing (from the caller's point-of-view).
> > But you all know that...
> >
> > So what's the problem? Is it the binary compatibility, again? This is a
> > configuration management issue. When the compatibility is broken, you change
> > the major version number and/or the base package name. That was settled, or
> > not?
> 
> That solves the problem, but at the cost of forcing all users to edit
> and recompile, so should not be undertaken lightly.

I'm sorry: I still might not have gotten something quite fundamental, as I
continue to not understand.
Either the user wants to use new features and he *has* to recompile or he
doesn't want to be bothered by incompatible changes and he keeps using the
previous release.
The other case is when a bug has been discovered, so that the user might
rightly want to use a drop-in replacement with the bug fixed. Then it is a
release and support policy issue. The right thing would be to provide a
compatible release with all bugs removed. As I see it, the problem in CM is
one of lacking resources. IMHO erecting the binary compatibility principle
as the ideal goal is not a substitute of support of old releases.
As a mitigating measure, the minor numbers releases will be binary
compatible. For the user, there remains the risk that a bug has been fixed
before just before a major release: If he wants to benefit from that, he'll
have to edit and recompile. That's the balance between the user's slight
discomfort, sometimes, and a project that will be stuck in dead ends.

> > It would be a pity to sacrifice a tool aimed at improving the design because
> > of such considerations as keeping backward compatibility with versions that
> > nobody here is going to support.
> > If some user is happy with version "M.m", he can use it forever. If he wants
> > to use a newer version "N.n", he should not expect it to be compatible. It
> > does not have to be! Non-compatible modifications are not performed out of
> > some urge for change but stem from the desire to get a better product, bits
> > by bits.
> >
> > Yes, it's not easy to get the interfaces right; so what? If you find that
> > you can improve the design, you do it and bump the major verion number.
> > As someone pointed out, it's not as if we'll run out of numbers.
> 
> But users could run out of patience if every release requires them to
> edit and recompile.

I'm not advocating for making each new release incompatible with the
previous one. It's releasing too rarely which leads to this situation!
Rather I'm in favour (and I'm not the only one in the Commons community) of
releasing often, because there will be a higher probablility that a
backward-compatible official release exists that contains the bug fixes
which a user might want.

> > Part of the real problem (as shown by the amazing amount of done and undone
> > work for 2.2) is that you (collectively) want to do too many things at the
> > same time (lots of changes *and* few releases).
> 
> I don't think that's fair.
> 
> Making lots of releases all of which may be binary incompatible with
> each other just compounds the problem.
> 
> IMO it's important to minimise the amount of user disruption, so it
> make sense to bundle as many API breaking changes into a single
> release as possible.

I agree with that. I just don't think that anyone (and not even all of us
together) can make sure to get it right in one fell swoop. We fix what we
see, and make releases (preferably at more or less fixed dates). It is
counter-productive to work endlessly for fear to have missed something
(which we will anyways).
[Luc did that for release 2.2, changing yet another little thing to be nice
to users, and then everything was reverted because the ultimate goal could
not be achieved that way.]

> > To be clear, the problem is
> > not the "lots of changes" part (which you would like to "solve" by vetoing
> > future compatibility-breaking improvements). Note that the following is not
> > a criticism of CM (which has many features etc.) but some of the code that
> > I've seen (in the parts which I've more closely looked at) do not make me
> > think that it is mature enough that one can say "That's it!" and stick with
> > the design forever.
> 
> > Again, all this (e.g. removing duplicate code and refactoring the design)
> > can be considered unimportant and swept under the carpet but IMO *any*
> > cleanup is good as it will contribute to the overall robustness and
> > maintainability.
> 
> > Half-baked design will continue to itch.
> 
> Which is why it's important to spend enought time on development, and
> be prepared to restart if things turn out wrong.

That's *exactly* my point; but I still suspect that we don't mean the same
thing. IMO, we cannot decide that things *will* go wrong just because we try
to answer all "What if ...?" questions that come to mind. It is impossible
to foresee all uses of CM; so CM's development cannot be based on the
assumption that we can cover all user's requests. IMO, it must be based on
internal consistency. If that is achieved, and _afterwards_ it is asserted
that some use case cannot be performed, it will be time to restart. But the
refactoring will be easier, thanks to the internal consistency.


Regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to