On Apr 15, 2008, at 9:09 PM, Greg Ewing wrote:

David Cournapeau wrote:
Greg Ewing wrote:

If they really do need different versions, this is insoluble.

But that's by far the most significant problem of packages with a lot of
dependencies !

But if your application really does depend on two libraries that
have conflicting requirements like this, the application itself
is screwed to begin with. There's *no* way of making it work
on *any* system, whether that system has library versioning or
not.

Consequently, the developer will be unable to make it work on
his own machine, and will therefore never attempt to deploy it!

That's good and fine for the situation where one application is
being deployed on a Python installation, but that isn't a very realistic
situation.  There is usually more than one application/script/whatever
running off of a given Python installation.

Here's an example of how you can get into an unworkable situation:

Application A depends on:
library B v 1.1
library C v 2.3

Library B v 1.1 has no dependencies.

Now application D comes along.  It depends upon:
library B v 1.4

But library B v 1.4 depends upon library C v 2.6.

Application A *breaks* with library C v 2.6.

Which does the user give up?  Application A (their
storage management client) or application D (their
monitoring agent)?

This sort of stuff happens all the time.

This is what a Versioning (with a capital V attempts
to solve.)

That's because the program would say something like "Please
give me gtk-2.3", and the system would know that anything
called gtk-2.x with x >= 3 is compatible with that, and would
pick the one with the greatest x out of those available.

Thats a great theory, but that's not how the real world works.  Python
packages are an ecology where there will be inconsistencies between
different minor versions of the same package.  A legitimate bug fix
may break behavior that other packages depend upon while correcting
the behavior for others.  That's just the way stuff happens.

We can encourage 10,000 disparate developers to adhere to best practices, but it's fantasy to imagine that they will. Most of the people putting together
packages are donating their 80% solutions to the rest of the world.

They will develop these packages over time. The packages will be changed to suit their *own* needs as time goes on. Those changes will be released
back to the public eventually (we hope).

Some of those changes are going to break compatibility with the old
version for some reason or another.  Perhaps 100% backwards
compatibility either wasn't important to the author. Perhaps because they
didn't have the time to do exhaustive backwards compatibility testing.
Why didn't they expend the effort? Because she's working on her
thesis or maybe because his wife just had triplets.  Perhaps people
came to depend on functionality that the author didn't ever expect
them to use, or never even anticipated, and has never even heard'
of.

What we have to figure out his how to allow *those* packages to work
with *these* applications no matter what.


IOW, enabling version requirement without strong API compatibility is
the path to dependency hell.

I fully agree. However, this is a social issue, not a
technical one. Everyone using the versioning system would
need to understand the rules and be willing to follow them.

What is needed is a Real Versioning System.  A system in which a library
or application consists of:

1) files
2) a dependency link map

When the system loads the package it links in the dependent modules. Then
you can get a system where:

Application A imports B v 1.1 and C v 2.3

and

Application D imports B v 1.4 and C v 2.6.

At least that's the holy grail as I see it.

-jeff
_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to