On Friday, September 21, 2012 at 12:57 AM, PJ Eby wrote:
> As far as the practicality vs. purity question, Python has already had
> Provides/Requires in the metadata format for several years, and it
> contained all the features that were needed for a "pure" dependency
> resolution system. In *theory* you could meet everyone's use cases
> with that, if you're going to assume that all packages live in PyPI
> and everyone has a compiler. And then there wouldn't be any need for
> any of the new metadata specs.
That field only allows importable names which many distributions do not
have (e.g. you can't put in `django-tastypie` into requires).
The exact wording is:
The format of a requirement string is identical to that of a
module or package name usable with the 'import' statement,
optionally followed by a version declaration within parentheses.
These fields were _not_ for saying that it required a particular
distribution/project
and _were_ for saying it requires a particular module or package (in the import
sense).
>
> In practice, however, not everything's in PyPI and not everybody has
> compilers, and not every package is instantly updated or maintained
> indefinitely. If you don't like how dependency links solve the
> problem, perhaps you could propose some other way of handling those
> use cases, such that developer B's dependency maintenance burdens are
> not propagated to developer A and his peer/parent dependers?
I don't see why pushing the maintenance burden down the stack isn't an ok thing
to do. We already do it for External requirements that either aren't python
packages
or are packaged in a way that standard tools cannot handle them. Obviously you
have to draw the line somewhere between packages that you can depend on
automatically and ones you can't.
My problems with dependency_links is less to do with guessing and more about
the fact it ties dependency installation to a particular host. The fact that
dependencies
are abstract requirements (a distribution named foo at version 2.0) and not a
concrete
location (distribution foo version 2.0 found at http://example.com/dists/)
means that
you get to choose at install time where your requirements come from. So a user
could
easily host their own intranet PyPI style server that they installed their
dependencies
from. A dependency that comes from dependency_links can't be installed from your
own internal PyPI (or another public one) without rewriting the setup.py
scripts (Unless
my understanding of dependency links is wrong).
Additionally by tying a dependency to an external system you decrease the
overall
installablity of a set of requirements* and you increase the number of
locations that
someone can use to exploit developers installing Python packages. These are
important
considerations when deciding to install or depend on a package and the
dependency_links
system "hides" it and makes it non obvious to end users what all is entailed in
installing their set of requirements. (Note I have the same basic problem with
external
links on PyPI).
I don't think it's entirely unreasonable for a packaging tool to have the
ability to install
from other locations that PyPI or a PyPI like server, I think that a packaging
tool
probably should have a dependency_links like system, however I think that the
choice
to use them should be in the hands of the person _installing_ the package, not
the
person creating the package.
* If You depend on 5 different hosts for installing any particular set of
requirements,
and each of those 5 servers have a 99% uptime then the total combined uptime
would be (0.99 * 0.99 * 0.99 * 0.99 * 0.99) = ~95%. In an average month you'd
go from ~7 hours of uninstallable time per month (for one server at 99%
uptime)
to over a day of uninstallable time per month (for 5 servers at 99% uptime
each).
_______________________________________________
Distutils-SIG maillist - [email protected]
http://mail.python.org/mailman/listinfo/distutils-sig