On Fri, Dec 17, 2010 at 4:48 PM, Jim Fulton <j...@zope.com> wrote: >> Although specifying a dependency >> without a version shouldn't be a good practice, > > Actually, it's best practice.
I don't think so. It's what people does, removing packages from pypi it's common, but it's awful. As a library mantainer, I can never break an API, not even if I release 2.0 after working on 1.0, because projects depending on my lib might just break. Hence we get "project" "project2" "project3", e.g. versioning through project naming. Is that a good idea? Consider the Java world, where Maven is a de-facto standard. Once a package with a certain version is uploaded, it's usually never removed. You can decide to depend on the latest 3.x.x, on the latest 3.5.x, on version ranges, or on a very specific versions. This makes it possibile for developers to depend on the versions they really need. Dependency problems may arise whatsoever and must be usually manually fixed, but in this case the Python way seems to say "hey, let's pretend the problem doesn't exist and the API never changes in a backwards incompatible way". >> it seems quite common >> - mostly because many people have got full control of the pypi section >> on their own repo and they can decide what's offered there, so they >> just specify "mydependency" on their other projects. I think it's >> quite a wide assumption, by the way. > > It's best practice. Requiring a specific version is extremely inflexible. > If everyone did it, you'd never be able to assemble anything of any size. You might require a version range - I think I asked something about version specification, which is quite fuzzy in setuptools/distribute, some weeks ago. BTW if your require the API exposed by a project at version, let's say, 1.4 (or "at least 1.4 but less than 2.0") it won't do any good not to specify your version. You're effectively depending on the state of an external entity which is not under your control. Scenario: You project A today works 100% with a certain version of lib FOO, which is, let's say, 1.0.0, but you don't explicitly write that version requirement. In a year, you get back to project A, run the tests, and everything fails, because today's lib FOO 1.2.0 API changed. Of course you could use a private repository as a backend instead of pypi, but this means you'd require an additional, stateful indirection level for what could be declaratively expressed just by version numbers. > Maybe system packing tools use better dependency-resolution > mechanisms. It could happen! I have certainly been stymied by > version conflicts in system packaging systems, so I don't think their > algorithms are that great. Version conflicts of course can arise. If there's no possible dependency path, nor yum neither apt will be able to do anything. > > The way to work around this with buildout is to use a buildout versions > section: > > http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used > > It would be interesting to see if a breadth first strategy would provide > better > behavior. I'll take a look at this and see what it happens. But I think we need to rethink the way versioning is handled. -- Alan Franzoni -- contact me at pub...@[mysurname].eu _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig