[Distutils] Question about easy-install.pth

2008-09-30 Thread Brian Cameron


I am having troubles working with setuptools on Solaris.  The Solaris 
operating system normally installs modules as packages which contain

binaries.  This is unlike other Linux operating systems where, for
exmaple, an RPM would download the source and build it on the user's
machine when they install the RPM.

So, to create packages on Solaris we normally install a module to
a temporary directory such as /var/tmp/pkgbuild-foo/usr, package up
the files that are built, and when the user installs the package
these files are then installed to their system.

However, we are having problems with figuring out how to properly
create the /usr/lib/python2.4/site-packages/easy-install.pth file
using our build system.  What happens is that each package ends up
with its own easy-install.pth file which only contains the
information for that one module.  Users can't install two such
packages at the same time because it creates a file conflict, two
packages can't install the same file.

Solaris packages do have pre-install, post-install, pre-uninstall
and post-uninstall scripts so that we could do something like avoid
installing the file as a part of the package and instead generate it
on-the-fly via scripting.  However, it doesn't seem that setuptools
provides any mechanisms for doing this easily.  Though I'm not very
familiar with setuptools or easy_install, so I hope that I'm missing
something.

Aside from writing our own code to manually manage adding and removing
entries to/from this file, is there any way that setuptools allows you
to manage this file when installing binaries to a system, or does
setuptools assume the file only needs to be managed when you build a
module?

I'm hoping to avoid some hacky solution where we try and hack the
easy-install.pth file by hand when users install or uninstall packages.
Since users can install or uninstall any random combination of
packages which may need an easy-install.pth file, I can imagine that
it would get complicated to properly manage it via our own scripts.

Or is there some other solution that might be useful in our situation
that might avoid the need for multiple packages to need this file, or
might it be possible for different modules to use differently named
files instead of a commonly named easy-install.pth file.

Brian

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "Python Package Management Sucks"

2008-09-30 Thread Toshio Kuratomi
Ian Bicking wrote:
> Rick Warner wrote:
>>> Actually, PyPI is replicated.  See, for example,
>>> http://download.zope.org/simple/.
>>>
>>> It may be that some of the mirrors should be better advertised.
>>
>> A half-hearted effort. at best, after the problems last year.  When I
>> configure a CPAN client (once per user) I create a list of replicas I
>> want to search for any query from a list of hundreds of  replicas
>> distributed around the world. 
> 
> Can someone suggest the best way to search among repositories?  For
> instance, try to connect to one, then stop if it gives Connection
> Refused?  If it gives any unexpected error (5xx)?  Timing out is a
> common failure, and a pain in the butt, but I guess there's that too.
> What does the CPAN client do?
> 
> 
I don't know what CPAN does but Linux distributions have also solved
this problem.  We send out massive numbers of updates and new packages
to users every day so we need a mirror network that works well.

In Fedora we have a server that gives out a list of mirrors with GeoIP
data used to try and assemble a list of mirrors near you (country, then
continent (with special cases, for instance, for certain middle eastern
countries that connect better to Europe than to Asia) and then global).

This server gives the mirror list out (randomized among the close
mirrors) and the client goes through the list, trying to retrieve
package metadata.  If it times out or otherwise fails, then it goes on
to the next mirror until it gets data.  (Note, some alternate clients
are able to download from multiple servers at the same time if multiple
packages are needed.)

The mirrorlist server is a pretty neat application
(https://fedorahosted.org/mirrormanager).  It has a TurboGears front end
that allows people to add a new mirror
(https://admin.fedoraproject.org/mirrormanager) for public availability
or restricted to a subset of IPs.  It allows you to only mirror a subset
of the whole content.  And it has several methods of telling if the
mirror is in sync or outdated.  The latter is important to us for making
sure we're giving out users the latest updates that we've shipped and
ranges from a script that the mirror admin can run from their cron job
to check the data available and report back to a process run on our
servers to check that the mirrors have up to date content.  The
mirrorlist itself is cached and served from a mod_python script (soon to
be mod_wsgi) for speed.

You might also be interested in the way that we work with package
metadata.  In Fedora and many other rpm-based distributions (Some
Debian-based distros talked about this as well but I don't know if it
was ever implemented there) we create static xml files (and recently,
sqlite dbs as well) that live on the mirrors.  The client hits the
mirror and downloads at least two of these files.  The repomd.xml file
describes the other files with checksums and is used to verify that the
other metadata is up to date and whether anything has changed.  The
primary.xml file stores information that is generally what is needed for
doing depsolving on the packages.  Then we have several other xml files
that collectively contain the complete metadata for the packages but is
usually overkill... by separating htis stuff out, we save clients from
having to download it in the common case.  This stuff could provide some
design ideas for constructing a pypi metadata repository and is
documented here:  http://createrepo.baseurl.org/

Note: the reason we went with static metadata rather than some sort of
cgi script is that static data can be mirrored without the mirror being
required to run anything beyond a simple rsync cron job.  This makes
finding mirrors much easier.

-Toshio



signature.asc
Description: OpenPGP digital signature
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Greg Ewing

Ian Bicking wrote:

FWIW, pyinstall can collect all the packages before installing any of 
them.  You do have to download all packages, though, as that's the only 
way to get the metadata.


This may be something to make sure is on the requirements
list for a metatdata standard: Make sure there is a defined
way of getting just the metadata from a repository without
having to download the whole package.

--
Greg
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Greg Ewing

Josselin Mouette wrote:

if you try to
build a package of baz, there is no way to express correctly that you
depend on python-bar (>= 1.3) or python-foo (>= 1.2).


Seems to me that baz shouldn't have to say that -- all it
should have to say is that it requires bar version 1.3.
It's up to the package manager to know how to look inside
packages to see what versions of other packages they contain,
if such a thing is going to be allowed.

Otherwise, whenever one package is moved inside another,
then in order to take advantage of that, all other packages
that use it would have to have their dependencies updated,
which doesn't seem reasonable.

I can't see the point of nesting packages like this anyway.
If bar really is usable independently of foo, then why not
just leave it in a separate tar file and let foo declare
a dependency on it?

They can be bundled together for convenience of manual
distribution if desired, but when installed, such a bundle
should be split out into separate packages as far as the
package manager sees them.

If they're being automatically retrieved from a repository,
it makes more sense to keep them separate. There's no
more to download that way, and there may be less, since
you can just download the packages actually needed.

--
Greg
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [issue48] ignores --build-directory= option if argument is a local file

2008-09-30 Thread Phillip J. Eby

This is as-documented.  Per the docs:

"""If a package is built from a source distribution or checkout, it 
will be extracted to a subdirectory of the specified directory."""


That is, --build-dir applies only to SVN checkouts (done by 
easy_install itself) and source distributions (i.e. sdist zipfiles 
and tarballs).


___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Just downloading an egg

2008-09-30 Thread Jeff Rush

Pascoe, S (Stephen) wrote:

I often just want to do with easy_install is download an egg from pypi without installing 
it.  I've studied the easy_install documentation and never found a way to do it.  Even 
giving it the "-d" option results in easy-install.pth being created and other 
unwanted stuff.

Looking at the setuptools pydoc I worked out a way to do it:


import setuptools
d = setuptools.Distribution()
d.fetch_build_egg(requirement)


Voila! the egg is downloaded into cwd.  It even seems built an egg from a 
tarball.

My question is, can I rely on this feature and is it the best way of doing what 
I want?  I'd like to use it in my code and hope it stays.  It would be ideal if 
I could do this through easy_install.


Actually this command will download the egg and just the egg:

   easy_install -zmaxd . -N SQLObject

or if you want the egg -and- its dependencies (as eggs too):

   easy_install -zmaxd . SQLObject

-Jeff

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Dave Peterson

Josselin Mouette wrote:

Le mardi 30 septembre 2008 à 15:46 -0500, Dave Peterson a écrit :
  
Josselin Mouette wrote: 


No, please stop here. That’s not OK. If a new version of HardJSON breaks
your application, it is friggin’ broken. If that new version is not
compatible, it should be called HardJSON2, and nothing will break.
  

I disagree with your assertion that the name HAS to imply API
compatibility.   There ought to be something that specifies API / ABI
compatibility, such as the combination of name and some portion of a
version number,  but too many people depend on a name for marketing or
other purposes for us to impose that it indicate technical aspects.



The marketing name does not have to be the same of the name of the
module you import. The situation where they differ is even quite common.
  


But we already have a separation between project name and module names 
that are contained within that project.   We don't currently declare 
dependencies on the module names but on the project name.   i.e. a 
dependency on HardJSON > 2.0 does not say anything about what modules 
you're expecting to import or use, only that you expect to use version 2 
of a project called HardJSON.   Were you suggesting that change?


I think the rest of the comments are easily resolved after the above is 
clear.



-- Dave

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Jean-Paul Calderone

On Tue, 30 Sep 2008 23:32:14 +0200, Josselin Mouette <[EMAIL PROTECTED]> wrote:

[snip]

The marketing name does not have to be the same of the name of the
module you import. The situation where they differ is even quite common.

You can also argue for separating the name from the API version, like
the soname of a library, and I’ll agree, but in the end it is very
similar.


Do you think this is practical for non-trivial libraries?  For any
library which has more than one API, the possibility exists for one
API to change incompatibly and the other to remain compatible.  With
larger libraries, the value of changing the module name because one
(or some other small fraction of the whole) API changed incompatibly
decreases as compared to the cost of updating all software which uses
the library to use the new name (much of which may well be unaffected
by the incompatible change).

I am a huge fan of backward compatibility.  You may not find a bigger
one (at least in the Python community).  I can't understand how this
approach can be made feasible though.  Should the next release of Twisted
include a Python packaged named "twisted2" instead of "twisted"?  And the
one after that "twisted3"?  There are thousands of APIs in Twisted, and
we do change them incompatibly (after giving notice programmatically for
no less than 12 months).  Should we instead give up on this and make all
users of Twisted update their code to reflect the new name with each
release?

Jean-Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] Just downloading an egg

2008-09-30 Thread Pascoe, S (Stephen)

I often just want to do with easy_install is download an egg from pypi without 
installing it.  I've studied the easy_install documentation and never found a 
way to do it.  Even giving it the "-d" option results in easy-install.pth being 
created and other unwanted stuff.

Looking at the setuptools pydoc I worked out a way to do it:

>>> import setuptools
>>> d = setuptools.Distribution()
>>> d.fetch_build_egg(requirement)

Voila! the egg is downloaded into cwd.  It even seems built an egg from a 
tarball.

My question is, can I rely on this feature and is it the best way of doing what 
I want?  I'd like to use it in my code and hope it stays.  It would be ideal if 
I could do this through easy_install.

Thanks,
Stephen.
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 15:46 -0500, Dave Peterson a écrit :
> Josselin Mouette wrote: 
> > No, please stop here. That’s not OK. If a new version of HardJSON breaks
> > your application, it is friggin’ broken. If that new version is not
> > compatible, it should be called HardJSON2, and nothing will break.
> 
> I disagree with your assertion that the name HAS to imply API
> compatibility.   There ought to be something that specifies API / ABI
> compatibility, such as the combination of name and some portion of a
> version number,  but too many people depend on a name for marketing or
> other purposes for us to impose that it indicate technical aspects.

The marketing name does not have to be the same of the name of the
module you import. The situation where they differ is even quite common.

You can also argue for separating the name from the API version, like
the soname of a library, and I’ll agree, but in the end it is very
similar.

> If your OS distribution chooses to do things that way, then fine, when
> your OS builds the distribution, it can rename it to HardJSON2 but
> that shouldn't be required of every platform.

We can do that, but we won’t as long as it is possible to do otherwise.
It completely breaks compatibility with third-party packages or modules,
and it is unnecessarily hard to maintain.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] [issue48] ignores --build-directory= option if argument is a local file

2008-09-30 Thread Zooko O'Whielacronx

New submission from Zooko O'Whielacronx <[EMAIL PROTECTED]>:

It seems like if the argument to easy_install is a local file instead of a
distribution name, then it ignores the --build-directory option:

easy_install -v -v -v --prefix=./instdir --build-directory=./builddir
./pyutil-1.3.21.tar.gz | grep ^Running | head -1
Running pyutil-1.3.21/setup.py -vvv bdist_egg --dist-dir
/tmp/easy_install-2fiWD4/pyutil-1.3.21/egg-dist-tmp-PU48xQ

$ mkdir -p instdir/lib/python2.5/site-packages ;
PYTHONPATH=./instdir/lib/python2.5/site-packages easy_install -v -v -v
--prefix=./instdir --build-directory=./builddir ./pyutil-1.3.21.tar.gz | grep
^Running | head -1
Running pyutil-1.3.21/setup.py -vvv bdist_egg --dist-dir
/tmp/easy_install-8CDXG9/pyutil-1.3.21/egg-dist-tmp-V3Tmzo

versus

$ mkdir -p instdir/lib/python2.5/site-packages ;
PYTHONPATH=./instdir/lib/python2.5/site-packages easy_install -v -v -v
--prefix=./instdir --build-directory=./builddir pyutil | grep ^Running | head -1
Running setup.py -vvv bdist_egg --dist-dir ./builddir/pyutil/egg-dist-tmp-sFObAa

--
messages: 184
nosy: zooko
priority: bug
status: unread
title: ignores --build-directory= option if argument is a local file

___
Setuptools tracker <[EMAIL PROTECTED]>

___
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Martin v. Löwis
Chris Withers wrote:
>> Contributions are welcome. The source code of PyPI is available
>> publically, 
> 
> Where?

https://svn.python.org/packages/trunk/

>> and I'm willing to accept patches. I won't have time
>> to work on this in the next 12 months myself.
> 
> These two don't seem to go hand in hand and don't really seem to fit my
> experiene of the catalog-sig :-(

Saying what? that I do have time in the next twelve months? That's nice
to hear.

Regards,
Martin

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Dave Peterson

Josselin Mouette wrote:

Le mardi 30 septembre 2008 à 11:37 -0500, Ian Bicking a écrit :
  

Say I have a package that represents an application.


[snip]

  
Then HardJSON 2.0 is released, and Turplango only required 
HardJSON>=1.2, so new installations start installing HardJSON 2.0.  But 
my app happens not to be compatible with that library, and so it's 
broken.  OK...



No, please stop here. That’s not OK. If a new version of HardJSON breaks
your application, it is friggin’ broken. If that new version is not
compatible, it should be called HardJSON2, and nothing will break.
  



I disagree with your assertion that the name HAS to imply API 
compatibility.   There ought to be something that specifies API / ABI 
compatibility, such as the combination of name and some portion of a 
version number,  but too many people depend on a name for marketing or 
other purposes for us to impose that it indicate technical aspects.


If your OS distribution chooses to do things that way, then fine, when 
your OS builds the distribution, it can rename it to HardJSON2 but that 
shouldn't be required of every platform.



-- Dave


___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 8:21 PM, Rob Cakebread <[EMAIL PROTECTED]> wrote:
> On Tue, Sep 30, 2008 at 10:51 AM, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
>> At 12:25 PM 9/30/2008 -0400, A.M. Kuchling wrote:
>>>
>>> On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
>>> > FWIW, pyinstall can collect all the packages before installing any of
>>> > them.  You do have to download all packages, though, as that's the only
>>> > way to get the metadata.
>>>
>>> Does the DOAP output for a package not contain enough metadata?
>>
>> Nope.  And it can't possibly do so, unless it contains dependency data for
>> every possible variation of the package.  For example, a package might
>> dynamically declare dependency on ctypes, depending on whether you're
>> installing it for Python 2.4 or Python 2.5.  (Dependencies can also be
>> platform-specific and build-option-specific, as well as
>> Python-version-specific.)
>>
>
> Not to mention the DOAP vocabulary lacks a way to describe dependency
> information. This is planned but it has to be well thought out because of
> all the variations Philip mentions.

out of curiosity, :

- can a RDF-based database can possibly handle such a graph ?
- would it make sense for PyPI to query the doap server to get those
dependency infos ?
-


>
> The good news is much of this dependency info is already in existence in
> Linux distributions. Take a Gentoo ebuild, for example. It has separate
> run-time, build-time and test dependency info, dependencies based on
> enabled features, and dependencies based on the version of Python used.
>
> Ebuilds also have metadata mapping the PyPI name to the Gentoo
> package name, so it'll be easy enough to create a database with all this info.
>
> I'm working on this now at http://doapspace.org/ where you can find DOAP
> for Python packages with a bit more metadata than the DOAP supplied on
> PyPI ( http://doapspace.org/doap/py/PYPI_PKG_NAME )
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> http://mail.python.org/mailman/listinfo/distutils-sig
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Chris Withers

Martin v. Löwis wrote:

That said, I didn't see any indication of what I consider to be a
critical failure in PyPI: No dependency metadata prior to downloading
the package.


Contributions are welcome. The source code of PyPI is available
publically, 


Where?


and I'm willing to accept patches. I won't have time
to work on this in the next 12 months myself.


These two don't seem to go hand in hand and don't really seem to fit my 
experiene of the catalog-sig :-(


Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] more thoughts on python package management

2008-09-30 Thread kiorky
Chris Withers a écrit :
> > Hi All,
> >
> > I've been trying to catch up on all the packaging discussions but
> > couldn't find the right place to reply so thought I'd just do so
> > seperately...
Maybe, we have all our own definition ...
> >
> > Probably the biggest thing that strikes me now is that
> > distutils/setuptools/distribute/pacman/whatever should aim to do much
> > less...
I don't totally agree, we must provide a tool to build from start to end our
package even if we let means (eg via complete metadatas) to do the job in
another way, by other tools.
> >
> > In fact, I get the feeling what we really need is a way for package
> > maintainers to provide the following metadata:
> >
> > - where the docs are
> >
> > - where the tests are and how they're run
> >
> > - how anything not-python should be built
> >
> > - what the dependencies are
> >   (maybe even what the non-python dependencies are!)
> >
> > - what version of the package this is
> >
> > This should be in a build-tool independent fashion such that any build
> > tools, but especially those of operating system maintainers, can run
> > over the same metadata and build their packages.
+1 for more explicit metadata on python packages,
But i don't think our target is only the target "OS" and its relative packages
manager. I really can't see how to get running a lot of projects with
inter-related incompatibilities on the same system without some sort of 
isolation
So we cannot say we will be able to install whatever we want with everything
else at the same time, system wide. And selecting versions by hand, in each
project, can be painful. I'd prefer to have a "project" or "environment
specific" approach like buildout, or virtualenv, or the combination of these
tools, even maybe a wrap-up tool like minitage [1].

> >
> > The only other critical thing for me is that *all* of the above metadata
> >  should be available post-install.
> >
I 'd prefer to know what i am installing before installing it. For me, the
metadata are predictable things, which must be there before installing any part
of a project.

> > With the above in place, we free up the evolution of build tools and let
> > the OS-specific packaging tools play nicely.
> >
> > I think a good aim would also be to have some "one-way-to-do-it" python
> > tools too for:
> >
> > - installing a package
> >
> > - uploading a package to PyPI
> >
> > - getting a package from PyPI
> >
> >
+1, but see under
> > ...without any silly big plugin system in the way distutils currently
> > works.
> >
> > What do other people feel?
> >
> > cheers,
> >
> > Chris
> >

IMHO, a good management suite for python packages is a collection of tools,
plugins and API bindings which:
 - Can do all its job in total isolation (installation in /prefix even where our
only privilege is a filesystem write access)
   There two levels i can see there:
 - Isolation from the OS
 - Isolation in the "target environment" (/package1/the-stuff and
/package2/the-stuff)
 - Is OS independant (maybe the most hard thing to do)
 - Is failure tolerant:
- Handle offline mode
- Know how to go backward
- Use a sandbox before (un)installing
 - Let us means to add, remove and override the packages 's repositories and
even the packages themselves.
 - Can use a download cache
 - Deal with dependencies
 - Can distribute the package
 - Can fetch the package
 - Can build the package
 - Can Test the package
 - Can install the package from source or binary
 - Trace and store what is installed
 - Can uninstall the package
 - Know how to deal with reverse dependencies problems, even if this is a tool
with just rebuild them like the gentoo's ones [2].
 - Provide goods metadata
 - Have all the classical features cli tools like apt, emerge ... have:
 - Search on installed and available softwares
 - List installed files
 - Knows from which package a file is belonging to
 - Have an API to let us build, query, distribute, get, ... our packages.
 - Have some hook/plugin registration system like the egg's entry points.

And the according packages's metadata must contain:
 - package name
 - package version
 - Detailed dependencies requirements which include also incompatibilities (like
OS ones)
 - Maybe some knobs system like the USES in gentoo.
 - where we get the package
 - The way we build/install/uninstall it
 - Author/website/license and so on
 - Where are the tests
 - How the tests are run
 - hooks registration information

There is nothing new there, Those are classical package manager needs. Maybe
more "source packages manager" needs because we are dealing with something which
can be built on the target. What is different there is that i try  introduce an
"environment" approach more than a "system centric" one.

[1] http://www.minitage.org/doc/rst
[2]
 - emerge @preserved-rebuild:
http://r0bertz.blogspot.com/2008/06/portage-22-preserve-libs-features.html
 - revdep-rebuild : http://gentoo-wiki.com/TIP_Control_revdep-rebuild

--
Cordiale

Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Martin v. Löwis
> That said, I didn't see any indication of what I consider to be a
> critical failure in PyPI: No dependency metadata prior to downloading
> the package.

Contributions are welcome. The source code of PyPI is available
publically, and I'm willing to accept patches. I won't have time
to work on this in the next 12 months myself.

Regards,
Martin
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 11:37 -0500, Ian Bicking a écrit :
> Say I have a package that represents an application.
[snip]

> Then HardJSON 2.0 is released, and Turplango only required 
> HardJSON>=1.2, so new installations start installing HardJSON 2.0.  But 
> my app happens not to be compatible with that library, and so it's 
> broken.  OK...

No, please stop here. That’s not OK. If a new version of HardJSON breaks
your application, it is friggin’ broken. If that new version is not
compatible, it should be called HardJSON2, and nothing will break.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] more thoughts on python package management

2008-09-30 Thread Nicolas Chauvat
> What do other people feel?

Open Standards. Standardizing data format rather than tools. Well
defined public PyPI API... of course I agree with you!

-- 
Nicolas Chauvat

logilab.fr - services en informatique scientifique et gestion de connaissances  
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Rob Cakebread
On Tue, Sep 30, 2008 at 10:51 AM, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> At 12:25 PM 9/30/2008 -0400, A.M. Kuchling wrote:
>>
>> On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
>> > FWIW, pyinstall can collect all the packages before installing any of
>> > them.  You do have to download all packages, though, as that's the only
>> > way to get the metadata.
>>
>> Does the DOAP output for a package not contain enough metadata?
>
> Nope.  And it can't possibly do so, unless it contains dependency data for
> every possible variation of the package.  For example, a package might
> dynamically declare dependency on ctypes, depending on whether you're
> installing it for Python 2.4 or Python 2.5.  (Dependencies can also be
> platform-specific and build-option-specific, as well as
> Python-version-specific.)
>

Not to mention the DOAP vocabulary lacks a way to describe dependency
information. This is planned but it has to be well thought out because of
all the variations Philip mentions.

The good news is much of this dependency info is already in existence in
Linux distributions. Take a Gentoo ebuild, for example. It has separate
run-time, build-time and test dependency info, dependencies based on
enabled features, and dependencies based on the version of Python used.

Ebuilds also have metadata mapping the PyPI name to the Gentoo
package name, so it'll be easy enough to create a database with all this info.

I'm working on this now at http://doapspace.org/ where you can find DOAP
for Python packages with a bit more metadata than the DOAP supplied on
PyPI ( http://doapspace.org/doap/py/PYPI_PKG_NAME )
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Phillip J. Eby

At 12:25 PM 9/30/2008 -0400, A.M. Kuchling wrote:

On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
> FWIW, pyinstall can collect all the packages before installing any of
> them.  You do have to download all packages, though, as that's the only
> way to get the metadata.

Does the DOAP output for a package not contain enough metadata?


Nope.  And it can't possibly do so, unless it contains dependency 
data for every possible variation of the package.  For example, a 
package might dynamically declare dependency on ctypes, depending on 
whether you're installing it for Python 2.4 or Python 
2.5.  (Dependencies can also be platform-specific and 
build-option-specific, as well as Python-version-specific.)


___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Gael Varoquaux
On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
>> There is nothing that I hate more than easy_install failing after having
>> half-installed a package because of a missing dependency. This is one of
>> the reasons I am never too happy when I have to run easy_install.
>
> FWIW, pyinstall can collect all the packages before installing any of them. 
>  You do have to download all packages, though, as that's the only way to 
> get the metadata.

Yes, I have seen that. I was very happy to witness the release of this
tool. Thank you.

Gaël
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Ian Bicking

Tarek Ziadé wrote:

So... that's the kind of thing I encountered with just a couple
dependencies, but in practice it was much worse because there were a lot
more than 3 libraries involved.  I now think it is best to only use version
requirements to express known conflicts.  For future versions of packages
you can't really know if they will cause conflicts until they are released.


Exactly, you can't control everything from your package unless you
work in an isolated environement like virtualenv or zc.buildout
provides, so I can't see any solution unless someone is taking care of
it at a higher level :(

maybe PyPI though, can automate this, when a package is uploaded, by
browsing all dependency and
finding relevant conflict ? PyPI "knows" all the packages out there.

At least display those conflicts somehow ? or warn about them.


Yes, keeping this version information separate from packages would help, 
I think.  If you find out more information about a conflict it shouldn't 
require a new release -- new releases take a while to do, and have 
cascading effects.  This kind of metadata isn't so much about the 
package, as about how the package relates to other packages.  If we 
could somewhat safely have collaborative conflict information that would 
be nice, though there's different kinds of conflicts so it might be 
infeasible.  It's all too common for a person to just poke around with 
version stuff until something works, but in a way that is only accurate 
for the context of their application, and if they submit that 
information upstream they could easily break other people's setups 
unnecessarily.


--
Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 6:37 PM, Ian Bicking <[EMAIL PROTECTED]> wrote:
> Chris Withers wrote:
>>
>> Tarek Ziadé wrote:

 Tarek Ziade wrote:
>
> For KGS I agree that this is a big work, but there's the need to work
> at a
> higher level that in your package

 Why? You really need to explain to me why the dependency information in
 each
 of the packages isn't enough?
>>>
>>> Because you can keep up with the dependencies changes, removed, or
>>> introduced
>>> by a package you depend on.
>>
>> Why can this not be expressed in the dependency information in the
>> package?
>
> I tried this briefly for a while when Setuptools first came out, and I found
> it completely unmaintainable.
>
> Say I have a package that represents an application.  We'll call it FooBlog.
>  I release version 1.0.  It uses the Turplango web framework (1.5 at the
> time of release) and the Storchalmy ORM (0.4), and Turplango uses HardJSON
> (1.2.1).
>
> I want my version 1.0 to keep working.  So, I figure I'll add the
> dependencies:
>
>  Turplango==1.5
>  Storchalmy==0.4
>
> Then HardJSON 2.0 is released, and Turplango only required HardJSON>=1.2, so
> new installations start installing HardJSON 2.0.  But my app happens not to
> be compatible with that library, and so it's broken.  OK... so, I could add
> HardJSON==1.2.1 in my requirements.
>
> But then a small bug fix, HardJSON 1.2.2 comes out, that fixes a security
> bug.  Turplango releases version 1.5.1 that requires HardJSON>=1.2.2.  I now
> have have to update FooBlog to require both Turplango==1.5.1 and
> HardJSON==1.2.2.
>
> Later on, I decide that Turplango 1.6 fixes some important bugs, and I want
> to try it with my app.  I can install Turplango 1.6, but I can't start my
> app because I'll get a version conflict.  So to even experiment with a new
> version of the app, I have to check out FooBlog, update setup.py, reinstall
> (setup.py develop) the package, and then I can start using it.  But if I've
> made other hard requirements of packages like HardJSON, I'll have to update
> all those too.
>
> So... that's the kind of thing I encountered with just a couple
> dependencies, but in practice it was much worse because there were a lot
> more than 3 libraries involved.  I now think it is best to only use version
> requirements to express known conflicts.  For future versions of packages
> you can't really know if they will cause conflicts until they are released.

Exactly, you can't control everything from your package unless you
work in an isolated environement like virtualenv or zc.buildout
provides, so I can't see any solution unless someone is taking care of
it at a higher level :(

maybe PyPI though, can automate this, when a package is uploaded, by
browsing all dependency and
finding relevant conflict ? PyPI "knows" all the packages out there.

At least display those conflicts somehow ? or warn about them.

(I am pushing this to catalog-sig as well, sorry for the cross-post. I
do thing though, these mailing lists should merge)


>
>
> --
> Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Ian Bicking

Chris Withers wrote:

Tarek Ziadé wrote:

Tarek Ziade wrote:
For KGS I agree that this is a big work, but there's the need to 
work at a

higher level that in your package
Why? You really need to explain to me why the dependency information 
in each

of the packages isn't enough?


Because you can keep up with the dependencies changes, removed, or 
introduced

by a package you depend on.


Why can this not be expressed in the dependency information in the package?


I tried this briefly for a while when Setuptools first came out, and I 
found it completely unmaintainable.


Say I have a package that represents an application.  We'll call it 
FooBlog.  I release version 1.0.  It uses the Turplango web framework 
(1.5 at the time of release) and the Storchalmy ORM (0.4), and Turplango 
uses HardJSON (1.2.1).


I want my version 1.0 to keep working.  So, I figure I'll add the 
dependencies:


  Turplango==1.5
  Storchalmy==0.4

Then HardJSON 2.0 is released, and Turplango only required 
HardJSON>=1.2, so new installations start installing HardJSON 2.0.  But 
my app happens not to be compatible with that library, and so it's 
broken.  OK... so, I could add HardJSON==1.2.1 in my requirements.


But then a small bug fix, HardJSON 1.2.2 comes out, that fixes a 
security bug.  Turplango releases version 1.5.1 that requires 
HardJSON>=1.2.2.  I now have have to update FooBlog to require both 
Turplango==1.5.1 and HardJSON==1.2.2.


Later on, I decide that Turplango 1.6 fixes some important bugs, and I 
want to try it with my app.  I can install Turplango 1.6, but I can't 
start my app because I'll get a version conflict.  So to even experiment 
with a new version of the app, I have to check out FooBlog, update 
setup.py, reinstall (setup.py develop) the package, and then I can start 
using it.  But if I've made other hard requirements of packages like 
HardJSON, I'll have to update all those too.


So... that's the kind of thing I encountered with just a couple 
dependencies, but in practice it was much worse because there were a lot 
more than 3 libraries involved.  I now think it is best to only use 
version requirements to express known conflicts.  For future versions of 
packages you can't really know if they will cause conflicts until they 
are released.



--
Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread Ian Bicking

A.M. Kuchling wrote:

On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
FWIW, pyinstall can collect all the packages before installing any of  
them.  You do have to download all packages, though, as that's the only  
way to get the metadata.


Does the DOAP output for a package not contain enough metadata?


No.  It probably could hold that information, but currently PyPI doesn't 
keep any record of requirements, and so the DOAP file it generates 
doesn't indicate requirements either.


--
Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [Catalog-sig] PEP for distutils

2008-09-30 Thread A.M. Kuchling
On Tue, Sep 30, 2008 at 10:41:11AM -0500, Ian Bicking wrote:
> FWIW, pyinstall can collect all the packages before installing any of  
> them.  You do have to download all packages, though, as that's the only  
> way to get the metadata.

Does the DOAP output for a package not contain enough metadata?

--amk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 17:08 +0100, Chris Withers a écrit :
> Josselin Mouette wrote:
> > It doesn’t have to go away per se, but we need proper ways to deal with
> > incompatible changes in the interfaces.
> 
> Well, the generally accepted way seems to be to increase the major 
> version number...

This information is not accessible directly at import time. If you want
to rely on it to check the API compatibility, you’ll end up doing the
horrible things pygtk and gst-python did. And believe me, that will not
be helpful.

> > In Python libraries, this is not possible without changing the code,
> > since the file name and the module name are the same. 
> 
> The distribution name and package/module name do not have to be the same...

Indeed, but if we change the package name without changing the file
name, we have to make both packages conflict with each other. This works
for distribution packages, but it doesn’t help for third-party addons,
and it can make things complicated if two packages need different APIs.

> > It will suffice, but we will not be able to manage it in distributions
> > if you allow too many weird things to be specified in these
> > dependencies.
> 
> Explain "too many weird things"...

I showed already two examples: versioned provides and exact
dependencies. That’s just after thinking about it for 5 minutes; if we
want it to really work, we need to thoroughly think of what exact kind
of information we are able to use.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Chris Withers

Ian Bicking wrote:
FWIW, pyinstall can collect all the packages before installing any of 
them.  You do have to download all packages, though, as that's the only 
way to get the metadata.


...yes, and this is why PyPI should change!

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Chris Withers

Tarek Ziadé wrote:

Tarek Ziade wrote:

For KGS I agree that this is a big work, but there's the need to work at a
higher level that in your package

Why? You really need to explain to me why the dependency information in each
of the packages isn't enough?


Because you can keep up with the dependencies changes, removed, or introduced
by a package you depend on.


Why can this not be expressed in the dependency information in the package?


How do you decide that the version 1.2 of bar is the one you should use,
when you use the foo package that can work with any version of bar ?


If you are using no other packages that have a dependency on bar that 
has a specific version requirement, then the answer is that you can use 
any version of bar you desire.



You can define the version of foo, but can't describe all the versions of
the packages foo uses. 


Why?


You'd end up building your own KGS in a way..


...you mean like the [versions] section with buildout?
Yes, I agree us paranoid people may want to do that, but we really 
shouldn't need to provided the packages each correctly define their 
dependencies...



So a general list of versions can help


I would be happy to wager that this would never successfully be maintained.


Bigger eggs wouldn't let you reuse thing like you can now imho


That doesn't explain the majority of eggs that end up dragging a whole 
load of eggs down themselves. In this case, they should all be packaged 
as one egg...


cheers,

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Chris Withers

Josselin Mouette wrote:

Le mardi 30 septembre 2008 à 16:36 +0100, Chris Withers a écrit :

No, the problem we have today is that some developers are providing
modules without API stability, which means you cannot simply depend on a
module, you need a specific version.

This problem is never going away, it's the nature of software.


It doesn’t have to go away per se, but we need proper ways to deal with
incompatible changes in the interfaces.


Well, the generally accepted way seems to be to increase the major 
version number...



In Python libraries, this is not possible without changing the code,
since the file name and the module name are the same. 


The distribution name and package/module name do not have to be the same...


It will suffice, but we will not be able to manage it in distributions
if you allow too many weird things to be specified in these
dependencies.


Explain "too many weird things"...

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] more thoughts on python package management

2008-09-30 Thread Chris Withers

Hi All,

I've been trying to catch up on all the packaging discussions but 
couldn't find the right place to reply so thought I'd just do so 
seperately...


Probably the biggest thing that strikes me now is that 
distutils/setuptools/distribute/pacman/whatever should aim to do much 
less...


In fact, I get the feeling what we really need is a way for package 
maintainers to provide the following metadata:


- where the docs are

- where the tests are and how they're run

- how anything not-python should be built

- what the dependencies are
  (maybe even what the non-python dependencies are!)

- what version of the package this is

This should be in a build-tool independent fashion such that any build 
tools, but especially those of operating system maintainers, can run 
over the same metadata and build their packages.


The only other critical thing for me is that *all* of the above metadata 
 should be available post-install.


With the above in place, we free up the evolution of build tools and let 
the OS-specific packaging tools play nicely.


I think a good aim would also be to have some "one-way-to-do-it" python 
tools too for:


- installing a package

- uploading a package to PyPI

- getting a package from PyPI


...without any silly big plugin system in the way distutils currently works.

What do other people feel?

cheers,

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 16:36 +0100, Chris Withers a écrit :
> >> No, the problem we have today is that some developers are providing
> >> modules without API stability, which means you cannot simply depend on a
> >> module, you need a specific version.
> 
> This problem is never going away, it's the nature of software.

It doesn’t have to go away per se, but we need proper ways to deal with
incompatible changes in the interfaces.

> >> Again, when a C library changes its ABI, we do not allow it to keep the
> >> same name. It's as simple as that.
> 
> That's insane, and I bet without trying to hard, I could find examples 
> of violation of this supposed practice.

Of course, Python developers don’t have the monopoly on misunderstanding
maintainability requirements. It evens happens more often in C, where
the ABI can change without any incompatibility in the API. When this
happens without a soname change, we either change the soname ourselves
(diverging from upstream) or change the package name, making it
impossible to install two conflicting versions at once.

In Python libraries, this is not possible without changing the code,
since the file name and the module name are the same. If a Python module
changes its API incompatibly, we are forced to update all reverse
dependencies and add versioned conflicts, without being able to ensure
none is forgotten, and without enforcing the change for third-party
packages.

[snip]
> Besides, accurately specified dependency information, including 
> versions, within a package should suffice. 

It will suffice, but we will not be able to manage it in distributions
if you allow too many weird things to be specified in these
dependencies.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 5:55 PM, Chris Withers <[EMAIL PROTECTED]> wrote:
> Tarek Ziade wrote:
>>
>> For KGS I agree that this is a big work, but there's the need to work at a
>> higher level that in your package
>
> Why? You really need to explain to me why the dependency information in each
> of the packages isn't enough?
>

Because you can keep up with the dependencies changes, removed, or introduced
by a package you depend on.

How do you decide that the version 1.2 of bar is the one you should use,
when you use the foo package that can work with any version of bar ?

You can define the version of foo, but can't describe all the versions of
the packages foo uses. You'd end up building your own KGS in a way..

So a general list of versions can help


>> Python frameworks are exploding in a myriad of packags : a Python
>> instalation needs to  handle up to a hundreds of public packages now to run
>> a plone site for example
>
> Yes, Plone and Zope both got the wrong end of the stick by making myriads of
> eggs rather than a few big ones...

I think it is a good opportunity to re-uses things. Right now I can
work on projects
that use packages from pylons AND plone AND zope 3.

Bigger eggs wouldn't let you reuse thing like you can now imho

>
> Chris
>
> --
> Simplistix - Content Management, Zope & Python Consulting
>   - http://www.simplistix.co.uk
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "Python Package Management Sucks"

2008-09-30 Thread Chris Withers

Matthias Klose wrote:

Install debian and get back to productive tasks.

This is an almost troll-like answer.
See page 35 of the presentation.


I disagree. You could think of "Packages are Pythons Plugins" (taken
from page 35) as a troll-like statement as well.


You're welcome to your (incorrect) opinion ;-)
Debian packages could just as easilly be seen as Debian's pluggins.


and a subset of all these operating systems (linux or other) do have
the need to distribute python and a set of python modules and
extensions. they cannot rely on a plugin system outside the (os)
distribution.


OK, you guys have persuaded me of this at least...

- all the package management systems behave differently and expect 
packages to be set up differently for them


correct, but again they share common requirements.


...but all have different implementations.


some people prefer to name this "stable releases" instead of
"bitrot". 


I'll call bullshit on this one. The most common problem I have as a 
happy Debian user and advocate when I go to try and get help for a 
packaged application (I use packages because I perhaps mistakenly assume 
this is the best way to get security-fixed softare), such as postfix, 
postgres, and Zope if I was foolish enough to take that path, is "why 
are toy using that ancient and buggy version of the software?!" shortly 
before pointing out how all the issues I'm facing are solved in newer 
(stable) releases.


The problem is that first the application needs to be tested and 
released by its community, then Debian needs to re-package, patch, 
generally mess around with it, etc before it eventually gets a "Debian 
release". It's bad enough with apps with huge support bases like 
portgres, imagine trying to do this "properly" for the 4000-odd packages 
on PyPI...



Speaking of extensions "maintained by the entity originating the
python package": this much too often is a way of bitrot. is the
shipped library up to date? does it have security fixes? how many
duplicates are shipped in different extensions? does the library need
to be shipped at all (because some os does ship it)?


So what do you propose doing one projectA depends on version 1.0 of libC 
and projectB depends on version 2.0 of libC?



this is known trouble for os distributors, and your statement is
generally wrong. firefox plugins are packaged in distributions and the
plugin system is able to cope with packaged plugins.


I guess since my desktop OS is still windows, this is not something I've 
had to fight with ;-)



Packages are Python's "plugins" and so should get the same type of
consistent, cross-platform package management targetted at the
application in question, which is Python in this case.


No, as explained above. 


While I'll buy the argument that python packaging tools should make life 
easier for production of os-specific packages, I still don't think 
you're correct ;-)



Considering an extension interfacing a library
shipped with the os, you do want to use this library, not add another
copy. 


libxml2 seems to be agood example to use here...

I guess on debian I'd need to likely install libxml2-dev before I could 
install the lxml package...


...what about MacOS X?

...what about Windows?


An upstream
extension maintainer cannot provide this unless he builds this
extension for every (os) distribution and maintains it during the os'
lifecycle.


...or just says in the docs "hey, you need libxml2 for this, unless 
you're on Windows, in which case the binary includes it".



 - os distributors usually try to minimize the versions they include,
   trying to just ship one version.  


...which is fair enough for the "system python", but many of us have a 
collection of apps, some of which require Python 2.4, some Python 2.5, 
and on top of each of those, different versions of different packages 
for each app.


In my case, I do source (alt-)installs of python rather than trusting 
the broken stuff that ships with Debian and buildout to make sure I get 
the right versions of the right packages for each project.



 - setuptools has the narrow minded view of a python package being
   contained in a single directory, which doesn't fit well when you
   do have common locations for include or doc files. 


Python packages have no idea of "docs" or "includes", which is certainly 
a deficiency.



way packaging the python module with rpm or dpkg. E.g. namespace
packages are a consequence how setuptools distributes and installs
things. Why force this on everybody?


being able to break a large lump (say zope.*) into seperate 
distributions is a good idea, which setuptools implements very badly 
using namespace packages...



A big win could be a modularized setuptools where you are able to only
use the things you do want to use, e.g.

 - version specifications (not just the heuristics shipped with
   setuptools).


not sure what you mean by this.


 - specification of dependencies.

 - resource management


?


 - a mod

Re: [Distutils] "just use debian"

2008-09-30 Thread Chris Withers

Tarek Ziade wrote:
For KGS I agree that this is a big work, but there's the need to work at 
a higher level that in your package


Why? You really need to explain to me why the dependency information in 
each of the packages isn't enough?


Python frameworks are exploding in a myriad of packags : a Python 
instalation needs to  handle up to a hundreds of public packages now to 
run a plone site for example


Yes, Plone and Zope both got the wrong end of the stick by making 
myriads of eggs rather than a few big ones...


Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Jean-Philippe CAMGUILHEM
On Tue, Sep 30, 2008 at 5:41 PM, Ian Bicking <[EMAIL PROTECTED]> wrote:

> Gael Varoquaux wrote:
>
>> On Tue, Sep 30, 2008 at 04:01:01PM +0100, Chris Withers wrote:
>>
>>> That said, I didn't see any indication of what I consider to be a
>>> critical failure in PyPI: No dependency metadata prior to downloading the
>>> package.
>>>
>>
>> +1. I want to be able do list all the packages an easy_install run will
>> download without running it. Something like the "-s" option of apt-get.
>> In addition, I want this information to be available programmatically (ie
>> with a good api, not something that expects to be called from the command
>> line) to be able to use it to build dependency graphs, generate conflicts
>> list, or simply tell me that I have requested something that is
>> impossible.
>>
>> There is nothing that I hate more than easy_install failing after having
>> half-installed a package because of a missing dependency. This is one of
>> the reasons I am never too happy when I have to run easy_install.
>>
>
> FWIW, pyinstall can collect all the packages before installing any of them.
>  You do have to download all packages, though, as that's the only way to get
> the metadata.

is a "simple catalog "db storage for metadata like /usr/ports/ on freebsd a
bad idea ?
the idea is to not download all packages to get the metadata, but just query
the catalog/db

Cheers,

Jean-Philippe

>
>
>
> --
> Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
>
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> http://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Tarek Ziade
2008/9/30 Chris Withers <[EMAIL PROTECTED]>

> Tarek Ziadé wrote:
>
>> In other words the problem we have today with an OS-based installation is
 that
 you cannot really have two versions of the same package installed,
 that would make happy
 two Python applications.

>>>
> Right, which is why dependencies can often be best matched by a
> project-based tool like buildout rather than having to have one python setup
> support all use cases.
>
>  No, the problem we have today is that some developers are providing
>>> modules without API stability, which means you cannot simply depend on a
>>> module, you need a specific version.
>>>
>>
> This problem is never going away, it's the nature of software.
>
>  Again, when a C library changes its ABI, we do not allow it to keep the
>>> same name. It's as simple as that.
>>>
>>
> That's insane, and I bet without trying to hard, I could find examples of
> violation of this supposed practice.
>
>  My convention is to :
>>  - keep the the old API and the new API in the new version, let's say
>> "2.0"
>>  - mark the old API as deprecated (we have this "warning'" module in
>> Python to do so)
>>  - remove the old API in the next release, like "2.1"
>>
>
> Right.
>
>  But I don't want to change the package name.
>>
>
> Right.
>
>  The setuptools project has partly improved this by providing a way to
 install several
 version of the same package in Python and give a way to select which
 one is active.

>>> This is not an improvement, it is a nightmare for the sysadmin.
>>>
>>
> Absolutely. This multi-version rubbish is totally and utterly insanely
> wrong.
>
>  I have an idea: what about having a "known good set" (KGS) like what
>> Zope has built on its
>> side.
>>
>> a Known Good Set is a set of python package versions, that are known to
>> provide
>> the good execution context for a given version of Python.
>>
>
> Given how poorly maintained Zope's "KGS" is, I think this is a pipe dream.
>
> Besides, accurately specified dependency information, including versions,
> within a package should suffice. It would be handy if you could also specify
> python version compatibility in this, something that setuptools does not
> currently support AFAIK.


you can use the Requires-Python metadata though.

For KGS I agree that this is a big work, but there's the need to work at a
higher level that in your package

that is what zc.buildout brought in a way,  but at the application level,
and with no respect to the OS-level in a way.
 Si we should find a way to generalize this at Python level imho: being able
to develop your package in a known environment.
and being able to give that info to the OS.

Python frameworks are exploding in a myriad of packags : a Python
instalation needs to  handle up to a hundreds of public packages now to run
a plone site for example



>
> cheers,
>
> Chris
>
> --
> Simplistix - Content Management, Zope & Python Consulting
>   - http://www.simplistix.co.uk
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> http://mail.python.org/mailman/listinfo/distutils-sig
>



-- 
Tarek Ziadé - Directeur Technique
INGENIWEB (TM) - SAS 5 Euros - RC B 438 725 632
Bureaux de la Colline - 1 rue Royale - Bâtiment D - 9ème étage
92210 Saint Cloud - France
Phone : 01.78.15.24.00 / Fax : 01 46 02 44 04
http://www.ingeniweb.com - une société du groupe Alter Way
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 17:20 +0200, Tarek Ziadé a écrit :
> > Again, when a C library changes its ABI, we do not allow it to keep the
> > same name. It's as simple as that.
> 
> I see, so there's no deprecation processes for a package  ?

Not per se. It is the job of the package manager to propose removing
deprecated packages when they are no longer available in the repository.

> I mean, if you change a public API of your package , you *have* to
> change its name ?

Yes, this is the requirement for C libraries, and we try to enforce it
as well for other languages.

> My convention is to :
>  - keep the the old API and the new API in the new version, let's say "2.0"
>  - mark the old API as deprecated (we have this "warning'" module in
> Python to do so)
>  - remove the old API in the next release, like "2.1"
> 
> But I don't want to change the package name.
> 
> And the development cycles in a python package are really short
> compared to OS systems, in fact
> we can have quite a few releases before a package is really stable.

I don’t think the requirements are different from those of C library
developers. There are, of course, special cases for libraries that are
in development; generally we take a snapshot, give it a specific soname
and enforce the ABI compatibility in the Debian package. The other
possibility is to distribute the library only in a private directory.
Nothing in this process is specific to C; the technical details are
different for python modules, but we should be able to handle it in a
similar way.

> > This is not an improvement, it is a nightmare for the sysadmin. You
> > cannot install things as simple (and as critical) as security updates if
> > you allow several versions to be installed together.
>
> mmm... unless the version is "part of the name" in a way

Yes, this is what C libraries do with the SONAME, for which the
convention is to postfix it with a number, which changes when the ABI is
changed in an incompatible way. I don’t know whether it would be
possible to do similar things with python modules, but it is certainly
something to look at.

> > Two conflicting versions must not use the same module namespace.
> 
> I have an idea: what about having a "known good set" (KGS) like what
> Zope has built on its
> side.
> 
> a Known Good Set is a set of python package versions, that are known to 
> provide
> the good execution context for a given version of Python.
> 
> Maybe the Python community could maintain a known good set of python
> packages at PyPI, with a real work on its integrity, like any
> OS-vendor does I believe.

Having a body that enforces API stability for a number of packages would
probably prevent such issues from happening in those packages. However,
that means relying too much and this body, and experience proves it will
quickly lag behind. Furthermore, the need to add packages that are not
in the KGS to distributions will arise sooner or later.

> And maybe this KGS could be used by Debian as the reference of package 
> versions.

We will always need, for some cases, more recent packages or packages
that are not in the KGS.

> -> if a package is listed in this KGS, it defines the version, for a
> given version of Python

You don’t have to define it so strictly. There is no reason why a new
version couldn’t be accepted in the KGS for an existing python version,
if it has been checked that it will not break existing applications
using this module.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 5:41 PM, Ian Bicking <[EMAIL PROTECTED]> wrote:
> Gael Varoquaux wrote:
>>
>> On Tue, Sep 30, 2008 at 04:01:01PM +0100, Chris Withers wrote:
>>>
>>> That said, I didn't see any indication of what I consider to be a
>>> critical failure in PyPI: No dependency metadata prior to downloading the
>>> package.
>>
>> +1. I want to be able do list all the packages an easy_install run will
>> download without running it. Something like the "-s" option of apt-get.
>> In addition, I want this information to be available programmatically (ie
>> with a good api, not something that expects to be called from the command
>> line) to be able to use it to build dependency graphs, generate conflicts
>> list, or simply tell me that I have requested something that is
>> impossible.
>>
>> There is nothing that I hate more than easy_install failing after having
>> half-installed a package because of a missing dependency. This is one of
>> the reasons I am never too happy when I have to run easy_install.
>
> FWIW, pyinstall can collect all the packages before installing any of them.
>  You do have to download all packages, though, as that's the only way to get
> the metadata.

Yes, so having them at PyPI would be a good idea indeed,
I am adding that to that small PEP


>
>
> --
> Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> http://mail.python.org/mailman/listinfo/distutils-sig
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Ian Bicking

Gael Varoquaux wrote:

On Tue, Sep 30, 2008 at 04:01:01PM +0100, Chris Withers wrote:
That said, I didn't see any indication of what I consider to be a critical 
failure in PyPI: No dependency metadata prior to downloading the package.


+1. I want to be able do list all the packages an easy_install run will
download without running it. Something like the "-s" option of apt-get.
In addition, I want this information to be available programmatically (ie
with a good api, not something that expects to be called from the command
line) to be able to use it to build dependency graphs, generate conflicts
list, or simply tell me that I have requested something that is
impossible.

There is nothing that I hate more than easy_install failing after having
half-installed a package because of a missing dependency. This is one of
the reasons I am never too happy when I have to run easy_install.


FWIW, pyinstall can collect all the packages before installing any of 
them.  You do have to download all packages, though, as that's the only 
way to get the metadata.



--
Ian Bicking : [EMAIL PROTECTED] : http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Gael Varoquaux
On Tue, Sep 30, 2008 at 04:01:01PM +0100, Chris Withers wrote:
> That said, I didn't see any indication of what I consider to be a critical 
> failure in PyPI: No dependency metadata prior to downloading the package.

+1. I want to be able do list all the packages an easy_install run will
download without running it. Something like the "-s" option of apt-get.
In addition, I want this information to be available programmatically (ie
with a good api, not something that expects to be called from the command
line) to be able to use it to build dependency graphs, generate conflicts
list, or simply tell me that I have requested something that is
impossible.

There is nothing that I hate more than easy_install failing after having
half-installed a package because of a missing dependency. This is one of
the reasons I am never too happy when I have to run easy_install.

Gaël
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] "just use debian"

2008-09-30 Thread Chris Withers

Tarek Ziadé wrote:

In other words the problem we have today with an OS-based installation is that
you cannot really have two versions of the same package installed,
that would make happy
two Python applications.


Right, which is why dependencies can often be best matched by a 
project-based tool like buildout rather than having to have one python 
setup support all use cases.



No, the problem we have today is that some developers are providing
modules without API stability, which means you cannot simply depend on a
module, you need a specific version.


This problem is never going away, it's the nature of software.


Again, when a C library changes its ABI, we do not allow it to keep the
same name. It's as simple as that.


That's insane, and I bet without trying to hard, I could find examples 
of violation of this supposed practice.



My convention is to :
 - keep the the old API and the new API in the new version, let's say "2.0"
 - mark the old API as deprecated (we have this "warning'" module in
Python to do so)
 - remove the old API in the next release, like "2.1"


Right.


But I don't want to change the package name.


Right.


The setuptools project has partly improved this by providing a way to
install several
version of the same package in Python and give a way to select which
one is active.
This is not an improvement, it is a nightmare for the sysadmin. 


Absolutely. This multi-version rubbish is totally and utterly insanely 
wrong.



I have an idea: what about having a "known good set" (KGS) like what
Zope has built on its
side.

a Known Good Set is a set of python package versions, that are known to provide
the good execution context for a given version of Python.


Given how poorly maintained Zope's "KGS" is, I think this is a pipe dream.

Besides, accurately specified dependency information, including 
versions, within a package should suffice. It would be handy if you 
could also specify python version compatibility in this, something that 
setuptools does not currently support AFAIK.


cheers,

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PEP for distutils

2008-09-30 Thread Chris Withers

Tarek Ziadé wrote:
I started to write a new PEP (well a wiki page in fact...) that 
describes a new package called "pypi" that would be dedicated to package 
registering and uploading mechanisms.
It would also provide enhancements like a proper password hash, or 
deepers metadata controls


http://wiki.python.org/moin/A_new_pypi_module

Any opinions about this PEP ? I tried to include all the problems people 
are having with register  and upload.


I think that catalog-sig would be interested in this.

That said, I didn't see any indication of what I consider to be a 
critical failure in PyPI: No dependency metadata prior to downloading 
the package.


cheers,

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Distutils Sprint

2008-09-30 Thread Chris Withers

Tarek Ziadé wrote:

http://wiki.python.org/moin/DistributeSprint_#1


The dates on this make no sense...

cheers,

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Distutils Sprint

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 5:02 PM, Chris Withers <[EMAIL PROTECTED]> wrote:
> Tarek Ziadé wrote:
>>
>> http://wiki.python.org/moin/DistributeSprint_#1
>
> The dates on this make no sense...

I fixed the typo thanks. Please propose some other dates then over
there, if you wish

Keep in mind that people from Paris, Tokyo and maybe SF bay are
interested so the right moment in the day
is hard to find :)


>
> cheers,
>
> Chris
>
> --
> Simplistix - Content Management, Zope & Python Consulting
>   - http://www.simplistix.co.uk
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] python v. perl - cpan v. pypi

2008-09-30 Thread Chris Withers

Michael wrote:

Now, with python there's the general ethos:
   There should be one-- and preferably only one --obvious way to do it.

And with perl there's the general ethos:
   There's more than one way to do it

Anyone who's written extensive amounts of code in both languages will know 
that the latter ethos does cause major problems in practice.


However for packaging, with python the rule is
   * "There's more than one way to do it"

And for perl the rule is:
   * Use CPAN

I've always found this difference amusing, 


Me too...

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 4:27 PM, Josselin Mouette <[EMAIL PROTECTED]> wrote:
>> In other words the problem we have today with an OS-based installation is 
>> that
>> you cannot really have two versions of the same package installed,
>> that would make happy
>> two Python applications.
>
> And this is not a problem, but something that is desired.
>
> No, the problem we have today is that some developers are providing
> modules without API stability, which means you cannot simply depend on a
> module, you need a specific version.
>
> Again, when a C library changes its ABI, we do not allow it to keep the
> same name. It's as simple as that.

I see, so there's no deprecation processes for a package  ?

I mean, if you change a public API of your package , you *have* to
change its name ?

My convention is to :
 - keep the the old API and the new API in the new version, let's say "2.0"
 - mark the old API as deprecated (we have this "warning'" module in
Python to do so)
 - remove the old API in the next release, like "2.1"

But I don't want to change the package name.

And the development cycles in a python package are really short
compared to OS systems, in fact
we can have quite a few releases before a package is really stable.

>
>> The setuptools project has partly improved this by providing a way to
>> install several
>> version of the same package in Python and give a way to select which
>> one is active.
>
> This is not an improvement, it is a nightmare for the sysadmin. You
> cannot install things as simple (and as critical) as security updates if
> you allow several versions to be installed together.
>

mmm... unless the version is "part of the name" in a way

[cut]
>> Interesting.. That would mean you would do version conflict resolution
>> at the OS level,   That makes me think about the previous point: how
>> two applications that use conflicting versions that are not comptabile
>> with each other (you have to choose one of them) can cohabit ?
>
> Two conflicting versions must not use the same module namespace. The
> real, fundamental issue, that generates even more brokenness when you
> accept it and work around it, is here. It is a nightmare for the
> developer (who can't rely on a defined API after "import foo"), a
> nightmare for the distributor (who has to use broken-by-design selection
> methods), and a nightmare for the system administrator (who cannot
> easily track what is installed on the system). Forbid that strictly, and
> you'll see that methods that work today for a Linux distribution (where
> we already forbid it) will work just as nicely for all other
> distribution mechanisms.


I have an idea: what about having a "known good set" (KGS) like what
Zope has built on its
side.

a Known Good Set is a set of python package versions, that are known to provide
the good execution context for a given version of Python.

Maybe the Python community could maintain a known good set of python
packages at PyPI, with a real work on its integrity, like any
OS-vendor does I believe.

And maybe this KGS could be used by Debian as the reference of package versions.

-> if a package is listed in this KGS, it defines the version, for a
given version of Python

then, application developers in Python could work with this KGS, in their code.
And if they can't get a package added in the official KGS, they will
have to be on their own,
inside their application, and maintain their own modules.


>
> Cheers,
> --
>  .''`.
> : :' :  We are debian.org. Lower your prices, surrender your code.
> `. `'   We will add your hardware and software distinctiveness to
>  `-our own. Resistance is futile.
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Chris Withers

Nicolas Chauvat wrote:

Baseline is "no problem with providing egg-info metadata, but pretty
please Python developers, do not code *for* distutils/setuptools/etc.
Just find a way to provide useful dependency/meta information then let
your users choose how they install your code on *their* system".


Right, now this I agree with, and it seems a lot of other people do too...

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Chris Withers

Marius Gedminas wrote:

On Tue, Sep 23, 2008 at 09:24:00PM -0700, Kevin Teague wrote:
Or they can just use debian!  Any debian developers out there up for  
the task of packaging up the 1500+ odd packages released from the Zope  
community?


The SchoolTool guys made a tool and built .debs for all of Zope 3 that
SchoolTool needs.  The resulting packages are here:
https://launchpad.net/~schooltool-owners/+archive


Yes, but how many of these have made it into an official debian release?

Chris

--
Simplistix - Content Management, Zope & Python Consulting
   - http://www.simplistix.co.uk
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 15:49 +0200, Tarek Ziadé a écrit :
> The  "Obsoletes" info could be used maybe. But the main problem I can
> see is that
> in any case several versions of the same module can be needed to build
> one application.

This is indeed a problem, and when it happens, it needs fixing instead
of trying to work with it.

> That is what tools like zc.buildout or virtualenv exists  : they are building
> an isolated environment where they install the packages so a given
> Python application
> can run.
> 
> In other words the problem we have today with an OS-based installation is that
> you cannot really have two versions of the same package installed,
> that would make happy
> two Python applications.

And this is not a problem, but something that is desired.

No, the problem we have today is that some developers are providing
modules without API stability, which means you cannot simply depend on a
module, you need a specific version.

Again, when a C library changes its ABI, we do not allow it to keep the
same name. It’s as simple as that.

> The setuptools project has partly improved this by providing a way to
> install several
> version of the same package in Python and give a way to select which
> one is active.

This is not an improvement, it is a nightmare for the sysadmin. You
cannot install things as simple (and as critical) as security updates if
you allow several versions to be installed together.

> From your point of view, how could we solve it at Debian level ? to
> kind of isolate a group
> of packages that fit the needs of one given application ?

I think we need to enforce even more the habit to move unstable and
private-use modules to private directories. It is not viable to add them
to public directories. This is something that is done punctually in some
Debian packages, but it should become mandatory for all cases where
there is no API stability. A tool that eases installation and use of
modules in private directories would certainly encourage developers to
do so and improve the situation in this matter.

> (btw A recent change it Python has allowed us to define per-user site-packages
>  http://mail.python.org/pipermail/python-dev/2008-January/076108.html)

This is definitely a nice improvement for those on multi-user systems
without administrative rights, and for those who wish to install a more
recent version of a specific module. However, I don’t think we should
rely on it as the normal way of installing python modules. And
especially, we should not rely on on-demand download/installation of
modules like setuptools does.

> Interesting.. That would mean you would do version conflict resolution
> at the OS level,   That makes me think about the previous point: how
> two applications that use conflicting versions that are not comptabile
> with each other (you have to choose one of them) can cohabit ?

Two conflicting versions must not use the same module namespace. The
real, fundamental issue, that generates even more brokenness when you
accept it and work around it, is here. It is a nightmare for the
developer (who can’t rely on a defined API after "import foo"), a
nightmare for the distributor (who has to use broken-by-design selection
methods), and a nightmare for the system administrator (who cannot
easily track what is installed on the system). Forbid that strictly, and
you’ll see that methods that work today for a Linux distribution (where
we already forbid it) will work just as nicely for all other
distribution mechanisms. 

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Josselin Mouette
Le mardi 30 septembre 2008 à 14:05 +0200, Tarek Ziadé a écrit :
> On Tue, Sep 30, 2008 at 10:42 AM, Nicolas Chauvat
> <[EMAIL PROTECTED]> wrote:
> > For example, if you require a minimal version of 1.4, you can
> > translate this to a package version of 1.4; it is a bit hackish but
> > will work if you handle epochs correctly. But if the package you
> > depend on has a Provides: blah (1.4), you have no way to map that to a
> > dependency, because you can't know what other versions of the package
> > will provide.
> 
> I am not sure to fully understand, could you provide a real-word example ?

Let’s say you have module bar, contained in the package python-bar. The
last version is 1.4. After that version, it is decided to distribute it
in the same tarball as module bar. It is therefore moved to the package
python-foo, which is at version 1.2. In this case, you can specify in
the metadata :
Provides: foo
Provides: bar (1.4)
This is the typical use case for versioned provides.

Let’s say application baz requires module bar with minimal version 1.3,
it will have as dependency:
Requires: bar >= 1.3
This way it will be happy to find the versioned provides if module foo
is installed, and everyone is happy. Well, except that, if you try to
build a package of baz, there is no way to express correctly that you
depend on python-bar (>= 1.3) or python-foo (>= 1.2).

This is why I’d prefer to have versioned provides simply not part of the
specification. 

Another thing that can cause issues is exact dependencies. If you
require strictly version 1.1 of foo, there is no good way of translating
it into a package dependency. All the following will have serious
drawbacks when facing the real world:
python-foo (>= 1.1), python-foo (<< 1.1.~)
python-foo (>= 1.1), python-foo (<< 1.2)
python-foo (= 1.1-1)

If you allow to specify requires and provides in a sophisticated way,
people will use it and we will run into unmanageable situations when
converting them to packages. If a module provides an API at version 1.2,
it will have to still provide it at version 1.3, otherwise the module
should remain private and never be installed in a public python module
directory. Just like we rename C libraries when their ABI changes, we
need to reach a situation where we can make the same assumptions about
python modules.

> > In all cases, it will be necessary to manually add shlibs-like
> > information to the packages; they could be partly autogenerated like
> > symbol files, but you need a mapping between provided modules and the
> > first version of the package that provides it.
> 
> Is this related ? http://lists.debian.org/debian-dpkg/2006/01/msg00118.html

Yes. The thread you point to did not let to something being actually
implemented, because at that moment, we lacked the necessary metadata.
Since then, setuptools appeared, but it does not provide it in a sane
way and it is not universal. Which is why I’m interested into the
metadata format that’s discussed here.

From this metadata, we will be able to generate some files that express
what is provided and the required version. Something like:
foo 1.0-1
bar 1.2~beta3

This way, if another package requires foo (> 1.1) and bar (without a
version requirement), we can convert this dependency into:
python-foo (>= 1.1), python-foo (>= 1.2~beta3)
which can then be factorized, of course.

Cheers,
-- 
 .''`.
: :' :  We are debian.org. Lower your prices, surrender your code.
`. `'   We will add your hardware and software distinctiveness to
  `-our own. Resistance is futile.


signature.asc
Description: Ceci est une partie de message	numériquement signée
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 3:17 PM, Josselin Mouette <[EMAIL PROTECTED]> wrote:
> Le mardi 30 septembre 2008 à 14:05 +0200, Tarek Ziadé a écrit :
>> On Tue, Sep 30, 2008 at 10:42 AM, Nicolas Chauvat
>> <[EMAIL PROTECTED]> wrote:
>> > For example, if you require a minimal version of 1.4, you can
>> > translate this to a package version of 1.4; it is a bit hackish but
>> > will work if you handle epochs correctly. But if the package you
>> > depend on has a Provides: blah (1.4), you have no way to map that to a
>> > dependency, because you can't know what other versions of the package
>> > will provide.
>>
>> I am not sure to fully understand, could you provide a real-word example ?
>
> Let's say you have module bar, contained in the package python-bar. The
> last version is 1.4. After that version, it is decided to distribute it
> in the same tarball as module bar. It is therefore moved to the package
> python-foo, which is at version 1.2. In this case, you can specify in
> the metadata :
>Provides: foo
>Provides: bar (1.4)
> This is the typical use case for versioned provides.
>
> Let's say application baz requires module bar with minimal version 1.3,
> it will have as dependency:
>Requires: bar >= 1.3
> This way it will be happy to find the versioned provides if module foo
> is installed, and everyone is happy. Well, except that, if you try to
> build a package of baz, there is no way to express correctly that you
> depend on python-bar (>= 1.3) or python-foo (>= 1.2).
>
> This is why I'd prefer to have versioned provides simply not part of the
> specification.
>

The  "Obsoletes" info could be used maybe. But the main problem I can
see is that
in any case several versions of the same module can be needed to build
one application.

That is what tools like zc.buildout or virtualenv exists  : they are building
an isolated environment where they install the packages so a given
Python application
can run.

In other words the problem we have today with an OS-based installation is that
you cannot really have two versions of the same package installed,
that would make happy
two Python applications.

The setuptools project has partly improved this by providing a way to
install several
version of the same package in Python and give a way to select which
one is active.

>From your point of view, how could we solve it at Debian level ? to
kind of isolate a group
of packages that fit the needs of one given application ?

(btw A recent change it Python has allowed us to define per-user site-packages
 http://mail.python.org/pipermail/python-dev/2008-January/076108.html)

>> Is this related ? http://lists.debian.org/debian-dpkg/2006/01/msg00118.html
>
> Yes. The thread you point to did not let to something being actually
> implemented, because at that moment, we lacked the necessary metadata.
> Since then, setuptools appeared, but it does not provide it in a sane
> way and it is not universal. Which is why I'm interested into the
> metadata format that's discussed here.
>
> From this metadata, we will be able to generate some files that express
> what is provided and the required version. Something like:
>foo 1.0-1
>bar 1.2~beta3
>
> This way, if another package requires foo (> 1.1) and bar (without a
> version requirement), we can convert this dependency into:
>python-foo (>= 1.1), python-foo (>= 1.2~beta3)
> which can then be factorized, of course.
>

Interesting.. That would mean you would do version conflict resolution
at the OS level,   That makes me think about the previous point: how
two applications that use conflicting versions that are not comptabile
with each other (you have to choose one of them) can cohabit ?

Cheers

> Cheers,
> --
>  .''`.
> : :' :  We are debian.org. Lower your prices, surrender your code.
> `. `'   We will add your hardware and software distinctiveness to
>  `-our own. Resistance is futile.
>



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 2:38 PM, zooko <[EMAIL PROTECTED]> wrote:
> On Sep 29, 2008, at 6:09 AM, Tarek Ziadé wrote:
>
>> Now, the question is,  what would debian miss in here to install:
>>
>> http://www.python.org/dev/peps/pep-0345/
>
> It really seems to me that PEP-345's specification of dependency metadata is
> the wrong starting point.
>
> There are not, to my knowledge, any Python packages in existence which use
> this form of dependency metadata, and there are not, to my knowledge, any
> Python tools which are capable of producing or consuming it.
>
> In contrast, there are a large number of packages already in existence that
> declare their dependencies in their EGG-INFO/depends.txt.  There are many
> tools -- I don't even know how many -- which already know how to produce and
> consume that dependency metadata.
>
> In fact, one such tool has a patch that I contributed myself to use that
> dependency metadata to automatically produce the Debian "Depends:"
> information [1].  I learned yesterday that there is a tool by David Malcolm
> to do likewise for Fedora RPM packages.
>
> We would gain power by continuing to use the format that is already
> implemented and deployed, instead of asking everyone to switch to a
> different format.

The point is not to switch to a different format but to make sure:

- we are able to read it without a setup.py magic call
- we do have everything needed in these metadata for OS-vendors to
work with the package
  otherwise propose some extensions

>
> So it seems like the next step is to write a PEP that supercedes the parts
> of PEP-345 which are about dependency metadata and instead says that the
> standard way to encode Python dependency metadata is in the
> EGG-INFO/requires.txt file.
>

I would go further and say that we shouldn't have to run a command
to generate the EGG-INFO of PKG-INFO or wathever,

they should be avalaible in the package, directly, in a flat file.

maybe in a "package_info.py" file I don't know or a .cfg file. But we
shouldn't depend
on a setup.py command call to read them or on  a directory built by a command.

That is one simple evolution I'd like to propose in the PEP I am
working on.


Regards,


> Regards,
>
> Zooko
>
> [1] https://code.launchpad.net/~astraw/stdeb/autofind-depends
> ---
> http://allmydata.org -- Tahoe, the Least-Authority Filesystem
> http://allmydata.com -- back up all your files for $5/month



-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread zooko

On Sep 29, 2008, at 6:09 AM, Tarek Ziadé wrote:


Now, the question is,  what would debian miss in here to install:

http://www.python.org/dev/peps/pep-0345/


It really seems to me that PEP-345's specification of dependency  
metadata is the wrong starting point.


There are not, to my knowledge, any Python packages in existence  
which use this form of dependency metadata, and there are not, to my  
knowledge, any Python tools which are capable of producing or  
consuming it.


In contrast, there are a large number of packages already in  
existence that declare their dependencies in their EGG-INFO/ 
depends.txt.  There are many tools -- I don't even know how many --  
which already know how to produce and consume that dependency metadata.


In fact, one such tool has a patch that I contributed myself to use  
that dependency metadata to automatically produce the Debian  
"Depends:" information [1].  I learned yesterday that there is a tool  
by David Malcolm to do likewise for Fedora RPM packages.


We would gain power by continuing to use the format that is already  
implemented and deployed, instead of asking everyone to switch to a  
different format.


So it seems like the next step is to write a PEP that supercedes the  
parts of PEP-345 which are about dependency metadata and instead says  
that the standard way to encode Python dependency metadata is in the  
EGG-INFO/requires.txt file.


Regards,

Zooko

[1] https://code.launchpad.net/~astraw/stdeb/autofind-depends
---
http://allmydata.org -- Tahoe, the Least-Authority Filesystem
http://allmydata.com -- back up all your files for $5/month
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] Distutils Sprint

2008-09-30 Thread Tarek Ziadé
Hello

In order to continue the effort started here. We are organizing a
distutils "PEP sprint" with people that works differently to deliver
Python applications.

http://wiki.python.org/moin/DistributeSprint_#1

Some Python developers from the Debian world and Scons specialist will
join. I'll bring the zc.buildout/setuptools point of view.

I would like to come up with enough material to write a meta-PEP and a
series of PEP. These PEPs will then be submitted here for further
discussions.

Please join !

Tarek

-- 
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Tarek Ziadé
On Tue, Sep 30, 2008 at 10:42 AM, Nicolas Chauvat
<[EMAIL PROTECTED]> wrote:
> For example, if you require a minimal version of 1.4, you can
> translate this to a package version of 1.4; it is a bit hackish but
> will work if you handle epochs correctly. But if the package you
> depend on has a Provides: blah (1.4), you have no way to map that to a
> dependency, because you can't know what other versions of the package
> will provide.

I am not sure to fully understand, could you provide a real-word example ?

>
> In all cases, it will be necessary to manually add shlibs-like
> information to the packages; they could be partly autogenerated like
> symbol files, but you need a mapping between provided modules and the
> first version of the package that provides it.


Is this related ? http://lists.debian.org/debian-dpkg/2006/01/msg00118.html
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] [pyconuk] "just use debian"

2008-09-30 Thread Nicolas Chauvat
Hi,

On Mon, Sep 29, 2008 at 02:09:15PM +0200, Tarek Ziadé wrote:
> That is exactly what was brought in the other thread in distutils-SIG,
> providing the package metadata in a simple way for os-vendors, without
> having to deal with things like setup.py
> 
> Then having third party applications that knows how to use them
> to install things in debian, or whatever the system is.
> 
> Now, the question is,  what would debian miss in here to install:
> 
> http://www.python.org/dev/peps/pep-0345/
> 
> If you can come up with a list of missing elements, we could
> probably start to work on a PEP together.

I started a thread on debian-python to ask for help:
http://lists.debian.org/debian-python/2008/09/msg00025.html

Here the answer from Josselin Mouette who is the author of
python-support, a tool that dramatically eases the packaging of Python
code for Debian.

-
Le lundi 29 septembre 2008 à 15:12 +0200, Nicolas Chauvat a écrit :
> Here is where we stand today:
>http://mail.python.org/pipermail/distutils-sig/2008-September/010126.html

This looks like a step in the right direction if we want to generate
inter-module dependencies.

Most things defined in the PEP will not be useful for packaging,
except for making something like a dh_make_python almost trivial to
write. The one thing that we'd almost certainly use is the Requires
and Provides fields.

However you should be careful with the notion of version. It is nice
to have a lot of flexibility in specifying versioned dependencies, but
the more stuff the norm allows, the more complicated it will be to
translate this into inter-package dependencies.

For example, if you require a minimal version of 1.4, you can
translate this to a package version of 1.4; it is a bit hackish but
will work if you handle epochs correctly. But if the package you
depend on has a Provides: blah (1.4), you have no way to map that to a
dependency, because you can't know what other versions of the package
will provide.

In all cases, it will be necessary to manually add shlibs-like
information to the packages; they could be partly autogenerated like
symbol files, but you need a mapping between provided modules and the
first version of the package that provides it.
-

Good to see this is moving forward :)

-- 
Nicolas Chauvat

logilab.fr - services en informatique scientifique et gestion de connaissances  
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig