Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/22/12 7:05 AM, Nick Coghlan wrote:

..

- I reject setup.cfg, as I believe ini-style configuration files are
not appropriate for a metadata format that needs to include file
listings and code fragments

I don't understand what's the problem is with ini-style files, as they are
suitable for multi-line variables etc. (see zc.buildout)

yaml vs ini vs xxx seems to be an implementation detail, and my take on 
this

is that we have ConfigParser in the stdlib

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 4:25 PM, Stephen J. Turnbull  wrote:
> Paul Moore writes:
>
>  > End users should not need packaging tools on their machines.
>
> I think this desideratum is close to obsolete these days, with webapps
> in "the cloud" downloading resources (including, but not limited to,
> code) on an as-needed basis.

There's still a lot more to the software world than what happens on
the public internet.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 3:20 PM, Donald Stufft  wrote:
> I don't want to argue over implementation details as I think that is
> premature right now, so this
> concept has a big +1 from me. RPM, deb, etc has a long history and a lot of
> shared knowledge
> so looking at them and adapting it to work cross platform is likely to be
> huge win.

Right, much of what I wrote in that email should be taken as "this is
one way I think it *could* work", rather than "this is the way I think
it *should* work". In particular, any realistic attempt should also
look at what Debian based systems do differently from RPM based
systems.

I think the key elements are recognising that:
- an "sdist" contains three kinds of file:
  - package metadata
  - files to be installed directly on the target system
  - files needed to build other files
- a "bdist" also contains three kinds of file:
  - package metadata
  - files to be installed directly on the target system
  - files needed to correctly install and update other files

That means the key transformations to be defined are:

- source checkout -> sdist
  - need to define contents of sdist
  - need to define where any directly installed files are going to end up
- sdist -> bdist
  - need to define contents of bdist
  - need to define how to create the build artifacts
  - need to define where any installed build artifacts are going to end up
- bdist -> installed software
  - need to allow application developers to customise the installation process
  - need to allow system packages to customise where certain kinds of
file end up

The one *anti-pattern* I think we really want to avoid is a complex
registration system where customisation isn't as simple as saying
either:
- run this inline piece of code; or
- invoke this named function or class that implements the appropriate interface

The other main consideration is that we want the format to be easy to
read with general purpose tools, and that means something based on a
configuration file standard. YAML is the obvious choice at that point.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Stephen J. Turnbull
Paul Moore writes:

 > End users should not need packaging tools on their machines.

I think this desideratum is close to obsolete these days, with webapps
in "the cloud" downloading resources (including, but not limited to,
code) on an as-needed basis.

If you're *not* obtaining resources as-needed, but instead installing
an everything-you-could-ever-need SUMO, I don't see the problem with
including packaging tools as well.

Not to mention that "end user" isn't a permanent property of a person,
but rather a role that they can change at will and sometimes may be
forced to.

What is desirable is that such tools be kept in the back of a closet
where people currently in the "end user" role don't need to see them
at all, but developers can get them immediately when needed.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Donald Stufft
On Friday, June 22, 2012 at 1:05 AM, Nick Coghlan wrote:
> 
> - I reject setup.cfg, as I believe ini-style configuration files are
> not appropriate for a metadata format that needs to include file
> listings and code fragments
> 
> - I reject bento.info (http://bento.info), as I think if we accept
> yet-another-custom-configuration-file-format into the standard library
> instead of just using YAML, we're even crazier than is already
> apparent
> 
> - I shall use "dist.yaml" as my proposed name for my "I wish I could
> define packages like this" format (and yes, that means adding yaml
> support to the standard library is part of the wish)
> 
> - many of the details below will be flawed, but I want to give a clear
> idea for how a concept like this might work in practice
> 
> - we need to define a clear set of build phases, and then design the
> dist metadata format accordingly. For example:
> - source
> - uses a "source" section in dist.yaml
> - "source/install" maps source files directly to desired
> install locations
> - essentially what the setup.cfg Resources section tries to do
> - used for pure Python code, documentation, etc
> - See below for example
> - "source/files" defines a list of extra files to be included
> - "source/exclude" defines the list of files to be excluded
> - "source/run" defines a Python fragment to be executed
> - serves a similar purpose to the "files" section in setup.cfg
> - creates a temporary directory (and sets it as the working directory)
> - dist.yaml is copied to the temporary directory
> - all files to be installed are copied to the temporary directory
> - all extra files are copied to the temporary directory
> - the Python fragment in "source/run" is executed (which can
> thus easily add more files)
> - if sdist archive creation is requested, entire contents of
> temporary directory are included
> - build
> - uses a "build" section in dist.yaml
> - "build/install" maps built files to desired install locations
> - like source/install, but for build artifacts
> - compiled C extensions, .pyc and .pyo files, etc would all go here
> - "build/run" defines a Python fragment to be executed
> - "build/files" defines the list of files to be included
> - "build/exclude" defines the list of files to be excluded
> - "build/requires" defines extra dependencies not needed at runtime
> - starting environment is a source directory that is either:
> - preexisting (e.g. to allow building in-place in the source tree)
> - created by running source first
> - created by unpacking an sdist archive
> - the Python fragment in "build/run" is executed to trigger the build
> - if the build succeeds (i.e. doesn't throw an exception)
> - create a temporary directory
> - copy dist.yaml
> - copy all specified files
> - this is the easiest way to exclude build artifacts from
> the distribution, while still keeping them around to enable
> incremental builds
> - if bdist_simple archive creation is requested, entire
> contents of temporary directory are included
> - other bdist formats (such as bdist_rpm) will have their own
> rules for getting from the bdist_simple format to the platform
> specific format
> - install
> - uses an "install" section in dist.yaml
> - "install/pre" defines a Python fragment to be executed
> before copying files
> - "install/post" defines a Python fragment to be executed
> after copying files
> - starting environment is a bdist_simple directory that is either:
> - preexisting (e.g. to allow creation by system packaging tools)
> - created by running build first
> - created by unpacking a bdist_simple archive
> - end result is a fully installed and usable piece of software
> - test
> - uses a "test" section in dist.yaml
> - "test/run" defines a Python fragment to be executed to start the tests
> - "test/requires" defines extra dependencies needed to run the
> test suite
> 
I dislike some of the (implementation) details, but in general I think this is 
a good direction 
to go in. Less trying to force tools to work together by hijacking setup.py or 
something
and more "this is a package, it contains the data you need to install, and how 
to install it, you
installation tool can use this data however it pleases to make sure it is 
installed." I feel like this
is (one of?) the missing piece of the puzzle to define a set of standards that 
_any_ package creation,
or installation tool can implement and gain interoperability.

I don't want to argue over implementation details as I think that is premature 
right now, so this
concept has a big +1 from me. RPM, deb, etc has a long history and a lot of 
shared knowledge
so looking at them and adapting it to work cross platform is likely to be huge 
win.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 362 - request for pronouncement

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 11:56 AM, Yury Selivanov
 wrote:
> Hello,
>
> On behalf of Brett, Larry, and myself, I'm requesting for PEP 362
> pronouncement.
>
> The PEP, has been updated with all feedback from python-dev list
> discussions. I'm posting the latest version of it with this message.
> The PEP is also available here: http://www.python.org/dev/peps/pep-0362/
> The python issue tracking the patch: http://bugs.python.org/issue15008
>
> The reception of the PEP was very positive, the API is minimalistic and
> future-proof; the implementation is stable, well tested, reviewed and should
> be ready to merge.

I'll also note that I've explicitly recused myself from being the
BDFL-Delegate for this one, as I had too much of a hand in the API
design.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 10:01 AM, Donald Stufft  wrote:
> The idea i'm hoping for is to stop worrying about one implementation over
> another and
> hoping to create a common format that all the tools can agree upon and
> create/install.

Right, and this is where it encouraged me to see in the Bento docs
that David had cribbed from RPM in this regard (although I don't
believe he has cribbed *enough*).

A packaging system really needs to cope with two very different levels
of packaging:
1. Source distributions (e.g. SRPMs). To get from this to useful
software requires developer tools.
2. "Binary" distributions (e.g. RPMs). To get from this to useful
software mainly requires a "file copy" utility (well, that and an
archive decompressor).

An SRPM is *just* a SPEC file and source tarball. That's it. To get
from that to an installed product, you have a bunch of additional
"BuildRequires" dependencies, along with %build and %install scripts
and a %files definition that define what will be packaged up and
included in the binary RPM. The exact nature of the metadata format
doesn't really matter, what matters is that it's a documented standard
that multiple tools can read.

An RPM includes files that actually get installed on the target
system. An RPM can be arch specific (if they include built binary
bits) or "noarch" if they're platform neutral.

distutils really only plays at the SRPM level - there is no defined OS
neutral RPM equivalent. That's why I brought up the bdist_simple
discussion earlier in the thread - if we can agree on a standard
bdist_simple format, then we can more cleanly decouple the "build"
step from the "install" step.

I think one of the key things to learn from the SPEC file format is
the configuration language it used for the various build phases: sh
(technically, any shell on the system, but almost everyone just uses
the default system shell)

This is why you can integrate whatever build system you like with it:
so long as you can invoke the build from the shell, then you can use
it to make your RPM.

Now, there's an obvious problem with this: it's completely useless
from a *cross-platform* building point of view. Isn't it a shame
there's no language we could use that would let us invoke build
systems in a cross platform way? Oh, wait...

So here's some sheer pie-in-the-sky speculation. If people like
elements of this idea enough to run with it, great. If not... oh well:

- I believe the "egg" term has way too much negative baggage (courtesy
of easy_install), and find the full term Distribution to be too easily
confused with "Linux distribution". However, "Python dist" is
unambiguous (since the more typical abbreviation for an aggregate
distribution is "distro"). Thus, I attempt to systematically refer to
the objects used to distribute Python software from developers to
users as "dists". In practice, this terminology is already used in
many places (distutils, sdist, bdist_msi, bdist_rpm, the .dist-info
format in PEP 376 etc). Thus, Python software is distributed as dists
(either sdists or bdists), which may in turn be converted to distro
packages (e.g. SRPMs and RPMs) for deployment to particular
environments.

- I reject setup.cfg, as I believe ini-style configuration files are
not appropriate for a metadata format that needs to include file
listings and code fragments

- I reject bento.info, as I think if we accept
yet-another-custom-configuration-file-format into the standard library
instead of just using YAML, we're even crazier than is already
apparent

- I shall use "dist.yaml" as my proposed name for my "I wish I could
define packages like this" format (and yes, that means adding yaml
support to the standard library is part of the wish)

- many of the details below will be flawed, but I want to give a clear
idea for how a concept like this might work in practice

- we need to define a clear set of build phases, and then design the
dist metadata format accordingly. For example:
- source
- uses a "source" section in dist.yaml
- "source/install" maps source files directly to desired
install locations
   - essentially what the setup.cfg Resources section tries to do
   - used for pure Python code, documentation, etc
   - See below for example
- "source/files" defines a list of extra files to be included
- "source/exclude" defines the list of files to be excluded
- "source/run" defines a Python fragment to be executed
- serves a similar purpose to the "files" section in setup.cfg
- creates a temporary directory (and sets it as the working directory)
- dist.yaml is copied to the temporary directory
- all files to be installed are copied to the temporary directory
- all extra files are copied to the temporary directory
- the Python fragment in "source/run" is executed (which can
thus easily add more files)
- if sdist archive creation is requested, entire contents of
t

[Python-Dev] PEP 362 - request for pronouncement

2012-06-21 Thread Yury Selivanov
Hello,

On behalf of Brett, Larry, and myself, I'm requesting for PEP 362
pronouncement.

The PEP, has been updated with all feedback from python-dev list 
discussions. I'm posting the latest version of it with this message. 
The PEP is also available here: http://www.python.org/dev/peps/pep-0362/
The python issue tracking the patch: http://bugs.python.org/issue15008

The reception of the PEP was very positive, the API is minimalistic and
future-proof; the implementation is stable, well tested, reviewed and should
be ready to merge.


Thank you,
Yury


PEP: 362
Title: Function Signature Object
Version: $Revision$
Last-Modified: $Date$
Author: Brett Cannon , Jiwon Seo ,
Yury Selivanov , Larry Hastings 

Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 21-Aug-2006
Python-Version: 3.3
Post-History: 04-Jun-2012


Abstract


Python has always supported powerful introspection capabilities,
including introspecting functions and methods (for the rest of
this PEP, "function" refers to both functions and methods).  By
examining a function object you can fully reconstruct the function's
signature.  Unfortunately this information is stored in an inconvenient
manner, and is spread across a half-dozen deeply nested attributes.

This PEP proposes a new representation for function signatures.
The new representation contains all necessary information about a function
and its parameters, and makes introspection easy and straightforward.

However, this object does not replace the existing function
metadata, which is used by Python itself to execute those
functions.  The new metadata object is intended solely to make
function introspection easier for Python programmers.


Signature Object


A Signature object represents the call signature of a function and
its return annotation.  For each parameter accepted by the function
it stores a `Parameter object`_ in its ``parameters`` collection.

A Signature object has the following public attributes and methods:

* return_annotation : object
The "return" annotation for the function. If the function
has no "return" annotation, this attribute is not set.

* parameters : OrderedDict
An ordered mapping of parameters' names to the corresponding
Parameter objects.

* bind(\*args, \*\*kwargs) -> BoundArguments
Creates a mapping from positional and keyword arguments to
parameters.  Raises a ``TypeError`` if the passed arguments do
not match the signature.

* bind_partial(\*args, \*\*kwargs) -> BoundArguments
Works the same way as ``bind()``, but allows the omission
of some required arguments (mimics ``functools.partial``
behavior.)  Raises a ``TypeError`` if the passed arguments do
not match the signature.

* replace(parameters, \*, return_annotation) -> Signature
Creates a new Signature instance based on the instance
``replace`` was invoked on.  It is possible to pass different
``parameters`` and/or ``return_annotation`` to override the
corresponding properties of the base signature.  To remove
``return_annotation`` from the copied ``Signature``, pass in
``Signature.empty``.

Signature objects are immutable.  Use ``Signature.replace()`` to
make a modified copy:
::

>>> def foo() -> None:
... pass
>>> sig = signature(foo)

>>> new_sig = sig.replace(return_annotation="new return annotation")
>>> new_sig is not sig
True
>>> new_sig.return_annotation != sig.return_annotation
True
>>> new_sig.parameters == sig.parameters
True

>>> new_sig = new_sig.replace(return_annotation=new_sig.empty)
>>> hasattr(new_sig, "return_annotation")
False

There are two ways to instantiate a Signature class:

* Signature(parameters, \*, return_annotation)
Default Signature constructor.  Accepts an optional sequence
of ``Parameter`` objects, and an optional ``return_annotation``.
Parameters sequence is validated to check that there are no
parameters with duplicate names, and that the parameters
are in the right order, i.e. positional-only first, then
positional-or-keyword, etc.
* Signature.from_function(function)
Returns a Signature object reflecting the signature of the
function passed in.

It's possible to test Signatures for equality.  Two signatures are
equal when their parameters are equal, their positional and
positional-only parameters appear in the same order, and they
have equal return annotations.

Changes to the Signature object, or to any of its data members,
do not affect the function itself.

Signature also implements ``__str__``:
::

>>> str(Signature.from_function((lambda *args: None)))
'(*args)'

>>> str(Signature())
'()'


Parameter Object


Python's expressive syntax means functions can accept many different
kinds of parameters with many subtle semantic differences.  We
propose a rich Parameter object designed to represent any possible
function parameter.

A P

Re: [Python-Dev] Accepting PEP 397

2012-06-21 Thread Mark Hammond

On 22/06/2012 8:14 AM, Brian Curtin wrote:

On Wed, Jun 20, 2012 at 11:54 AM, Brian Curtin  wrote:

As the PEP czar for 397, after Martin's final updates, I hereby
pronounce this PEP "accepted"!

Thanks to Mark Hammond for kicking it off, Vinay Sajip for writing up
the code, Martin von Loewis for recent updates, and everyone in the
community who contributed to the discussions.

I will begin integration work this evening.


It's in. http://hg.python.org/cpython/rev/a7ecbb2ad967

Thanks all!


Awesome - thank you!

Mark
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Donald Stufft
On Thursday, June 21, 2012 at 7:34 PM, Alex Clark wrote:
> Hi,
> 
> On 6/21/12 5:38 PM, Donald Stufft wrote:
> > On Thursday, June 21, 2012 at 4:01 PM, Paul Moore wrote:
> > > End users should not need packaging tools on their machines.
> > 
> > Sort of riffing on this idea, I cannot seem to find a specification for
> > what a Python
> > package actually is.
> > 
> 
> 
> 
> FWIW according to distutils[1], a package is: a module or modules inside 
> another module[2]. So e.g.::
> 
> 
> foo.py is a module
> 
> 
> and:
> 
> foo/__init__.py
> foo/foo.py
> 
> is a simple package containing the following modules:
> 
> import foo, foo.foo
> 
> 
> Alex
> 
> 
> [1] 
> http://docs.python.org/distutils/introduction.html#general-python-terminology
> 
> [2] And a distribution is a compressed archive of a package, in case 
> that's not clear.
> 
Right, i'm actually talking about distributions. (As is everyone else in this 
thread).
And that a definition is not a specification.

What i'm trying to get at is with a standard package format where all the 
metadata is
able to get gotten at without the packaging lib (distutils/setuptools cannot 
get at
metadata without using distutils or setuptools). It would need to be required 
that
this serves as the one true source of metadata and that other tools can add 
certain
types of metadata to this format.

If say distutils2 wrote a package that adhered to a certain standard, and wrote 
all the
information that distutils2 knows about how to install said package (what 
files, names,
versions, dependencies etc) to a file (say PKG-INFO) that contained only 
"common"
standard information then another tool (say bento) could take that package, and 
install
it.

The idea i'm hoping for is to stop worrying about one implementation over 
another and
hoping to create a common format that all the tools can agree upon and 
create/install.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Alex Clark

Hi,

On 6/21/12 5:38 PM, Donald Stufft wrote:

On Thursday, June 21, 2012 at 4:01 PM, Paul Moore wrote:

End users should not need packaging tools on their machines.


Sort of riffing on this idea, I cannot seem to find a specification for
what a Python
package actually is.



FWIW according to distutils[1], a package is: a module or modules inside 
another module[2]. So e.g.::



  foo.py is a module


and:

  foo/__init__.py
  foo/foo.py

is a simple package containing the following modules:

  import foo, foo.foo


Alex


[1] 
http://docs.python.org/distutils/introduction.html#general-python-terminology


[2] And a distribution is a compressed archive of a package, in case 
that's not clear.





 Maybe the first effort should focus on this instead
of arguing one
implementation or another.

As a packager:
 I should not (in general) care what tool (pip, pysetup,
easy_install, buildout, whatever) is used
 to install my package, My package should just describe what to do
to install itself.

As a end user:
I should not (in general) care what tool was used to create a
package (setuptools, bento, distutils,
whatever). My tool of choice should look at the package and preform
the operations that the package
says are needed for install.

Ideally the package could have some basic primitives that are enough to
tell the package installer
tool what to do to install it, These primitives should be enough to
cover the common cases (pure python
modules at the very least, maybe additionally some C modules). Now as
others have remarked it would
be insane to attempt to do this in every case as it would involve
writing a build system that is more
advanced than anything else existing, so a required primitive would be
something that allows calling out
to a specific package decided build system (waf, make, whatever) to
handle the build configuration.

The eventual end goal here being to make a package from something that
varies from implementation
to implementation to a standardized format that any number of tools can
build on top of. It would likely
include some things defining where metadata MUST be defined.

For instance, if metadata in setuptools was "compiled" down to static
file, and easy_install, pip et;al
used that static file to install from instead of executing setup.py,
then the end user would not have
required setup tools installed and instead any number of tools could
have been created that utilized
that data.





--
Alex Clark · http://pythonpackages.com



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 11:55 PM, David Cournapeau wrote:


I think there is a misunderstanding of what bento is: bento is not a 
compiler or anything like that. It is a set of libraries that work 
together to configure, build and install a python project.


Concretely, in bento, there is
  -  a part that build a packager description (Distribution-like in 
distutils-parlance) from a bento.info  (a bite like 
setup.cfg)

  - a set of tools of commands around this package description.
  - a set of "backends" to e.g. use waf to build C extension with full 
and automatic dependency analysis (rebuild this if this other thing is 
out of date), parallel builds and configuration. Bento scripts build 
numpy more efficiently and reliable while being 50 % shorter than our 
setup.py.
  - a small library to build a distutils-compatible Distribution so 
that you can write a 3 lines setup.py that takes all its info from 
bento.info  and allow for pip to work.


Now, you could produce a similar package description from the 
setup.cfg to be fed to bento, but I don't really see the point since 
AFAIK, bento.info  is strictly more powerful as a 
format than setup.cfg.




So that means that *today*, Bento can consume Distutils2 project and 
compiles them, just by reading their setup.cfg, right ?


And the code you have to convert setup.cfg into bento.info is what I was 
talking about.


It means that I can create a project without a setup.py file, and just 
setup.cfg, and have it working with distutils2 *or* bento


That's *exactly* what I was talking about. the setup.cfg is the *common* 
standard, and is planned to be published at PyPI statically.


Let people out there use their tool of their choice to install a project 
defined by a setup.cfg


so 2 questions:

1/ does Bento install things following PEP 376 ?

2/ how does the setup.cfg hooks work wrt Bento ?

and last one proposal: how a PEP that defines a setup.cfg standard that 
is Bento-friendly, but still distutils2-friendly would sound ?


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread David Cournapeau
On Thu, Jun 21, 2012 at 11:00 PM, Antoine Pitrou wrote:

> On Thu, 21 Jun 2012 22:46:58 +0200
> Dag Sverre Seljebotn  wrote:
> > > The other thing is, the folks in distutils2 and myself, have zero
> > > knowledge about compilers. That's why we got very frustrated not to see
> > > people with that knowledge come and help us in this area.
> >
> > Here's the flip side: If you have zero knowledge about compilers, it's
> > going to be almost impossible to have a meaningful discussion about a
> > compilation PEP.
>
> If a PEP is being discussed, even a packaging PEP, it involves all of
> python-dev, so Tarek and Éric not being knowledgeable in compilers is
> not a big problem.
>
> > The necessary prerequisites in this case is not merely "knowledge of
> > compilers". To avoid repeating mistakes of the past, the prerequisites
> > for a meaningful discussion is years of hard-worn experience building
> > software in various languages, on different platforms, using different
> > build tools.
>
> This is precisely the kind of knowledge that a PEP is aimed at
> distilling.
>

What would you imagine such a PEP would contain ? If you don't need to
customize the compilation, then I would say refactoring what's in distutils
is good enough. If you need customization, then I am convinced one should
just use one of the existing build tools (waf, fbuild, scons, etc…). Python
has more than enough of them already.

By refactoring, I mean extracting it completely from command, and have an
API similar to e.g. fbuild (
https://github.com/felix-lang/fbuild/blob/master/examples/c/fbuildroot.py),
i.e. you basically have a class PythonBuilder.build_extension(name,
sources, options). The key point is to remove any dependency on commands.
If fbuild were not python3-specific, I would say just use that. It would
cover most usecases.

Actually,




> Regards
>
> Antoine.
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> http://mail.python.org/mailman/options/python-dev/cournape%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/22/2012 12:05 AM, Dag Sverre Seljebotn wrote:

On 06/21/2012 11:04 PM, Tarek Ziadé wrote:

On 6/21/12 10:46 PM, Dag Sverre Seljebotn wrote:
...

I think we should, as you proposed, list a few projects w/ compilation
needs -- from the simplest to the more complex, then see how a standard
*description* could be used by any tool


It's not clear to me what you mean by description. Package metadata,
install information or description of what/how to build?

I hope you don't mean the latter, that would be insane...it would
effectively amount to creating a build tool that's both more elegant
and more powerful than any option that's currently already out there.

Assuming you mean the former, that's what David did to create Bento.
Reading and understanding Bento and the design decisions going into it
would be a better use of time than redoing a discussion, and would at
least be a very good starting point.


What I mean is : what would it take to use Bento (or another tool) as
the compiler in a distutils-based project, without having to change the
distutils metadata.


As for current distutils/setuptools/distribute metadata, the idea is you
run the bento conversion utility to convert it to Bento metadata, then
use Bento.

Please read

http://cournape.github.com/Bento/

There may be packages where this doesn't work and you'd need to tweak
the results yourself though.


Here's the flip side: If you have zero knowledge about compilers, it's
going to be almost impossible to have a meaningful discussion about a
compilation PEP. It's very hard to discuss standards unless everybody
involved have the necessary prerequisite knowledge. You don't go
discussing details of the Linux kernel without some solid C experience
either.

Consider me as the end user that want to have his 2 C modules compiled
in their Python project.


OK, so can I propose that you kill off distutils2 and use bento
wholesale instead?

Obviously not. So you're not just an end-user. That illusion would wear
rather thin very quickly.


I regret this comment, it's not helpful to the discussion. Trying again:

David's numscons project was a large effort and it tried to integrate a 
proper build system (scons) with distutils. That effort didn't in the 
end go anywhere.


But I think it did show that everything is coupled to everything, and 
that build system integration (and other "special" needs of the scipy 
community) affects everything in the package system.


It's definitely not as simple as having somebody with compiler 
experience chime in on the isolated topic of how to build extensions. 
It's something that needs to drive the entire design process. Which is 
perhaps why it is difficult to have a package system designed by people 
who don't know compilers to be usable by people who need to use them in 
non-trivial ways.


Dag





The necessary prerequisites in this case is not merely "knowledge of
compilers". To avoid repeating mistakes of the past, the prerequisites
for a meaningful discussion is years of hard-worn experience building
software in various languages, on different platforms, using different
build tools.

Look, these problems are really hard to deal with. Myself I have
experience with building 2-3 languages using 2-3 build tools on 2
platforms, and I consider myself a complete novice and usually decide
to trust David's instincts over trying to make up an opinion of my own
-- simply because I know he's got a lot more experience than I have.

Theoretically it is possible to separate and isolate concerns so that
one set of people discuss build integration and another set of people
discuss installation. Problem is that all the problems tangle -- in
particular when the starting point is distutils!

That's why *sometimes*, not always, design by committee is the wrong
approach, and one-man-shows is what brings technology forwards.


I am not saying this should be designed by a commitee, but rather - if
such a tool can be made compatible with simple Distutils project, the
guy behind this tool can probably help on a PEP with feedback from a
larger audience than the sci community.

What bugs me is to say that we live in two separate worlds and cannot
build common pieces. This is not True.


I'm not saying it's *impossible* to build common pieces, I'm suggesting
that it's not cost-effective in terms of man-hours going into it. And
the problem isn't technical as much as social and the mix of people and
skill sets involved.

But David really made that decision for me when he left distutils-sig,
I'm not going to spend my own time and energy trying to get decent
builds shoehorned into distutils2 when he is busy working on a solution.

(David already spent loads of time on trying to integrate scons with
distutils (the numscons project) and maintained numpy.distutils and
scipy builds for years; I trust his judgement above pretty much anybody
else's.)


So, I reiterate my proposal, and it could also be expressed like this:

1/ David writes a PEP whe

Re: [Python-Dev] Accepting PEP 397

2012-06-21 Thread Brian Curtin
On Wed, Jun 20, 2012 at 11:54 AM, Brian Curtin  wrote:
> As the PEP czar for 397, after Martin's final updates, I hereby
> pronounce this PEP "accepted"!
>
> Thanks to Mark Hammond for kicking it off, Vinay Sajip for writing up
> the code, Martin von Loewis for recent updates, and everyone in the
> community who contributed to the discussions.
>
> I will begin integration work this evening.

It's in. http://hg.python.org/cpython/rev/a7ecbb2ad967

Thanks all!
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/21/2012 11:04 PM, Tarek Ziadé wrote:

On 6/21/12 10:46 PM, Dag Sverre Seljebotn wrote:
...

I think we should, as you proposed, list a few projects w/ compilation
needs -- from the simplest to the more complex, then see how a standard
*description* could be used by any tool


It's not clear to me what you mean by description. Package metadata,
install information or description of what/how to build?

I hope you don't mean the latter, that would be insane...it would
effectively amount to creating a build tool that's both more elegant
and more powerful than any option that's currently already out there.

Assuming you mean the former, that's what David did to create Bento.
Reading and understanding Bento and the design decisions going into it
would be a better use of time than redoing a discussion, and would at
least be a very good starting point.


What I mean is : what would it take to use Bento (or another tool) as
the compiler in a distutils-based project, without having to change the
distutils metadata.


As for current distutils/setuptools/distribute metadata, the idea is you 
run the bento conversion utility to convert it to Bento metadata, then 
use Bento.


Please read

http://cournape.github.com/Bento/

There may be packages where this doesn't work and you'd need to tweak 
the results yourself though.



Here's the flip side: If you have zero knowledge about compilers, it's
going to be almost impossible to have a meaningful discussion about a
compilation PEP. It's very hard to discuss standards unless everybody
involved have the necessary prerequisite knowledge. You don't go
discussing details of the Linux kernel without some solid C experience
either.

Consider me as the end user that want to have his 2 C modules compiled
in their Python project.


OK, so can I propose that you kill off distutils2 and use bento 
wholesale instead?


Obviously not. So you're not just an end-user. That illusion would wear 
rather thin very quickly.




The necessary prerequisites in this case is not merely "knowledge of
compilers". To avoid repeating mistakes of the past, the prerequisites
for a meaningful discussion is years of hard-worn experience building
software in various languages, on different platforms, using different
build tools.

Look, these problems are really hard to deal with. Myself I have
experience with building 2-3 languages using 2-3 build tools on 2
platforms, and I consider myself a complete novice and usually decide
to trust David's instincts over trying to make up an opinion of my own
-- simply because I know he's got a lot more experience than I have.

Theoretically it is possible to separate and isolate concerns so that
one set of people discuss build integration and another set of people
discuss installation. Problem is that all the problems tangle -- in
particular when the starting point is distutils!

That's why *sometimes*, not always, design by committee is the wrong
approach, and one-man-shows is what brings technology forwards.


I am not saying this should be designed by a commitee, but rather - if
such a tool can be made compatible with simple Distutils project, the
guy behind this tool can probably help on a PEP with feedback from a
larger audience than the sci community.

What bugs me is to say that we live in two separate worlds and cannot
build common pieces. This is not True.


I'm not saying it's *impossible* to build common pieces, I'm suggesting 
that it's not cost-effective in terms of man-hours going into it. And 
the problem isn't technical as much as social and the mix of people and 
skill sets involved.


But David really made that decision for me when he left distutils-sig, 
I'm not going to spend my own time and energy trying to get decent 
builds shoehorned into distutils2 when he is busy working on a solution.


(David already spent loads of time on trying to integrate scons with 
distutils (the numscons project) and maintained numpy.distutils and 
scipy builds for years; I trust his judgement above pretty much anybody 
else's.)



So, I reiterate my proposal, and it could also be expressed like this:

1/ David writes a PEP where he describes how Bento interact with a
project -- metadata, description files, etc
2/ Someone from distutils2 completes the PEP by describing how setup.cfg
works wrt Extensions
3/ we see if we can have a common standard even if it's a subset of
bento capabilities


bento isn't a build tool, it's a packaging tool, competing directly
with distutils2. It can deal with simple distutils-like builds using a
bundled build tool, and currently has integration with waf for
complicated builds; integration with other build systems will
presumably be added later as people need it (the main point is that
bento is designed for it).

I am not interested in Bento-the-tool. I am interested in what such a
tool needs from a project to use it =>


Again, you should read the elevator pitch at 
http://cournape.github.com/Bento/ + the Bento documentation.


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 22:46:58 +0200
Dag Sverre Seljebotn  wrote:
> > The other thing is, the folks in distutils2 and myself, have zero
> > knowledge about compilers. That's why we got very frustrated not to see
> > people with that knowledge come and help us in this area.
> 
> Here's the flip side: If you have zero knowledge about compilers, it's 
> going to be almost impossible to have a meaningful discussion about a 
> compilation PEP.

If a PEP is being discussed, even a packaging PEP, it involves all of
python-dev, so Tarek and Éric not being knowledgeable in compilers is
not a big problem.

> The necessary prerequisites in this case is not merely "knowledge of 
> compilers". To avoid repeating mistakes of the past, the prerequisites 
> for a meaningful discussion is years of hard-worn experience building 
> software in various languages, on different platforms, using different 
> build tools.

This is precisely the kind of knowledge that a PEP is aimed at
distilling.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread David Cournapeau
On Thu, Jun 21, 2012 at 10:04 PM, Tarek Ziadé  wrote:

> On 6/21/12 10:46 PM, Dag Sverre Seljebotn wrote:
> ...
>
>  I think we should, as you proposed, list a few projects w/ compilation
>>> needs -- from the simplest to the more complex, then see how a standard
>>> *description* could be used by any tool
>>>
>>
>> It's not clear to me what you mean by description. Package metadata,
>> install information or description of what/how to build?
>>
>> I hope you don't mean the latter, that would be insane...it would
>> effectively amount to creating a build tool that's both more elegant and
>> more powerful than any option that's currently already out there.
>>
>> Assuming you mean the former, that's what David did to create Bento.
>> Reading and understanding Bento and the design decisions going into it
>> would be a better use of time than redoing a discussion, and would at least
>> be a very good starting point.
>>
>
> What I mean is : what would it take to use Bento (or another tool) as the
> compiler in a distutils-based project, without having to change the
> distutils metadata.


I think there is a misunderstanding of what bento is: bento is not a
compiler or anything like that. It is a set of libraries that work together
to configure, build and install a python project.

Concretely, in bento, there is
  -  a part that build a packager description (Distribution-like in
distutils-parlance) from a bento.info (a bite like setup.cfg)
  - a set of tools of commands around this package description.
  - a set of "backends" to e.g. use waf to build C extension with full and
automatic dependency analysis (rebuild this if this other thing is out of
date), parallel builds and configuration. Bento scripts build numpy more
efficiently and reliable while being 50 % shorter than our setup.py.
  - a small library to build a distutils-compatible Distribution so that
you can write a 3 lines setup.py that takes all its info from
bento.infoand allow for pip to work.

Now, you could produce a similar package description from the setup.cfg to
be fed to bento, but I don't really see the point since AFAIK,
bento.infois strictly more powerful as a format than setup.cfg.

Another key point is that the commands around this package description are
almost entirely decoupled from each other: this is the hard part, and
something that is not really possible to do with the current distutils
design in an incremental way.

  - Command don't know about each other and dependencies between commands
are *external* to commands. You say command "build" depends on command
"configure", those dependencies are resolved at runtime. This allows for
3rd parties to insert new command without interfering with each other.
  - options are registered and handled outside command as well: each
command can query any other command options. I believe something similar is
now available in distutils2, though. Bento allow to add arbitrary configure
options to customize library directories (ala autoconf).
  - bento internally has an explicit "database" of built files, with
associated categories, and the build command produces a build "manifest".
The build manifest + the build tree defines completely the input for
install and installers command. The different binary installers use the
same build manifest, and the build manifest is actually designed as to
allow lossless convertion between different installers (e.g. wininst <->
msi, egg <-> mpkg on mac, etc…). This is what allows in principle to use
make, gyp, etc… to produce this build manifest


>
> "It can deal with simple distutils-like builds using a bundled build tool"
>  => If I understand this correctly, does that mean that Bento can build a
> distutils project with the distutils Metadata ?
>

I think Dag meant that bento has a system where you can basically do

# setup.py
from distutils.core import setup
import bento.distutils
bento.distutils.monkey_patch()
setup()

and this setup.py will automatically build a distutils Distribution
populated from bento.info. This allows a bento package to be installable
with pip or anything that expected a setup.py

This allows for interoperability without having to depend on all the
distutils issues.

David
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Donald Stufft
On Thursday, June 21, 2012 at 4:01 PM, Paul Moore wrote:
> End users should not need packaging tools on their machines.
> 
Sort of riffing on this idea, I cannot seem to find a specification for what a 
Python
package actually is. Maybe the first effort should focus on this instead of 
arguing one
implementation or another.  

As a packager:
I should not (in general) care what tool (pip, pysetup, easy_install, 
buildout, whatever) is used
to install my package, My package should just describe what to do to 
install itself.

As a end user:
   I should not (in general) care what tool was used to create a package 
(setuptools, bento, distutils,
   whatever). My tool of choice should look at the package and preform the 
operations that the package
   says are needed for install.

Ideally the package could have some basic primitives that are enough to tell 
the package installer
tool what to do to install it, These primitives should be enough to cover the 
common cases (pure python
modules at the very least, maybe additionally some C modules). Now as others 
have remarked it would
be insane to attempt to do this in every case as it would involve writing a 
build system that is more
advanced than anything else existing, so a required primitive would be 
something that allows calling out
to a specific package decided build system (waf, make, whatever) to handle the 
build configuration.

The eventual end goal here being to make a package from something that varies 
from implementation
to implementation to a standardized format that any number of tools can build 
on top of. It would likely
include some things defining where metadata MUST be defined.

For instance, if metadata in setuptools was "compiled" down to static file, and 
easy_install, pip et;al
used that static file to install from instead of executing setup.py, then the 
end user would not have
required setup tools installed and instead any number of tools could have been 
created that utilized
that data.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 10:46 PM, Dag Sverre Seljebotn wrote:
...

I think we should, as you proposed, list a few projects w/ compilation
needs -- from the simplest to the more complex, then see how a standard
*description* could be used by any tool


It's not clear to me what you mean by description. Package metadata, 
install information or description of what/how to build?


I hope you don't mean the latter, that would be insane...it would 
effectively amount to creating a build tool that's both more elegant 
and more powerful than any option that's currently already out there.


Assuming you mean the former, that's what David did to create Bento. 
Reading and understanding Bento and the design decisions going into it 
would be a better use of time than redoing a discussion, and would at 
least be a very good starting point.


What I mean is : what would it take to use Bento (or another tool) as 
the compiler in a distutils-based project, without having to change the 
distutils metadata.





But anyway, some project types from simple to advanced:

 - Simple library using Cython + NumPy C API
 - Wrappers around HPC codes like mpi4py, petsc4py
 - NumPy
 - SciPy (uses Fortran compilers too)
 - Library using code generation, Cython, NumPy C API, Fortran 90 
code, some performance tuning with CPU characteristics (instruction 
set, cache size, optimal loop structure) decided compile-time


I'd add:

- A Distutils project with a few ExtensionsThe other thing is, the folks 
in distutils2 and myself, have zero

knowledge about compilers. That's why we got very frustrated not to see
people with that knowledge come and help us in this area.


Here's the flip side: If you have zero knowledge about compilers, it's 
going to be almost impossible to have a meaningful discussion about a 
compilation PEP. It's very hard to discuss standards unless everybody 
involved have the necessary prerequisite knowledge. You don't go 
discussing details of the Linux kernel without some solid C experience 
either.
Consider me as the end user that want to have his 2 C modules compiled 
in their Python project.




The necessary prerequisites in this case is not merely "knowledge of 
compilers". To avoid repeating mistakes of the past, the prerequisites 
for a meaningful discussion is years of hard-worn experience building 
software in various languages, on different platforms, using different 
build tools.


Look, these problems are really hard to deal with. Myself I have 
experience with building 2-3 languages using 2-3 build tools on 2 
platforms, and I consider myself a complete novice and usually decide 
to trust David's instincts over trying to make up an opinion of my own 
-- simply because I know he's got a lot more experience than I have.


Theoretically it is possible to separate and isolate concerns so that 
one set of people discuss build integration and another set of people 
discuss installation. Problem is that all the problems tangle -- in 
particular when the starting point is distutils!


That's why *sometimes*, not always, design by committee is the wrong 
approach, and one-man-shows is what brings technology forwards.


I am not saying this should be designed by a commitee, but rather - if 
such a tool can be made compatible with simple Distutils project, the 
guy behind this tool can probably help on a PEP with feedback from a 
larger audience than the sci community.


What bugs me is to say that we live in two separate worlds and cannot 
build common pieces. This is not True.





So, I reiterate my proposal, and it could also be expressed like this:

1/ David writes a PEP where he describes how Bento interact with a
project -- metadata, description files, etc
2/ Someone from distutils2 completes the PEP by describing how setup.cfg
works wrt Extensions
3/ we see if we can have a common standard even if it's a subset of
bento capabilities


bento isn't a build tool, it's a packaging tool, competing directly 
with distutils2. It can deal with simple distutils-like builds using a 
bundled build tool, and currently has integration with waf for 
complicated builds; integration with other build systems will 
presumably be added later as people need it (the main point is that 
bento is designed for it).
I am not interested in Bento-the-tool. I am interested in what such a 
tool needs from a project to use it =>


"It can deal with simple distutils-like builds using a bundled build 
tool"  => If I understand this correctly, does that mean that Bento can 
build a distutils project with the distutils Metadata ?


If this is the case it means that there a piece of function that 
translates Distutils metadata into something Bento deals with.


That's the part I am interested in for interoperability.





Dag
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/ziade.tarek%40gmail.co

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/21/2012 09:05 PM, Tarek Ziadé wrote:

On 6/21/12 4:26 PM, Dag Sverre Seljebotn wrote:



project should give me so I can compile its extensions ?" I think this
has nothing to do with the tools/implementations.

If you sit down and ask your self: "what are the information a python

I'm not sure if I understand. A project can't "give the information
needed to build it". The build system is an integrated piece of the
code and package itself. Making the build of library X work on some
ugly HPC setup Y is part of the development of X.

To my mind a solution looks something like (and Bento is close to this):

Step 1) "Some standard" to do configuration of a package (--prefix and
other what-goes-where options, what libraries to link with, what
compilers to use...)

Step 2) Launch the package's custom build system (may be Unix shell
script or makefile in some cases (sometimes portability is not a
goal), may be a waf build)

Step 3) "Some standard" to be able to cleanly
install/uninstall/upgrade the product of step 2)

An attempt to do Step 2) in a major way in the packaging framework
itself, and have the package just "declare" its C extensions, would
not work. It's fine to have a way in the packaging framework that
works for trivial cases, but it's impossible to create something that
works for every case.


I think we should, as you proposed, list a few projects w/ compilation
needs -- from the simplest to the more complex, then see how a standard
*description* could be used by any tool


It's not clear to me what you mean by description. Package metadata, 
install information or description of what/how to build?


I hope you don't mean the latter, that would be insane...it would 
effectively amount to creating a build tool that's both more elegant and 
more powerful than any option that's currently already out there.


Assuming you mean the former, that's what David did to create Bento. 
Reading and understanding Bento and the design decisions going into it 
would be a better use of time than redoing a discussion, and would at 
least be a very good starting point.


But anyway, some project types from simple to advanced:

 - Simple library using Cython + NumPy C API
 - Wrappers around HPC codes like mpi4py, petsc4py
 - NumPy
 - SciPy (uses Fortran compilers too)
 - Library using code generation, Cython, NumPy C API, Fortran 90 code, 
some performance tuning with CPU characteristics (instruction set, cache 
size, optimal loop structure) decided compile-time



And if we're able to write down in a PEP this, e.g. the information a
compiler is looking for to do its job, then any tool out there waf,
scons, cmake, jam, ant, etc, can do the job, no ?




Anyway: I really don't want to start a flame-war here. So let's accept
up front that we likely won't agree here; I just wanted to clarify my
position.

After 4 years I still don't understand what "we won't agree" means in
this context. *NO ONE* ever ever came and told me : here's what I want a
Python project to describe for its extensions.


That's unfortunate. To be honest, it's probably partly because it's
easier to say what won't work than come with a constructive
suggestion. A lot of people (me included) just use
waf/cmake/autotools, and forget about making the code installable
through PyPI or any of the standard Python tools. Just because that
works *now* for us, but we don't have any good ideas for how to make
this into something that works on a wider scale.

I think David is one of the few who has really dug into the matter and
tried to find something that can both do builds and work through
standard install mechanisms. I can't answer for why you haven't been
able to understand one another.

It may also be an issue with how much one can constructively do on
mailing lists. Perhaps the only route forward is to to bring people
together in person and walk distutils2 people through some hairy
scientific HPC builds (and vice versa).


Like versions scheme, I think it's fine if you guys have a more complex
system to build software. But there should be a way to share a common
standard for complation, even if people that uses distutils2 or xxx, are
just doing the dumbest things, like simple C libs compilation.




Just "we won't agree" or "distutils sucks" :)


Gosh I hope we will overcome this lock one day, and move forward :D


Well, me too.

The other thing is, the folks in distutils2 and myself, have zero
knowledge about compilers. That's why we got very frustrated not to see
people with that knowledge come and help us in this area.


Here's the flip side: If you have zero knowledge about compilers, it's 
going to be almost impossible to have a meaningful discussion about a 
compilation PEP. It's very hard to discuss standards unless everybody 
involved have the necessary prerequisite knowledge. You don't go 
discussing details of the Linux kernel without some solid C experience 
either.


The necessary prerequisites in this case is not merely "knowledge of 
co

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Thu, Jun 21, 2012 at 4:01 PM, Paul Moore  wrote:

> End users should not need packaging tools on their machines.
>

Well, unless they're developers.  ;-)  Sometimes, the "end user" is a
developer making use of a library.


Development tools like distutils2, distribute/setuptools, bento would
> *only* be needed on developer machines, and would be purely developer
> choice. They would all interact with end users via the
> stdlib-supported standard formats. They could live outside the stdlib,
> and developers could use whichever tool suited them.
>

AFAIK, this was the goal behind setup.cfg in packaging, and it's a goal I
agree with.



> This is a radical idea in that it does not cater for the "zipped up
> development directory as a distribution format" mental model that
> current Python uses. That model could still work, but only if all the
> tools generated a stdlib-supported build definition


Again, packaging's setup.cfg is, or should be, this.  I think there are
some technical challenges with the current state of setup.cfg, but AFAIK
they aren't anything insurmountable.

(Background: the general idea is that setup.cfg contains "hooks", which
name Python callables to be invoked at various stages of the process.
These hooks can dynamically add to the setup.cfg data, e.g. to list
newly-built files, binaries, etc., as well as to do any actual building.)


PS I know that setuptools includes some end-user aspects -
> multi-versioning, entry points and optional dependencies, for example.
> Maybe these are needed - personally, I have never had a need for any
> of these, so I'm not the best person to comment.
>

Entry points are a developer tool, and cross-project co-ordination
facility.  They allow packages to advertise classes, modules, functions,
etc. that other projects may wish to import and use in a programmatic way.
For example, a web framework may say, "if you want to provide a page
template file format, register an entry point under this naming convention,
and we will automatically use it when a template has a matching file
extension."  So entry points are not really consumed by end users;
libraries and frameworks use them as ways to dynamically co-ordinate with
other installed libraries, plugins, etc.

Optional dependencies ("extras"), OTOH, are for end-user convenience: they
allow an author to suggest configurations that might be of interest.
Without them, people have to do things like this:

  http://pypi.python.org/pypi/celery-with-couchdb

in order to advertise what else should be installed.  If Celery were
instead to list its couchdb and SQLAlchemy requirements as "extras" in
setup.py, then one could "easy_install celery[couchdb]" or "easy_install
celery[sqla]" instead of needing to register separate project names on PyPI
for each of these scenarios.

As it happens, however, two of the most popular setuptools add-ons (pip and
buildout) either did not or still do not support "extras", because they
were not frequently used.  Unfortunately, this meant that projects had to
do things like setup dummy projects on PyPI, because the popular tools
didn't support the scenario.

In short, nobody's likely to mourn the passing of extras to any great
degree.  They're a nice idea, but hard to bootstrap into use due to the
chicken-and-egg problem.  If you don't know what they're for, you won't use
them, and without common naming conventions (like mypackage[c_speedups] or
mypackage[test_support]), nobody will get used to asking for them.  I think
at some point we will end up reinventing them, but essentially the
challenge is that they are a generalized solution to a variety of small
problems that are not individually very motivating to anybody.  They were
only motivating to me in the aggregate because I saw lots of individual
people being bothered by their particular variation on the theme of
auxiliary dependencies or recommended options.

As for multi-versioning, it's pretty clearly a dead duck, a
proof-of-concept that was very quickly obsoleted by buildout and
virtualenv.  Buildout is a better implementation of multi-versioning for
actual scripts, and virtualenvs work fine for people who haven't yet
discovered the joys of buildout.  (I'm a recent buildout convert, in case
you can't tell.  ;-) )
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Paul Moore
Can I take a step back and make a somewhat different point.

Developer requirements are very relevant, sure. But the most important
requirements are those of the end user. The person who simply wants to
*use* a distribution, couldn't care less how it was built, whether it
uses setuptools, or whatever.

End users should not need packaging tools on their machines.

At the moment, to install from source requires the tools the developer
chooses to use for his convenience (distribute/setuptools,
distutils2/packaging, bento) to be installed on the target machine.
And binary installers are only normally available for code that needs
a C extension, and in that case the developer's choice is still
visible in terms of the binary format provided.

I would argue that we should only put *end user* tools in the stdlib.
- A unified package format, suitable for binaries, but also for pure
Python code that wants to ship that way.
- Installation management tools (download, install, remove, list, and
dependency management) that handle the above package format
- Maybe support in the package format and/or installation tools for
managing "wrapper executables" for executable scripts in distributions

Development tools like distutils2, distribute/setuptools, bento would
*only* be needed on developer machines, and would be purely developer
choice. They would all interact with end users via the
stdlib-supported standard formats. They could live outside the stdlib,
and developers could use whichever tool suited them.

This is a radical idea in that it does not cater for the "zipped up
development directory as a distribution format" mental model that
current Python uses. That model could still work, but only if all the
tools generated a stdlib-supported build definition (which could
simply be a Python script that runs the various compile/copy commands,
plus some compiler support classes in the stdlib) in the same way that
lex/yacc generate C, and projects often distribute the generated C
along with the grammar files.

Legacy support in the form of distutils, converters from bdist_xxx
formats to the new binary format, and maybe pip-style "hide the
madness under a unified interface" tools could support this, either in
the stdlib or as 3rd party tools.

I realise this is probably too radical to happen, but at least, it
might put the debate into context if people try to remember that end
users, as well as package developers, are affected by this (and there
are a lot more end users than package developers...).

Paul.

PS I know that setuptools includes some end-user aspects -
multi-versioning, entry points and optional dependencies, for example.
Maybe these are needed - personally, I have never had a need for any
of these, so I'm not the best person to comment.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 7:56 PM, Chris McDonough wrote:

...

think any API has been removed or modified.


In my opinion, distribute is the only project that should go forward
since it's actively maintained and does not suffer from the bus factor.

Yeah the biggest difference is Py3 compat, other than that afaik I don't

I'm not too interested in the drama/history of the fork situation
You are the one currently adding drama by asking for a new setuptools 
release and saying distribute is diverging.



so I don't care whether setuptools has the fix or distribute has it or 
both have it, but being able to point at some package which doesn't 
prevent folks from overriding sys.path ordering using PYTHONPATH would 
be a good thing.



It has to be in Distribute if we want it in most major Linux distros.

And as I proposed to PJE I think the best thing would be to have a 
single project code base, working with Py3 and receiving maintenance 
fixes with several maintainers.


Since it's clear we're not going to add feature in any of the projects, 
I think we can safely trust a larger list of maintainers, and just keep 
the project working until the replacement is used



- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/ziade.tarek%40gmail.com


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 4:26 PM, Dag Sverre Seljebotn wrote:



project should give me so I can compile its extensions ?" I think this
has nothing to do with the tools/implementations.

If you sit down and ask your self: "what are the information a python

I'm not sure if I understand. A project can't "give the information 
needed to build it". The build system is an integrated piece of the 
code and package itself. Making the build of library X work on some 
ugly HPC setup Y is part of the development of X.


To my mind a solution looks something like (and Bento is close to this):

 Step 1) "Some standard" to do configuration of a package (--prefix 
and other what-goes-where options, what libraries to link with, what 
compilers to use...)


 Step 2) Launch the package's custom build system (may be Unix shell 
script or makefile in some cases (sometimes portability is not a 
goal), may be a waf build)


 Step 3) "Some standard" to be able to cleanly 
install/uninstall/upgrade the product of step 2)


An attempt to do Step 2) in a major way in the packaging framework 
itself, and have the package just "declare" its C extensions, would 
not work. It's fine to have a way in the packaging framework that 
works for trivial cases, but it's impossible to create something that 
works for every case.


I think we should, as you proposed, list a few projects w/ compilation 
needs -- from the simplest to the more complex, then see how a standard 
*description* could be used by any tool







And if we're able to write down in a PEP this, e.g. the information a
compiler is looking for to do its job, then any tool out there waf,
scons, cmake, jam, ant, etc, can do the job, no ?




Anyway: I really don't want to start a flame-war here. So let's accept
up front that we likely won't agree here; I just wanted to clarify my
position.

After 4 years I still don't understand what "we won't agree" means in
this context. *NO ONE* ever ever came and told me : here's what I want a
Python project to describe for its extensions.


That's unfortunate. To be honest, it's probably partly because it's 
easier to say what won't work than come with a constructive 
suggestion. A lot of people (me included) just use 
waf/cmake/autotools, and forget about making the code installable 
through PyPI or any of the standard Python tools. Just because that 
works *now* for us, but we don't have any good ideas for how to make 
this into something that works on a wider scale.


I think David is one of the few who has really dug into the matter and 
tried to find something that can both do builds and work through 
standard install mechanisms. I can't answer for why you haven't been 
able to understand one another.


It may also be an issue with how much one can constructively do on 
mailing lists. Perhaps the only route forward is to to bring people 
together in person and walk distutils2 people through some hairy 
scientific HPC builds (and vice versa).


Like versions scheme, I think it's fine if you guys have a more complex 
system to build software. But there should be a way to share a common 
standard for complation, even if people that uses distutils2 or xxx, are 
just doing the dumbest things, like simple C libs compilation.





Just "we won't agree" or "distutils sucks" :)


Gosh I hope we will overcome this lock one day, and move forward :D


Well, me too.
The other thing is, the folks in distutils2 and myself, have zero 
knowledge about compilers. That's why we got very frustrated not to see 
people with that knowledge come and help us in this area.


So, I reiterate my proposal, and it could also be expressed like this:

1/ David writes a PEP where he describes how Bento interact with a 
project -- metadata, description files, etc
2/ Someone from distutils2 completes the PEP by describing how setup.cfg 
works wrt Extensions
3/ we see if we can have a common standard even if it's a subset of 
bento capabilities







Dag
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/ziade.tarek%40gmail.com


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 7:49 PM, PJ Eby wrote:
On Thu, Jun 21, 2012 at 1:20 PM, Tarek Ziadé > wrote:


telling us no one that is willing to maintain setuptools is able
to do so. (according to him)


Perhaps there is some confusion or language barrier here: what I said 
at that time was that the only people who I already *knew* to be 
capable of taking on full responsibility for *continued development* 
of setuptools, were not available/interested in the job, to my knowledge.


Specifically, the main people I had in mind were Ian Bicking and/or 
Jim Fulton, both of whom had developed extensions to or significant 
chunks of setuptools' functionality themselves, during which they 
demonstrated exemplary levels of understanding both of the code base 
and the wide variety of scenarios in which that code base had to 
operate.  They also both demonstrated conservative, user-oriented 
design choices, that made me feel comfortable that they would not do 
anything to disrupt the existing user base, and that if they made any 
compatibility-breaking changes, they would do so in a way that avoided 
disruption.  (I believe I also gave Philip Jenvey as an example of 
someone who, while not yet proven at that level, was someone I 
considered a good potential candidate as well.)


This was not a commentary on anyone *else's* ability, only on my 
then-present *knowledge* of clearly-suitable persons and their 
availability, or lack thereof.
Yes, so I double-checked my sentence, I think we are in agreement: you 
would not let folks that *wanted* to maintain it back then, do it.  
Sorry if this was not clear to you.


But let's forget about this, old story I guess.




I would guess that the pool of qualified persons is even larger now, 
but the point is moot: my issue was never about who would "maintain" 
setuptools, but who would *develop* it.


And I expect that we would at this point agree that future 
*development* of setuptools is not something either of us are seeking. 
Rather, we should be seeking to develop tools that can properly 
supersede it.


This is why I participated in Distutils-SIG discussion of the various 
packaging PEPs, and hope to see more of them there.


I definitely agree, and I think your feedback on the various PEPs were 
very important.


My point is just that, we could (and *should*) in my opinion, merge back 
setuptools and distribute, just to have a py3-enabled setuptools that is 
in maintenance mode,


and work on the new stuff in packaging besides it.

the merged setuptools/distribute project could also be the place were we 
start to do the work to be compatible with the new standards


That's my proposal.


Tarek
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] import too slow on NFS based systems

2012-06-21 Thread PJ Eby
On Thu, Jun 21, 2012 at 10:08 AM, Daniel Braniss wrote:

> > On Thu, 21 Jun 2012 13:17:01 +0300
> > Daniel Braniss  wrote:
> > > Hi,
> > > when lib/python/site-packages/ is accessed via NFS, open/stat/access
> is very
> > > expensive/slow.
> > >
> > > A simple solution is to use an in memory directory search/hash, so I
> was
> > > wondering if this has been concidered in the past, if not, and I come
> > > with a working solution for Unix (at least Linux/Freebsd) will it be
> > > concidered.
> >
> > There is such a thing in Python 3.3, although some stat() calls are
> > still necessary to know whether the directory caches are fresh.
> > Can you give it a try and provide some feedback?
>
> WOW!
> with a sample python program:
>
> in 2.7 there are:
>stats   open
>27369037
> in 3.3
>288 57
>
> now I have to fix my 2.7 to work with 3.3 :-)
>
> any chance that this can be backported to 2.7?
>

As Antoine says, not in the official release.  You can, however, speed
things up substantially in 2.x by zipping the standard library and placing
it in the location given in the default sys.path, e.g.:

# python2.7
Python 2.7 (r27:82500, May  5 2011, 11:50:25)
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> [p for p in sys.path if p.endswith('.zip')]
['/usr/lib/python27.zip']

If you include a compiled 'sitecustomize.py' in this zipfile, you would
also be able to implement a caching importer based on the default one in
pkgutil, to take up the rest of the slack.  I've previously posted sketches
of such importers; they're not that complicated to implement.  It's just
that if you don't *also* zip up the standard library, your raw interpreter
start time won't get much benefit.

(To be clear, creating the zipfile will only speed up stdlib imports,
nothing else; you'll need to implement a caching importer to get any
benefit for site-packages imports.)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 01:20 PM, Tarek Ziadé wrote:

On 6/21/12 6:44 PM, Chris McDonough wrote:




Yes. At the very least, there will be updated development snapshots
(which are what buildout uses anyway).

(Official releases are in a bit of a weird holding pattern.
distribute's versioning scheme leads to potential confusion: if I
release e.g. 0.6.1, then it sounds like it's a lesser version than
whatever distribute is up to now. OTOH, releasing a later version
number than distribute implies that I'm supporting their feature
enhancements, and I really don't want to add new features to 0.6... but
don't have time right now to clean up all the stuff I started in the 0.7
line either, since I've been *hoping* that the work on packaging would
make 0.7 unnecessary. And let's not even get started on the part where
system-installed copies of distribute can prevent people from
downloading or installing setuptools in the first place.)



Welp, I don't want to get in the middle of that whole mess. But maybe
the distribute folks would be kind enough to do a major version bump
in their next release; e.g. 1.67 instead of 0.67. That said, I don't
think anyone would be confused by overlapping version numbers between
the two projects.

Oh yeah no problem, if Philip backports all the things we've done like
Py3 compat, and bless more people to maintain setuptools, we can even
discontinue distribute !

If not, I think you are just being joking here -- we don't want to go
back into the lock situation we've suffered for many years were PJE is
the only maintainer then suddenly disappears for a year, telling us no
one that is willing to maintain setuptools is able to do so. (according
to him)



It's known that they have been diverging for a while.

Yeah the biggest difference is Py3 compat, other than that afaik I don't
think any API has been removed or modified.


In my opinion, distribute is the only project that should go forward
since it's actively maintained and does not suffer from the bus factor.


I'm not too interested in the drama/history of the fork situation so I 
don't care whether setuptools has the fix or distribute has it or both 
have it, but being able to point at some package which doesn't prevent 
folks from overriding sys.path ordering using PYTHONPATH would be a good 
thing.


- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Thu, Jun 21, 2012 at 1:20 PM, Tarek Ziadé  wrote:

> telling us no one that is willing to maintain setuptools is able to do so.
> (according to him)


Perhaps there is some confusion or language barrier here: what I said at
that time was that the only people who I already *knew* to be capable of
taking on full responsibility for *continued development* of setuptools,
were not available/interested in the job, to my knowledge.

Specifically, the main people I had in mind were Ian Bicking and/or Jim
Fulton, both of whom had developed extensions to or significant chunks of
setuptools' functionality themselves, during which they demonstrated
exemplary levels of understanding both of the code base and the wide
variety of scenarios in which that code base had to operate.  They also
both demonstrated conservative, user-oriented design choices, that made me
feel comfortable that they would not do anything to disrupt the existing
user base, and that if they made any compatibility-breaking changes, they
would do so in a way that avoided disruption.  (I believe I also gave
Philip Jenvey as an example of someone who, while not yet proven at that
level, was someone I considered a good potential candidate as well.)

This was not a commentary on anyone *else's* ability, only on my
then-present *knowledge* of clearly-suitable persons and their
availability, or lack thereof.

I would guess that the pool of qualified persons is even larger now, but
the point is moot: my issue was never about who would "maintain"
setuptools, but who would *develop* it.

And I expect that we would at this point agree that future *development* of
setuptools is not something either of us are seeking. Rather, we should be
seeking to develop tools that can properly supersede it.

This is why I participated in Distutils-SIG discussion of the various
packaging PEPs, and hope to see more of them there.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Alex Clark

Hi,

On 6/21/12 1:20 PM, Tarek Ziadé wrote:

On 6/21/12 6:44 PM, Chris McDonough wrote:




Yes. At the very least, there will be updated development snapshots
(which are what buildout uses anyway).

(Official releases are in a bit of a weird holding pattern.
distribute's versioning scheme leads to potential confusion: if I
release e.g. 0.6.1, then it sounds like it's a lesser version than
whatever distribute is up to now.  OTOH, releasing a later version
number than distribute implies that I'm supporting their feature
enhancements, and I really don't want to add new features to 0.6...  but
don't have time right now to clean up all the stuff I started in the 0.7
line either, since I've been *hoping* that the work on packaging would
make 0.7 unnecessary.  And let's not even get started on the part where
system-installed copies of distribute can prevent people from
downloading or installing setuptools in the first place.)



Welp, I don't want to get in the middle of that whole mess.  But maybe
the distribute folks would be kind enough to do a major version bump
in their next release; e.g. 1.67 instead of 0.67.  That said, I don't
think anyone would be confused by overlapping version numbers between
the two projects.

Oh yeah no problem, if Philip backports all the things we've done like
Py3 compat, and bless more people to maintain setuptools, we can even
discontinue distribute !

If not, I think you are just being joking here -- we don't want to go
back into the lock situation we've suffered for many years were PJE is
the only maintainer then suddenly disappears for a year, telling us no
one that is willing to maintain setuptools is able to do so. (according
to him)



It's known that they have been diverging for a while.

Yeah the biggest difference is Py3 compat, other than that afaik I don't
think any API has been removed or modified.


In my opinion, distribute is the only project that should go forward
since it's actively maintained and does not suffer from the bus factor.


+1. I can't help but cringe when I read this (sorry, PJ Eby!):

"Official releases are in a bit of a weird holding pattern." due to 
distribute.


Weren't they in a bit of a weird holding pattern before distribute? 
Haven't they always been in a bit of a weird holding pattern?


Let's let setuptools be setuptools and distribute be distribute i.e. as 
long as distribute exists, I don't care at all about setuptools' release 
schedule (c.f. PIL/Pillow) and I like it that way :-). If one day 
setuptools or packaging/distutils2 comes along and fixes everything, 
then distribute can cease to exist.




Alex




--
Alex Clark · http://pythonpackages.com



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: The latest changes from Yury Selivanov. I can almost taste the acceptance!

2012-06-21 Thread Yury Selivanov
On 2012-06-21, at 11:34 AM, Eric Snow wrote:

> On Thu, Jun 21, 2012 at 2:44 AM, larry.hastings
>  wrote:
>> http://hg.python.org/peps/rev/1edf1cecae7d
>> changeset:   4472:1edf1cecae7d
>> user:Larry Hastings 
>> date:Thu Jun 21 01:44:15 2012 -0700
>> summary:
>>  The latest changes from Yury Selivanov.  I can almost taste the acceptance!
>> 
>> files:
>>  pep-0362.txt |  159 +++---
>>  1 files changed, 128 insertions(+), 31 deletions(-)
>> 
>> 
>> diff --git a/pep-0362.txt b/pep-0362.txt
>> --- a/pep-0362.txt
>> +++ b/pep-0362.txt
>> @@ -42,23 +42,58 @@
>>  A Signature object has the following public attributes and methods:
>> 
>>  * return_annotation : object
>> -The annotation for the return type of the function if specified.
>> -If the function has no annotation for its return type, this
>> -attribute is not set.
>> +The "return" annotation for the function. If the function
>> +has no "return" annotation, this attribute is not set.
>> +
>>  * parameters : OrderedDict
>> An ordered mapping of parameters' names to the corresponding
>> -Parameter objects (keyword-only arguments are in the same order
>> -as listed in ``code.co_varnames``).
>> +Parameter objects.
>> +
>>  * bind(\*args, \*\*kwargs) -> BoundArguments
>> Creates a mapping from positional and keyword arguments to
>> parameters.  Raises a ``TypeError`` if the passed arguments do
>> not match the signature.
>> +
>>  * bind_partial(\*args, \*\*kwargs) -> BoundArguments
>> Works the same way as ``bind()``, but allows the omission
>> of some required arguments (mimics ``functools.partial``
>> behavior.)  Raises a ``TypeError`` if the passed arguments do
>> not match the signature.
>> 
>> +* replace(parameters, \*, return_annotation) -> Signature
> 
> Shouldn't it be something like this:
> 
> * replace_(*parameters, [return_annotation]) -> Signature
> 
> Or is parameters supposed to be a dict/OrderedDict of replacements/additions?

No, it's a regular list.  I'd keep the 'parameters' argument plain
due to the usual use-cases (see the tests)

>> +Creates a new Signature instance based on the instance
>> +``replace`` was invoked on.  It is possible to pass different
>> +``parameters`` and/or ``return_annotation`` to override the
>> +corresponding properties of the base signature.  To remove
>> +``return_annotation`` from the copied ``Signature``, pass in
>> +``Signature.empty``.
> 
> Can you likewise remove parameters this way?

No, you have to create a new list (keep it simple?)

>> +
>> +Signature objects are immutable.  Use ``Signature.replace()`` to
>> +make a modified copy:
>> +::
>> +
>> +>>> sig = signature(foo)
>> +>>> new_sig = sig.replace(return_annotation="new return annotation")
>> +>>> new_sig is not sig
>> +True
>> +>>> new_sig.return_annotation == sig.return_annotation
>> +True
> 
> Should be False here, right?

There is a new version of PEP checked-in where this is fixed.
> 
>> +>>> new_sig.parameters == sig.parameters
>> +True
> 
> An example of replacing parameters would also be good here.
> 
>> +
>> +There are two ways to instantiate a Signature class:
>> +
>> +* Signature(parameters, *, return_annotation)
> 
> Same here as with Signature.replace().
> 
>> +Default Signature constructor.  Accepts an optional sequence
>> +of ``Parameter`` objects, and an optional ``return_annotation``.
>> +Parameters sequence is validated to check that there are no
>> +parameters with duplicate names, and that the parameters
>> +are in the right order, i.e. positional-only first, then
>> +positional-or-keyword, etc.
>> +* Signature.from_function(function)
>> +Returns a Signature object reflecting the signature of the
>> +function passed in.
>> +
>>  It's possible to test Signatures for equality.  Two signatures are
>>  equal when their parameters are equal, their positional and
>>  positional-only parameters appear in the same order, and they
>> @@ -67,9 +102,14 @@
>>  Changes to the Signature object, or to any of its data members,
>>  do not affect the function itself.
>> 
>> -Signature also implements ``__str__`` and ``__copy__`` methods.
>> -The latter creates a shallow copy of Signature, with all Parameter
>> -objects copied as well.
>> +Signature also implements ``__str__``:
>> +::
>> +
>> +>>> str(Signature.from_function((lambda *args: None)))
>> +'(*args)'
>> +
>> +>>> str(Signature())
>> +'()'
>> 
>> 
>>  Parameter Object
>> @@ -80,20 +120,22 @@
>>  propose a rich Parameter object designed to represent any possible
>>  function parameter.
>> 
>> -The structure of the Parameter object is:
>> +A Parameter object has the following public attributes and methods:
>> 
>>  * name : str
>> -The name of the parameter as a string.
>> +The name of the parameter as a string.  Must be a valid
>> +python identifier name (with the exce

[Python-Dev] PEP 362 6th edition

2012-06-21 Thread Yury Selivanov
Hello,

The new revision of PEP 362 has been posted:
http://www.python.org/dev/peps/pep-0362/

Summary:

1. Signature & Parameter objects are now immutable

2. Signature.replace() and Parameter.replace()

3. Signature has a new default constructor, which
accepts parameters list and a return_annotation.
Parameters list is checked for the correct order
(i.e. keyword-only before var-keyword, not vice-versa)

The second way to instantiate Signatures is to use 
'from_function', which creates a Signature object
for the passed function.

4. Parameter.__str__

5. Positional-only arguments are rendered in '<>'

6. PEP was updated to include new documentation &
small examples.


The implementation is updated and 100% test covered.
Please see the issue: http://bugs.python.org/issue15008


Open questions:
Just one - Should we rename 'replace()' to 'new()'?  I like
'new()' a bit better - it suggests that we'll get a new object.

-
Yury


PEP: 362
Title: Function Signature Object
Version: $Revision$
Last-Modified: $Date$
Author: Brett Cannon , Jiwon Seo ,
Yury Selivanov , Larry Hastings 

Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 21-Aug-2006
Python-Version: 3.3
Post-History: 04-Jun-2012


Abstract


Python has always supported powerful introspection capabilities,
including introspecting functions and methods (for the rest of
this PEP, "function" refers to both functions and methods).  By
examining a function object you can fully reconstruct the function's
signature.  Unfortunately this information is stored in an inconvenient
manner, and is spread across a half-dozen deeply nested attributes.

This PEP proposes a new representation for function signatures.
The new representation contains all necessary information about a function
and its parameters, and makes introspection easy and straightforward.

However, this object does not replace the existing function
metadata, which is used by Python itself to execute those
functions.  The new metadata object is intended solely to make
function introspection easier for Python programmers.


Signature Object


A Signature object represents the call signature of a function and
its return annotation.  For each parameter accepted by the function
it stores a `Parameter object`_ in its ``parameters`` collection.

A Signature object has the following public attributes and methods:

* return_annotation : object
The "return" annotation for the function. If the function
has no "return" annotation, this attribute is not set.

* parameters : OrderedDict
An ordered mapping of parameters' names to the corresponding
Parameter objects.

* bind(\*args, \*\*kwargs) -> BoundArguments
Creates a mapping from positional and keyword arguments to
parameters.  Raises a ``TypeError`` if the passed arguments do
not match the signature.

* bind_partial(\*args, \*\*kwargs) -> BoundArguments
Works the same way as ``bind()``, but allows the omission
of some required arguments (mimics ``functools.partial``
behavior.)  Raises a ``TypeError`` if the passed arguments do
not match the signature.

* replace(parameters, \*, return_annotation) -> Signature
Creates a new Signature instance based on the instance
``replace`` was invoked on.  It is possible to pass different
``parameters`` and/or ``return_annotation`` to override the
corresponding properties of the base signature.  To remove
``return_annotation`` from the copied ``Signature``, pass in
``Signature.empty``.

Signature objects are immutable.  Use ``Signature.replace()`` to
make a modified copy:
::

>>> def foo() -> None:
... pass
>>> sig = signature(foo)

>>> new_sig = sig.replace(return_annotation="new return annotation")
>>> new_sig is not sig
True
>>> new_sig.return_annotation != sig.return_annotation
True
>>> new_sig.parameters == sig.parameters
True

>>> new_sig = new_sig.replace(return_annotation=new_sig.empty)
>>> hasattr(new_sig, "return_annotation")
False

There are two ways to instantiate a Signature class:

* Signature(parameters, \*, return_annotation)
Default Signature constructor.  Accepts an optional sequence
of ``Parameter`` objects, and an optional ``return_annotation``.
Parameters sequence is validated to check that there are no
parameters with duplicate names, and that the parameters
are in the right order, i.e. positional-only first, then
positional-or-keyword, etc.
* Signature.from_function(function)
Returns a Signature object reflecting the signature of the
function passed in.

It's possible to test Signatures for equality.  Two signatures are
equal when their parameters are equal, their positional and
positional-only parameters appear in the same order, and they
have equal return annotations.

Changes to the Signature object, or to any of its data members,
do not affect the function itself.

Signature also implements ``__s

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 6:44 PM, Chris McDonough wrote:




Yes. At the very least, there will be updated development snapshots
(which are what buildout uses anyway).

(Official releases are in a bit of a weird holding pattern.
distribute's versioning scheme leads to potential confusion: if I
release e.g. 0.6.1, then it sounds like it's a lesser version than
whatever distribute is up to now.  OTOH, releasing a later version
number than distribute implies that I'm supporting their feature
enhancements, and I really don't want to add new features to 0.6...  but
don't have time right now to clean up all the stuff I started in the 0.7
line either, since I've been *hoping* that the work on packaging would
make 0.7 unnecessary.  And let's not even get started on the part where
system-installed copies of distribute can prevent people from
downloading or installing setuptools in the first place.)



Welp, I don't want to get in the middle of that whole mess.  But maybe 
the distribute folks would be kind enough to do a major version bump 
in their next release; e.g. 1.67 instead of 0.67.  That said, I don't 
think anyone would be confused by overlapping version numbers between 
the two projects. 
Oh yeah no problem, if Philip backports all the things we've done like 
Py3 compat, and bless more people to maintain setuptools, we can even 
discontinue distribute !


If not, I think you are just being joking here -- we don't want to go 
back into the lock situation we've suffered for many years were PJE is 
the only maintainer then suddenly disappears for a year, telling us no 
one that is willing to maintain setuptools is able to do so. (according 
to him)




It's known that they have been diverging for a while.
Yeah the biggest difference is Py3 compat, other than that afaik I don't 
think any API has been removed or modified.



In my opinion, distribute is the only project that should go forward 
since it's actively maintained and does not suffer from the bus factor.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 12:26 PM, PJ Eby wrote:

On Thu, Jun 21, 2012 at 11:50 AM, Chris McDonough mailto:chr...@plope.com>> wrote:

On 06/21/2012 11:37 AM, PJ Eby wrote:


On Jun 21, 2012 11:02 AM, "Zooko Wilcox-O'Hearn"
mailto:zo...@zooko.com>
>> wrote:
 >
 > Philip J. Eby provisionally approved of one of the patches,
except for
 > some specific requirement that I didn't really understand how
to fix
 > and that now I don't exactly remember:
 >
 >

http://mail.python.org/__pipermail/distutils-sig/2009-__January/010880.html


 >

I don't remember either; I just reviewed the patch and
discussion, and
I'm not finding what the holdup was, exactly.  Looking at it now, it
looks to me like a good idea...  oh wait, *now* I remember the
problem,
or at least, what needs reviewing.

Basically, the challenge is that it doesn't allow an .egg in a
PYTHONPATH directory to take precedence over that *specific*
PYTHONPATH
directory.

With the perspective of hindsight, this was purely a transitional
concern, since it only *really* mattered for site-packages; anyplace
else you could just delete the legacy package if it was a
problem.  (And
your patch works fine for that case.)

However, for setuptools as it was when you proposed this, it was a
potential backwards-compatibility problem.  My best guess is
that I was
considering the approach for 0.7...  which never got any serious
development time.

(It may be too late to fix the issue, in more than one sense.
  Even if
the problem ceased to be a problem today, nobody's going to
re-evaluate
their position on setuptools, especially if their position
wasn't even
based on a personal experience with the issue.)


A minor backwards incompat here to fix that issue would be
appropriate, if only to be able to say "hey, that issue no longer
exists" to folks who condemn the entire ecosystem based on that bug.
  At least, that is, if there will be another release of setuptools.
  Is that likely?


Yes. At the very least, there will be updated development snapshots
(which are what buildout uses anyway).

(Official releases are in a bit of a weird holding pattern.
distribute's versioning scheme leads to potential confusion: if I
release e.g. 0.6.1, then it sounds like it's a lesser version than
whatever distribute is up to now.  OTOH, releasing a later version
number than distribute implies that I'm supporting their feature
enhancements, and I really don't want to add new features to 0.6...  but
don't have time right now to clean up all the stuff I started in the 0.7
line either, since I've been *hoping* that the work on packaging would
make 0.7 unnecessary.  And let's not even get started on the part where
system-installed copies of distribute can prevent people from
downloading or installing setuptools in the first place.)


Welp, I don't want to get in the middle of that whole mess.  But maybe 
the distribute folks would be kind enough to do a major version bump in 
their next release; e.g. 1.67 instead of 0.67.  That said, I don't think 
anyone would be confused by overlapping version numbers between the two 
projects.  It's known that they have been diverging for a while.


- C

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Thu, Jun 21, 2012 at 11:50 AM, Chris McDonough  wrote:

> On 06/21/2012 11:37 AM, PJ Eby wrote:
>
>>
>> On Jun 21, 2012 11:02 AM, "Zooko Wilcox-O'Hearn" > > wrote:
>>  >
>>  > Philip J. Eby provisionally approved of one of the patches, except for
>>  > some specific requirement that I didn't really understand how to fix
>>  > and that now I don't exactly remember:
>>  >
>>  > http://mail.python.org/**pipermail/distutils-sig/2009-**
>> January/010880.html
>>  >
>>
>> I don't remember either; I just reviewed the patch and discussion, and
>> I'm not finding what the holdup was, exactly.  Looking at it now, it
>> looks to me like a good idea...  oh wait, *now* I remember the problem,
>> or at least, what needs reviewing.
>>
>> Basically, the challenge is that it doesn't allow an .egg in a
>> PYTHONPATH directory to take precedence over that *specific* PYTHONPATH
>> directory.
>>
>> With the perspective of hindsight, this was purely a transitional
>> concern, since it only *really* mattered for site-packages; anyplace
>> else you could just delete the legacy package if it was a problem.  (And
>> your patch works fine for that case.)
>>
>> However, for setuptools as it was when you proposed this, it was a
>> potential backwards-compatibility problem.  My best guess is that I was
>> considering the approach for 0.7...  which never got any serious
>> development time.
>>
>> (It may be too late to fix the issue, in more than one sense.  Even if
>> the problem ceased to be a problem today, nobody's going to re-evaluate
>> their position on setuptools, especially if their position wasn't even
>> based on a personal experience with the issue.)
>>
>
> A minor backwards incompat here to fix that issue would be appropriate, if
> only to be able to say "hey, that issue no longer exists" to folks who
> condemn the entire ecosystem based on that bug.  At least, that is, if
> there will be another release of setuptools.  Is that likely?
>

Yes. At the very least, there will be updated development snapshots (which
are what buildout uses anyway).

(Official releases are in a bit of a weird holding pattern.  distribute's
versioning scheme leads to potential confusion: if I release e.g. 0.6.1,
then it sounds like it's a lesser version than whatever distribute is up to
now.  OTOH, releasing a later version number than distribute implies that
I'm supporting their feature enhancements, and I really don't want to add
new features to 0.6...  but don't have time right now to clean up all the
stuff I started in the 0.7 line either, since I've been *hoping* that the
work on packaging would make 0.7 unnecessary.  And let's not even get
started on the part where system-installed copies of distribute can prevent
people from downloading or installing setuptools in the first place.)

Anyway, changing this in a snapshot release shouldn't be a big concern; the
main user of snapshots is buildout, and buildout doesn't use .pth files
anyway, it just writes scripts that do sys.path manipulation.  (A better
approach, for everything except having stuff importable from the standard
interpreter.)

Of course, the flip side is that it means there won't be many people
testing the fix.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 5:50 PM, Chris McDonough wrote:
A minor backwards incompat here to fix that issue would be 
appropriate, if only to be able to say "hey, that issue no longer 
exists" to folks who condemn the entire ecosystem based on that bug.  
At least, that is, if there will be another release of setuptools.  Is 
that likely?
or simply do that fix in distribute since it's Python 3 compatible -- 
and have setuptools officially discontinued for the sake of clarity.



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 11:45 AM, PJ Eby wrote:


On Jun 21, 2012 10:12 AM, "Chris McDonough" mailto:chr...@plope.com>> wrote:
 > - Install "package resources", which are non-Python source files that
 >  happen to live in package directories.

I love this phrasing, by the way ("non-Python source files").

A pet peeve of mine is the insistence by some people that such files are
"data" and don't belong in package directories, despite the fact that if
you gave them a .py extension and added data="""...""" around them,
they'd be considered part of the code.  A file's name and internal
format aren't what distinguishes code from data; it's the way it's
*used* that matters.

I think "packaging" has swung the wrong way on this particular point,
and that resources and data files should be distinguished in setup.cfg,
with sysadmins *not* being given the option to muck about with resources
-- especially not to install them in locations where they might be
mistaken for something editable.


+1.  A good number of the "package resource" files we deploy are not 
data files at all.  In particular, a lot of them are files which 
represent HTML templates.  These templates are exclusively the domain of 
the software being installed, and considering them explicitly "more 
editable" than the Python source they sit next to in the package 
structure is a grave mistake.  They have exactly the same editability 
candidacy as the Python source files they are mixed in with.


- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 11:37 AM, PJ Eby wrote:


On Jun 21, 2012 11:02 AM, "Zooko Wilcox-O'Hearn" mailto:zo...@zooko.com>> wrote:
 >
 > Philip J. Eby provisionally approved of one of the patches, except for
 > some specific requirement that I didn't really understand how to fix
 > and that now I don't exactly remember:
 >
 > http://mail.python.org/pipermail/distutils-sig/2009-January/010880.html
 >

I don't remember either; I just reviewed the patch and discussion, and
I'm not finding what the holdup was, exactly.  Looking at it now, it
looks to me like a good idea...  oh wait, *now* I remember the problem,
or at least, what needs reviewing.

Basically, the challenge is that it doesn't allow an .egg in a
PYTHONPATH directory to take precedence over that *specific* PYTHONPATH
directory.

With the perspective of hindsight, this was purely a transitional
concern, since it only *really* mattered for site-packages; anyplace
else you could just delete the legacy package if it was a problem.  (And
your patch works fine for that case.)

However, for setuptools as it was when you proposed this, it was a
potential backwards-compatibility problem.  My best guess is that I was
considering the approach for 0.7...  which never got any serious
development time.

(It may be too late to fix the issue, in more than one sense.  Even if
the problem ceased to be a problem today, nobody's going to re-evaluate
their position on setuptools, especially if their position wasn't even
based on a personal experience with the issue.)


A minor backwards incompat here to fix that issue would be appropriate, 
if only to be able to say "hey, that issue no longer exists" to folks 
who condemn the entire ecosystem based on that bug.  At least, that is, 
if there will be another release of setuptools.  Is that likely?


- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 12:59 AM, Chris McDonough  wrote:
> On 06/21/2012 10:30 AM, Nick Coghlan wrote:
>> That will give at least 3 mechanisms for Python code to get onto a system:
>>
>> 1. Python dist ->  converter ->  system package ->  system Python path
>>
>> 2. Python dist ->  system Python installer ->  system Python path
>>
>> 3. Python dist ->  venv Python installer ->  venv Python path
>>
>> While I agree that path 2 should be discouraged for production
>> systems, I don't think it should be prevented altogether (since it can
>> be very convenient on personal systems).
>
>
> I'm not sure under what circumstance 2 and 3 wouldn't do the same thing.  Do
> you have a concrete idea?

Yep, this is what I was talking about in terms of objecting to
installation of *.pth files: I think automatically installing *.pth
files into the system Python path is *wrong* (just like globally
editing PYTHONPATH), and that includes any *.pth files needed for egg
installation.

In a venv however, I assume the entire thing is application specific,
so using *.pth files and eggs for ease of management makes a lot of
sense and I would be fine with using that style of installation by
default.

If the *same* default was going to the used in both places, my
preference would be to avoid *.pth files by default and require them
to be explicitly requested regardless of the nature of the target
environment. I really just wanted to be clear that I don't mind *.pth
files at all in the venv case, because they're not affecting the
runtime state of other applications.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Jun 21, 2012 10:12 AM, "Chris McDonough"  wrote:
> - Install "package resources", which are non-Python source files that
>  happen to live in package directories.

I love this phrasing, by the way ("non-Python source files").

A pet peeve of mine is the insistence by some people that such files are
"data" and don't belong in package directories, despite the fact that if
you gave them a .py extension and added data="""...""" around them, they'd
be considered part of the code.  A file's name and internal format aren't
what distinguishes code from data; it's the way it's *used* that matters.

I think "packaging" has swung the wrong way on this particular point, and
that resources and data files should be distinguished in setup.cfg, with
sysadmins *not* being given the option to muck about with resources --
especially not to install them in locations where they might be mistaken
for something editable.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Jun 21, 2012 11:02 AM, "Zooko Wilcox-O'Hearn" 
wrote:
>
> Philip J. Eby provisionally approved of one of the patches, except for
> some specific requirement that I didn't really understand how to fix
> and that now I don't exactly remember:
>
> http://mail.python.org/pipermail/distutils-sig/2009-January/010880.html
>

I don't remember either; I just reviewed the patch and discussion, and I'm
not finding what the holdup was, exactly.  Looking at it now, it looks to
me like a good idea...  oh wait, *now* I remember the problem, or at least,
what needs reviewing.

Basically, the challenge is that it doesn't allow an .egg in a PYTHONPATH
directory to take precedence over that *specific* PYTHONPATH directory.

With the perspective of hindsight, this was purely a transitional concern,
since it only *really* mattered for site-packages; anyplace else you could
just delete the legacy package if it was a problem.  (And your patch works
fine for that case.)

However, for setuptools as it was when you proposed this, it was a
potential backwards-compatibility problem.  My best guess is that I was
considering the approach for 0.7...  which never got any serious
development time.

(It may be too late to fix the issue, in more than one sense.  Even if the
problem ceased to be a problem today, nobody's going to re-evaluate their
position on setuptools, especially if their position wasn't even based on a
personal experience with the issue.)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: The latest changes from Yury Selivanov. I can almost taste the acceptance!

2012-06-21 Thread Eric Snow
On Thu, Jun 21, 2012 at 2:44 AM, larry.hastings
 wrote:
> http://hg.python.org/peps/rev/1edf1cecae7d
> changeset:   4472:1edf1cecae7d
> user:        Larry Hastings 
> date:        Thu Jun 21 01:44:15 2012 -0700
> summary:
>  The latest changes from Yury Selivanov.  I can almost taste the acceptance!
>
> files:
>  pep-0362.txt |  159 +++---
>  1 files changed, 128 insertions(+), 31 deletions(-)
>
>
> diff --git a/pep-0362.txt b/pep-0362.txt
> --- a/pep-0362.txt
> +++ b/pep-0362.txt
> @@ -42,23 +42,58 @@
>  A Signature object has the following public attributes and methods:
>
>  * return_annotation : object
> -    The annotation for the return type of the function if specified.
> -    If the function has no annotation for its return type, this
> -    attribute is not set.
> +    The "return" annotation for the function. If the function
> +    has no "return" annotation, this attribute is not set.
> +
>  * parameters : OrderedDict
>     An ordered mapping of parameters' names to the corresponding
> -    Parameter objects (keyword-only arguments are in the same order
> -    as listed in ``code.co_varnames``).
> +    Parameter objects.
> +
>  * bind(\*args, \*\*kwargs) -> BoundArguments
>     Creates a mapping from positional and keyword arguments to
>     parameters.  Raises a ``TypeError`` if the passed arguments do
>     not match the signature.
> +
>  * bind_partial(\*args, \*\*kwargs) -> BoundArguments
>     Works the same way as ``bind()``, but allows the omission
>     of some required arguments (mimics ``functools.partial``
>     behavior.)  Raises a ``TypeError`` if the passed arguments do
>     not match the signature.
>
> +* replace(parameters, \*, return_annotation) -> Signature

Shouldn't it be something like this:

* replace_(*parameters, [return_annotation]) -> Signature

Or is parameters supposed to be a dict/OrderedDict of replacements/additions?

> +    Creates a new Signature instance based on the instance
> +    ``replace`` was invoked on.  It is possible to pass different
> +    ``parameters`` and/or ``return_annotation`` to override the
> +    corresponding properties of the base signature.  To remove
> +    ``return_annotation`` from the copied ``Signature``, pass in
> +    ``Signature.empty``.

Can you likewise remove parameters this way?

> +
> +Signature objects are immutable.  Use ``Signature.replace()`` to
> +make a modified copy:
> +::
> +
> +    >>> sig = signature(foo)
> +    >>> new_sig = sig.replace(return_annotation="new return annotation")
> +    >>> new_sig is not sig
> +    True
> +    >>> new_sig.return_annotation == sig.return_annotation
> +    True

Should be False here, right?

> +    >>> new_sig.parameters == sig.parameters
> +    True

An example of replacing parameters would also be good here.

> +
> +There are two ways to instantiate a Signature class:
> +
> +* Signature(parameters, *, return_annotation)

Same here as with Signature.replace().

> +    Default Signature constructor.  Accepts an optional sequence
> +    of ``Parameter`` objects, and an optional ``return_annotation``.
> +    Parameters sequence is validated to check that there are no
> +    parameters with duplicate names, and that the parameters
> +    are in the right order, i.e. positional-only first, then
> +    positional-or-keyword, etc.
> +* Signature.from_function(function)
> +    Returns a Signature object reflecting the signature of the
> +    function passed in.
> +
>  It's possible to test Signatures for equality.  Two signatures are
>  equal when their parameters are equal, their positional and
>  positional-only parameters appear in the same order, and they
> @@ -67,9 +102,14 @@
>  Changes to the Signature object, or to any of its data members,
>  do not affect the function itself.
>
> -Signature also implements ``__str__`` and ``__copy__`` methods.
> -The latter creates a shallow copy of Signature, with all Parameter
> -objects copied as well.
> +Signature also implements ``__str__``:
> +::
> +
> +    >>> str(Signature.from_function((lambda *args: None)))
> +    '(*args)'
> +
> +    >>> str(Signature())
> +    '()'
>
>
>  Parameter Object
> @@ -80,20 +120,22 @@
>  propose a rich Parameter object designed to represent any possible
>  function parameter.
>
> -The structure of the Parameter object is:
> +A Parameter object has the following public attributes and methods:
>
>  * name : str
> -    The name of the parameter as a string.
> +    The name of the parameter as a string.  Must be a valid
> +    python identifier name (with the exception of ``POSITIONAL_ONLY``
> +    parameters, which can have it set to ``None``.)
>
>  * default : object
> -    The default value for the parameter, if specified.  If the
> -    parameter has no default value, this attribute is not set.
> +    The default value for the parameter.  If the parameter has no
> +    default value, this attribute is not set.
>
>  * annotation : object
> -    The annotation for the parameter if s

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris Lambacher
Nick Coghlan  gmail.com> writes:

> 
> The Python community covers a broad spectrum of use cases, and I
> suspect that's one of the big reasons packaging can get so contentious
> - the goals end up being in direct conflict. Currently, I've
> identified at least half a dozen significant communities with very
> different needs (the names aren't meant to be all encompassing, just
> good representatives of each category, and many individuals will span
> multiple categories depending on which hat they're wearing at the
> time):
> 

One set of users not covered by your list is people who need to Cross-Compile
Python to another CPU architecture (i.e. x86 to ARM/PowerPC) for use with 
embedded computers. Distutils does not handle this very well. If you want a 
recent overview of what these users go through you should see my talk from 
PyCon 2012: 
http://pyvideo.org/video/682/cross-compiling-python-c-extensions-for-embedde

-Chris

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 12:02:58 -0300
"Zooko Wilcox-O'Hearn"  wrote:
> 
> Fortunately, this issue is fixable! I opened a bug report and I and a
> others have provided patches that makes setuptools stop doing this
> behavior. This makes the above documentation true again. The negative
> impact on features or backwards-compatibility doesn't seem to be
> great.
> 
> http://bugs.python.org/setuptools/issue53
> 
> Philip J. Eby provisionally approved of one of the patches, except for
> some specific requirement that I didn't really understand how to fix
> and that now I don't exactly remember:
> 
> http://mail.python.org/pipermail/distutils-sig/2009-January/010880.html

These days, I think you should really target distribute, not setuptools.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Zooko Wilcox-O'Hearn
On Thu, Jun 21, 2012 at 12:57 AM, Nick Coghlan  wrote:
>
> Standard assumptions about the behaviour of site and distutils cease to be 
> valid once setuptools is installed
…
> - advocacy for the "egg" format and the associated sys.path changes that 
> result for all Python programs running on a system
…
> System administrators (and developers that think like system administrators 
> when it comes to configuration management) *hate* what setuptools (and 
> setuptools based installers) can do to their systems.

I have extensive experience with this, including quite a few bug
reports and a few patches in setuptools and distribute, plus
maintaining my own fork of setuptools to build and deploy my own
projects, plus interviewing quite a few Python developers about why
they hated setuptools, plus supporting one of them who hates
setuptools even though he and I use it in a build system
(https://tahoe-lafs.org).

I believe that 80% to 90% of the hatred alluded to above is due to a
single issue: the fact that setuptools causes your Python interpreter
to disrespect the PYTHONPATH, in violation of the documentation in
http://docs.python.org/release/2.7.2/install/index.html#inst-search-path
, which says:

"""
The PYTHONPATH variable can be set to a list of paths that will be
added to the beginning of sys.path. For example, if PYTHONPATH is set
to /www/python:/opt/py, the search path will begin with
['/www/python', '/opt/py']. (Note that directories must exist in order
to be added to sys.path; the site module removes paths that don’t
exist.)
"""

Fortunately, this issue is fixable! I opened a bug report and I and a
others have provided patches that makes setuptools stop doing this
behavior. This makes the above documentation true again. The negative
impact on features or backwards-compatibility doesn't seem to be
great.

http://bugs.python.org/setuptools/issue53

Philip J. Eby provisionally approved of one of the patches, except for
some specific requirement that I didn't really understand how to fix
and that now I don't exactly remember:

http://mail.python.org/pipermail/distutils-sig/2009-January/010880.html

Regards,

Zooko
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 10:30 AM, Nick Coghlan wrote:

A tool to generate an OS-specific system package from a Python library
project should be unrelated to a Python distribution *installer*. Instead,
you'd use related tools that understood how to unpack the distribution
packaging format to build one or more package structures. The resulting
structures will be processed and then eventually installed by native OS
install tools.  But the Python distribution installer (e.g easy_install,
pip, or some future similar tool) would just never come into play to create
those structures.  The Python distribution installer and the OS-specific
build tool might share code to introspect and unpack files from the
packaging format, but they'd otherwise have nothing to do with one another.

This seems like the most reasonable separation of concerns to me anyway, and
I'd be willing to work on the code that would be shared by both the
Python-level installer and by OS-level packaging tools.


Right, but if the standard library grows a dist installer (and I think
it eventually should), we're going to need to define how it should
behave when executed with the *system* Python.

That will give at least 3 mechanisms for Python code to get onto a system:

1. Python dist ->  converter ->  system package ->  system Python path

2. Python dist ->  system Python installer ->  system Python path

3. Python dist ->  venv Python installer ->  venv Python path

While I agree that path 2 should be discouraged for production
systems, I don't think it should be prevented altogether (since it can
be very convenient on personal systems).


I'm not sure under what circumstance 2 and 3 wouldn't do the same thing. 
 Do you have a concrete idea?



As far as the scope of the packaging utilities and what they can
install goes, I think the distutils2 folks have done a pretty good job
of defining that with their static metadata format:
http://alexis.notmyidea.org/distutils2/setupcfg.html#files


Yeah definitely a good start.

- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 10:23:25 -
"Armin Ronacher"  wrote:
> Due to an user error on my part I was not using os.readlink correctly. 
> Since links can be relative to their location I think it would make sense
> to provide an os.path.resolve helper that automatically returns the
> absolute path:
> 
> def resolve(filename):
> try:
> target = os.readlink(filename)
> except OSError as e:
> if e.errno == errno.EINVAL:
> return abspath(filename)
> raise
> return normpath(join(dirname(filename), target))

Note that abspath() is buggy in the face of symlinks, for example it
will happily collapse /etc/foo/../bar into /etc/bar, even
though /etc/foo might be a link to /usr/lib/foo

The only safe way to collapse ".." elements is to resolve symlinks.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Phil Vandry

On 2012-06-21 06:23, Armin Ronacher wrote:

Due to an user error on my part I was not using os.readlink correctly.
Since links can be relative to their location I think it would make sense
to provide an os.path.resolve helper that automatically returns the
absolute path:

 def resolve(filename):
 try:
 target = os.readlink(filename)
 except OSError as e:
 if e.errno == errno.EINVAL:
 return abspath(filename)
 raise
 return normpath(join(dirname(filename), target))

The above implementation also does not fail if an entity exists but is not
a link and just returns the absolute path of the given filename in that
case.


It's expensive (not to mention racy) to do this correctly, when any 
component of the pathname (not just the component after the last slash) 
might be a symlink. For example:


mkdir -p foo1/foo2
touch bar
ln -s ../../bar foo1/foo2/symlink
ln -s foo1/foo2 foo

Now try to resolve "foo/symlink" using your function. It produces 
"../bar", which doesn't exist.


Why not just work with the pathname you're given and let the kernel 
worry about resolving it?


-Phil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Fri, Jun 22, 2012 at 12:12 AM, Chris McDonough  wrote:
> On 06/21/2012 09:29 AM, Nick Coghlan wrote:
>>>
>>> My only comment on that is this: Since this is a problem related to the
>>> installation of Python distributions, it should deal with the problems
>>> that
>>> Python developers have more forcefully than non-Python developers and
>>> non-programmers.
>>
>>
>> Thanks to venv, there's an alternative available that may be able to
>> keep both of us happy: split the defaults. For system installs, adopt
>> a vendor-centric, multi-language,
>> easy-to-translate-to-language-neutral-packaging mindset (e.g. avoiding
>> *.pth files by unpacking eggs to the file system). For venv installs,
>> do whatever is most convenient for pure Python developers (e.g.
>> leaving eggs packed and using *.pth files to extend sys.path within
>> the venv).
>
>
> I'd like to agree with this, but I think there's a distinction that needs to
> be made here that's maybe not obvious to everyone.
>
> A tool to generate an OS-specific system package from a Python library
> project should be unrelated to a Python distribution *installer*. Instead,
> you'd use related tools that understood how to unpack the distribution
> packaging format to build one or more package structures. The resulting
> structures will be processed and then eventually installed by native OS
> install tools.  But the Python distribution installer (e.g easy_install,
> pip, or some future similar tool) would just never come into play to create
> those structures.  The Python distribution installer and the OS-specific
> build tool might share code to introspect and unpack files from the
> packaging format, but they'd otherwise have nothing to do with one another.
>
> This seems like the most reasonable separation of concerns to me anyway, and
> I'd be willing to work on the code that would be shared by both the
> Python-level installer and by OS-level packaging tools.

Right, but if the standard library grows a dist installer (and I think
it eventually should), we're going to need to define how it should
behave when executed with the *system* Python.

That will give at least 3 mechanisms for Python code to get onto a system:

1. Python dist -> converter -> system package -> system Python path

2. Python dist -> system Python installer -> system Python path

3. Python dist -> venv Python installer -> venv Python path

While I agree that path 2 should be discouraged for production
systems, I don't think it should be prevented altogether (since it can
be very convenient on personal systems).

As far as the scope of the packaging utilities and what they can
install goes, I think the distutils2 folks have done a pretty good job
of defining that with their static metadata format:
http://alexis.notmyidea.org/distutils2/setupcfg.html#files

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/21/2012 03:23 PM, Tarek Ziadé wrote:

On 6/21/12 2:45 PM, Dag Sverre Seljebotn wrote:


Guido was asked about build issues and scientific software at PyData
this spring, and his take was that "if scientific users have concerns
that are that special, perhaps you just need to go and do your own
thing". Which is what David is doing.

Trailing Q&A session here: http://www.youtube.com/watch?v=QjXJLVINsSA


if you know what you want and have a tool that does it, why bother using
distutils ?

But then, what your community will do with the guy that create packages
with distutils ? just tell him he suck ?

The whole idea is *interoperability*, not the tool used.



Generalizing a bit I think it's "web developers" and "scientists"
typically completely failing to see each others' usecases. I don't
know if that bridge can be crossed through mailing list discussion
alone. I know that David tried but came to a point where he just had
to unsubscribe to distutils-sig.

I was there, and sorry to be blunt, but he came to tell us we had to
drop distutils because it sucked, and left because we did not follow
that path




Sometimes design by committee is just what you want, and sometimes
design by committee doesn't work. ZeroMQ, for instance, is a great
piece of software resulting from dropping out of the AQMP committee.



That will not work. And I will say here again what I think we should do
imho:

1/ take all the packaging PEPs and rework them until everyone is happy
(compilation sucks in distutils ? write a PEP !!!)


I think the only way of making scientists happy is to make the build
tool choice arbitrary (and allow the use of waf, scons, cmake, jam,
ant, etc. for the build). After all, many projects contains more C++
and Fortran code than Python code. (Of course, one could make a PEP
saying that.)

Right now things are so horribly broken for the scientific community
that I'm not sure if one *can* sanely specify PEPs. It's more a
question of playing around and throwing things at the wall and see
what sticks -- 5 years from now one is perhaps in a position where the
problem is really understood and one can write PEPs.

Perhaps the "web developers" are at the PEP-ing stage already. Great
for you. But the usecases are really different.

If you sit down and ask your self: "what are the information a python
project should give me so I can compile its extensions ?" I think this
has nothing to do with the tools/implementations.


I'm not sure if I understand. A project can't "give the information 
needed to build it". The build system is an integrated piece of the code 
and package itself. Making the build of library X work on some ugly HPC 
setup Y is part of the development of X.


To my mind a solution looks something like (and Bento is close to this):

 Step 1) "Some standard" to do configuration of a package (--prefix and 
other what-goes-where options, what libraries to link with, what 
compilers to use...)


 Step 2) Launch the package's custom build system (may be Unix shell 
script or makefile in some cases (sometimes portability is not a goal), 
may be a waf build)


 Step 3) "Some standard" to be able to cleanly 
install/uninstall/upgrade the product of step 2)


An attempt to do Step 2) in a major way in the packaging framework 
itself, and have the package just "declare" its C extensions, would not 
work. It's fine to have a way in the packaging framework that works for 
trivial cases, but it's impossible to create something that works for 
every case.




And if we're able to write down in a PEP this, e.g. the information a
compiler is looking for to do its job, then any tool out there waf,
scons, cmake, jam, ant, etc, can do the job, no ?




Anyway: I really don't want to start a flame-war here. So let's accept
up front that we likely won't agree here; I just wanted to clarify my
position.

After 4 years I still don't understand what "we won't agree" means in
this context. *NO ONE* ever ever came and told me : here's what I want a
Python project to describe for its extensions.


That's unfortunate. To be honest, it's probably partly because it's 
easier to say what won't work than come with a constructive suggestion. 
A lot of people (me included) just use waf/cmake/autotools, and forget 
about making the code installable through PyPI or any of the standard 
Python tools. Just because that works *now* for us, but we don't have 
any good ideas for how to make this into something that works on a wider 
scale.


I think David is one of the few who has really dug into the matter and 
tried to find something that can both do builds and work through 
standard install mechanisms. I can't answer for why you haven't been 
able to understand one another.


It may also be an issue with how much one can constructively do on 
mailing lists. Perhaps the only route forward is to to bring people 
together in person and walk distutils2 people through some hairy 
scientific HPC builds (and vice versa).



Just "we won't 

Re: [Python-Dev] import too slow on NFS based systems

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 17:08:09 +0300
Daniel Braniss  wrote:
> > There is such a thing in Python 3.3, although some stat() calls are
> > still necessary to know whether the directory caches are fresh.
> > Can you give it a try and provide some feedback?
> 
> WOW!
> with a sample python program:
> 
> in 2.7 there are:
>   stats   open
>   27369037
> in 3.3
>   288 57
> 
> now I have to fix my 2.7 to work with 3.3 :-)
> 
> any chance that this can be backported to 2.7?

Not a chance. It is all based on using importlib as the default import
mechanism, and that's a gory piece of work that we wouldn't port in a
bugfix release.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 11:16 PM, Antoine Pitrou  wrote:
> On Thu, 21 Jun 2012 15:04:17 +0200
> Christian Heimes  wrote:
>>
>> How about adding keyword support to OSError and derive the strerror from
>> errno if the second argument is not given?
>
> That's not the original behaviour:
>
> Python 3.2.2+ (3.2:9ef20fbd340f, Oct 15 2011, 21:22:07)
> [GCC 4.5.2] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
 e = OSError(5)
 e.errno
 e.strerror
 str(e)
> '5'
>
>
> I don't mind making this particular compatibility-breaking change,
> though.

+1 from me. Existing code that just passes errno will now get strerror
set automatically, and existing code *can't* just be passing the errno
and filename, since OSError doesn't yet support keyword arguments.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 11:57 PM, Barry Warsaw  wrote:
> On Jun 21, 2012, at 07:48 AM, Chris McDonough wrote:
>
>>I don't know about Red Hat but both Ubuntu and Apple put all kinds of stuff
>>on the default sys.path of the system Python of the box that's related to
>>their software's concerns only.  I don't understand why people accept this
>>but get crazy about the fact that installing a setuptools distribution using
>>easy_install changes the default sys.path.
>
> Frankly, I've long thought that distros like Debian/Ubuntu which rely so much
> on Python for essential system functions should basically have two Python
> stacks.  One would be used for just those system functions and the other would
> be for application deployment.  OTOH, I often hear from application developers
> on Ubuntu that they basically have to build up their own stack *anyway* if
> they want to ensure they've got the right suite of dependencies.  This is
> where tools like virtualenv and buildout on the lower end and chef/puppet/juju
> on the higher end come into play.

Yeah, I liked Hynek's method for blending a Python-centric application
development approach with a system packaging centric configuration
management approach: take an entire virtualenv and package *that* as a
single system package.

Another strategy that can work is application specific system package
repos, but you have to be very committed to a particular OS and
packaging system for that approach to make a lot of sense :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 09:29 AM, Nick Coghlan wrote:

My only comment on that is this: Since this is a problem related to the
installation of Python distributions, it should deal with the problems that
Python developers have more forcefully than non-Python developers and
non-programmers.


Thanks to venv, there's an alternative available that may be able to
keep both of us happy: split the defaults. For system installs, adopt
a vendor-centric, multi-language,
easy-to-translate-to-language-neutral-packaging mindset (e.g. avoiding
*.pth files by unpacking eggs to the file system). For venv installs,
do whatever is most convenient for pure Python developers (e.g.
leaving eggs packed and using *.pth files to extend sys.path within
the venv).


I'd like to agree with this, but I think there's a distinction that 
needs to be made here that's maybe not obvious to everyone.


A tool to generate an OS-specific system package from a Python library 
project should be unrelated to a Python distribution *installer*. 
Instead, you'd use related tools that understood how to unpack the 
distribution packaging format to build one or more package structures. 
The resulting structures will be processed and then eventually installed 
by native OS install tools.  But the Python distribution installer (e.g 
easy_install, pip, or some future similar tool) would just never come 
into play to create those structures.  The Python distribution installer 
and the OS-specific build tool might share code to introspect and unpack 
files from the packaging format, but they'd otherwise have nothing to do 
with one another.


This seems like the most reasonable separation of concerns to me anyway, 
and I'd be willing to work on the code that would be shared by both the 
Python-level installer and by OS-level packaging tools.



One of Python's great virtues is its role as a glue language, and part
of being an effective glue language is playing well with others. That
should apply to packaging&  distribution as well, not just to runtime
bindings to tools written in other languages.

When we add the scientific users into the mix, we're actually getting
to a *third* audience: multi-language developers that want to use
*Python's* packaging utilities for their source and binary
distribution formats.

The Python community covers a broad spectrum of use cases, and I
suspect that's one of the big reasons packaging can get so contentious
- the goals end up being in direct conflict. Currently, I've
identified at least half a dozen significant communities with very
different needs (the names aren't meant to be all encompassing, just
good representatives of each category, and many individuals will span
multiple categories depending on which hat they're wearing at the
time):

Library authors: just want to quickly and easily publish their work on
the Python package index in a way that is discoverable by others and
allows feedback to reach them at their development site

Web developers: creators of Python applications, relying primarily on
other Python software and underlying OS provided functionality,
potentially with some native extensions, that may need to run on
multiple platforms, but can require installation using a language
specific mechanism by technical staff

Rich client developers: creators of Python applications relying
primarily on other Python software and underlying OS provided
functionality, potentially with native extensions, that need to run on
multiple platforms, but must be installed using standard system
utilities for the benefit of non-technical end users

Enterprise developers: creators of Python or mixed language
applications that need to integrate with corporate system
administration policies (including packaging, auditing and
configuration management)

Scientists: creators of Python data analysis and modelling
applications, with complex dependencies on software written in a
variety of other languages and using various build systems

Python embedders: developers that embed a Python runtime inside a
larger application


I think we'll also need to put some limits on the goal independent of 
the union of everything all the audiences require.


Here's some scope suggestions that I believe could be shared by all of 
the audiences you list above except for embedders; I think that use case 
is pretty much separate.  It might also leave "rich client developers" 
wanting, but no more than they're already wanting.


- Install code that can *later be imported*.  This could be pure Python
  code or C code which requires compilation.  But it's not for the
  purpose of compiling and installing completely arbitrary C code to
  arbitrary locations, it's ust written for the purpose of compiling C
  code which then *lives in the installed distribution* to
  provide an importable Python module that lives in the same
  distribution with logic.

- Install "console scripts" which are shell-scripts/batch-files
  that cause some logic written in Python to get ru

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Barry Warsaw
On Jun 21, 2012, at 08:51 AM, Chris McDonough wrote:

>The reason it's disappointing to see OS vendors mutating the default sys.path
>is because they put *very old versions of very common non-stdlib packages*
>(e.g. zope.interface, lxml) on sys.path by default.  The path is tainted out
>of the box for anyone who wants to use the system Python for development of
>newer software.  So at some point they invariably punt to virtualenv or a
>virtualenv-like system where the OS-vendor-provided path is not present.
>
>If Python supported the installation of multiple versions of the same module
>and versioned imports, both PYTHONPATH and virtualenv would be much less
>important.  But given lack of enthusiasm for that, I don't think it's
>reasonable to assume there is only one sys.path on every system.

This is really the key insight that should be driving us IMO.  From the system
vendor point of view, my job is to ensure the *system* works right, and that
everything written in Python that provides system functionality is compatible
with whatever versions of third party Python packages I provide in a
particular OS version.  That's already a hard enough problem, that frankly any
illusions that I can also provide useful versions for higher level
applications that people will deploy on my OS is just madness.

This is why I get lots of people requesting versioned imports, or simply
resorting to venv/buildout/chef/puppet/juju to deploy *their* applications on
the OS.  There's just no other sane way to do it.

I do think Python could do better, but obviously it's a difficult problem.
I suspect that having venv support out of the box in 3.3 will go a long way to
solving some class of these problems.  I don't know if that will be the *only*
answer.

-Barry
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Alex Clark

Hi,

On 6/21/12 7:56 AM, Tarek Ziadé wrote:

On 6/21/12 11:08 AM, Dag Sverre Seljebotn wrote:

...
David Cournapeau's Bento project takes the opposite approach,
everything is explicit and without any magic.

http://cournape.github.com/Bento/

It had its 0.1.0 release a week ago.

Please, I don't want to reopen any discussions about Bento here --
distutils2 vs. Bento discussions have been less than constructive in
the past -- I just wanted to make sure everybody is aware that
distutils2 isn't the only horse in this race. I don't know if there
are others too?


That's *exactly* the kind of approach that has made me not want to
continue.

People are too focused on implementations, and 'how distutils sucks'
'how setuptools sucks' etc 'I'll do better' etc

Instead of having all the folks involved in packaging sit down together
and try to fix the issues together by building PEPs describing what
would be a common set of standards, they want to create their own tools
from scratch.

That will not work.



But you can't tell someone or some group of folks that, and expect them 
to listen. Most times NIH is pejorative[1], but sometimes something 
positive comes out of it.




And I will say here again what I think we should do
imho:

1/ take all the packaging PEPs and rework them until everyone is happy
(compilation sucks in distutils ?  write a PEP !!!)

2/ once we have a consensus, write as many tools as you want, if they
rely on the same standards => interoperability => win.

But I must be naive because everytime I tried to reach people that were
building their own tools to ask them to work with us on the PEPs, all I
was getting was "distutils sucks!'



And that's the best you can do: give your opinion. I understand the 
frustration, but we have to let people succeed and/or fail on their own[2].





It worked with the OS packagers guys though, we have built a great data
files managment system in packaging + the versions (386)



Are you referring to "the" packaging/distutils2 or something else?


Alex


[1] http://en.wikipedia.org/wiki/Not_invented_here
[2] 
http://docs.pythonpackages.com/en/latest/advanced.html#buildout-easy-install-vs-virtualenv-pip









--
Dag Sverre Seljebotn
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/ziade.tarek%40gmail.com





--
Alex Clark · http://pythonpackages.com



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] import too slow on NFS based systems

2012-06-21 Thread Daniel Braniss
> On Thu, 21 Jun 2012 13:17:01 +0300
> Daniel Braniss  wrote:
> > Hi,
> > when lib/python/site-packages/ is accessed via NFS, open/stat/access is very
> > expensive/slow. 
> > 
> > A simple solution is to use an in memory directory search/hash, so I was
> > wondering if this has been concidered in the past, if not, and I come
> > with a working solution for Unix (at least Linux/Freebsd) will it be 
> > concidered.
> 
> There is such a thing in Python 3.3, although some stat() calls are
> still necessary to know whether the directory caches are fresh.
> Can you give it a try and provide some feedback?

WOW!
with a sample python program:

in 2.7 there are:
stats   open
27369037
in 3.3
288 57

now I have to fix my 2.7 to work with 3.3 :-)

any chance that this can be backported to 2.7?

cheers,
danny


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 11:31 PM, PJ Eby  wrote:
> So, if we are to draw any lesson from the past, it would seem to be, "make
> sure that the people who'll be doing the work are actually going to be
> available through to the next Python version".

Thanks for that write-up - I learned quite a few things I didn't know,
even though I was actually around for 2.5 development (the fact I had
less of a vested interest in packaging issues then probably made a big
difference, too).

> After all, if they are not, it may not much matter whether the code is in
> the stdlib or not.  ;-)

Yeah, I think Tarek had the right idea with working through the slow
painful process of reaching consensus from the bottom up, feature by
feature - we just got impatient and tried to skip to the end without
working through the rest of the list.

It's worth reflecting on the progress we've made so far, and looking
ahead to see what else remains

In the standard library for 3.3:
- native namespace packages (PEP 420)
- native venv support (PEP 405)

Packaging tool interoperability standards as Accepted PEPs (may still
require further tweaks):
- updated PyPI metadata standard (PEP 345)
- PyPI enforced orderable dist versioning standard (PEP 386)
- common dist installation database format (PEP 376)

As I noted earlier in the thread, it would be good to see the
components of distutils2/packaging aimed at this interoperability
level split out as a separate utility library that can more easily be
shared between projects (distmeta was my suggested name for such a
PyPI project)

Other components where python-dev has a role to play as an
interoperability clearing house:

- improved command and compiler extension API

Other components where python-dev has a role to play in smoothing the
entry of beginners into the Python ecosystem:
- a package installer shipped with Python to reduce bootstrapping issues
- a pypi client for the standard library
- dependency graph builder
- reduced boilerplate in package definition (setup.cfg should help there)

Other components where standard library inclusion is a "nice-to-have"
but not critical:
- most of the other convenience features in setuptools

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Barry Warsaw
On Jun 21, 2012, at 07:48 AM, Chris McDonough wrote:

>I don't know about Red Hat but both Ubuntu and Apple put all kinds of stuff
>on the default sys.path of the system Python of the box that's related to
>their software's concerns only.  I don't understand why people accept this
>but get crazy about the fact that installing a setuptools distribution using
>easy_install changes the default sys.path.

Frankly, I've long thought that distros like Debian/Ubuntu which rely so much
on Python for essential system functions should basically have two Python
stacks.  One would be used for just those system functions and the other would
be for application deployment.  OTOH, I often hear from application developers
on Ubuntu that they basically have to build up their own stack *anyway* if
they want to ensure they've got the right suite of dependencies.  This is
where tools like virtualenv and buildout on the lower end and chef/puppet/juju
on the higher end come into play.

-Barry
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Vinay Sajip
Chris McDonough  plope.com> writes:

> On 06/21/2012 04:45 AM, Nick Coghlan wrote:
> > A packaging PEP needs to explain:
> > - what needs to be done to eliminate any need for monkeypatching
> > - what's involved in making sure that *.pth are *not* needed by default
> > - making sure that executable code in implicitly loaded *.pth files
> > isn't used *at all*
> 
> I'll note that these goals are completely sideways to any actual 
> functional goal.  It'd be a shame to have monkeypatching going on, but 
> the other stuff I don't think are reasonable goals.  Instead they 
> represent fears, and those fears just need to be managed.

Managed how? Whose functional goals? It's good to have something that works here
and now, but surely there's more to it. Presumably distutils worked for some
value of "worked" up until the point where it didn't, and setuptools needed to
improve on it. Oscar's example shows how setuptools is broken for some use
cases. Nor does it consider, for example, the goals of OS distro packagers in
the same way that packaging has tried to. You're encouraging core devs to use
setuptools, but as most seem to agree that distutils is (quick-)sand and
setuptools is built on sand, it's hard to see setuptools as anything other than
a stopgap, the best we have until something better can be devised.

The command-class based design of distutils and hence setuptools doesn't seem to
be something to bet the future on. As an infrastructure concern, this area of
functionality definitely needs to be supported in the stdlib, even if it's a
painful process getting there. The barriers seem more social than technical, but
hopefully the divide-and-conquer-with-multiple-PEPs approach will prevail.

Regards,

Vinay Sajip

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread PJ Eby
On Wed, Jun 20, 2012 at 11:57 PM, Nick Coghlan  wrote:
>
> Right - clearly enumerating the features that draw people to use
> setuptools over just using distutils should be a key element in any
> PEP for 3.4
>
> I honestly think a big part of why packaging ended up being incomplete
> for 3.3 is that we still don't have a clearly documented answer to two
> critical questions:
> 1. Why do people choose setuptools over distutils?

Some of the reasons:

* Dependencies
* Namespace packages
* Less boilerplate in setup.py (revision control, data files support,
find_packages(), etc.)
* Entry points system for creating extensible applications and frameworks
that need runtime plugin discovery
* Command-line script wrappers
* Binary plugin installation system for apps (i.e. dump eggs in a directory
and let pkg_resources figure out what to put on sys.path)
* "Test" command
* Easy distribution of (and runtime access to) static data resources

Of these, automatic dependency resolution with as close to 100% backward
compatibility for installing other projects on PyPI was almost certainly
the #1 factor driving setuptools' initial adoption.  The 20% that drives
the 80%, as it were.  The rest are the 80% that brings in the remaining 20%.

>
> 2. What's wrong with setuptools that meant the idea of including it
> directly in the stdlib was ultimately dropped and eventually replaced
> with the goal of incorporating distutils2?

Based on the feedback from Python-Dev, I withdrew setuptools from 2.5
because of what I considered valid concerns raised regarding:

1. Lack of available persons besides myself familiar with the code base and
design
2. Lack of design documents to remedy #1
3. Lack of unified end-user documentation

And there was no time for me to fix all of that before 2.5 came out,
although I did throw together the EggFormats documentation.  After that,
the time window where I was being paid (by OSAF) for setuptools
improvements came to an end, and other projects started taking precedence.

Since then, setuptools *itself* has become stable legacy code in much the
same way that the distutils has: pip, buildout, and virtualenv all built on
top of it, as it built on top of the distutils.  Problem #3 remains, but at
least now there are other people working on the codebase.

>   If the end goal is "the bulk of the setuptools feature set
> without the problematic features and default behaviours that make
> system administrators break out the torches and pitchforks", then we
> should *write that down* (and spell out the implications) rather than
> assuming that everyone knows the purpose of the exercise.

That's why I brought this up.  ISTM that far too much of the knowledge of
what those use cases and implications are, has been either buried in my
head or spread out among diverse user communities in the past.

Luckily, a lot of people from those communities are now getting
considerably more involved in this effort.  At the time of, say, the 2.5
setuptools question, there wasn't anybody around but me who was able to
argue the "why eggs are good and useful" side of the discussion, for
example.

(If you look back to the early days of setuptools, I often asked on
distutils-sig for people who could help assemble specs for various
things...  which I ended up just deciding for myself, because nobody was
there to comment on them.  It took *years* of setuptools actually being in
the field and used before enough people knew enough to *want* to take part
in the design discussions.  The versioning and metadata PEPs were things I
asked about many years prior, but nobody knew what they wanted yet, or even
knew yet why they should care.)

Similarly, in the years since then, MvL -- who originally argued against
all things setuptools at 2.5 time -- actually proposed the original
namespace package PEP.

So I don't think it's unfair to say that, seven years ago, the ideas in
setuptools were still a few years ahead of their "time".  Today, console
script generation, virtual environments, namespace packages, entry point
discovery, setup.py-driven testing tools, static file inclusion, etc. are
closer to "of course we should have that/everybody uses that" features,
rather than esoteric oddities.

That being said, setuptools *itself* is not such a good thing.  It was
originally a *private* add-on to distutils (like numpy's distutils
extensions) and a prototyping sandbox for additions to the distutils.
(E.g. setuptools features were added to distutils in 2.4 and 2.5.)  I
honestly didn't think at the time that I was writing those features (or
even the egg stuff), that the *long term* goal would be for those things to
be maintained in a separate package.

Instead, I (rather optimistically) assumed that the value of the approaches
would be self-evident, and copied the way the other setuptools features
were.  (To this day, there are an odd variety of other little experimental
"future distutils enhancements" still living in the setuptools code base,
like

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 10:51 PM, Chris McDonough  wrote:
> Is it reasonable to even assume there is only one-sys.path-to-rule-them-all?
> And that users install "the set of libraries they need" into a common place?
>  This quickly turns into failure, because Python is used for many, many
> tasks, and those tasks sometimes *require conflicting versions of
> libraries*.  This is the root cause of why virtualenv exists and is popular.

And why I'm very happy to see pyvenv make it's way into the standard library :)

> I care about deploying Python-based applications to many platforms.  You
> care about deploying multilanguage-based applications to a single platform.
>  There's going to be conflict there.
>
> My only comment on that is this: Since this is a problem related to the
> installation of Python distributions, it should deal with the problems that
> Python developers have more forcefully than non-Python developers and
> non-programmers.

Thanks to venv, there's an alternative available that may be able to
keep both of us happy: split the defaults. For system installs, adopt
a vendor-centric, multi-language,
easy-to-translate-to-language-neutral-packaging mindset (e.g. avoiding
*.pth files by unpacking eggs to the file system). For venv installs,
do whatever is most convenient for pure Python developers (e.g.
leaving eggs packed and using *.pth files to extend sys.path within
the venv).

One of Python's great virtues is its role as a glue language, and part
of being an effective glue language is playing well with others. That
should apply to packaging & distribution as well, not just to runtime
bindings to tools written in other languages.

When we add the scientific users into the mix, we're actually getting
to a *third* audience: multi-language developers that want to use
*Python's* packaging utilities for their source and binary
distribution formats.

The Python community covers a broad spectrum of use cases, and I
suspect that's one of the big reasons packaging can get so contentious
- the goals end up being in direct conflict. Currently, I've
identified at least half a dozen significant communities with very
different needs (the names aren't meant to be all encompassing, just
good representatives of each category, and many individuals will span
multiple categories depending on which hat they're wearing at the
time):

Library authors: just want to quickly and easily publish their work on
the Python package index in a way that is discoverable by others and
allows feedback to reach them at their development site

Web developers: creators of Python applications, relying primarily on
other Python software and underlying OS provided functionality,
potentially with some native extensions, that may need to run on
multiple platforms, but can require installation using a language
specific mechanism by technical staff

Rich client developers: creators of Python applications relying
primarily on other Python software and underlying OS provided
functionality, potentially with native extensions, that need to run on
multiple platforms, but must be installed using standard system
utilities for the benefit of non-technical end users

Enterprise developers: creators of Python or mixed language
applications that need to integrate with corporate system
administration policies (including packaging, auditing and
configuration management)

Scientists: creators of Python data analysis and modelling
applications, with complex dependencies on software written in a
variety of other languages and using various build systems

Python embedders: developers that embed a Python runtime inside a
larger application

>> setuptools (or, perhaps, easy_install, although I've seen enough posts
>> about eggs being uploaded to PyPI to suspect otherwise), encourages
>> the deployment of system configuration changes that alter the runtime
>> environment of every single Python application executed on the system.
>> That's simply not cool.
>
> Again, it would help if you tried it in anger.  What's the worst that could
> happen?  You might like it! ;-)

Oh, believe me, if I ever had distribution needs that required the
power and flexibility of setuptools, I would reach for it in a
heartbeat (in fact, I already use it today, albeit for tasks that
ordinary distutils could probably handle). That said, I do get to
cheat though - since I don't need to worry about cross-platform
deployment, I can just use the relevant RPM hooks directly :)

You're right that most of my ire should be directed at the default
behaviour of easy_install rather than at setuptools itself, though. I
shall moderate my expressed opinions accordingly.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 2:45 PM, Dag Sverre Seljebotn wrote:


Guido was asked about build issues and scientific software at PyData 
this spring, and his take was that "if scientific users have concerns 
that are that special, perhaps you just need to go and do your own 
thing". Which is what David is doing.


Trailing Q&A session here: http://www.youtube.com/watch?v=QjXJLVINsSA


if you know what you want and have a tool that does it, why bother using 
distutils ?


But then, what your community will do with the guy that create packages 
with distutils ? just tell him he suck ?


The whole idea is *interoperability*, not the tool used.



Generalizing a bit I think it's "web developers" and "scientists" 
typically completely failing to see each others' usecases. I don't 
know if that bridge can be crossed through mailing list discussion 
alone. I know that David tried but came to a point where he just had 
to unsubscribe to distutils-sig.
I was there, and sorry to be blunt,  but he came to tell us we had to 
drop distutils because it sucked, and left because we did not follow 
that path





Sometimes design by committee is just what you want, and sometimes 
design by committee doesn't work. ZeroMQ, for instance, is a great 
piece of software resulting from dropping out of the AQMP committee.




That will not work. And I will say here again what I think we should do
imho:

1/ take all the packaging PEPs and rework them until everyone is happy
(compilation sucks in distutils ? write a PEP !!!)


I think the only way of making scientists happy is to make the build 
tool choice arbitrary (and allow the use of waf, scons, cmake, jam, 
ant, etc. for the build). After all, many projects contains more C++ 
and Fortran code than Python code. (Of course, one could make a PEP 
saying that.)


Right now things are so horribly broken for the scientific community 
that I'm not sure if one *can* sanely specify PEPs. It's more a 
question of playing around and throwing things at the wall and see 
what sticks -- 5 years from now one is perhaps in a position where the 
problem is really understood and one can write PEPs.


Perhaps the "web developers" are at the PEP-ing stage already. Great 
for you. But the usecases are really different.
If you sit down and ask your self: "what are the information a python 
project should give me so I can compile its extensions ?"  I think this 
has nothing to do with the tools/implementations.


And if we're able to write down in a PEP this, e.g. the information a 
compiler is looking for to do its job, then any tool out there waf, 
scons, cmake, jam, ant, etc, can do the job,  no ?





Anyway: I really don't want to start a flame-war here. So let's accept 
up front that we likely won't agree here; I just wanted to clarify my 
position.
After 4 years I still don't understand what "we won't agree" means in 
this context.   *NO ONE* ever ever came and told me : here's what I want 
a Python project to describe for its extensions.


Just "we won't agree" or "distutils sucks"  :)


Gosh I hope we will overcome this lock one day, and move forward :D

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 15:04:17 +0200
Christian Heimes  wrote:
> 
> How about adding keyword support to OSError and derive the strerror from
> errno if the second argument is not given?

That's not the original behaviour:

Python 3.2.2+ (3.2:9ef20fbd340f, Oct 15 2011, 21:22:07) 
[GCC 4.5.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> e = OSError(5)
>>> e.errno
>>> e.strerror
>>> str(e)
'5'


I don't mind making this particular compatibility-breaking change,
though.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Christian Heimes
Am 21.06.2012 14:55, schrieb Nick Coghlan:
> On Thu, Jun 21, 2012 at 9:26 PM, Christian Heimes  wrote:
>> BTW Is there a better way than raise OSError(errno.ELOOP,
>> os.strerror(errno.ELOOP), filename) to raise a correct OSError with
>> errno, errno message and filename? A classmethod like
>> "OSError.from_errno(errno, filename=None) -> proper subclass auf OSError
>> with sterror() set" would reduce the burden for developers. PEP mentions
>> the a similar idea at
>> http://www.python.org/dev/peps/pep-3151/#implementation but this was
>> never implemented.
> 
> According to the C code, it should be working at least for recognised
> errno values:
> 
> http://hg.python.org/cpython/file/009ac63759e9/Objects/exceptions.c#l890
> 
> I can't get it to trigger properly in my local build, though :(

Me neither with the one argument variant:

Python 3.3.0a4+ (default:c3616595dada+, Jun 19 2012, 23:12:25)
[GCC 4.6.3] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import errno
[73872 refs]
>>> type(OSError(errno.ENOENT))

[73877 refs]

It works work two arguments but it doesn't set strerror and filename
correctly:

>>> exc = OSError(errno.ENOENT, "filename")
[73948 refs]
>>> exc
FileNotFoundError(2, 'filename')
[73914 refs]
>>> exc.strerror
'filename'
[73914 refs]
>>> exc.filename
[73914 refs]

OSError doesn't accept keyword args:

>>> OSError(errno.ENOENT, filename="filename")
Traceback (most recent call last):
  File "", line 1, in 
TypeError: OSError does not take keyword arguments


How about adding keyword support to OSError and derive the strerror from
errno if the second argument is not given?

Christian
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 9:26 PM, Christian Heimes  wrote:
> BTW Is there a better way than raise OSError(errno.ELOOP,
> os.strerror(errno.ELOOP), filename) to raise a correct OSError with
> errno, errno message and filename? A classmethod like
> "OSError.from_errno(errno, filename=None) -> proper subclass auf OSError
> with sterror() set" would reduce the burden for developers. PEP mentions
> the a similar idea at
> http://www.python.org/dev/peps/pep-3151/#implementation but this was
> never implemented.

According to the C code, it should be working at least for recognised
errno values:

http://hg.python.org/cpython/file/009ac63759e9/Objects/exceptions.c#l890

I can't get it to trigger properly in my local build, though :(

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 08:21 AM, Nick Coghlan wrote:



Installing a distribution will change behavior whether or not sys.path is
changed as a result.  That's its purpose.


No it won't. An ordinary package will only change the behaviour of
Python applications that import a package by that name. Other Python
applications will be completely unaffected (as it should be).


If a Python application is effected by a change to sys.path which 
doesn't impact modules it uses, then that Python application is plain 
broken, because the developer of that application cannot make 
assumptions about what a user does to sys.path unrelated to the modules 
it requires.  This is completely independent of easy_install.


Any Python application is going to be effected by the installation of a 
distribution that does impact modules it imports, whether sys.path is 
used to change the working set of modules or not.


So what concrete situation are we actually talking about here?


  The code that runs in the .pth
*file* (there's only one that matters: easy_install.pth) just mutates
sys.path.  The end result is this: if you understand how sys.path works, you
understand how eggs work.  Each egg is addded to sys.path.  That's all there
is to it.  It's the same as manually mutating a global PYTHONPATH, except
you don't need to do it.


Yes, it's the same as mutating PYTHONPATH. That's a similarly bad
system global change. Individual libraries do not have the right to
change the sys.path seen on initialisation by every other Python
application on that system.


Is it reasonable to even assume there is only 
one-sys.path-to-rule-them-all? And that users install "the set of 
libraries they need" into a common place?  This quickly turns into 
failure, because Python is used for many, many tasks, and those tasks 
sometimes *require conflicting versions of libraries*.  This is the root 
cause of why virtualenv exists and is popular.


The reason it's disappointing to see OS vendors mutating the default 
sys.path is because they put *very old versions of very common 
non-stdlib packages* (e.g. zope.interface, lxml) on sys.path by default. 
 The path is tainted out of the box for anyone who wants to use the 
system Python for development of newer software.  So at some point they 
invariably punt to virtualenv or a virtualenv-like system where the 
OS-vendor-provided path is not present.


If Python supported the installation of multiple versions of the same 
module and versioned imports, both PYTHONPATH and virtualenv would be 
much less important.  But given lack of enthusiasm for that, I don't 
think it's reasonable to assume there is only one sys.path on every system.


I sympathize, however, with Oscar's report that PYTHONPATH can't the 
setuptools-derived path.  That's indeed a mistake that a future tool 
should not make.



And note that this is not "setuptools" in general.  It's easy_install in
particular.  Everything you've brought up so far I think is limited to
easy_install.  It doesn't happen when you use pip.  I think it's a mistake
that pip doesn't do it, but I think you have to make more accurate
distinctions.


What part of "PR problem" was unclear? setuptools and easy_install are
inextricably linked in everyone's minds, just like pip and distribute.


Hopefully for the purposes of the discussion, folks here can make the 
mental separation between setuptools and easy_install.  We can't help 
what other folks think in the meantime, certainly not solely by making 
technological compromises anyway.



A packaging PEP needs to explain:
- what needs to be done to eliminate any need for monkeypatching
- what's involved in making sure that *.pth are *not* needed by default
- making sure that executable code in implicitly loaded *.pth files
isn't used *at all*


I'll note that these goals are completely sideways to any actual functional
goal.  It'd be a shame to have monkeypatching going on, but the other stuff
I don't think are reasonable goals.  Instead they represent fears, and those
fears just need to be managed.


No, they reflect the mindset of someone with configuration management
and auditing responsibilities for shared systems with multiple
applications installed which may be written in a variety of languages,
not just Python. You may not care about those people, but I do.


I care about deploying Python-based applications to many platforms.  You 
care about deploying multilanguage-based applications to a single 
platform.  There's going to be conflict there.


My only comment on that is this: Since this is a problem related to the 
installation of Python distributions, it should deal with the problems 
that Python developers have more forcefully than non-Python developers 
and non-programmers.



It'd also be useful if other core developers actually tried to use
setuptools in anger.  That'd be a good start towards understanding some of
its tradeoffs.  People can write this stuff down til they're blue in the
face, but if core devs  

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/21/2012 01:56 PM, Tarek Ziadé wrote:

On 6/21/12 11:08 AM, Dag Sverre Seljebotn wrote:

...
David Cournapeau's Bento project takes the opposite approach,
everything is explicit and without any magic.

http://cournape.github.com/Bento/

It had its 0.1.0 release a week ago.

Please, I don't want to reopen any discussions about Bento here --
distutils2 vs. Bento discussions have been less than constructive in
the past -- I just wanted to make sure everybody is aware that
distutils2 isn't the only horse in this race. I don't know if there
are others too?


That's *exactly* the kind of approach that has made me not want to
continue.

People are too focused on implementations, and 'how distutils sucks'
'how setuptools sucks' etc 'I'll do better' etc

Instead of having all the folks involved in packaging sit down together
and try to fix the issues together by building PEPs describing what
would be a common set of standards, they want to create their own tools
from scratch.


Guido was asked about build issues and scientific software at PyData 
this spring, and his take was that "if scientific users have concerns 
that are that special, perhaps you just need to go and do your own 
thing". Which is what David is doing.


Trailing Q&A session here: http://www.youtube.com/watch?v=QjXJLVINsSA

Generalizing a bit I think it's "web developers" and "scientists" 
typically completely failing to see each others' usecases. I don't know 
if that bridge can be crossed through mailing list discussion alone. I 
know that David tried but came to a point where he just had to 
unsubscribe to distutils-sig.


Sometimes design by committee is just what you want, and sometimes 
design by committee doesn't work. ZeroMQ, for instance, is a great piece 
of software resulting from dropping out of the AQMP committee.




That will not work. And I will say here again what I think we should do
imho:

1/ take all the packaging PEPs and rework them until everyone is happy
(compilation sucks in distutils ? write a PEP !!!)


I think the only way of making scientists happy is to make the build 
tool choice arbitrary (and allow the use of waf, scons, cmake, jam, ant, 
etc. for the build). After all, many projects contains more C++ and 
Fortran code than Python code. (Of course, one could make a PEP saying 
that.)


Right now things are so horribly broken for the scientific community 
that I'm not sure if one *can* sanely specify PEPs. It's more a question 
of playing around and throwing things at the wall and see what sticks -- 
5 years from now one is perhaps in a position where the problem is 
really understood and one can write PEPs.


Perhaps the "web developers" are at the PEP-ing stage already. Great for 
you. But the usecases are really different.


Anyway: I really don't want to start a flame-war here. So let's accept 
up front that we likely won't agree here; I just wanted to clarify my 
position.


(Some context: I might have funding to work 2 months full-time on 
distributing Python software on HPC clusters this autumn. It's not 
really related to Bento (or distutils though, more of a client tool 
using those libraries)


Dag Sverre Seljebotn
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 10:19 PM, David Cournapeau  wrote:
>
>
> On Thu, Jun 21, 2012 at 12:58 PM, Nick Coghlan  wrote:
>>
>> On Thu, Jun 21, 2012 at 7:28 PM, David Cournapeau 
>> wrote:
>> > If specifying install dependencies is the killer feature of setuptools,
>> > why
>> > can't we have a very simple module that adds the necessary 3 keywords to
>> > record it, and let 3rd party tools deal with it as they wish ? That
>> > would
>> > not even require speciying the format, and would let us more time to
>> > deal
>> > with the other, more difficult questions.
>>
>> That low level role is filled by PEP 345 (the latest PyPI metadata
>> format, which adds the new fields), PEP 376 (local installation
>> database) and PEP 386 (version numbering schema).
>>
>> The corresponding packaging submodules are the ones that were being
>> considered for retention as a reference implementation in 3.3, but are
>> still slated for removal along with the rest of the package (the
>> reference implementations will remain available as part of distutils2
>> on PyPI).
>
>
> I understand the code is already implemented, but I meant that it may be a
> good idea to have a simple, self-contained module that does just provide the
> necessary bits for the "setuptools killer feature", and let competing tools
> deal with it as they please.

If you're genuinely interested in that prospect, I suggest
collaborating with the distutils2 team to extract the four identified
modules (and any necessary support code) as a "distmeta" project on
PyPI:

distmeta.version — Version number classes
distmeta.metadata — Metadata handling
distmeta.markers — Environment markers
distmeta.database — Database of installed distributions

That will allow faster iteration on the core interoperability
standards prior to reincorporation in 3.4, and explicitly decouple
them from the higher level (more contentious) features.

>> Whatever UI a Python packaging solution presents to a user, it needs
>> to support those 3 PEPs on the back end for interoperability with
>> other tools (including, eventually, the packaging module in the
>> standard library).
>>
>> Your feedback on the commands/compilers design sounds valuable, and I
>> would be very interested in seeing a PEP targeting that aspect of the
>> new packaging module (if you look at the start of this thread, the
>> failure to improve the compiler API is one of the reasons for pulling
>> the code from 3.3).
>
>
> The problem with compilation is not just the way the compiler classes work.
> It it how they interact with commands and the likes, which ends up being
> most of the original distutils code. What's wrong with  distutils is the
> whole underlying model, if one can call that. No PEP will fix the issue if
> the premise is to work within that model.

I don't accept the premise that the 3.4 packaging solution must be
restricted to the distutils semantic model. However, no alternative
strategy has been formally presented to python-dev.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 9:48 PM, Chris McDonough  wrote:
> On 06/21/2012 04:45 AM, Nick Coghlan wrote:
>> And, like it or not, setuptools has a serious PR problem due to the
>> fact it monkeypatches the standard library, uses *.pth files to alter
>> sys.path for every installed application by default, actually *uses*
>> the ability to run code in *.pth files and has hard to follow
>> documentation to boot. I *don't* trust that I fully understand the
>> import system on any machine with setuptools installed, because it is
>> demonstrably happy to install state to the file system that will
>> affect *all* Python programs running on the machine.
>
>
> I don't know about Red Hat but both Ubuntu and Apple put all kinds of stuff
> on the default sys.path of the system Python of the box that's related to
> their software's concerns only.  I don't understand why people accept this
> but get crazy about the fact that installing a setuptools distribution using
> easy_install changes the default sys.path.

Because the vendor gets to decide what goes into the base install of
the OS. If I'm using the system Python, then I expect sys.path to
contain the system paths, just as I expect gcc to be able to see the
system include paths. If I don't want that, I'll use virtualenv or a
completely separate Python installation.

However, when I install a new Python package into site-packages it
*should* just sit there and have zero impact on other Python
applications that don't import that package. As soon as someone
installs a *.pth file, however, that's *no longer the case* - every
Python application on that machine will now be scanning additional
paths for modules whether it wants to or not. It's unnecessary
coupling between components that *should* be completely independent of
each other.

Now, *.pth support in the interpreter certainly cannot be blamed on
setuptools, but encouraging use of a packaging format that effectively
requires them certainly can be.

It's similar to the reason why monkeypatching and global environment
variable modifications (including PYTHONPATH) are a problem: as soon
as you start doing that kind of thing, you're introducing coupling
that *shouldn't exist*. If there is no better solution, then sure, do
it as a near term workaround, but that isn't the same as accepting it
as the long term answer.

> Installing a distribution will change behavior whether or not sys.path is
> changed as a result.  That's its purpose.

No it won't. An ordinary package will only change the behaviour of
Python applications that import a package by that name. Other Python
applications will be completely unaffected (as it should be).

> The code that runs in the .pth
> *file* (there's only one that matters: easy_install.pth) just mutates
> sys.path.  The end result is this: if you understand how sys.path works, you
> understand how eggs work.  Each egg is addded to sys.path.  That's all there
> is to it.  It's the same as manually mutating a global PYTHONPATH, except
> you don't need to do it.

Yes, it's the same as mutating PYTHONPATH. That's a similarly bad
system global change. Individual libraries do not have the right to
change the sys.path seen on initialisation by every other Python
application on that system.

> And note that this is not "setuptools" in general.  It's easy_install in
> particular.  Everything you've brought up so far I think is limited to
> easy_install.  It doesn't happen when you use pip.  I think it's a mistake
> that pip doesn't do it, but I think you have to make more accurate
> distinctions.

What part of "PR problem" was unclear? setuptools and easy_install are
inextricably linked in everyone's minds, just like pip and distribute.

>> A packaging PEP needs to explain:
>> - what needs to be done to eliminate any need for monkeypatching
>> - what's involved in making sure that *.pth are *not* needed by default
>> - making sure that executable code in implicitly loaded *.pth files
>> isn't used *at all*
>
> I'll note that these goals are completely sideways to any actual functional
> goal.  It'd be a shame to have monkeypatching going on, but the other stuff
> I don't think are reasonable goals.  Instead they represent fears, and those
> fears just need to be managed.

No, they reflect the mindset of someone with configuration management
and auditing responsibilities for shared systems with multiple
applications installed which may be written in a variety of languages,
not just Python. You may not care about those people, but I do.

> It'd also be useful if other core developers actually tried to use
> setuptools in anger.  That'd be a good start towards understanding some of
> its tradeoffs.  People can write this stuff down til they're blue in the
> face, but if core devs  don't try the stuff, they'll always fear it.

setuptools (or, perhaps, easy_install, although I've seen enough posts
about eggs being uploaded to PyPI to suspect otherwise), encourages
the deployment of system configuration ch

Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread David Cournapeau
On Thu, Jun 21, 2012 at 12:58 PM, Nick Coghlan  wrote:

> On Thu, Jun 21, 2012 at 7:28 PM, David Cournapeau 
> wrote:
> > If specifying install dependencies is the killer feature of setuptools,
> why
> > can't we have a very simple module that adds the necessary 3 keywords to
> > record it, and let 3rd party tools deal with it as they wish ? That would
> > not even require speciying the format, and would let us more time to deal
> > with the other, more difficult questions.
>
> That low level role is filled by PEP 345 (the latest PyPI metadata
> format, which adds the new fields), PEP 376 (local installation
> database) and PEP 386 (version numbering schema).
>
> The corresponding packaging submodules are the ones that were being
> considered for retention as a reference implementation in 3.3, but are
> still slated for removal along with the rest of the package (the
> reference implementations will remain available as part of distutils2
> on PyPI).
>

I understand the code is already implemented, but I meant that it may be a
good idea to have a simple, self-contained module that does just provide
the necessary bits for the "setuptools killer feature", and let competing
tools deal with it as they please.



> Whatever UI a Python packaging solution presents to a user, it needs
> to support those 3 PEPs on the back end for interoperability with
> other tools (including, eventually, the packaging module in the
> standard library).
>
> Your feedback on the commands/compilers design sounds valuable, and I
> would be very interested in seeing a PEP targeting that aspect of the
> new packaging module (if you look at the start of this thread, the
> failure to improve the compiler API is one of the reasons for pulling
> the code from 3.3).


The problem with compilation is not just the way the compiler classes work.
It it how they interact with commands and the likes, which ends up being
most of the original distutils code. What's wrong with  distutils is the
whole underlying model, if one can call that. No PEP will fix the issue if
the premise is to work within that model.

There are similar kind of arguments around the extensibility of distutils:
it is not just about monkey-patching, but what kind of API you offer to
allow for extensibility, and I think the only way to design this sensibly
is to work on real packages and iterate, not writing a PEP as a first step.

David
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Oscar Benjamin
On 21 June 2012 12:48, Chris McDonough  wrote:

> On 06/21/2012 04:45 AM, Nick Coghlan wrote:
>
>> On Thu, Jun 21, 2012 at 2:44 PM, Chris McDonough
>>  wrote:
>>
>>> All of these are really pretty minor issues compared with the main
>>> benefit
>>> of not needing to ship everything with everything else. The killer
>>> feature
>>> is that developers can specify dependencies and users can have those
>>> dependencies installed automatically in a cross-platform way.  Everything
>>> else is complete noise if this use case is not served.
>>>
>>
>> Cool. This is the kind of thing we need recorded in a PEP - there's a
>> lot of domain knowledge floating around in the heads of packaging
>> folks that needs to be captured so we can know *what the addition of
>> packaging to the standard library is intended to fix*.
>>
>> And, like it or not, setuptools has a serious PR problem due to the
>> fact it monkeypatches the standard library, uses *.pth files to alter
>> sys.path for every installed application by default, actually *uses*
>> the ability to run code in *.pth files and has hard to follow
>> documentation to boot. I *don't* trust that I fully understand the
>> import system on any machine with setuptools installed, because it is
>> demonstrably happy to install state to the file system that will
>> affect *all* Python programs running on the machine.
>>
>
> I don't know about Red Hat but both Ubuntu and Apple put all kinds of
> stuff on the default sys.path of the system Python of the box that's
> related to their software's concerns only.  I don't understand why people
> accept this but get crazy about the fact that installing a setuptools
> distribution using easy_install changes the default sys.path.
>

I don't like the particular way that easy_install modifies sys.path so that
it can no longer be overridden by PYTHONPATH. For a discussion, see:
http://stackoverflow.com/questions/5984523/eggs-in-path-before-pythonpath-environment-variable

The fact that ubuntu does this for some system ubuntu packages has never
bothered me, but the fact that it happens for packages that I install with
easy_install has. The typical scenario would be that I:

1) Install some package X with easy_install.
2) Find a bug or some aspect of X that I want to change and checkout the
latest version from e.g. github.
3) Try to use PYTHONPATH to test the checked out version and find that
easy_install's path modification prevents me from doing so.
4) Run the quickfix script in the stackoverflow question above and consider
not using easy_install for X in future.

Oscar
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/20/12 2:53 PM, Nick Coghlan wrote:

On Wed, Jun 20, 2012 at 9:31 PM, Tarek Ziadé  wrote:

Yeah maybe this subset could be left in 3.3

and we'd remove packaging-the-installer part (pysetup, commands, compilers)

I think it's a good idea !

OK, to turn this into a concrete suggestion based on the packaging docs.





Declare stable, include in 3.3
--
 packaging.version — Version number classes
 packaging.metadata — Metadata handling
 packaging.markers — Environment markers
 packaging.database — Database of installed distributions

I think that's a good subset.

+1 on all of the things you said after

If you succeed on getting the sci people working on "PEP: Distutils 
replacement: Compiling Extension Modules" it will be a big win.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 7:28 PM, David Cournapeau  wrote:
> If specifying install dependencies is the killer feature of setuptools, why
> can't we have a very simple module that adds the necessary 3 keywords to
> record it, and let 3rd party tools deal with it as they wish ? That would
> not even require speciying the format, and would let us more time to deal
> with the other, more difficult questions.

That low level role is filled by PEP 345 (the latest PyPI metadata
format, which adds the new fields), PEP 376 (local installation
database) and PEP 386 (version numbering schema).

The corresponding packaging submodules are the ones that were being
considered for retention as a reference implementation in 3.3, but are
still slated for removal along with the rest of the package (the
reference implementations will remain available as part of distutils2
on PyPI).

Whatever UI a Python packaging solution presents to a user, it needs
to support those 3 PEPs on the back end for interoperability with
other tools (including, eventually, the packaging module in the
standard library).

Your feedback on the commands/compilers design sounds valuable, and I
would be very interested in seeing a PEP targeting that aspect of the
new packaging module (if you look at the start of this thread, the
failure to improve the compiler API is one of the reasons for pulling
the code from 3.3).

If python-dev ends up playing referee on multiple competing PEPs,
that's not necessarily a bad thing. If a consensus solution doesn't
meet the needs of key parties that aren't well served by existing
approaches (specifically, the scientific community, and enterprise
users that want to be able to translate the plethora of language
specific packaging systems to a common format for internal use to
simplify system administration and configuration management and
auditing), then we may as well not bother and let the status quo
continue indefinitely.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Tarek Ziadé

On 6/21/12 11:08 AM, Dag Sverre Seljebotn wrote:

...
David Cournapeau's Bento project takes the opposite approach, 
everything is explicit and without any magic.


http://cournape.github.com/Bento/

It had its 0.1.0 release a week ago.

Please, I don't want to reopen any discussions about Bento here -- 
distutils2 vs. Bento discussions have been less than constructive in 
the past -- I just wanted to make sure everybody is aware that 
distutils2 isn't the only horse in this race. I don't know if there 
are others too?


That's *exactly* the kind of approach that has made me not want to 
continue.


People are too focused on implementations, and 'how distutils sucks'   
'how setuptools sucks' etc 'I'll do better' etc


Instead of having all the folks involved in packaging sit down together 
and try to fix the issues together by building PEPs describing what 
would be a common set of standards, they want to create their own tools 
from scratch.


That will not work. And I will say here again what I think we should do 
imho:


1/ take all the packaging PEPs and rework them until everyone is happy 
(compilation sucks in distutils ?  write a PEP !!!)


2/ once we have a consensus, write as many tools as you want, if they 
rely on the same standards => interoperability => win.


But I must be naive because everytime I tried to reach people that were 
building their own tools to ask them to work with us on the PEPs, all I 
was getting was "distutils sucks!'


It worked with the OS packagers guys though, we have built a great data 
files managment system in packaging + the versions (386)






--
Dag Sverre Seljebotn
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/ziade.tarek%40gmail.com


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Chris McDonough

On 06/21/2012 04:45 AM, Nick Coghlan wrote:

On Thu, Jun 21, 2012 at 2:44 PM, Chris McDonough  wrote:

All of these are really pretty minor issues compared with the main benefit
of not needing to ship everything with everything else. The killer feature
is that developers can specify dependencies and users can have those
dependencies installed automatically in a cross-platform way.  Everything
else is complete noise if this use case is not served.


Cool. This is the kind of thing we need recorded in a PEP - there's a
lot of domain knowledge floating around in the heads of packaging
folks that needs to be captured so we can know *what the addition of
packaging to the standard library is intended to fix*.

And, like it or not, setuptools has a serious PR problem due to the
fact it monkeypatches the standard library, uses *.pth files to alter
sys.path for every installed application by default, actually *uses*
the ability to run code in *.pth files and has hard to follow
documentation to boot. I *don't* trust that I fully understand the
import system on any machine with setuptools installed, because it is
demonstrably happy to install state to the file system that will
affect *all* Python programs running on the machine.


I don't know about Red Hat but both Ubuntu and Apple put all kinds of 
stuff on the default sys.path of the system Python of the box that's 
related to their software's concerns only.  I don't understand why 
people accept this but get crazy about the fact that installing a 
setuptools distribution using easy_install changes the default sys.path.


Installing a distribution will change behavior whether or not sys.path 
is changed as a result.  That's its purpose.  The code that runs in the 
.pth *file* (there's only one that matters: easy_install.pth) just 
mutates sys.path.  The end result is this: if you understand how 
sys.path works, you understand how eggs work.  Each egg is addded to 
sys.path.  That's all there is to it.  It's the same as manually 
mutating a global PYTHONPATH, except you don't need to do it.


And note that this is not "setuptools" in general.  It's easy_install in 
particular.  Everything you've brought up so far I think is limited to 
easy_install.  It doesn't happen when you use pip.  I think it's a 
mistake that pip doesn't do it, but I think you have to make more 
accurate distinctions.



A packaging PEP needs to explain:
- what needs to be done to eliminate any need for monkeypatching
- what's involved in making sure that *.pth are *not* needed by default
- making sure that executable code in implicitly loaded *.pth files
isn't used *at all*


I'll note that these goals are completely sideways to any actual 
functional goal.  It'd be a shame to have monkeypatching going on, but 
the other stuff I don't think are reasonable goals.  Instead they 
represent fears, and those fears just need to be managed.



I *think* trying to achieve this is actually the genesis of the
original distribute fork, that subsequently became distutils2 as Tarek
discovered how much of the complexity in setuptools was actually due
to the desire to *not* officially fork distutils (and instead
monkeypatch it, effectively creating a runtime fork).

However, for those of us that weren't directly involved, this is all
still a strange mystery dealt with by other people. I've cribbed
together bits and pieces just from following the fragments of the
discussions that have happened on python-dev and at PyCon US, but if
we want the madness to ever stop, then *the problems with the status
quo* need to be written down so that other core developers can
understand them.


It'd also be useful if other core developers actually tried to use 
setuptools in anger.  That'd be a good start towards understanding some 
of its tradeoffs.  People can write this stuff down til they're blue in 
the face, but if core devs  don't try the stuff, they'll always fear it.



In fact, I just remembered that Tarek *has* written a lot of this
down, just not in PEP form: http://www.aosabook.org/en/packaging.html



Cool.

- C
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Fwd: Re: Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Calvin Spealman
-- Forwarded message -- (whoops from my phone)

On Jun 21, 2012 6:32 AM, "Armin Ronacher" 
wrote:
>
> Due to an user error on my part I was not using os.readlink correctly.
> Since links can be relative to their location I think it would make sense
> to provide an os.path.resolve helper that automatically returns the
> absolute path:
>
>def resolve(filename):
>try:
>target = os.readlink(filename)
>except OSError as e:
>if e.errno == errno.EINVAL:
>return abspath(filename)
>raise
>return normpath(join(dirname(filename), target))
>
> The above implementation also does not fail if an entity exists but is not
> a link and just returns the absolute path of the given filename in that
> case.
>

Does it need to be an absolute path, and what if the advantage of that? Can
it returned absolute if that's what you gave it, and relative otherwise?

>
> Regards,
> Armin
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
http://mail.python.org/mailman/options/python-dev/ironfroggy%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Oleg Broytman
On Thu, Jun 21, 2012 at 11:10:44AM -, Armin Ronacher 
 wrote:
> would have to check the POSIX spec for a
> reasonable value

   POSIX allows 8 links:

http://infohost.nmt.edu/~eweiss/222_book/222_book/0201433079/ch02lev1sec5.html

_POSIX_SYMLOOP_MAX - number of symbolic links that can be traversed
during pathname resolution: 8

   The constant _POSIX_SYMLOOP_MAX from unistd.h:

#define _POSIX_SYMLOOP_MAX 8

Oleg.
-- 
 Oleg Broytmanhttp://phdru.name/p...@phdru.name
   Programmers don't die, they just GOSUB without RETURN.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Christian Heimes
Am 21.06.2012 13:10, schrieb Armin Ronacher:
Hello Armin,

> No, but that's a good point.  It should attempt to resolve these in a loop
> until it either loops too often (would have to check the POSIX spec for a
> reasonable value) or until it terminates by finding an actual file or
> directory.

The specs mention sysconf(SYMLOOP_MAX) / _POSIX_SYMLOOP_MAX for the
maximum count of lookups. The limit is lower than I expected. On my
system it's defined as 8 in
/usr/include/x86_64-linux-gnu/bits/posix1_lim.h. The limit would also
handle self referencing loops correctly.

BTW Is there a better way than raise OSError(errno.ELOOP,
os.strerror(errno.ELOOP), filename) to raise a correct OSError with
errno, errno message and filename? A classmethod like
"OSError.from_errno(errno, filename=None) -> proper subclass auf OSError
with sterror() set" would reduce the burden for developers. PEP mentions
the a similar idea at
http://www.python.org/dev/peps/pep-3151/#implementation but this was
never implemented.

Christian
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 11:10:44 -
"Armin Ronacher"  wrote:
> Hi,
> 
> > Am 21.06.2012 12:23, schrieb Armin Ronacher:
> > Does the code handle a chain of absolute and relative symlinks
> > correctly, for example a relative symlink that points to another
> > relative symlink in a different directory that points to a file in a
> > third directry?
> No, but that's a good point.  It should attempt to resolve these in a loop
> until it either loops too often (would have to check the POSIX spec for a
> reasonable value) or until it terminates by finding an actual file or
> directory.

You could take a look at the resolve() algorithm in pathlib:
http://pypi.python.org/pypi/pathlib/

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Armin Ronacher
Hi,

> Am 21.06.2012 12:23, schrieb Armin Ronacher:
> Does the code handle a chain of absolute and relative symlinks
> correctly, for example a relative symlink that points to another
> relative symlink in a different directory that points to a file in a
> third directry?
No, but that's a good point.  It should attempt to resolve these in a loop
until it either loops too often (would have to check the POSIX spec for a
reasonable value) or until it terminates by finding an actual file or
directory.


Regards,
Armin

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Christian Heimes
Am 21.06.2012 12:23, schrieb Armin Ronacher:
> Due to an user error on my part I was not using os.readlink correctly. 
> Since links can be relative to their location I think it would make sense
> to provide an os.path.resolve helper that automatically returns the
> absolute path:
> 
> def resolve(filename):
> try:
> target = os.readlink(filename)
> except OSError as e:
> if e.errno == errno.EINVAL:
> return abspath(filename)
> raise
> return normpath(join(dirname(filename), target))
> 
> The above implementation also does not fail if an entity exists but is not
> a link and just returns the absolute path of the given filename in that
> case.

+1

Does the code handle a chain of absolute and relative symlinks
correctly, for example a relative symlink that points to another
relative symlink in a different directory that points to a file in a
third directry?

Christian
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] import too slow on NFS based systems

2012-06-21 Thread Antoine Pitrou
On Thu, 21 Jun 2012 13:17:01 +0300
Daniel Braniss  wrote:
> Hi,
> when lib/python/site-packages/ is accessed via NFS, open/stat/access is very
> expensive/slow. 
> 
> A simple solution is to use an in memory directory search/hash, so I was
> wondering if this has been concidered in the past, if not, and I come
> with a working solution for Unix (at least Linux/Freebsd) will it be 
> concidered.

There is such a thing in Python 3.3, although some stat() calls are
still necessary to know whether the directory caches are fresh.
Can you give it a try and provide some feedback?

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Add os.path.resolve to simplify the use of os.readlink

2012-06-21 Thread Armin Ronacher
Due to an user error on my part I was not using os.readlink correctly. 
Since links can be relative to their location I think it would make sense
to provide an os.path.resolve helper that automatically returns the
absolute path:

def resolve(filename):
try:
target = os.readlink(filename)
except OSError as e:
if e.errno == errno.EINVAL:
return abspath(filename)
raise
return normpath(join(dirname(filename), target))

The above implementation also does not fail if an entity exists but is not
a link and just returns the absolute path of the given filename in that
case.


Regards,
Armin

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] import too slow on NFS based systems

2012-06-21 Thread Oleg Broytman
On Thu, Jun 21, 2012 at 01:17:01PM +0300, Daniel Braniss  
wrote:
> when lib/python/site-packages/ is accessed via NFS, open/stat/access is very
> expensive/slow. 
> 
> A simple solution is to use an in memory directory search/hash, so I was
> wondering if this has been concidered in the past, if not, and I come
> with a working solution for Unix (at least Linux/Freebsd) will it be 
> concidered.

   I'm sure it'll be considered providing that the solution doesn't slow
down local FS access.

Oleg.
-- 
 Oleg Broytmanhttp://phdru.name/p...@phdru.name
   Programmers don't die, they just GOSUB without RETURN.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] import too slow on NFS based systems

2012-06-21 Thread Daniel Braniss
Hi,
when lib/python/site-packages/ is accessed via NFS, open/stat/access is very
expensive/slow. 

A simple solution is to use an in memory directory search/hash, so I was
wondering if this has been concidered in the past, if not, and I come
with a working solution for Unix (at least Linux/Freebsd) will it be 
concidered.

thanks,
danny


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread David Cournapeau
On Thu, Jun 21, 2012 at 9:45 AM, Nick Coghlan  wrote:

> On Thu, Jun 21, 2012 at 2:44 PM, Chris McDonough  wrote:
> > All of these are really pretty minor issues compared with the main
> benefit
> > of not needing to ship everything with everything else. The killer
> feature
> > is that developers can specify dependencies and users can have those
> > dependencies installed automatically in a cross-platform way.  Everything
> > else is complete noise if this use case is not served.
>
> Cool. This is the kind of thing we need recorded in a PEP - there's a
> lot of domain knowledge floating around in the heads of packaging
> folks that needs to be captured so we can know *what the addition of
> packaging to the standard library is intended to fix*.
>
> And, like it or not, setuptools has a serious PR problem due to the
> fact it monkeypatches the standard library, uses *.pth files to alter
> sys.path for every installed application by default, actually *uses*
> the ability to run code in *.pth files and has hard to follow
> documentation to boot. I *don't* trust that I fully understand the
> import system on any machine with setuptools installed, because it is
> demonstrably happy to install state to the file system that will
> affect *all* Python programs running on the machine.
>
> A packaging PEP needs to explain:
> - what needs to be done to eliminate any need for monkeypatching
> - what's involved in making sure that *.pth are *not* needed by default
> - making sure that executable code in implicitly loaded *.pth files
> isn't used *at all*
>

It is not a PEP, but here are a few reasons why extending distutils is
difficult (taken from our experience in the scipy community, which has by
far the biggest extension of distutils AFAIK):

http://cournape.github.com/Bento/html/faq.html#why-not-extending-existing-tools-distutils-etc

While I believe setuptools has been a net negative for the scipy community
because of the way it works and for the reason you mentioned, I think it is
fair to say it is not really possible to do any differently if you rely on
distutils.

If specifying install dependencies is the killer feature of setuptools, why
can't we have a very simple module that adds the necessary 3 keywords to
record it, and let 3rd party tools deal with it as they wish ? That would
not even require speciying the format, and would let us more time to deal
with the other, more difficult questions.

David
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Dag Sverre Seljebotn

On 06/21/2012 05:57 AM, Nick Coghlan wrote:

On Thu, Jun 21, 2012 at 3:29 AM, PJ Eby  wrote:

On Wed, Jun 20, 2012 at 9:02 AM, Nick Coghlan  wrote:


On Wed, Jun 20, 2012 at 9:46 PM, Antoine Pitrou
wrote:

Agreed, especially if the "proven in the wild" criterion is required
(people won't rush to another third-party distutils replacement, IMHO).


The existence of setuptools means that "proven in the wild" is never
going to fly - a whole lot of people use setuptools and easy_install
happily, because they just don't care about the downsides it has in
terms of loss of control of a system configuration.



Um, this may be a smidge off topic, but what "loss of control" are we
talking about here?  AFAIK, there isn't anything it does that you can't
override with command line options or the config file.  (In most cases,
standard distutils options or config files.)  Do you just mean that most
people use the defaults and don't care about there being other options?  And
if that's the case, which other options are you referring to?


No, I mean there are design choices in setuptools that explain why
many people don't like it and are irritated when software they want to
use depends on it without a good reason. Clearly articulating the
reasons that "just include setuptools" is no longer being considered
as an option should be one of the goals of any PEPs associated with
adding packaging back for 3.4.

The reasons I'm personally aware of:
- it's a unilateral runtime fork of the standard library that bears a
lot of responsibility for the ongoing feature freeze in distutils.
Standard assumptions about the behaviour of site and distutils cease
to be valid once setuptools is installed
- overuse of "*.pth" files and the associated sys.path changes for all
Python programs running on a system. setuptools gleefully encourages
the inclusion of non-trivial code snippets in *.pth files that will be
executed by all programs.
- advocacy for the "egg" format and the associated sys.path changes
that result for all Python programs running on a system
- too much magic that is enabled by default and is hard to switch off
(e.g. http://rhodesmill.org/brandon/2009/eby-magic/)

System administrators (and developers that think like system
administrators when it comes to configuration management) *hate* what
setuptools (and setuptools based installers) can do to their systems.
It doesn't matter that package developers don't *have* to do those
things - what matters is that the needs and concerns of system
administrators simply don't appear to have been anywhere on the radar
when setuptools was being designed. (If those concerns actually were
taken into account at some point, it's sure hard to tell from the end
result and the choices of default behaviour)


David Cournapeau's Bento project takes the opposite approach, everything 
is explicit and without any magic.


http://cournape.github.com/Bento/

It had its 0.1.0 release a week ago.

Please, I don't want to reopen any discussions about Bento here -- 
distutils2 vs. Bento discussions have been less than constructive in the 
past -- I just wanted to make sure everybody is aware that distutils2 
isn't the only horse in this race. I don't know if there are others too?


--
Dag Sverre Seljebotn
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Cannot find the main Python library during installing some app.

2012-06-21 Thread Oleg Broytman
Hello.

   We are sorry but we cannot help you. This mailing list is to work on
developing Python (adding new features to Python itself and fixing bugs);
if you're having problems learning, understanding or using Python, please
find another forum. Probably python-list/comp.lang.python mailing list/news
group is the best place; there are Python developers who participate in it;
you may get a faster, and probably more complete, answer there. See
http://www.python.org/community/ for other lists/news groups/fora. Thank
you for understanding.

On Thu, Jun 21, 2012 at 01:43:37AM -0700, Van Gao  wrote:
> 
>ERROR!
>You probably have to install the development version of the Python
> package
>for your distribution.  The exact name of this package varies among them.
> 

   This is the key. You have to install the development version of the
Python package *for your distribution*, not python from sources.

Oleg.
-- 
 Oleg Broytmanhttp://phdru.name/p...@phdru.name
   Programmers don't die, they just GOSUB without RETURN.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Cannot find the main Python library during installing some app.

2012-06-21 Thread Amaury Forgeot d'Arc
Hi,

This mailing list is for the development *of* Python.
Development *with* Python should be discussed on the python-list mailing
list, or the comp.lang.python usenet group.
There will be many people there willing to answer your question...

2012/6/21 Van Gao 

> hi,
>
> I got the error below during installing the libmobiledevice:
> checking consistency of all components of *python development
> environment...
> no*
> configure: error:
>  Could not link test program to Python. Maybe the main Python library has
> been
>  installed in some non-standard library path. If so, pass it to configure,
>  via the LDFLAGS environment variable.
>  Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"
>
>
> 
>   ERROR!
>   You probably have to install the development version of the Python
> package
>   for your distribution.  The exact name of this package varies among them.
>
>
> 
>
> I have installed the python2.7,  but I cannot find the lib under the
> /usr/local/lib/python2.7, *so where can I get the development version for
> python*?  I downloaded the Python-2.7.3.tgz from python.org, is there any
> different between the development version with the tgz file? Thanks.
>
> --
> View this message in context:
> http://python.6.n6.nabble.com/Cannot-find-the-main-Python-library-during-installing-some-app-tp4979076.html
> Sent from the Python - python-dev mailing list archive at Nabble.com.
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> http://mail.python.org/mailman/options/python-dev/amauryfa%40gmail.com
>



-- 
Amaury Forgeot d'Arc
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Cannot find the main Python library during installing some app.

2012-06-21 Thread Van Gao
hi,

I got the error below during installing the libmobiledevice:
checking consistency of all components of *python development environment...
no*
configure: error: 
  Could not link test program to Python. Maybe the main Python library has
been
  installed in some non-standard library path. If so, pass it to configure,
  via the LDFLAGS environment variable.
  Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"
 

   ERROR!
   You probably have to install the development version of the Python
package
   for your distribution.  The exact name of this package varies among them.
 


I have installed the python2.7,  but I cannot find the lib under the
/usr/local/lib/python2.7, *so where can I get the development version for
python*?  I downloaded the Python-2.7.3.tgz from python.org, is there any
different between the development version with the tgz file? Thanks.

--
View this message in context: 
http://python.6.n6.nabble.com/Cannot-find-the-main-Python-library-during-installing-some-app-tp4979076.html
Sent from the Python - python-dev mailing list archive at Nabble.com.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of packaging in 3.3

2012-06-21 Thread Nick Coghlan
On Thu, Jun 21, 2012 at 2:44 PM, Chris McDonough  wrote:
> All of these are really pretty minor issues compared with the main benefit
> of not needing to ship everything with everything else. The killer feature
> is that developers can specify dependencies and users can have those
> dependencies installed automatically in a cross-platform way.  Everything
> else is complete noise if this use case is not served.

Cool. This is the kind of thing we need recorded in a PEP - there's a
lot of domain knowledge floating around in the heads of packaging
folks that needs to be captured so we can know *what the addition of
packaging to the standard library is intended to fix*.

And, like it or not, setuptools has a serious PR problem due to the
fact it monkeypatches the standard library, uses *.pth files to alter
sys.path for every installed application by default, actually *uses*
the ability to run code in *.pth files and has hard to follow
documentation to boot. I *don't* trust that I fully understand the
import system on any machine with setuptools installed, because it is
demonstrably happy to install state to the file system that will
affect *all* Python programs running on the machine.

A packaging PEP needs to explain:
- what needs to be done to eliminate any need for monkeypatching
- what's involved in making sure that *.pth are *not* needed by default
- making sure that executable code in implicitly loaded *.pth files
isn't used *at all*

I *think* trying to achieve this is actually the genesis of the
original distribute fork, that subsequently became distutils2 as Tarek
discovered how much of the complexity in setuptools was actually due
to the desire to *not* officially fork distutils (and instead
monkeypatch it, effectively creating a runtime fork).

However, for those of us that weren't directly involved, this is all
still a strange mystery dealt with by other people. I've cribbed
together bits and pieces just from following the fragments of the
discussions that have happened on python-dev and at PyCon US, but if
we want the madness to ever stop, then *the problems with the status
quo* need to be written down so that other core developers can
understand them.

In fact, I just remembered that Tarek *has* written a lot of this
down, just not in PEP form: http://www.aosabook.org/en/packaging.html

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com