I've posted about this idea to the list before, but this time I've
finally started working on it and have a concrete plan to discuss :)

The basic idea:

* I want to progressively move the active interoperability
specifications out of PEPs and into a subsection of
packaging.python.org
* packaging PEPs would then become a tool for changing those
specifications, rather than the specifications themselves
* the description of this process would be captured along with the
rest of the PyPA operational docs at pypa.io
* there's a draft PR to start down this path at
https://github.com/pypa/pypa.io/pull/12

That PR provides an example of the potential benefits of this approach
- it's able to state which parts of PEP 345 have been superseded by
other PEPs, and also note the "Provides-Extra" field which exists as a
de facto standard, despite not being formally adopted through the PEP
process.

However, having written the draft PR entirely against pypa.io, I've
now gone back to thinking packaging.python.org would be a better fit
for the actual hosting of the agreed specifications - the "python.org"
domain is a vastly better known one than "pypa.io", and being on a
subdomain of python.org more clearly establishes these
interoperability specifications as the Python Packaging counterparts
to the Python Language Reference and Python Library Reference.

So my next iteration will be split into two PRs: one for pypa.io
defining the specification management process, and one for
packaging.python.org adding a specifications section

Once those changes are merged, we'll end up with additional lower
overhead ways to handle minor updates to the specifications: pure
clarifications can be handled through PRs and issues against
packaging.python.org, minor updates and/or updating the specifications
to match real world practices can be handled through a distutils-sig
discussion, while larger more complex (or more controversial) updates
will still need to go through the PEP process.

The additional background:

For folks that haven't used the PEP process to work on CPython itself,
here's the way that works:

- major or controversial proposals get a standards track PEP
- that gets debated/discussed on python-dev (perhaps with a
preliminary discussion on python-ideas or one of the SIGs)
- if Accepted, the relevant changes get made to CPython, including the
language and library reference
- the PEP is marked Final
- at this point, CPython and its docs are the source of authoritative
info, NOT the PEP
- future minor updates are handled as tracker issues, with the full
PEP process only invoked again for major or controversial changes

The key point there is that once the PEP is marked Final it becomes a
*historical document*, so there's no need to have a
meta-change-management process for the PEP itself.

It started out that distutils used PEPs at least in something
resembling the same way, since the standard library's distutils was
the reference implementation, and packaging standards evolved at the
same pace of the rest of the standard library. We broke that model
when we moved to using the independently developed pip and setuptools
as the reference implementations.

We've since been using the PEP process in a way a bit more like the
way IETF RFC's work, and I think we can all agree that's been a pretty
clumsy and horrible way to run things - the PEP process really wasn't
designed to be used that way, and it shows.

The approach I'm proposing we switch to gets us back to something much
closer to the way CPython uses the PEP process, which should help both
folks trying to figure out the *current* approaches to
interoperability handling, as well as those of us trying to work on
improvements to those standards.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to