Re: [Distutils] Deferring metadata hooks

2014-03-02 Thread Oscar Benjamin
On 2 March 2014 07:25, Nick Coghlan ncogh...@gmail.com wrote:

 I've just posted updated versions of PEP 426 and 459 that defer the
 metadata hooks feature. The design and behaviour of that extension
 is still way too speculative for me to approve in its current form,
 but I also don't want to hold up the rest of the changes in metadata
 2.0 while we thrash out the details of a hook system.

The other idea that was discussed a few times but hasn't made it into
PEP 426 is the idea of compatibility tags. These are mentioned in the
(deferred) metabuild section:
http://legacy.python.org/dev/peps/pep-0426/#metabuild-system
but nowhere else in the PEP.

I certainly understand the desire to defer dealing with something as
complex as hooks but simple string compatibility tags are a much
simpler thing to include in the metadata and could be very useful. I'm
thinking of a situation where you can indicate things like ABI
compatibility for C/Fortran compiled code (e.g. libc, gfortran vs g77)
but there could easily be many other uses once wheel takes off.

 That said, I still don't want us to get into a situation where someone
 later publishes a wheel file that expects metadata hook support and
 older tools silently install it without running the hooks.

 Accordingly, the revised PEP 426 adds a single simpler feature to the
 extensions system: the idea of a required extension.

 If a project sets that flag for an extension (by including
 required_extension: true in the extension metadata), and an
 installation tool doesn't understand it, then the tool is required to
 either fail the installation attempt entirely or else fall back to
 installing from source.

 That way, project authors will be able to distinguish between these
 metadata hooks are just an optimisation, things will still work if you
 don't run them and if you don't run these hooks, your installation
 will be broken.

Is there some need for metadata extensions to be optional by default?

 I think this approach may also encourage a design where projects do
 something sensible *by default* (e.g. NumPy defaulting to SSE2) and
 then use the (not yet defined) post-installation hooks to potentially
 *change away* from the default to something more optimised for that
 particular system (e.g. NumPy overwriting itself with an SSE3
 version), while still *allowing* developers to refuse to let the
 software install if the metadata hooks won't be run.

I'm not sure but there does seem to be some discussion and movement
toward the idea of numpy distributing openblas binaries (which would
solve the SSE problem). See the threads starting here for more:
http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069186.html
http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069106.html

(Note that shipping openblas binaries does not solve the ABI mismatch
problems that compatibility tags could address).


Oscar
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Deferring metadata hooks

2014-03-02 Thread Nick Coghlan
On 3 Mar 2014 04:34, Oscar Benjamin oscar.j.benja...@gmail.com wrote:

 On 2 March 2014 07:25, Nick Coghlan ncogh...@gmail.com wrote:
 
  I've just posted updated versions of PEP 426 and 459 that defer the
  metadata hooks feature. The design and behaviour of that extension
  is still way too speculative for me to approve in its current form,
  but I also don't want to hold up the rest of the changes in metadata
  2.0 while we thrash out the details of a hook system.

 The other idea that was discussed a few times but hasn't made it into
 PEP 426 is the idea of compatibility tags. These are mentioned in the
 (deferred) metabuild section:
 http://legacy.python.org/dev/peps/pep-0426/#metabuild-system
 but nowhere else in the PEP.

 I certainly understand the desire to defer dealing with something as
 complex as hooks but simple string compatibility tags are a much
 simpler thing to include in the metadata and could be very useful. I'm
 thinking of a situation where you can indicate things like ABI
 compatibility for C/Fortran compiled code (e.g. libc, gfortran vs g77)
 but there could easily be many other uses once wheel takes off.

There's an issue for a python.expect extension on the metadata tracker
(name TBD, python.constraints would answer work) . I'll list that as a
deferred feature as well, but similar to the metadata hooks, I'd prefer to
give people a chance to try out the baseline version before we lock in
additional extensions.

  That said, I still don't want us to get into a situation where someone
  later publishes a wheel file that expects metadata hook support and
  older tools silently install it without running the hooks.
 
  Accordingly, the revised PEP 426 adds a single simpler feature to the
  extensions system: the idea of a required extension.
 
  If a project sets that flag for an extension (by including
  required_extension: true in the extension metadata), and an
  installation tool doesn't understand it, then the tool is required to
  either fail the installation attempt entirely or else fall back to
  installing from source.
 
  That way, project authors will be able to distinguish between these
  metadata hooks are just an optimisation, things will still work if you
  don't run them and if you don't run these hooks, your installation
  will be broken.

 Is there some need for metadata extensions to be optional by default?

Yes, it's because I expect most extensions to be along the lines of the
standard ones, providing additional info for developers and tools *other
than* the installer. If you can't process this extension, don't install
the software should be relatively rare.

  I think this approach may also encourage a design where projects do
  something sensible *by default* (e.g. NumPy defaulting to SSE2) and
  then use the (not yet defined) post-installation hooks to potentially
  *change away* from the default to something more optimised for that
  particular system (e.g. NumPy overwriting itself with an SSE3
  version), while still *allowing* developers to refuse to let the
  software install if the metadata hooks won't be run.

 I'm not sure but there does seem to be some discussion and movement
 toward the idea of numpy distributing openblas binaries (which would
 solve the SSE problem). See the threads starting here for more:
 http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069186.html
 http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069106.html

 (Note that shipping openblas binaries does not solve the ABI mismatch
 problems that compatibility tags could address).

So long as NumPy defines and publishes an extension with the relevant
details in its metadata, the metadata constraints extension would
eventually be able to automate consistency checks.

However, I'm starting to think you may be right and it will be worth having
that defined from the beginning, specifically to help ensure we keep the
NumPy dependent wheels on PyPI consistent with each other.

Cheers,
Nick.



 Oscar
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Deferring metadata hooks

2014-03-02 Thread Oscar Benjamin
On 2 March 2014 21:05, Nick Coghlan ncogh...@gmail.com wrote:
 
  I think this approach may also encourage a design where projects do
  something sensible *by default* (e.g. NumPy defaulting to SSE2) and
  then use the (not yet defined) post-installation hooks to potentially
  *change away* from the default to something more optimised for that
  particular system (e.g. NumPy overwriting itself with an SSE3
  version), while still *allowing* developers to refuse to let the
  software install if the metadata hooks won't be run.

 I'm not sure but there does seem to be some discussion and movement
 toward the idea of numpy distributing openblas binaries (which would
 solve the SSE problem). See the threads starting here for more:
 http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069186.html
 http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069106.html

 (Note that shipping openblas binaries does not solve the ABI mismatch
 problems that compatibility tags could address).

 So long as NumPy defines and publishes an extension with the relevant
 details in its metadata, the metadata constraints extension would eventually
 be able to automate consistency checks.

 However, I'm starting to think you may be right and it will be worth having
 that defined from the beginning, specifically to help ensure we keep the
 NumPy dependent wheels on PyPI consistent with each other.

I expect that those involved in distributing wheels for the scipy
stack would coordinate and quickly converge on a consistent set of
wheels for Windows/OSX on PyPI so I doubt that it would be an issue in
that sense.

Where it is an issue is for people who install different pieces from
different places i.e. mixing source builds, .exe installers, wheels
from PyPI, and perhaps even conda packages and wheels from other
places. If a mechanism was provided to prevent borken installs and
give helpful error messages I'm sure it would be taken advantage of.

If it were also possible to upload and select between multiple
variants of a distribution then that might lower the bar for numpy to
distribute e.g. openblas wheels. A user who was unhappy with openblas
could just as easily install an alternative. (Changing BLAS library is
a big deal for numpy).


Oscar
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] Deferring metadata hooks

2014-03-01 Thread Nick Coghlan
I've just posted updated versions of PEP 426 and 459 that defer the
metadata hooks feature. The design and behaviour of that extension
is still way too speculative for me to approve in its current form,
but I also don't want to hold up the rest of the changes in metadata
2.0 while we thrash out the details of a hook system.

That said, I still don't want us to get into a situation where someone
later publishes a wheel file that expects metadata hook support and
older tools silently install it without running the hooks.

Accordingly, the revised PEP 426 adds a single simpler feature to the
extensions system: the idea of a required extension.

If a project sets that flag for an extension (by including
required_extension: true in the extension metadata), and an
installation tool doesn't understand it, then the tool is required to
either fail the installation attempt entirely or else fall back to
installing from source.

That way, project authors will be able to distinguish between these
metadata hooks are just an optimisation, things will still work if you
don't run them and if you don't run these hooks, your installation
will be broken.

I think this approach may also encourage a design where projects do
something sensible *by default* (e.g. NumPy defaulting to SSE2) and
then use the (not yet defined) post-installation hooks to potentially
*change away* from the default to something more optimised for that
particular system (e.g. NumPy overwriting itself with an SSE3
version), while still *allowing* developers to refuse to let the
software install if the metadata hooks won't be run.

Regards,
Nick.

P.S. The draft PEP for metadata hooks is still available at
https://bitbucket.org/pypa/pypi-metadata-formats/src/default/metadata-hooks.rst

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig