On Fri, Mar 10, 2017 at 7:55 AM, Nick Coghlan <[email protected]> wrote: > On 11 March 2017 at 00:52, Nathaniel Smith <[email protected]> wrote: >> >> On Fri, Mar 10, 2017 at 1:26 AM, Nick Coghlan <[email protected]> wrote: >> > Hi folks, >> > >> > After a few years of dormancy, I've finally moved the metadata 2.0 >> > specification back to Draft status: >> > >> > https://github.com/python/peps/commit/8ae8b612d4ea8b3bf5d8a7b795ae8aec48bbb7a3 >> >> We have lots of metadata files in the wild that already claim to be >> version 2.0. If you're reviving this I think you might need to change >> the version number? > > > They're mostly in metadata.json files, though. That said, version numbers > are cheap, so I'm happy to skip straight to 3.0 if folks think it makes more > sense.
AFAICT bdist_wheel produces METADATA files with Metadata-Version: 2.0 by default, and has for some time. Certainly this one I just spot-checked does that. >> > Based on our last round of discussion, I've culled a lot of the >> > complexity >> > around dependency declarations, cutting it back to just 4 pre-declared >> > extras (dev, doc, build, test), >> >> I think we can drop 'build' in favor of pyproject.toml? > > > No, as that's a human edited input file, not an output file from the sdist > generation process. > >> >> Actually all of the pre-declared extras are really relevant for sdists >> rather than wheels. Maybe they should all move into pyproject.toml? > > > Think "static release metadata in an API response from PyPI" for this > particular specification, rather than something you'd necessarily check into > source control. That's actually one of the big benefits of doing this post > pyproject.toml - with that taking care of the build system bootstrapping > problem, it frees up pydist.json to be entirely an artifact of the sdist > generation process (and then copying it along to the wheel archives and the > installed package as well). > > That said, that's actually an important open question: is pydist.json always > preserved unmodified through the sdist->wheel->install and sdist->install > process? > > There's a lot to be said for treating the file as immutable, and instead > adding *other* metadata files as a component moves through the distribution > process. If so, then it may actually be more appropriate to call the > rendered file "pysdist.json", since it contains the sdist metadata > specifically, rather than arbitrary distribution metadata. I guess there are three possible kinds of build dependencies: - those that are known statically - those that are determined by running some code at sdist creation time - those that are determined by running some code at build time But all the examples I can think of fall into either bucket A (which pyproject.toml handles), or bucket C (which pydist.json can't handle). So it seems like the metadata here is either going to be redundant or wrong? I'm not sure I understand the motivation for wanting wheels to have a file which says "here's the metadata describing the sdist that you would have, if you had an sdist (which you don't)"? I guess it doesn't hurt anything, but it seems odd. > I'd also be fairly strongly opposed to converting extras from an optional > dependency management system to a "let multiple PyPI packages target the > same site-packages subdirectory" because we already know that's a nightmare > from the Linux distro experience (having a clear "main" package that owns > the parent directory with optional subpackages solves *some* of the > problems, but my main reaction is still "Run awaaay"). The "let multiple PyPI packages target the same site-packages directory" problem is orthogonal to the reified extras proposal. I actually think we can't avoid handling the same site-packages directory problem, but the solution is namespace packages and/or better Conflicts: metadata. Example illustrating why the site-packages conflict problem arises independently of reified extras: people want to distribute numpy built against different BLAS backends, especially MKL (which is good but zero-cost proprietary) versus OpenBLAS (which is not as good but is free). Right now that's possible by distributing 'numpy' and 'numpy-mkl' packages, but of course ugly stuff happens if you try to install both; some sort of Conflicts: metadata would help. If we instead have the packages be named 'numpy' and 'numpy[mkl]', then they're in exactly the same position with respect to conflicts. The very significant advantage is that we know that 'numpy[mkl]' "belongs to" the numpy project, so 'numpy[mkl]' can say 'Provides-Dist: numpy' without all the security issues that Provides-Dist otherwise runs into. Example illustrating why reifed extras are useful totally independently of site-packages conflicts: it would be REALLY NICE if numpy could say 'Provides-Dist: numpy[abi=7]' and then packages could depend on 'numpy[abi=7]' and have that do something sensible. This would be a pure virtual package. > It especially isn't needed just to solve the "pip forgets what extras it > installed" problem - that technically doesn't even need a PEP to resolve, it > just needs pip to drop a pip specific file into the PEP 376 dist-info > directory that says what extras to request when doing future upgrades. But that breaks if people use a package manager other than pip, which is something we want to support, right? And in any case it requires a bunch more redundant special-case logic inside pip, to basically make extras act like virtual packages. -n -- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ Distutils-SIG maillist - [email protected] https://mail.python.org/mailman/listinfo/distutils-sig
