Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On 7 October 2015 at 22:53, Nathaniel Smithwrote: >> I think I'm as confused by what you're saying here as Donald is. Could >> you give a few examples of such projects? I'd like to go & take a look >> at them and try to understand what they are doing that is so >> incompatible with what Donald and I are thinking of as a"source >> wheel". > > An example would be flit itself: > https://github.com/takluyver/flit > https://pypi.python.org/pypi/flit > > It's not that you couldn't support a "source wheel" here, it's just > that forcing them to go checkout -> source wheel -> wheel would be > adding pointless hassle while accomplishing nothing useful. pip would > never actually touch the source wheel, and for the remaining use cases > for source distribution, a classic "source release" that's basically a > tarball of a VCS checkout + static version number would be more > familiar and useful. I'm not sure I follow. If you have a binary wheel of flit, "pip install flit" won't need a source wheel, certainly (that's just as true for flit as for something complex like numpy). But distro packages would still want a source wheel to build their packages. If you mean that flit itself wouldn't use a source wheel, then while that may well be true, it's hardly relevant - whether flit chooses to use a source wheel is its own choice. But I'd hope flit *could* use a source wheel, as otherwise I couldn't use it to build wheels for other projects which want to use it and distribute source wheels. Should any such exist - this is pretty hypothetical at this point, and so not likely to be very productive. I am inclined to think that we're basically in agreement, we're just confused over terminology, and/or worrying about hypothetical cases. Would it help if I said that the *only* distinction between "source release" and source wheel that I care about is that in a source wheel the metadata must be static? We can discuss what metadata precisely, and we can thrash out other differences that might make more use of the fact that conceptually a "source release" is for humans to work with whereas a source wheel is for tools to consume, but those are details. I'm not clear if you think I have some more complicated picture than that, but really I don't [1]. Paul [1] I'd like a source wheel to have a defined format, but even that's not a killer. A zipfile with 2 directories "metadata" containing machine readable static metadata, and "source" with the complete contents of a source release, would do me. Of course when you build, if the metadata the build produces doesn't match the static data, that's a bug in the project packaging and we'd want to guard against it (it's the main reason the static data in the current sdist format is useless, that we can't rely on it :-() We can thrash this sort of stuff out, though. ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] pbr issues (was: Where should I put tests when packaging python modules?)
On 8 October 2015 at 04:32, Erik Braywrote: > Starting a sub-thread since issues with pbr are orthogonal to the > original disucssion. > > But one point I'd like to raise about this is that when I originally > designed d2to1, on which a chunk of pbr is based, it was *explicitly* > designed to never be installed in site-packages (with the exception of > downstream packaging systems which can do what they want and are more > controlled). This is exactly because I knew different packages might > have dependencies on different versions of d2to1 as features are > added, and that if some version is installed in site-packages it can > lead to VersionConflict issues (this is in part exacerbated by a > bug/misfeature in setuptools--I fixed that bug a while ago but the fix > had to be rolled back due to a regression [1]). So yes - that principle makes a lot of sense. There are two factors for pbr. a) as Donald mentions, there's a runtime API too - but we could indeed split that out into a separate package, if it would help things. b) more importantly, in OpenStack infra we can't use easy-install - its inconsistency with pip and lack of updated handling for wheels, HTTPS, separate configuration - consistently cause headaches every single time it happens. So many years ago we put in place pre-installation of all known build-time dependencies - we just install them statically, because we find the effort required to make them be compatible is less than the headaches from easy-install. As such pbr has a hard API requirement: thou shalt be backwards compatible. Clearly b) can hit VersionConflicts if pbr (and any other build dependencies like setuptools itself) are out of date, but that is easily handled for automated environments (pip install -U pip setuptools wheel pbr && echo YAY), and it errors cleanly enough for hand-use by folk that its a decent enough tradeoff in our experience. > I don't know what features pbr has grown that might make someone want > it to be a runtime dependency (the only entry-points I noticed were > for adding egg-info writers but that should only be needed at > build-time too...), but maybe something like that should be split off > as a separate module or something... Its all about easy-install. This is why I put a proof of concept static-setup-requires thing together for pip (and its on our teams roadmap to deliver a production version of it via patches to all the pypa projects, we're just not at that point yet - the resolver is first, and we have to finish rolling out constraints within OpenStack before that gets large timeslices). -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On 7 October 2015 at 20:36, Oscar Benjaminwrote: > Currently I can take the code from the numpy release and compile it in > different incompatible ways. For example I could make a wheel that bundles a > BLAS library. Or I could make a wheel that expects to use a system BLAS > library that should be installed separately somehow or I could build a wheel > against pyopenblas and make a wheel that depends on pyopenblas. Or I could > link a BLAS library statically into numpy. > > A numpy release supports being compiled and linked in many different ways > and will continue to do so regardless of any decisions made by PYPA. What > that means is that there is not a one to one correspondence between a numpy > release and a binary wheel. If there must be a one to one correspondence > between a source wheel and a binary wheel then it follows that there cannot > be a one to one correspondence between the source release and a source > wheel. > > Of course numpy could say that they will only upload one particular source > wheel and binary wheel to PyPI but people need to be able to use the source > release in many different ways. So only releasing a source wheel that maps > one to one to a particular way of compiling numpy is not an acceptable way > for numpy to release its code. The disconnect here seems to be that I view all of those wheels as being numpy 1.9.X wheels (or whatever). They differ in terms of compatibility details, but they are all wheels for the same project/version. So there's no problem with them all being built from the same source wheel. I also have no problem with it being possible to configure the build differently from a single source wheel, to generate all those wheels. The configuration isn't metadata, it's "just" settings for the build. Of course, there *is* an unsolved issue here, which is how we manage compatibility for wheels at the level needed for numpy. But I thought the discussion on that was ongoing? I'm concerned that this proposal is actually about bypassing that discussion, and instead trying to treat incompatibly linked wheels as "different" in terms of project metadata, which I think is the wrong way of handling things. I note that Christoph Gohlke's numpy builds are tagged with a "+mkl" local version modifier - that's presumably intended to mark the fact that they are built with an incompatible runtime - but that's a misuse of local versions (and I've found it causes niggling issues with how pip recognises upgrades, etc). So, in summary: Your points above don't seem to me to in any way preclude having a single numpy source wheel, and a number of (mutually incompatible, but the same in terms of project and version) binary wheels. Paul ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On Wed, Oct 7, 2015 at 11:14 AM, Paul Moorewrote: > On 7 October 2015 at 18:27, Nathaniel Smith wrote: >> There are projects on PyPI right now, today, that have no way to >> generate sdists and will never have any need for "source wheels" > > I think I'm as confused by what you're saying here as Donald is. Could > you give a few examples of such projects? I'd like to go & take a look > at them and try to understand what they are doing that is so > incompatible with what Donald and I are thinking of as a"source > wheel". An example would be flit itself: https://github.com/takluyver/flit https://pypi.python.org/pypi/flit It's not that you couldn't support a "source wheel" here, it's just that forcing them to go checkout -> source wheel -> wheel would be adding pointless hassle while accomplishing nothing useful. pip would never actually touch the source wheel, and for the remaining use cases for source distribution, a classic "source release" that's basically a tarball of a VCS checkout + static version number would be more familiar and useful. -n -- Nathaniel J. Smith -- http://vorpus.org ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On 7 October 2015 at 22:28, Nathaniel Smithwrote: > Maybe I have misunderstood: does it actually help pip at all to have > static access to name and version, but not to anything else? I've been > assuming not, but I don't think anyone's pointed to any examples yet > of the problems that pip is encountering due to the lack of static > metadata -- would this actually be enough to solve them? The principle I am working on is that *all* metadata in a source wheel should be statically available - that's not just for pip, but for all other consumers, including distro packagers. What's not set in stone is precisely what (subsets of) metadata are appropriate for source wheels as opposed to (binary) wheels. So I'd counter your question with the converse - what metadata specifically are you unwilling to include statically in source wheels? My feeling is that there isn't anything you'd be unwilling to include that I'd consider as "source wheel metadata". Possibly the nearest we'd have to an issue is over allowing the build process to *add* dependencies to a binary wheel (e.g. a some builds depend on currently-hypothetical MKL wheel, which provides needed DLLs). I don't in principle object to that, but I'd like to see a fleshed out proposal on how wheels containing just DLLs (as opposed to Python packages) would work in practice - until we have a mechanism for building/distributing such wheels, I think it's premature to worry about specifying dependencies. But whatever comes out of this, the Metadata 2.0 spec should ultimately be updated to note which metadata is mandated in source wheels, and which in binary wheels only. Paul ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On Wed, Oct 7, 2015 at 1:28 PM, Paul Moorewrote: > The disconnect here seems to be that I view all of those wheels as > being numpy 1.9.X wheels (or whatever). They differ in terms of > compatibility details, but they are all wheels for the same > project/version. So there's no problem with them all being built from > the same source wheel. I also have no problem with it being possible > to configure the build differently from a single source wheel, to > generate all those wheels. The configuration isn't metadata, it's > "just" settings for the build. But the different builds for the different configurations end up with different metadata. If I'm understanding right, the whole point of "source wheels" is that they have all the static metadata that pip needs in order to make decisions, and this has to match the resulting wheels -- right? The way I'm imagining it is that there are multiple levels of metadata staticness: package name, author, description, ... static in: VCS checkouts, source releases, source wheels, wheels package version static in: source releases, source wheels, wheels package dependencies static in: source wheels, wheels environment tag static in: wheels > Of course, there *is* an unsolved issue here, which is how we manage > compatibility for wheels at the level needed for numpy. But I thought > the discussion on that was ongoing? I'm concerned that this proposal > is actually about bypassing that discussion, and instead trying to > treat incompatibly linked wheels as "different" in terms of project > metadata, which I think is the wrong way of handling things. I note > that Christoph Gohlke's numpy builds are tagged with a "+mkl" local > version modifier - that's presumably intended to mark the fact that > they are built with an incompatible runtime - but that's a misuse of > local versions (and I've found it causes niggling issues with how pip > recognises upgrades, etc). Yeah, that's not a good long term solution -- it needs to be moved into the metadata (probably by creating an MKL wheel and then making the numpy wheel depend on it). That's exactly the problem :-) > So, in summary: Your points above don't seem to me to in any way > preclude having a single numpy source wheel, and a number of (mutually > incompatible, but the same in terms of project and version) binary > wheels. Maybe I have misunderstood: does it actually help pip at all to have static access to name and version, but not to anything else? I've been assuming not, but I don't think anyone's pointed to any examples yet of the problems that pip is encountering due to the lack of static metadata -- would this actually be enough to solve them? -n -- Nathaniel J. Smith -- http://vorpus.org ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 4:42 PM, Ionel Cristian Mărieșwrote: > On Wed, Oct 7, 2015 at 3:18 PM, Donald Stufft wrote: > >> tox and setup.py test are not really equivalent. There’s no way (to my >> knowledge) to test the item outside of a virtual environment. This is >> important for downstreams who want to test that the package build and the >> tests successfully are executed in their environment, not within some >> virtual environment. > > > H ... you're right. But making Tox not use virtualenvs is not > impossible - much alike to how Detox is working, we could have a "Tax" > (just made that up) that just skips making any virtualenv. It's a matter of > making two subclasses and a console_scripts entrypoint (I think). I think > it's a good name: ``use Tax instead of Tox if you wanna "tax" your global > site-packages`` :-) > Just for kicks, I verified this, it's not hard at all: https://pypi.python.org/pypi/tax Barry may want to look at it, in case he has too many tox.ini files to copy-paste from :-) Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On October 7, 2015 at 5:28:54 PM, Nathaniel Smith (n...@pobox.com) wrote: > > Yeah, that's not a good long term solution -- it needs to be moved > into the metadata (probably by creating an MKL wheel and then > making > the numpy wheel depend on it). That's exactly the problem :-) > Are you available on IRC or for a video call or something? I feel like there's something foundational from both sides that we're each missing here and it'd be easier to just hash it out in real time rather than lobbying random emails coming from places of confusion (at least on my side). I'm not sure if Paul (or anyone else!) would want to jump in on it too, though I feel like probably if it's me and you then the two "sides" will probably be reasonably well represented so if more folks don't want to join that's probably OK too, particularly since we wouldn't be making any actual decisions there :D - Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] tests location: Use case: new comers and docs.
On Tue, Oct 6, 2015 at 10:38 PM, Thomas Güttler < guettl...@thomas-guettler.de> wrote: > Yes, there is not generic "one right way here". > > Yes, let's consider individual use cases. > > My use case are the docs for new comers: > > - https://github.com/pypa/sampleproject > - https://packaging.python.org/en/latest/distributing/ > > That's why started the thread. > unfortunately, that isn't a use-case -- every newcomer has a different use case. I was happy to see this thread, because I thought maybe I"d learn what i should teach my students - new to python. But alas - there clearly really is no consensus. What i've told newbies in the past is somethig like: """ if you want your user to be able to install you package, and then run something like: import my_package my_package.test() then put your tests inside the package. If you are fine with only being able to run the tests from the source tree -- then put your tests outside the package. """ but really, newbies have no idea how to make this decsion. Maybe we could come up with a decision tree for this -- some guidance for knowing what to do, when? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR(206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception chris.bar...@noaa.gov ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] pbr issues (was: Where should I put tests when packaging python modules?)
On Wed, Oct 7, 2015 at 3:39 PM, Robert Collinswrote: [...] > Its all about easy-install. This is why I put a proof of concept > static-setup-requires thing together for pip (and its on our teams > roadmap to deliver a production version of it via patches to all the > pypa projects, we're just not at that point yet - the resolver is > first, and we have to finish rolling out constraints within OpenStack > before that gets large timeslices). I remembered Nick saying something about this at PyCon, but I couldn't find anything when I looked -- could you point me to the PoC? -n -- Nathaniel J. Smith -- http://vorpus.org ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Oct 07, 2015, at 09:46 AM, Ben Finney wrote: >So “I'm a big fan of putting tests inside the [Python] package >[directory]” can't be motivated by “Having the tests there in the >installed package”. The two aren't related, AFAICT. It makes it easier for sure. When the tests are inside the package, nothing special has to be done; you just install the package and the tests subdirectories come along for the ride. If the tests are outside the package then you first have to figure out where they're going to go when they're installed, and then do something special to get them there. Cheers, -Barry pgpVco0pIiE1x.pgp Description: OpenPGP digital signature ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On Wed, 7 Oct 2015 19:42 Donald Stufftwrote: On October 7, 2015 at 2:31:03 PM, Oscar Benjamin (oscar.j.benja...@gmail.com) wrote: > > > Your idea of an sdist as something that has fully static build/runtime > dependency metadata and a one to one correspondence with binary > wheels is not a usable format when releasing the code for e.g. > numpy 1.10. It's fine to say that pip/PyPI should work with the > source in some other distribution format and numpy could produce > that but it means that the standard tarball release needs to be > supported some how separately. Numpy should be able to use PyPI > in order to host the tarball even if pip ignores the file. > > > If numpy released only source wheels then there would be more > than one source wheel for each release corresponding to e.g. > the different ways that numpy is linked. There still needs to > be a way to release a single file representing the code for the > release as a whole. > Can you expand on this please? I've never used numpy for anything serious and I'm trying to figure out why and what parts of what I'm thinking of wouldn't work for it. Currently I can take the code from the numpy release and compile it in different incompatible ways. For example I could make a wheel that bundles a BLAS library. Or I could make a wheel that expects to use a system BLAS library that should be installed separately somehow or I could build a wheel against pyopenblas and make a wheel that depends on pyopenblas. Or I could link a BLAS library statically into numpy. A numpy release supports being compiled and linked in many different ways and will continue to do so regardless of any decisions made by PYPA. What that means is that there is not a one to one correspondence between a numpy release and a binary wheel. If there must be a one to one correspondence between a source wheel and a binary wheel then it follows that there cannot be a one to one correspondence between the source release and a source wheel. Of course numpy could say that they will only upload one particular source wheel and binary wheel to PyPI but people need to be able to use the source release in many different ways. So only releasing a source wheel that maps one to one to a particular way of compiling numpy is not an acceptable way for numpy to release its code. -- Oscar ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Oct 7, 2015 12:44 AM, "Marius Gedminas"wrote: > > On Tue, Oct 06, 2015 at 05:21:27PM -0400, Barry Warsaw wrote: > > On Oct 06, 2015, at 06:21 AM, Donald Stufft wrote: > > > > >FreeBSD relies on ``python setup.py test`` as it's preferred test invocation, > > >so it apparently doesn't find it useful either. > > > > Oh how I wish there was a standard way to *declare* how to run the test suite, > > such that all our automated tools (or the humans :) didn't have to guess. make test > > I have hopes for 'tox.ini' becoming the standard way to test a Python > project. * https://tox.readthedocs.org/en/latest/config.html * https://github.com/docker/docker-registry/blob/master/tox.ini #flake8 * dox = docker + tox | PyPI: https://pypi.python.org/pypi/dox | Src: https://git.openstack.org/cgit/stackforge/dox/tree/dox.yml * docker-compose.yml | Docs: https://docs.docker.com/compose/ | Docs: https://github.com/docker/compose/blob/master/docs/yml.md * https://github.com/kelseyhightower/kubernetes-docker-files/blob/master/docker-compose.yml * https://github.com/kubernetes/kubernetes/blob/master/docs/user-guide/pods.md#alternatives-considered * https://github.com/docker/docker/issues/8781 ( pods ( containers ) ) * http://docs.buildbot.net/latest/tutorial/docker.html * http://docs.buildbot.net/current/tutorial/docker.html#building-and-running-buildbot tox.ini often is not sufficient: * [Makefile: make test/tox] * setup.py * tox.ini * docker/platform-ver/Dockerfile * [dox.yml] * [docker-compose.yml] * [CI config] * http://docs.buildbot.net/current/manual/configuration.html * jenkins-kubernetes, jenkins-mesos > > Marius Gedminas > -- > "Actually, the Singularity seems rather useful in the entire work avoidance > field. "I _could_ write up that report now but if I put it off, I may well > become a weakly godlike entity, at which point not only will I be able to > type faster but my comments will be more on-target."- James Nicoll > > ___ > Distutils-SIG maillist - Distutils-SIG@python.org > https://mail.python.org/mailman/listinfo/distutils-sig > ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Oct 7, 2015 6:58 AM, "Ionel Cristian Mărieș"wrote: > > > On Wed, Oct 7, 2015 at 8:12 AM, Thomas Güttler < guettl...@thomas-guettler.de> wrote: >> >> I thought "easy_install" is a very old and deprecated method. > > > Indeed it is. That why people put all sorts of custom "test" commands in their setup.py to work around the deficiencies of the "test > " command setuptools provides. So we end up with lots of variations of "how to use pytest to run tests via `setup.py test`", "how to use pip to install deps, instead of what `setup.py test` normally does" and so on. > > If you're gonna implement a test runner in your setup.py you might as well use a supported and well maintained tool: tox. > >> >> Why not use `setup.py test`? > > > Because: > > 1. There's Tox, which does exactly that, and more. It's maintained. It gets features. Tox rocks. * detox can run concurrent processes: https://pypi.python.org/pypi/detox/ * TIL timeit.default_timer measures **wall time** by default and not CPU time: concurrent test timings are likely different from linear tests run on an machine with load > 2. The "test" command will install the "test_requires" dependencies as eggs. You will end up with multiple versions of the same eggs right in your source checkout. * is there no way around this? * is this required / spec'd / fixable? > 3. The "test" command will install the "test_requires" dependencies with easy_install. That means wheels cannot be used. would it be possible to add this to wheel? as if, after package deployment, in-situ tests are no longer relevant. (I think it wise to encourage TDD here) > 4. Because the builtin "test" command is so bare people tend to implement a custom one. Everyone does something slightly different, and slightly buggy. * README.rst test invocation examples (all, subset, one) * Makefile (make test; [vim] :make) * python setup.py nosetests http://nose.readthedocs.org/en/latest/api/commands.html * python setup.py [test] https://pytest.org/latest/goodpractises.html#integrating-with-setuptools-python-setup-py-test > 5. There's no established tooling that relies on `setup.py test`. There isn't even a test result protocol like TAP [1] for it. Why use something so limited and outdated if there's no practical advantage? * xUnit XML: https://westurner.org/wiki/awesome-python-testing#xunit-xml ``` xUnit XML⬅ https://en.wikipedia.org/wiki/XUnit https://nose.readthedocs.org/en/latest/plugins/xunit.html http://nosexunit.sourceforge.net/ https://pytest.org/latest/usage.html#creating-junitxml-format-files https://github.com/xmlrunner/unittest-xml-reporting https://github.com/zandev/shunit2/compare/master...jeremycarroll:master ``` * TAP protocol > > [1] https://testanything.org/ > > Thanks, > -- Ionel Cristian Mărieș, http://blog.ionelmc.ro > > ___ > Distutils-SIG maillist - Distutils-SIG@python.org > https://mail.python.org/mailman/listinfo/distutils-sig > ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On October 7, 2015 at 7:58:55 AM, Ionel Cristian Mărieș (cont...@ionelmc.ro) wrote: > On Wed, Oct 7, 2015 at 8:12 AM, Thomas Güttler > > wrote: > > Why not use `setup.py test`? > > > > Because: > > 1. There's Tox, which does exactly that, and more. It's maintained. It > gets features. tox and setup.py test are not really equivalent. There’s no way (to my knowledge) to test the item outside of a virtual environment. This is important for downstreams who want to test that the package build and the tests successfully are executed in their environment, not within some virtual environment. - Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 3:20 PM, Wes Turnerwrote: > > 2. The "test" command will install the "test_requires" dependencies as > eggs. You will end up with multiple versions of the same eggs right in your > source checkout. > > * is there no way around this? > * is this required / spec'd / fixable? > It's not that bad now, recent setuptools put the eggs in a ".eggs" dir - so it's not as messy as before. > > 3. The "test" command will install the "test_requires" dependencies with > easy_install. That means wheels cannot be used. > > would it be possible to add this to wheel? > It's up to the maintainers of wheel/setuptools to figure this one out (or not) I think. Either way, you should search through the distutils-sig archives for clues/intentions, eg: https://mail.python.org/pipermail/distutils-sig/2014-December/thread.html#25482 Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 3:18 PM, Donald Stufftwrote: > tox and setup.py test are not really equivalent. There’s no way (to my > knowledge) to test the item outside of a virtual environment. This is > important for downstreams who want to test that the package build and the > tests successfully are executed in their environment, not within some > virtual environment. H ... you're right. But making Tox not use virtualenvs is not impossible - much alike to how Detox is working, we could have a "Tax" (just made that up) that just skips making any virtualenv. It's a matter of making two subclasses and a console_scripts entrypoint (I think). I think it's a good name: ``use Tax instead of Tox if you wanna "tax" your global site-packages`` :-) We only need someone to do it. Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 6:13 PM, Erik Braywrote: > > Lets not use `setup.py test`. It's either bad or useless. > > Says who? Many of the projects I'm involved in use `setup.py test` > exclusively and for good reason--they all have C and/or Cython > extension modules that need to be built for the tests to even run. > Only setup.py knows about those extension modules and how to find and > build them. Using `setup.py test` ensures that everything required to > run the package (including runtime dependencies) is built and ready, Well ok, then it's not useless. :-) For pure Python packages I think it's less important and can usually > rely on "just run 'nose', or 'py.test'" (or "tox" but that's true > regardless of how the tests are invoked outside of tox). > That implies you would be testing code that you didn't install. That allows preventable mistakes, like publishing releases on PyPI that don't actually work, or do not even install at all (because you didn't test that). `setup.py test` doesn't really allow you to fully test that part, but Tox does. Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] pbr issues (was: Where should I put tests when packaging python modules?)
On Tue, Oct 6, 2015 at 7:46 PM, Ionel Cristian Mărieșwrote: > > On Wed, Oct 7, 2015 at 2:23 AM, Robert Collins > wrote: >> >> >> Hangon, there's clearly a *huge* gap in understanding here. >> >> pbr does *not* modify *anyones* setup.py output unless its enabled. > > > Unless it's >=1.7.0. You can't blame setuptools having entrypoints for pbr > doing weird stuff to distributions by abusing said entrypoints. > > For reference: https://bugs.launchpad.net/pbr/+bug/1483067 > > There's nothing special about pbr here. It's not like it's the first package > doing dangerous stuff (distribute, suprocess.run, pdbpp). I really like > pdbpp, but you don't put that in production. Starting a sub-thread since issues with pbr are orthogonal to the original disucssion. But one point I'd like to raise about this is that when I originally designed d2to1, on which a chunk of pbr is based, it was *explicitly* designed to never be installed in site-packages (with the exception of downstream packaging systems which can do what they want and are more controlled). This is exactly because I knew different packages might have dependencies on different versions of d2to1 as features are added, and that if some version is installed in site-packages it can lead to VersionConflict issues (this is in part exacerbated by a bug/misfeature in setuptools--I fixed that bug a while ago but the fix had to be rolled back due to a regression [1]). So TL;DR unless you know what you're doing, d2to1 should *never* be "installed"--it was only meant to be used with setup_requires, where the appropriate d2to1 used in building/installing a package is only temporarily enabled on sys.path via a temporary egg install. If some project is making it a *runtime* requirement that's a mistake. I don't know what features pbr has grown that might make someone want it to be a runtime dependency (the only entry-points I noticed were for adding egg-info writers but that should only be needed at build-time too...), but maybe something like that should be split off as a separate module or something... Best, Erik [1] https://bitbucket.org/tarek/distribute/pull-requests/20/fixes-and-adds-a-regression-test-for-323/diff ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Tue, Oct 6, 2015 at 6:08 PM, Ionel Cristian Mărieșwrote: > > On Wed, Oct 7, 2015 at 12:51 AM, Ben Finney > wrote: >> >> I think the above describes the standard way of declaring the test >> runner: The ‘setup.py test’ command. >> >> Now, I lament that more Python projects don't *conform to* that >> standard, but at least it exists. > > > There's a very simple answer to that: easy_install (that's what `setup.py > test` will use to install deps). It has several design issue wrt how > packages are installed and how dependencies are managed. > > Lets not use `setup.py test`. It's either bad or useless. Says who? Many of the projects I'm involved in use `setup.py test` exclusively and for good reason--they all have C and/or Cython extension modules that need to be built for the tests to even run. Only setup.py knows about those extension modules and how to find and build them. Using `setup.py test` ensures that everything required to run the package (including runtime dependencies) is built and ready, and then the tests can start. Without it, we would have to tell developers to go through a build process first and then make sure they're running the tests on the built code. `setup.py test` makes it a no-brainer. For pure Python packages I think it's less important and can usually rely on "just run 'nose', or 'py.test'" (or "tox" but that's true regardless of how the tests are invoked outside of tox). Best, Erik ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On Mon, Oct 5, 2015 at 6:51 AM, Donald Stufftwrote: [...] > I also don't think it will be confusing. They'll associate the VCS thing (a > source release) as something focused on development for most everyone. Most > people won't explicitly make one and nobody will be uploading it to PyPI. The > end goal in my mind is someone produces a source wheel and uploads that to > PyPI and PyPI takes it from there. Mucking around with manually producing > binary wheels or producing source releases other than what's checked into vcs > will be something that I suspect only advanced users will do. Of course people will make source releases, and should be able to upload them to PyPI. The end goal is that *pip* will not use source releases, but PyPI is not just there for pip. If it was, it wouldn't even show package descriptions :-). There are projects on PyPI right now, today, that have no way to generate sdists and will never have any need for "source wheels" (because they don't use distutils and they build "none-any" wheels directly from their source). It should still be possible for them to upload source releases for all the other reasons that having source releases is useful: they form a permanent record of the whole project state (including potentially docs, tests, working notes, etc. that don't make it into the wheels), human users may well want to download those archives, Debian may prefer to use that as their orig.tar.gz, etc. etc. And on the other end of the complexity scale, there are projects like numpy where it's not clear to me whether they'll ever be able to support "source wheels", and even if they do they'll still need source releases to support user configuration at build time. -n -- Nathaniel J. Smith -- http://vorpus.org ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On October 7, 2015 at 1:27:31 PM, Nathaniel Smith (n...@pobox.com) wrote: > On Mon, Oct 5, 2015 at 6:51 AM, Donald Stufft wrote: > [...] > > I also don't think it will be confusing. They'll associate the VCS thing (a > > source release) > as something focused on development for most everyone. Most people won't > explicitly > make one and nobody will be uploading it to PyPI. The end goal in my mind is > someone produces > a source wheel and uploads that to PyPI and PyPI takes it from there. Mucking > around with > manually producing binary wheels or producing source releases other than > what's checked > into vcs will be something that I suspect only advanced users will do. > > Of course people will make source releases, and should be able to > upload them to PyPI. The end goal is that *pip* will not use source > releases, but PyPI is not just there for pip. If it was, it wouldn't > even show package descriptions :-). > > There are projects on PyPI right now, today, that have no way to > generate sdists and will never have any need for "source wheels" > (because they don't use distutils and they build "none-any" wheels > directly from their source). It should still be possible for them to > upload source releases for all the other reasons that having source > releases is useful: they form a permanent record of the whole project > state (including potentially docs, tests, working notes, etc. that > don't make it into the wheels), human users may well want to download > those archives, Debian may prefer to use that as their orig.tar.gz, > etc. etc. > > And on the other end of the complexity scale, there are projects like > numpy where it's not clear to me whether they'll ever be able to > support "source wheels", and even if they do they'll still need source > releases to support user configuration at build time. We must have different ideas of what a source release vs source wheel would look like, because I'm having a hard time squaring what you've said here with what it looks like in my head. In my head, source releases (outside of the VCS use case) will be rare and only for very complex packages that are doing very complex things. Source wheels will be something that will be semi mandatory to being a well behaved citizen (for Debian and such to download) and binary wheels will be something that you'll want to have but aren't required. I don't see any reason why source wheels wouldn't include docs, tests, and other misc files. I picture building a binary wheel directly being something similar to using fpm to build binary .deb packages directly, totally possible but unadvised. Having talked to folks who deal with Debian/Fedora packages, they won't accept a binary wheel as the input source and (given how I explained it to them) they are excited about the concept of source wheels and moving away from dynamic metadata and towards static metadata. - Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 6:37 PM, Erik Braywrote: > Which, incidentally, is a great reason for installable tests :) > Not really. Doesn't matter where you have the tests. It matters where you have the code being tested. Tests being installed is a mere consequence of the location of tests. > Running in the source tree is great for development. But when > preparing a release it's great to be able to create an sdist, install > that into a virtualenv, and run `package.test()` or `python -m > package.tests` or whatever. Occasionally catches problems with the > source dist if nothing else. > As I said, I like the idea. It's just that it's not feasible right now. Lets go over the issues again: * Tests too bulky (pyca/cryptography) * Tests can't be installed at all: https://github.com/getpelican/pelican/issues/1409 * Not clear how to install test dependencies. tests_require? extras? no deps? What about version conflicts and way too many deps being installed. Dependencies are like cars, they are very useful but too many of them create problems. * Real problems like standardized test output or run protocol are not solved at all. Little benefit of doing it like this if you can't build good CI tools around this. * Workflows are under-specified. User are not guided to make quality releases on PyPI. Maybe we should have a PEP that would specify/propose some concrete solutions to all those? Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Wed, Oct 7, 2015 at 11:31 AM, Ionel Cristian Mărieșwrote: > > On Wed, Oct 7, 2015 at 6:13 PM, Erik Bray wrote: >> >> > Lets not use `setup.py test`. It's either bad or useless. >> >> Says who? Many of the projects I'm involved in use `setup.py test` >> exclusively and for good reason--they all have C and/or Cython >> extension modules that need to be built for the tests to even run. >> Only setup.py knows about those extension modules and how to find and >> build them. Using `setup.py test` ensures that everything required to >> run the package (including runtime dependencies) is built and ready, > > > Well ok, then it's not useless. :-) > >> For pure Python packages I think it's less important and can usually >> rely on "just run 'nose', or 'py.test'" (or "tox" but that's true >> regardless of how the tests are invoked outside of tox). > > > That implies you would be testing code that you didn't install. That allows > preventable mistakes, like publishing releases on PyPI that don't actually > work, or do not even install at all (because you didn't test that). > `setup.py test` doesn't really allow you to fully test that part, but Tox > does. Which, incidentally, is a great reason for installable tests :) Running in the source tree is great for development. But when preparing a release it's great to be able to create an sdist, install that into a virtualenv, and run `package.test()` or `python -m package.tests` or whatever. Occasionally catches problems with the source dist if nothing else. Best, Erik ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Oct 07, 2015, at 08:18 AM, Donald Stufft wrote: >tox and setup.py test are not really equivalent. There’s no way (to my >knowledge) to test the item outside of a virtual environment. This is >important for downstreams who want to test that the package build and the >tests successfully are executed in their environment, not within some virtual >environment. I usually do not use tox to test a package when building it for Debian. It's pretty easy to extract the actual command used to run the test suit from the tox.ini and that's what I put in the debian/rules file. It can make things build a little more reliably, but also eliminates a build dependency on tox. Cheers, -Barry pgpzLAJQoCUdy.pgp Description: OpenPGP digital signature ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Where should I put tests when packaging python modules?
On Oct 07, 2015, at 08:35 AM, Marius Gedminas wrote: >I have hopes for 'tox.ini' becoming the standard way to test a Python >project. As do I, modulo outliers of course. I'd like to see 90% of PyPI packages have a tox.ini. Cheers, -Barry pgpxJ0ovIWbuL.pgp Description: OpenPGP digital signature ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On 7 October 2015 at 18:27, Nathaniel Smithwrote: > There are projects on PyPI right now, today, that have no way to > generate sdists and will never have any need for "source wheels" I think I'm as confused by what you're saying here as Donald is. Could you give a few examples of such projects? I'd like to go & take a look at them and try to understand what they are doing that is so incompatible with what Donald and I are thinking of as a"source wheel". Paul ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] pbr issues (was: Where should I put tests when packaging python modules?)
On October 7, 2015 at 11:32:32 AM, Erik Bray (erik.m.b...@gmail.com) wrote: > > I don't know what features pbr has grown that might make someone > want > it to be a runtime dependency (the only entry-points I noticed > were > for adding egg-info writers but that should only be needed at > build-time too...), but maybe something like that should be > split off > as a separate module or something… It has runtime utilities for dealing with versions. - Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On Wed, 7 Oct 2015 18:51 Donald Stufftwrote: On October 7, 2015 at 1:27:31 PM, Nathaniel Smith (n...@pobox.com) wrote: > On Mon, Oct 5, 2015 at 6:51 AM, Donald Stufft wrote: > [...] > > I also don't think it will be confusing. They'll associate the VCS thing (a source release) > as something focused on development for most everyone. Most people won't explicitly > make one and nobody will be uploading it to PyPI. The end goal in my mind is someone produces > a source wheel and uploads that to PyPI and PyPI takes it from there. Mucking around with > manually producing binary wheels or producing source releases other than what's checked > into vcs will be something that I suspect only advanced users will do. > > Of course people will make source releases, and should be able to > upload them to PyPI. The end goal is that *pip* will not use source > releases, but PyPI is not just there for pip. If it was, it wouldn't > even show package descriptions :-). > > There are projects on PyPI right now, today, that have no way to > generate sdists and will never have any need for "source wheels" > (because they don't use distutils and they build "none-any" wheels > directly from their source). It should still be possible for them to > upload source releases for all the other reasons that having source > releases is useful: they form a permanent record of the whole project > state (including potentially docs, tests, working notes, etc. that > don't make it into the wheels), human users may well want to download > those archives, Debian may prefer to use that as their orig.tar.gz, > etc. etc. > > And on the other end of the complexity scale, there are projects like > numpy where it's not clear to me whether they'll ever be able to > support "source wheels", and even if they do they'll still need source > releases to support user configuration at build time. We must have different ideas of what a source release vs source wheel would look like, because I'm having a hard time squaring what you've said here with what it looks like in my head. In my head, source releases (outside of the VCS use case) will be rare and only for very complex packages that are doing very complex things. Source wheels will be something that will be semi mandatory to being a well behaved citizen (for Debian and such to download) and binary wheels will be something that you'll want to have but aren't required. I don't see any reason why source wheels wouldn't include docs, tests, and other misc files. I picture building a binary wheel directly being something similar to using fpm to build binary .deb packages directly, totally possible but unadvised. Having talked to folks who deal with Debian/Fedora packages, they won't accept a binary wheel as the input source and (given how I explained it to them) they are excited about the concept of source wheels and moving away from dynamic metadata and towards static metadata. Your idea of an sdist as something that has fully static build/runtime dependency metadata and a one to one correspondence with binary wheels is not a usable format when releasing the code for e.g. numpy 1.10. It's fine to say that pip/PyPI should work with the source in some other distribution format and numpy could produce that but it means that the standard tarball release needs to be supported some how separately. Numpy should be able to use PyPI in order to host the tarball even if pip ignores the file. If numpy released only source wheels then there would be more than one source wheel for each release corresponding to e.g. the different ways that numpy is linked. There still needs to be a way to release a single file representing the code for the release as a whole. -- Oscar ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Re: [Distutils] Towards a simple and standard sdist format that isn't intertwined with distutils
On October 7, 2015 at 2:31:03 PM, Oscar Benjamin (oscar.j.benja...@gmail.com) wrote: > > > Your idea of an sdist as something that has fully static build/runtime > dependency metadata and a one to one correspondence with binary > wheels is not a usable format when releasing the code for e.g. > numpy 1.10. It's fine to say that pip/PyPI should work with the > source in some other distribution format and numpy could produce > that but it means that the standard tarball release needs to be > supported some how separately. Numpy should be able to use PyPI > in order to host the tarball even if pip ignores the file. > > > If numpy released only source wheels then there would be more > than one source wheel for each release corresponding to e.g. > the different ways that numpy is linked. There still needs to > be a way to release a single file representing the code for the > release as a whole. > Can you expand on this please? I've never used numpy for anything serious and I'm trying to figure out why and what parts of what I'm thinking of wouldn't work for it. - Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA ___ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig