[Distutils] Re: Make an ordered list of sdists to be installed?

2018-07-23 Thread Tzu-ping Chung
I just happened to be looking into this area, and may be able to offer some 
help.

Some background information (optional reading): 
https://github.com/pypa/pipenv/issues/2596#issuecomment-405656806

Sébastien (@sdispater) is the author of Poetry, including the resolver, called 
Mixology. As he mentioned in the issue, he recently rewrote the resolver, and 
had yet to decide when and how to extract if from other inner workings of 
Poetry.

The old version of Mixology (before the rewrite), however, is still available 
on PyPI, and is good enough for this particular use with some additional work.

I made a quick, very inefficient implementation, available as a GitHub Gist:
https://gist.github.com/uranusjr/a7a9f20c6e43810bd19f0c73e9617182

Hopefully this would help with your need 😊

TP


From: distutils-sig-requ...@python.org
Sent: 24 July 2018 00:00
To: distutils-sig@python.org

I don’t know the details, but I did read that Poetry has a sophisticated 
dependency resolver. 

https://github.com/sdispater/poetry

I don’t know if there is a way to access the resolver independently of the 
tool, but perhaps it would provide a handy reference. 
2018년 7월 23일 (월) 오전 5:49, Thomas Kluyver 님이 작성:
Hi all,

Do we know of any tool that can, given the name of one or more packages, follow 
dependency chains and produce a list of packages in the order they need to be 
installed, assuming every package needed will be built from source?

Running "pip download --no-binary :all: ipython" gets me a set of sdists to be 
installed, but I lose any information about the order. I assume some packages 
will fail to build if their dependencies are not installed first, so the order 
is significant.

Pip appears to keep track of the ordering internally: if I run "pip install 
--no-binary :all: ipython", all the dependencies are downloaded, and then the 
collected packages are installed starting from those with no dependencies and 
finishing with the package I requested. But I don't know of any way to get this 
information out of pip. Is there an option that I'm overlooking? Or some other 
tool that can do this?

The use case I'm thinking about is to automatically generate instructions for a 
build system which separates the downloading and installing steps, so for each 
step it expects one or more URLs to download, along with instructions for how 
to install that piece. The installation steps shouldn't download further data. 
I could work around the issue by telling it to download all the sdists in a 
single step and then install in one shot with --no-index and --find-links. But 
it's more elegant - and better for caching - if we can install each package as 
a single step.

Thanks,
Thomas
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/LGTH3IYBMVKBS4PYGFJ6A7N5GW5ZKFUY/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/VX734J3S4EBKDSZR5BIRQPOH5LVHTHHE/


[Distutils] Re: pipenv and pip

2018-08-21 Thread Tzu-ping Chung
Hi,

Dan and I had been doing most of the maintenance work for Pipenv recently, and 
as Dan mentioned,
we have been working on some related projects that poke into pip internals 
significantly, so I feel I
should voice some opinions. I have significantly less experience messing with 
pip than Dan, and might
be able to offer a slightly different perspective.

Pipenv mainly interacts with pip for two things: install/uninstall/upgrade 
packages, and to gain information
about a package (what versions are available, what dependencies does a 
particular version has, etc.).
For the former case, we are currently using it with subprocesses, and it is 
likely the intended way of
interaction. I have to say, however, that the experience is not flawless. pip 
has a significant startup time,
and does not offer chances for interaction once it is started on running, so we 
really don’t have a good
way to, for example, provide installation progress bar for the user, unless we 
parse pip’s stdout directly.
These are not essential to Pipenv’s functionality, however, so they are more 
like an annoyance rather
than glaring problems.

The other thing Pipenv uses pip for—getting package information—is more 
troubling (to me, personally).
Pipenv has a slightly different need from pip regarding dependency resolution. 
pip can (and does) freely
drop dependencies that does not match the current environment, but Pipenv needs 
to generate a lock file
for an abstract platform that works for, say, both macOS and Windows. This 
means pip’s resolver is not
useful for us, and we need to implement our own. Our own resolver, however, 
still needs to know about
packages it gets, and we are left with two choices: a. try re-implement the 
same logic, or b. use pip internals
to cobble something together.

We tried to go for a. for a while, but as you’d easily imagine, our own 
implementation is buggy, cannot
handle edge cases nearly as well, and fielded a lot of complaints along the 
lines of “I can do this in pip, why
can’t I do the same in Pipenv”. One example is how package artifacts are 
discovered. At my own first
glance, I thought to myself this wouldn’t be that hard—we have a simple API, 
and the naming conventions are
there, so as long as we specify sources in Pipfile (we do), we should be able 
to discover them no problem.
I couldn’t be more wrong. There are find_links, dependency_links, pip.conf for 
the user, for the machine, all
sorts of things, and for everything quirk in pip we don’t replicate 100%, 
issues are filed urging use to fix it.
In the end we gave up and use pip’s internal PackageFinder instead.

This is a big problem going forward, and we are fully aware of that. The 
strategy we are taking at the
moment is to try to limit the surface area of pip internals usage. Dan 
mentioned we have been building a
resolver for Pipenv[1], and we took the chance to work toward centralising 
things interfacing with pip
internals. Those are still internals, of course, but we now have a relatively 
good idea what we actually need
from pip, and I’d be extremely happy if some parts of pip can come out as 
standalone with official blessing.
The things I am particularly interested in (since they would be beneficial for 
Pipenv) are:

* VcsSupport
* PackageFinder
* WheelBuilder (and everything that comes with it like the wheel cache, 
preparer, unpack_url, etc.)

Sorry for the very long post, but I want to get everything out so it might be 
easier to paint a complete picture
of the state we are currently in.


[1]: https://github.com/sarugaku/passa <https://github.com/sarugaku/passa>



Yours,

TP

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com

> On 21/8, 2018, at 00:00, distutils-sig-requ...@python.org wrote:
> 
> Send Distutils-SIG mailing list submissions to
>   distutils-sig@python.org
> 
> To subscribe or unsubscribe via the World Wide Web, visit
>   https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> or, via email, send a message with subject or body 'help' to
>   distutils-sig-requ...@python.org
> 
> You can reach the person managing the list at
>   distutils-sig-ow...@python.org
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Distutils-SIG digest..."Today's Topics:
> 
>   1. Re: pipenv and pip (Dan Ryan)
>   2. Re: pipenv and pip (Dan Ryan)
> 
> From: Dan Ryan 
> Subject: [Distutils] Re: pipenv and pip
> Date: 20 August 2018 at 22:04:11 GMT+8
> To: Chris Jerdonek 
> Cc: distutils sig 
> 
> 
> The truth is that it’s basically impossible to gauge bugs in pip vs bugs in 
> our patches to it which are often a lot more likely — reproductions of edge 
> cases can be impossible but there are specific things I know we broke (like 
> parsing certain kinds of extras, previously) — mostly bugs land in pips issue 
> tr

[Distutils] Re: pipenv and pip

2018-08-21 Thread Tzu-ping Chung
Hi Chris,

>From my understanding (it is totally possible I could miss something), 
>get_vcs_deps()
is specifically used only for resolution (to populate Git hash etc. in the lock 
file, if I’m
not mistaken), not installation. This is why I mentioned the two main aspects 
Pipenv
interact with pip for in the very beginning—the situation of the two parts are 
very
different; the latter (resolution) is significantly worse than the former.

I admit Pipenv has not been a good citizen in that regard. I do hope to clean 
most
(if not all) of those up with the new resolver implementation.

Thanks for the feedback, and sorry for the disturbance.

TP


Sent from Mail for Windows 10

From: Chris Jerdonek
Sent: 21 August 2018 19:58
To: Tzu-ping Chung
Cc: distutils sig
Subject: Re: [Distutils] Re: pipenv and pip

On Tue, Aug 21, 2018 at 4:02 AM, Tzu-ping Chung  wrote:
>
> Pipenv mainly interacts with pip for two things: install/uninstall/upgrade
> packages, and to gain information
> about a package (what versions are available, what dependencies does a
> particular version has, etc.).
> For the former case, we are currently using it with subprocesses, and it is
> likely the intended way of
> interaction.

I just want to say that this didn't appear to be the case when I
looked at the code. For example, the pipenv function obtain_vcs_req()
called inside get_vcs_deps() uses internal pip API's directly --
apparently for its installs of "VCS dependencies" in editable mode:
https://github.com/pypa/pipenv/blob/5da09fd24e3b63f28f77454594649bd2912fb17d/pipenv/utils.py#L1187-L1196

The function obtain_vcs_req() seems to bypass a lot of the logic
inside VersionControl.obtain() (which I know because this is an area
of the code that I've actively been working on improving). I also
noticed that pipenv's code here seems to result in the installation
routine unnecessarily being called twice in succession in some cases,
since it calls update() even after obtain() is called (and a
RevOptions object shouldn't be getting passed to is_commit_id_equal()
there -- that method invocation will always return False).

It was a little frustrating to see these methods being called in this
way, because it made it appear to me that new, different problems
might be getting introduced in pipenv, even as bugs are getting fixed
in pip.

--Chris


> I have to say, however, that the experience is not flawless.
> pip has a significant startup time,
> and does not offer chances for interaction once it is started on running, so
> we really don’t have a good
> way to, for example, provide installation progress bar for the user, unless
> we parse pip’s stdout directly.
> These are not essential to Pipenv’s functionality, however, so they are more
> like an annoyance rather
> than glaring problems.
>
> The other thing Pipenv uses pip for—getting package information—is more
> troubling (to me, personally).
> Pipenv has a slightly different need from pip regarding dependency
> resolution. pip can (and does) freely
> drop dependencies that does not match the current environment, but Pipenv
> needs to generate a lock file
> for an abstract platform that works for, say, both macOS and Windows. This
> means pip’s resolver is not
> useful for us, and we need to implement our own. Our own resolver, however,
> still needs to know about
> packages it gets, and we are left with two choices: a. try re-implement the
> same logic, or b. use pip internals
> to cobble something together.
>
> We tried to go for a. for a while, but as you’d easily imagine, our own
> implementation is buggy, cannot
> handle edge cases nearly as well, and fielded a lot of complaints along the
> lines of “I can do this in pip, why
> can’t I do the same in Pipenv”. One example is how package artifacts are
> discovered. At my own first
> glance, I thought to myself this wouldn’t be that hard—we have a simple API,
> and the naming conventions are
> there, so as long as we specify sources in Pipfile (we do), we should be
> able to discover them no problem.
> I couldn’t be more wrong. There are find_links, dependency_links, pip.conf
> for the user, for the machine, all
> sorts of things, and for everything quirk in pip we don’t replicate 100%,
> issues are filed urging use to fix it.
> In the end we gave up and use pip’s internal PackageFinder instead.
>
> This is a big problem going forward, and we are fully aware of that. The
> strategy we are taking at the
> moment is to try to limit the surface area of pip internals usage. Dan
> mentioned we have been building a
> resolver for Pipenv[1], and we took the chance to work toward centralising
> things interfacing with pip
> internals. Those are still internals, of course, but we now have a
> relatively good idea what we actually nee

[Distutils] Re: pipenv and pip

2018-08-21 Thread Tzu-ping Chung
Speaking of Zazo[1], I actually found its abstraction extremely similar to our 
current
abstraction, ResolveLib[2]. I’m not sure it’s coincidence, we took inspiration 
from
similar sources, or this is simply the right way to do it, but how the logic 
matches
is quite surprising to me.

I’m quite sure we would be able to find some way to consolidate efforts once we
find the chance, but for the moment, progress in ResolveLib (and by extension
Pipenv) would likely benefit pip developing a proper resolver (be it Zazo or
otherwise).

TP

[1]: https://github.com/pradyunsg/zazo
[2]: https://github.com/sarugaku/resolvelib

Sent from Mail for Windows 10

From: Dan Ryan
Sent: 21 August 2018 22:03
To: Tzu-ping Chung
Cc: Chris Jerdonek; distutils sig
Subject: Re: [Distutils] Re: pipenv and pip

There was a specific bug related to pipenv-only functionality which is why the 
vcs ref is obtained. Pip now by default prefers ephemeral wheels, while pipenv 
maintains a copy of editable repos currently. The work Tzu Ping has been 
discussing and the work in pipenv currently are separate. You can also not 
simply read some lines of pipenv and assume there should be a 1-1 functional 
equivalence. Sometimes we will have overlap and sometimes we will have bugs. In 
the specific case you mentioned, we simply make sure to check out whatever 
version is requested in the Pipfile before doing resolution and handing off the 
results to pip for installation.  So while it may seem like we are simply doing 
things over again that pip already handles, we have different motivations and 
while we very likely have plenty of bugs, there is more context than that we 
did something that pip also does. 

In any event, I’m not sure this mailing list is the right place to do code 
reviews; we would happily accept any feedback if you make it on the project 
itself.  Pipenv has a lot of cleanup work to do. We are in the process of 
tackling that now. If you have ideas for how to tackle that, we would love to 
hear them :)

With regard to zazo — it’s been mentioned to us many times but we’ve been in 
touch with Pradyun as I mentioned and he said he was very busy and couldn’t 
coordinate efforts at all on the resolver front. Zazo isn’t actually an 
implementation, it’s an abstraction layer. We built a directed graph and 
layered a resolver on top of it. Since this is a primary functionality pipenv 
has always wanted to offer (as far as my understanding went) and has always 
basically failed at because of various tooling issues, we have been working for 
the last 4-8 weeks pretty much in seclusion trying to tackle this. 
Dan Ryan // pipenv maintainer
gh: @techalchemy

On Aug 21, 2018, at 9:02 AM, Tzu-ping Chung  wrote:
Hi Chris,
 
>From my understanding (it is totally possible I could miss something), 
>get_vcs_deps()
is specifically used only for resolution (to populate Git hash etc. in the lock 
file, if I’m
not mistaken), not installation. This is why I mentioned the two main aspects 
Pipenv
interact with pip for in the very beginning—the situation of the two parts are 
very
different; the latter (resolution) is significantly worse than the former.
 
I admit Pipenv has not been a good citizen in that regard. I do hope to clean 
most
(if not all) of those up with the new resolver implementation.
 
Thanks for the feedback, and sorry for the disturbance.
 
TP
 
 
Sent from Mail for Windows 10
 
From: Chris Jerdonek
Sent: 21 August 2018 19:58
To: Tzu-ping Chung
Cc: distutils sig
Subject: Re: [Distutils] Re: pipenv and pip
 
On Tue, Aug 21, 2018 at 4:02 AM, Tzu-ping Chung  wrote:
> 
> Pipenv mainly interacts with pip for two things: install/uninstall/upgrade
> packages, and to gain information
> about a package (what versions are available, what dependencies does a
> particular version has, etc.).
> For the former case, we are currently using it with subprocesses, and it is
> likely the intended way of
> interaction.
 
I just want to say that this didn't appear to be the case when I
looked at the code. For example, the pipenv function obtain_vcs_req()
called inside get_vcs_deps() uses internal pip API's directly --
apparently for its installs of "VCS dependencies" in editable mode:
https://github.com/pypa/pipenv/blob/5da09fd24e3b63f28f77454594649bd2912fb17d/pipenv/utils.py#L1187-L1196
 
The function obtain_vcs_req() seems to bypass a lot of the logic
inside VersionControl.obtain() (which I know because this is an area
of the code that I've actively been working on improving). I also
noticed that pipenv's code here seems to result in the installation
routine unnecessarily being called twice in succession in some cases,
since it calls update() even after obtain() is called (and a
RevOptions object shouldn't be getting passed to is_commit_id_equal()
there -- that method invocation will always return False).
 
It was a little frustrating to see these methods being called in this
way, beca

[Distutils] Editable requirement parsing in pip

2018-08-31 Thread Tzu-ping Chung
Hi,

I was verifying my Pipenv resolver work, and found a quirk in pip when handling
-e requirements.

With a requirements.txt:

# Works as expected
colorama ; os_name == "nt"

# Works as expected
./colorama ; os_name == "nt"

# Drops markers; always installs disregarding the OS (!)
-e ./colorama ; os_name == "nt"

This by itself is okay, since it doesn’t really make sense to me to specify a
requirement as editable without intending it to always work, although somehow
surprising and (AFAICT) undocumented.

The behavior in the command line is a little more preplexing:

# Works as expected
$ pip install "./colorama ; os_name == 'nt'"

# Does not work
$ pip install -e "./colorama ; os_name == 'nt'"
./colorama ; os_name == "nt" should either be a path to a local project or
a VCS url beginning with svn+, git+, hg+, or bzr+

# This either
$ pip install "-e ./colorama ; os_name == 'nt'"
./colorama ; os_name == "nt" should either be a path to a local project or
a VCS url beginning with svn+, git+, hg+, or bzr+

Since pip’s documentation says a requirements.txt “is just a list of pip install
arguments placed in a file” [1], this is surprising to me.

[1]: https://pip.pypa.io/en/stable/user_guide/#id1

So my questions are

* Is this behaviour intentional, or is it an oversight?
* Is there specification or documentation on how pip is supposed to work?

Given that PEP 508 (although pip does not actually use its URL lookup syntax) 
allows
requirements to specify markers in all cases, I would expect pip to allow and 
honour
them in all cases. Or, if that’s not the case (since editable is not really a
standardised thing), I’d expect pip to at least be consistent with itself :)


--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/3B4W7ZCANHBQLYHLHYF5Y6FG757HVYX6/


[Distutils] Re: Environment markers for GPU/CUDA availibility

2018-08-31 Thread Tzu-ping Chung
I’m not knowledgable about GPUs, but from limited conversations with others,
it is important to first decide what exactly the problem area is. Unlike 
currently
available environment markers, there’s currently not a very reliable way to
programmatically determine even if there is a GPU, let alone what that GPU can
actually do (not every GPU can be used by Tensorflow, for example).

IMO it would likely be a good route to first implement some interface for GPU
environment detection in Python. This interface can then be used in projects 
like
tensorflow-auto-detect. Projects like Tensorflow can also detect directly what
implementation it should use, like many projects do platform-specific things by
detection os.name of sys.platform. Once we’re sure we have all the things needed
for detection, markers can be drafted based on the detection interface.

TP


> On 01/9/2018, at 03:57, Dustin Ingram  wrote:
> 
> Hi all, trying to pull together a few separate discussions into a
> single thread here.
> 
> The main issue is that currently PEP 508 does not provide environment
> markers for GPU/CUDA availability, which leads to problems for
> projects that want to provide distributions for environments with and
> without GPU support.
> 
> As far as I can tell, there's been multiple suggestions to bring this
> issue to distutils-sig, but no one has actually done it.
> 
> Relevant issues:
> 
> (closed) "How should Python packages depending on TensorFlow structure
> their requirements?"
> https://github.com/tensorflow/tensorflow/issues/7166
> 
> (closed) "Adding gpu or cuda specification in PEP 508"
> https://github.com/python/peps/issues/581
> 
> (closed) "More support for conditional installation"
> https://github.com/pypa/pipenv/issues/1353
> 
> (no response) "Adding gpu or cuda markers in PEP 508"
> https://github.com/pypa/interoperability-peps/issues/68
> 
> There is now a third-party project which attempts to amend this for
> tensorflow (https://github.com/akatrevorjay/tensorflow-auto-detect)
> but this approach is somewhat fragile (depends on version numbers
> being in sync), doesn't directly scale to all similar projects, and
> would require maintainers for a given project to maintain _three_
> separate projects, instead of just one.
> 
> I'm not intimately familiar with PEP 508, so my questions for this list:
> 
> * Is the demand sufficient to justify supporting this use case?
> * Is it possible to add support for GPU Environment markers?
> * If so, what would need to be done?
> * If implemented, what should the transition look like for projects
> like tensorflow?
> 
> Thanks!
> D.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/LXLF4YSC4WUZOYRX65DW7CESIX7UUBK5/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/3VLRDFVQS7E7BBAYF6WQQ5TM2QJMVEDX/


[Distutils] Re: manylinux1 guidelines for zlib?

2018-09-05 Thread Tzu-ping Chung
Isn’t zlib only required for compression? It is my impression that zipfile’s 
decompressor is pure Python,
and only depends on zlib if the archive is encrypted (but wheels are never 
encrypted).

zlib also does not provide decompression at its core; for that you need 
zlib/contrib/minizip, but I don’t
think CPython depends on minizip either.

zlib is technically optional, and not even required to installed wheels.


> On 05/9/2018, at 19:35, Donald Stufft  wrote:
> 
> 
> 
>> On Sep 4, 2018, at 6:06 AM, Alex Walters > > wrote:
>> 
>> Since zlib is a dependency of python, the assumption has to be that it is
>> already present.
> 
> 
> It is technically an optional dependency of Python, though I don’t think you 
> can install wheels without zlib present since wheels are zip files and 
> generally include compression. Although maybe it could work in niche 
> scenarios if someone built wheels using ZIP_STORED instead of ZIP_DEFLATE?
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/FFB5XQBHX6N26FD5ZELMRIBRNOCNQ62Z/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/QA37CKADM7YOT6DZPY4IDBJ73L3ZLSXU/


[Distutils] Re: manylinux1 guidelines for zlib?

2018-09-06 Thread Tzu-ping Chung
I see. Thanks for the clarification!


> On 05/9, 2018, at 22:54, Donald Stufft  wrote:
> 
> 
> 
>> On Sep 5, 2018, at 9:30 AM, Tzu-ping Chung > <mailto:uranu...@gmail.com>> wrote:
>> 
>> Isn’t zlib only required for compression? It is my impression that zipfile’s 
>> decompressor is pure Python,
>> and only depends on zlib if the archive is encrypted (but wheels are never 
>> encrypted).
>> 
>> zlib also does not provide decompression at its core; for that you need 
>> zlib/contrib/minizip, but I don’t
>> think CPython depends on minizip either.
>> 
>> zlib is technically optional, and not even required to installed wheels.
>> 
>> 
> 
> 
> I don’t think that’s accurate, if you look at zipfile.py, it does: 
> 
> https://github.com/python/cpython/blob/874809ea389e6434787e773a6054a08e0b81f734/Lib/zipfile.py#L17-L22
>  
> <https://github.com/python/cpython/blob/874809ea389e6434787e773a6054a08e0b81f734/Lib/zipfile.py#L17-L22>
> 
> Which attempts to import the zlib module, and if it fails sets zlib to None. 
> Then later one this is used:
> 
> https://github.com/python/cpython/blob/874809ea389e6434787e773a6054a08e0b81f734/Lib/zipfile.py#L682-L683
>  
> <https://github.com/python/cpython/blob/874809ea389e6434787e773a6054a08e0b81f734/Lib/zipfile.py#L682-L683>
> 
> Which doesn’t guard the expression at all, so I believe if you attempt to 
> unzip a wheel that uses ZIP_STORED without the zlib module built, then you’ll 
> get an AttributeError, None type does not have decompressobj exception.
> 
> 
> The Python stdlib zlib module does not appear to have any fallback at all if 
> the zlib library is not available and a recent enough version.

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/COH66FOLE5UW5HCMGGQJKLP2B34U5L6R/


[Distutils] Re: Adopting virtualenv package maintenance

2018-09-06 Thread Tzu-ping Chung
I don’t have the authority to do it, but I would really love to see virtualenv 
getting maintenance.

Pipenv still depends on virtualenv because for not only Python 2. I am already 
working on switching
to venv for Python 3.5+, but we will continue to need virtualenv for quite a 
while due to some
compatibility issues, even on Python 3. We occasionally get bug reports about 
virtual environment
creation, and those generally are due to some long standing bugs in virtualenv. 
It would be awesome
if we could point reporters somewhere to actually get things fixed, instead of 
say sorry with a shrug.

I am not suggesting in particular who should be the maintainer(s) (but also not 
objecting to Sorin’s
proposal), but virtualenv really needs a maintainer.

TP


> On 05/9, 2018, at 16:56, Sorin Sbarnea  wrote:
> 
> As it seems that virtualenv package is in need of some maintenance effort, 
> focused mostly on doing reviews, closing or merging them and eventually doing 
> a new release once a month.
> 
> I know that virtualenv is in deprecation mode as its would be no longer 
> needed when Python2 will no longer be used. The reality is that Python 2.x 
> will still be in production after January 1st, 2020 because there are 
> deployed products with LTS contracts which will need some time to get updated 
> to newer versions that use py3. This automatically translates to the need to 
> have a working virtualenv for testing them. I am part of the OpenStack team 
> and I am sure that, even if I like it or not, I would have to deal with some 
> amount of py2 even after the magic date.
> 
> The current situation with virtualenv is pretty bad because there are lots of 
> open pull-requests which are not reviewed or merged, mostly because there is 
> nobody available to do that boring extra work. I had few changes that were 
> improving the CI testing of virtualenv which soon will be one year old,... 
> most of them without any feedback. Even finding whom to ping by email or irc 
> was a challenge as I got two responses: no response at all or someone else 
> telling me that they are not maintainers of the virtualenv package. Example 
> https://groups.google.com/forum/#!topic/pypa-dev/YMVsRbNoVpg
> 
> For these reasons I would like to become a maintainer for virtualenv, 
> preferably working with two others on keeping it alive for a couple of years 
> till we could organize a big wake ceremony for it.
> 
> It would be preferable if two others would join the maintenance "taskforce" 
> because merging a change should almost always involve at least two reviewers.
> 
> While I cannot make any guarantees regarding dealing with all reported bugs, 
> I can commit on assuring that there are no PRs that are not reviewed for 
> longer than 30 days (aiming for one week). Now there are ~75 open PRs. I have 
> being doing open source for a long time and I respect all the time and 
> efforth put by project maintainers and at the same time I always tried to do 
> my best dealign with incoming PRs because if someone spended his time trying 
> to make a contribution that is passing CI, they probably deserve at least a 
> review. 
> 
> https://github.com/pypa/virtualenv/pulls 
> 
> Thanks
> Sorin Sbarnea
> @ssbarnea on irc/github/...
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/EOXOL3742HVDLAIQDODL36UNRGU4R6SG/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/TFUYTDYSOLT3KJ2SRPM7Y7K6GNDK5TMW/


[Distutils] Re: Adopting virtualenv package maintenance

2018-09-07 Thread Tzu-ping Chung

> On 08/9/2018, at 02:23, Brett Cannon  wrote:
> 
> 
> 
> On Fri, 7 Sep 2018 at 11:18 Nathaniel Smith  > wrote:
> On Fri, Sep 7, 2018, 10:48 Brett Cannon  > wrote:
> 
> 
> On Thu, 6 Sep 2018 at 13:44 Alex Becker  > wrote:
> Another +1 to the utility of a maintainer. I am also working on package 
> management and have found that venv is not a full replacement for 
> virtualenv--for example I don't believe the environment can be entered 
> programatically, while virtualenv provides activate_this.py which can be 
> exec'd. I'm sure there are many other limitations, so I don't think python 
> can give up on virtualenv soon.
> 
> But are those inherent limitations of venv or simply a lack of a provided API 
> or library to have the equivalent abilities? I assume there's a difference 
> between keeping virtualenv running versus developing a small library on top 
> of venv to backfill some things.
> 
> I guess venv being in the stdlib means that any limitations it has are going 
> to keep limiting "python -m venv" for quite a while.
> 
> Depends on the type of limitation. Alex mentioned an activate_this.py script 
> that makes activation a generic thing. That doesn't strike me as something 
> that requires changes in the stdlib and instead something on PyPI that 
> provided the equivalent for venv (and potentially also being in the stdlib).

Just want to mention that adding activate_this.py to venv has been proposed, 
and rejected.
https://bugs.python.org/issue21496 

Also there is an interpolation problem when you want to create a venv out of a 
virtualenv.
https://bugs.python.org/issue30811 

While I agree with Vinay that this isn’t really a venv problem, it does still 
require a solution
in some way. If venv doesn’t change, virtualenv needs to.

No matter what the future of virtualenv is (rewritten on top of venv or not), I 
think it is at
least reasonable to say that virtualenv needs *some* work, and someone needs to 
work on
it? Because virtualenv is still being used at a very high volume while we are 
exchanging words,
and there are a lot of simple improvements people can benefit from right now.


>  
> 
> If we want to work around these limits on something other than the Python 
> release cycle, then it means training users to not run "python -m venv", and 
> instead run "python -m somethingelse".
>  
> 
> So long as that's necessary, the "somethingelse" might as well be 
> "virtualenv", which is what everyone is already trained to do anyway...
> 
> There have been plans at various points to rewrite virtualenv on top of venv, 
> and as far as I know the limiting factor was time/energy, not that they hit 
> some intrinsic limitation. 
> 
> As is everything in open source. ;) I'm just trying to understand if there's 
> something that specifically needs to change in venv long-term or not.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/MIF3VTCHQIKRNS2UJEGTSBTW2SMCWTT4/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/NEC322PDDNIWDYHJNLESZQQ5WWGSNEAC/


[Distutils] Re: PEP 518 and the pyproject.toml format

2018-09-10 Thread Tzu-ping Chung
One minor clarification—Poetry os both an application and a library packaging 
tool.
It however relies on the old, non-isolated Setuptools approach to do the 
building part
under the hood, so (from my own understanding)

Poetry ≒ Pipenv + Flit - PEP 517 - PEP 518

Whether this is a good approach is another topic (well, I guess my position is 
implied as
a Pipenv maintainer), and Paul’s explanation is still complete valid.


> On 10/9, 2018, at 18:47, Paul Moore  wrote:
> 
> On Mon, 10 Sep 2018 at 10:13, Melvyn Sopacua  wrote:
> 
>> First post, so short introduction:
> 
> Hi, and welcome.
> 
>> I'm mostly working on Django projects, currently for 3yourminD in Berlin.
>> Working with python / Django for 6 going on 7 years. Also experienced in PHP,
>> some Js and build systems like docker.
>> We're currently migrating everything to docker and pip / requirements.txt
>> isn't cutting it anymore and I've tasked myself with looking for or building
>> an alternative tool.
>> 
>> This weekend I did quite a bit of research into the packaging tools and came
>> accross PEP 518. What it doesn't address is that there now is a file resvered
>> that declares very little. It does not declare and define sections to serve 
>> as
>> a deployment file for a project and apparently it is unclear to tool builders
>> what the one section is does define is supposed to be used for.
> 
> It is true, PEP 518 doesn't define a lot by itself. But as it notes,
> "Tables not specified in this PEP are reserved for future use by other
> PEPs" - in other words, it's possible to add further details, but they
> should be backed by a formal specification, so that tools have a
> common understanding of what to do with the data.
> 
>> Case in point: poetry doesn't add itself to the build-system table when
>> writing the py-project.toml file. Which, as I read it, it should. Ideally,
>> every tool.${name} should be in the build system?
> 
> Well, poetry isn't (as I understand it) a build system. It doesn't
> define *how* to build a wheel from project sources, rather it
> orchestrates the process of building an application from a set of
> projects (as I understand it from a brief read of the project page).
> That puts it in the same area as pipenv, I guess? That process
> (building *applications*) isn't something that's been really looked at
> yet in terms of standardisation, so yes, it'd still very much fair
> game for something to be discussed and agreed.
> 
> One reason this hasn't yet been looked at, as I understand things, is
> that there are a lot of very different scenarios to consider - web
> applications (which I guess are your area of interest), standalone
> command line utilities like Mercurial or the Leo editor, orchestration
> tools like Ansible, background OS service processes, etc. And it's
> unlikely that a "one size fits all" approach would work, so there's a
> whole question of how any standard would establish its scope and
> boundaries. One note of caution - the attempts I've seen in the past
> to define "build processes" tend to promote a certain amount of
> confusion by assuming all projects are similar, and then getting
> bogged down when people with very different requirements come along
> with different assumptions and mistaken expectations.
> 
>> So, can we improve the file specification and introduce sections that allows 
>> a
>> project (not a library) to specify how it should be built? Right now, this is
>> all off-loaded to the tools that use the file and one cannot rely on 
>> anything.
>> It also makes migration from one tool to another difficult and cooperation
>> between tools hard. Or is this intentional: is this supposed to be tied to
>> specific tools as build system are generally tightly coupled? And should we
>> not change that?
> 
> You say "a project (not a library)". I presume by that you mean
> something like the application building process I described above?
> Yes, there's not been any standardisation effort in this area yet, so
> the situation is as you describe. It's not intentional as such, simply
> that no-one has really stepped up to propose anything yet. And maybe
> because there aren't that many formal tools in this area - a lot of
> what goes on appears to be simply workflow and local standards. But
> certainly it's something that might be worth exploring.
> 
>> It would help if the PEP had some explanation of the choices made to reserve 
>> a
>> file that partially sets out to replace (all?) other files, but does not do
>> anything to accomplish that.
> 
> PEP 518 defines a means of "Specifying Minimum Build System
> Requirements for Python Projects". It defines a new, flexible, file
> format to hold that information, and defers further definition of the
> file contents to future PEPs, but it was never (to my knowledge) a
> goal of that PEP to look at "all other files" and fit them into the
> pyproject.toml format. It did reserve a namespace "tool.XXX" for
> projects that choose to consider the

[Distutils] Re: pip installing scripts into another virtualenv

2018-09-14 Thread Tzu-ping Chung
I am confused. If you’re installing things with subprocess, and is using 
virtual environments anyway, wouldn’t it be simpler to use the pip inside the 
virtual environment directly?

Instead of using a (random) pip to install into the environment, you can 
instead do

[virtualenv-path]/[script-prefix]/python -m pip install …

The script prefix would be bin in most cases, or Scripts on Windows. You can 
dig into the source of venv or virtualenv to find the exact logic and use it, 
but in practice it is basically enough to test both locations until you find 
the correct Python executable. (This is also Pipenv’s approach, essentially.)

I wouldn’t say it’s crazy to pip-install into a foreign virtual environment, 
but can’t recommend it either. pip does a lot of runtime detection when 
choosing an artifact to install a package. So unless you’re installing directly 
from URL, there is no guarantee you are installing the correct artifact for a 
given package, or even choosing the correct version (when you lock the 
requirements). Pipenv actually used a similar approach in its initial design, 
but found it to be very error-prone. Reading the project description, however, 
it seems like you’re locking “specific distributions” for a given package, so I 
guess this this is not necessarily a problem? I am not sure how you can choose 
the correct distribution/artifact without using the in-environment pip though 
(I didn’t read the implementation).

Finally, if all you really want is to download/unpack/install the locked 
artifact to use, maybe you can even do away with virtual environments 
altogether. There was a discussion around ditching virtual environments 
altogether in favour of a node/npm-like setup at this year’s PyCon (I didn’t 
attend)[1] that you might find interesting.

[1]: https://lwn.net/Articles/757354/ 


TP


> On 14/9, 2018, at 15:15, Alex Becker  wrote:
> 
> As part of a package management tool , 
> I'm trying to use pip to install python packages into a virtualenv, from 
> python code (via subprocess), into a different virtualenv than the virtualenv 
> my python process is running in (if any). I have this *mostly* working with:
> 
> pip install --prefix [virtualenv-path] --ignore-installed --no-build-isolation
> 
> However, installed scripts break because the scrips automatically get 
> prepended with a shebang pointing to the python interpreter pip was run under.
> 
> Is there a way around this behavior? Am I crazy to even try to install into a 
> different virtualenv? Or do I have to re-architect my code to call pip in the 
> target virtualenv (which may require me forcing pip to be installed, 
> depending on what versions of python I choose to support)?
> 
> Sincerely,
> 
> Alex Becker
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/C5Y2KKW2YQGFE74LZXQLXE32RJOABVEE/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/CXHKLJNDV5OV5ZFMYZSHZR5V6CEY37Z7/


[Distutils] Re: disable building wheel for a package

2018-09-14 Thread Tzu-ping Chung
I’m wondering though, why? The wheel format already offers a data directory, 
couldn’t it be used for most of the things mentioned?


> On 14/9/2018, at 22:25, Daniel Holth  wrote:
> 
> No one wants wheel to be able to install things outside of the virtualenv. 
> What people have repeatedly asked for is the ability to install things 
> somewhere besides $VIRTUAL_ENV/lib/python#.#/site-packages/, places like 
> $VIRTUAL_ENV/etc/ for example.
> Should all the config files, documentation, data, man pages, licenses, 
> images, go into $VIRTUAL_ENV/lib/python#.#/site-packages/? Or can we do a 
> better job letting people put files in $VIRTUAL_ENV/xyz?
> 
> On Fri, Sep 14, 2018 at 9:51 AM sashk  > wrote:
>  
>  
> 14.09.2018, 08:37, "Paul Moore"  >:
>> On Fri, 14 Sep 2018 at 12:43, Jeroen Demeyer > > wrote:
>> 
>>  On 2018-09-14 12:55, Alex Grönholm wrote:
>>  > I'm curious: what data does it attempt to install and where? Have you
>>  > created a ticket for this somewhere?
>> 
>>  The OP mentioned absolute paths. However, it really sounds like a bad
>>  idea to hard-code an absolute installation path. Let's consider it a
>>  feature that wheel doesn't support that.
>> 
>> The OP hasn't said, but I assumed that it was expecting to install
>> something in a "standard" Unix location like /etc.
>> 
> No, I'm not installing anything into standard unix locations. My package is 
> for internal use, so we had a luxury to write it specifically for use with 
> virtual environment.  
>  
> We need to install Jupyter kernels (and other files) into 
> $VIRTUAL_ENV/etc/jupyter and $VIRTUAL_ENV/share/jupyter paths. This was done 
> with the help of data_files, and works unless we build wheel, because of use 
> of absolute paths.
>  
> Do I understand correctly, that when using relative paths in the data_files 
> and installing package into virtual environment,  installation prefix is 
> sys.prefix and it is the same as $VIRTUAL_ENV?
>  
> Thanks.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org 
> 
> To unsubscribe send an email to distutils-sig-le...@python.org 
> 
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ 
> 
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/AR65F7SMLRN54FKZ6EI6LKZZDCVFNKUX/
>  
> 
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/YSMYRZDS3DESU6YOCVSGP7ZMGQJUAK77/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/HKAME45XWGEPGWRJ7VYWZ3UQV3B63WJ5/


[Distutils] Re: Adopting virtualenv package maintenance

2018-09-16 Thread Tzu-ping Chung
Thanks for pointing out the PR, I didn’t know that exists :D

I made a thin wrapper around virtualenv/venv a while ago too. My intention is 
to use it to abstract away library differences so I can bring native venv 
support to Pipenv more easily, so the library has some special quirks (e.g. 
uses virtualenv on Python 3.3, even though venv is technically available) to 
suit its needs, but the idea is similar—create with venv if the Python 
installation supports it, otherwise fallback virtualenv. Just throwing it out 
here in case anyone is interested.


> On 16/9/2018, at 22:06, Nick Coghlan  wrote:
> 
> On Sat, 8 Sep 2018 at 13:34, Brett Cannon  > wrote:
> On Fri., Sep. 7, 2018, 16:32 Dan Ryan,  > wrote:
> I’m thinking (and correct me if I’m wrong here) that Brett’s message might be 
> motivated more by a desire to standardize and centralize effort by first 
> assessing what the issue is. 
> 
> Yep. I try to take the decade view of things so solving stuff that will take 
> a while to trickle out to the community doesn't phase me. 😉
> 
> 
> Note that Donald created a draft virtualenv rewrite that worked as a thin 
> shell around venv on the versions that provided venv, but otherwise worked in 
> much the same as it always had: https://github.com/pypa/virtualenv/pull/697 
> 
> 
> Around this time last year, he noted that he wasn't sure when he would be 
> able to find the time to follow up on it, and if someone else wanted to 
> pursue a different strategy he'd be OK with that: 
> https://github.com/pypa/virtualenv/pull/697#issuecomment-333937166 
> 
> 
> So if folks are still interested in the general idea of improving virtualenv 
> and venv interoperability, then my last message to that thread and Paul's 
> follow up would be a decent place to start: 
> https://github.com/pypa/virtualenv/pull/697#issuecomment-333437537 
> 
> 
> However, eliminating virtualenv as a project/component is a definite 
> *non*-goal. While venv is valuable because it integrates more tightly into 
> the interpreter startup sequence than virtualenv itself was ever able to (and 
> provides a common API for doing virtualenv-like things across different 
> Python implementations), virtualenv remains valuable as a cross-version 
> compatibility and enhancement layer for the parts that don't need to be as 
> tightly integrated with the core interpreter implementation. We just hope to 
> eventually be able to delete most of the code from it, and have it just 
> handle the parts that venv itself doesn't cover in any given version of 
> Python :)
> 
> Cheers,
> Nick.
> 
> -- 
> Nick Coghlan   |   ncogh...@gmail.com    |   
> Brisbane, Australia
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/RXBKWLQOMS2OM56VIVRZ6O3ALYZTRKHT/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/THSTP27OFOHVZIAAXMHJ64XW2EE4M5UE/


[Distutils] Re: Adopting virtualenv package maintenance

2018-09-16 Thread Tzu-ping Chung
Sorry, hit the send button too soon. Here’s the project:
https://github.com/sarugaku/virtenv <https://github.com/sarugaku/virtenv>



> On 16/9/2018, at 22:50, Tzu-ping Chung  wrote:
> 
> Thanks for pointing out the PR, I didn’t know that exists :D
> 
> I made a thin wrapper around virtualenv/venv a while ago too. My intention is 
> to use it to abstract away library differences so I can bring native venv 
> support to Pipenv more easily, so the library has some special quirks (e.g. 
> uses virtualenv on Python 3.3, even though venv is technically available) to 
> suit its needs, but the idea is similar—create with venv if the Python 
> installation supports it, otherwise fallback virtualenv. Just throwing it out 
> here in case anyone is interested.
> 
> 
>> On 16/9/2018, at 22:06, Nick Coghlan > <mailto:ncogh...@gmail.com>> wrote:
>> 
>> On Sat, 8 Sep 2018 at 13:34, Brett Cannon > <mailto:br...@python.org>> wrote:
>> On Fri., Sep. 7, 2018, 16:32 Dan Ryan, > <mailto:d...@danryan.co>> wrote:
>> I’m thinking (and correct me if I’m wrong here) that Brett’s message might 
>> be motivated more by a desire to standardize and centralize effort by first 
>> assessing what the issue is. 
>> 
>> Yep. I try to take the decade view of things so solving stuff that will take 
>> a while to trickle out to the community doesn't phase me. 😉
>> 
>> 
>> Note that Donald created a draft virtualenv rewrite that worked as a thin 
>> shell around venv on the versions that provided venv, but otherwise worked 
>> in much the same as it always had: 
>> https://github.com/pypa/virtualenv/pull/697 
>> <https://github.com/pypa/virtualenv/pull/697>
>> 
>> Around this time last year, he noted that he wasn't sure when he would be 
>> able to find the time to follow up on it, and if someone else wanted to 
>> pursue a different strategy he'd be OK with that: 
>> https://github.com/pypa/virtualenv/pull/697#issuecomment-333937166 
>> <https://github.com/pypa/virtualenv/pull/697#issuecomment-333937166>
>> 
>> So if folks are still interested in the general idea of improving virtualenv 
>> and venv interoperability, then my last message to that thread and Paul's 
>> follow up would be a decent place to start: 
>> https://github.com/pypa/virtualenv/pull/697#issuecomment-333437537 
>> <https://github.com/pypa/virtualenv/pull/697#issuecomment-333437537>
>> 
>> However, eliminating virtualenv as a project/component is a definite 
>> *non*-goal. While venv is valuable because it integrates more tightly into 
>> the interpreter startup sequence than virtualenv itself was ever able to 
>> (and provides a common API for doing virtualenv-like things across different 
>> Python implementations), virtualenv remains valuable as a cross-version 
>> compatibility and enhancement layer for the parts that don't need to be as 
>> tightly integrated with the core interpreter implementation. We just hope to 
>> eventually be able to delete most of the code from it, and have it just 
>> handle the parts that venv itself doesn't cover in any given version of 
>> Python :)
>> 
>> Cheers,
>> Nick.
>> 
>> -- 
>> Nick Coghlan   |   ncogh...@gmail.com <mailto:ncogh...@gmail.com>   |   
>> Brisbane, Australia
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org 
>> <mailto:distutils-sig@python.org>
>> To unsubscribe send an email to distutils-sig-le...@python.org 
>> <mailto:distutils-sig-le...@python.org>
>> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ 
>> <https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/>
>> Message archived at 
>> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/RXBKWLQOMS2OM56VIVRZ6O3ALYZTRKHT/
> 

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/IKRNSK7QUSTFCB2576EUVNWTVLB2U7ET/


[Distutils] Re: disable building wheel for a package

2018-09-19 Thread Tzu-ping Chung

> On 19/9, 2018, at 16:02, Paul Moore  wrote:
> 
> On Wed, 19 Sep 2018 at 00:52, Dan Ryan  wrote:
>> 
>>> so the people benefiting
>>> are those who want a supported API for that functionality, and it
>>> seems only reasonable to expect them to do the job of moving the code,
>>> rather than expecting the pip developers to do so.
>> 
>> This is where I think we disagree and I feel the rhetoric is a bit harmful 
>> -- personally I don't benefit much at all, I actually don't think any 
>> individual maintainer inside the PyPA benefits much beyond having a new 
>> project to maintain, so the 'helps me vs helps you' framing isn't really the 
>> point.  If it strictly helped me to add a project to my list of things to 
>> maintain I would have done that already. The real issue here is that we all 
>> have different implementations and they create non-uniform / disjointed user 
>> experiences.  Converging on a set of common elements to extract seems like 
>> step 1
>> 
>> I am fairly new to the PyPA, and I don't know how any of these processes 
>> actually work.  But I do know that painting this as "us vs you" when my 
>> interest actually in helping the user of packaging tools is causing a 
>> disconnect for me anytime we engage on this -- and I'm not asking you to 
>> tackle any of this yourself, except possibly review someone's PR down the 
>> road to swap out some internals.
> 
> Apologies. I misread your email, and so I was mostly addressing the
> issues we've seen posted to pip asking for us to simply expose the
> internal functions, not your comment about multiple projects
> implementing the logic. Sorry for that. Agreed if we already have
> multiple implementations, merging them is a useful thing, but the
> benefits are diffuse and long term, so it's the sort of thing that
> tends to remain on the back burner indefinitely. (One of the problems
> with open source is that unless something is *already* available as a
> library, we tend to reimplement rather than refactoring existing code
> out of a different project, because the cost of that interaction is
> high - which unfortunately I demonstrated above by my comment "people
> needing an API should do the work" :-().

Risking thread hijacking, I want to take this chance and ask about one 
particular multiple implementation problem I found recently.

What is the current situation regarding distlib vs packaging and various pieces 
in pip? Many parts of distlib seems to have duplicates in either packaging or 
pip/setuptools internals. I understand this is a historical artifact, but what 
is the plan going forward, and what strategy, if any, should a person take if 
they are to make the attempt of merging, or collecting pieces from existing 
code bases into a workable library?

From what I can tell (very limited), distlib seems to contain a good baseline 
design of a library fulfilling the intended purpose, but is currently missing 
parts to be fully usable on its own. Would it be a good idea to extend it with 
picked parts from pip? Should I contribute directly to it, or make a (higher 
level) wrapper around it with those parts? Should I actually use parts from it, 
or from other projects (e.g. distlib.version vs packaging.version, 
distlib.locator or pip’s PackageFinder)? It would be extremely helpful if there 
is a somewhat general, high-level view to the whole situation.

TP


> 
> Paul
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/XJC3BXEX5N4PCLNQ3XKKCGOIMCMP3LH4/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/25T4YPMHJ2Z4T5MDDVCP2LM3K4VEOZOD/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-19 Thread Tzu-ping Chung
I have the same experience with Pipenv as Nick’s. I would also guess
another reason is the lack of knowledge—this is certainly my own before
I get involved in Pipenv. There is barely any guide on how I should
implement such a thing, and my developer’s instinct would tell me to look
at a known implementation, i.e. pip. This also ties back to the problem that
pip uses distlib internally barely at all—had it used more of it, people might
be pointed to the right direction.

Migrating Pipenv’s internals from pip instead of distlib is actually the exact
thing I am thinking about when I raised the question. There are, as mentioned,
a lot of pieces missing in distlib. For example, distlib knows how to find a
distribution, and how to install wheels, but not how a non-wheel distribution
can be turned into a wheel. [1] It also has no functionalities on 
uninstallation.
If I’m to glue together a working thing, I would likely need to copy/reimplement
parts of pip, but where should they live? Do I add yet another layer above
distlib to include them, or do I try to include them in distlib?

Although distlib provides a nice basis, I feel it is still one layer below what 
most
people want to do, e.g. install a thing by name (or URL). But would a 
three-layer
design be too much, or should distlib have a high-level API as well?


[1]: Also, while I’m having your attention—I’m trying to use the pep517 library
as part of the solution to build an sdist into wheel, but I’m hitting a bug. 
Could
you help review my PR? :p https://github.com/pypa/pep517/pull/15 
<https://github.com/pypa/pep517/pull/15>


TP


> On 19/9, 2018, at 17:17, Paul Moore  wrote:
> 
> On Wed, 19 Sep 2018 at 09:39, Tzu-ping Chung  wrote:
>> Risking thread hijacking, I want to take this chance and ask about one 
>> particular multiple implementation problem I found recently.
> 
> I changed the subject to keep things easier to follow. Hope that's OK.
> 
>> 
>> What is the current situation regarding distlib vs packaging and various 
>> pieces in pip? Many parts of distlib seems to have duplicates in either 
>> packaging or pip/setuptools internals. I understand this is a historical 
>> artifact, but what is the plan going forward, and what strategy, if any, 
>> should a person take if they are to make the attempt of merging, or 
>> collecting pieces from existing code bases into a workable library?
> 
> Note: This is my personal view of the history only, Vinay and Donald
> would be better able to give definitive answers
> 
>> 
>> From what I can tell (very limited), distlib seems to contain a good 
>> baseline design of a library fulfilling the intended purpose, but is 
>> currently missing parts to be fully usable on its own. Would it be a good 
>> idea to extend it with picked parts from pip? Should I contribute directly 
>> to it, or make a (higher level) wrapper around it with those parts? Should I 
>> actually use parts from it, or from other projects (e.g. distlib.version vs 
>> packaging.version, distlib.locator or pip’s PackageFinder)? It would be 
>> extremely helpful if there is a somewhat general, high-level view to the 
>> whole situation.
> 
> Distlib was created as a place to experiment with making a
> library-style interface to various pieces of packaging functionality.
> At the time it was created, there were not many standardised parts of
> the packaging ecosystem, so while it followed the standards where they
> existed, it also implemented a number of pieces of functionality that
> *weren't* backed by standards (obvious examples being the script
> creation stuff and the package finder).
> 
> Packaging, in the other hand, was designed to focus strictly on
> implementations of agreed standards, providing reference APIs for
> projects to use.
> 
> Pip uses both libraries, but as far as I'm aware, we'd use an API from
> packaging in preference to distlib. The only distlib API we use is the
> script maker API. Pretty much everything else in distlib, we already
> had an internal implementation for by the time distlib was written, so
> there was no benefit in changing (in contrast, the benefit in
> switching to packaging is "by design conformance to the relevant
> standards").
> 
> My recommendations would be:
> 
> 1. Use packaging APIs always where they exist, even if a distlib
> equivalent exists.
> 2. Never use pip APIs, they are internal use only (Paul bangs on that
> old drum again :-))
> 3. Consider using distlib APIs for things like the locator API,
> because it's better than writing your own code, but be aware of the
> risks.
> 
> When I say risks here, the things I'd consider are:
> 
> * Distlibs APIs aren't used in many projects, so they are l

[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-19 Thread Tzu-ping Chung
I feel the plan is quite solid. This however leaves us (who want a Python 
implementation and interface to do what pip does) in an interesting place. So I 
can tell there are a couple of principles:

1. Do not use pip internals
2. pip won’t be using either distlib or setuptools, so they might not match 
what pip does, in the long run

Does this leaves us only one option left, to implement a library that matches 
what pip does (follows the standards), but is not pip? That feels quite 
counter-productive to me, but if it’s what things would be, I’d accept it.

The next step (for me) in that case would then be to start working on that 
library. Since existing behaviours in setuptools and pip (including the part it 
uses distlib for) are likely to be standardised, I can rely on distlib for 
script creation, setuptools for some miscellaneous things (editable installs?), 
and pull (or reimplement) parts out of pip for others. Are there caveats I 
should look out?

TP

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
Sent from my iPhone

> On 20 Sep 2018, at 00:39, Donald Stufft  wrote:
> 
> 
>>> On Sep 19, 2018, at 5:17 AM, Paul Moore  wrote:
>>> 
>>> 
>>> 
>>> What is the current situation regarding distlib vs packaging and various 
>>> pieces in pip? Many parts of distlib seems to have duplicates in either 
>>> packaging or pip/setuptools internals. I understand this is a historical 
>>> artifact, but what is the plan going forward, and what strategy, if any, 
>>> should a person take if they are to make the attempt of merging, or 
>>> collecting pieces from existing code bases into a workable library?
>> 
>> Note: This is my personal view of the history only, Vinay and Donald
>> would be better able to give definitive answers
> 
> I’ve personally always planned on pulling out the last bits of what we do use 
> distlib for in pip, and not relying on it any longer.
> 
> My general plan for extracting stuff from pip and/or setuptools has always 
> been to first standardize in a PEP (if a sufficient one doesn’t already 
> exist) anything that makes sense to be as a standard, and then start either 
> reimplementing or pulling code out of pip (or setuptools if pip is using 
> setuptools). When doing that I had always planned on spending a lot of effort 
> ensuring that the behavior matches what pip is already doing (or have known, 
> specific divergences).
> 
> Now some of this already exists in distlib, but I don’t plan on using it. 
> Part of that is because I find it easier to identify things that should be 
> standardized but aren’t if I’m not using a big bundle of already implemented 
> stuff already (for instance, script support needs to be standardized, but it 
> didn’t occur to us at the time because we just used what distlib had). It 
> also has a bunch of functionality that exists only in distlib, like 
> attempting to use JSON to find packages (at one point there was even a 
> locator that implicitly used a non PyPI server, no idea if there still is) 
> that I felt made it harder to use the library in a way that didn’t basically 
> create a new set of implementation defined semantics. It also copied APIs and 
> semantics that I think *shouldn’t* have been (for instance, it has an 
> implicitly caching resource API like setuptools does… a generally bad idea 
> IMO, whereas using the new importlib.resources is a much saner API).
> 
> So in general, there are things that currently only exist in distlib or 
> setuptools, but my personal long term plan for pip is that we should get 
> solid implementations of those things out of those libraries, but generally 
> my mind puts distlib and setuptools in largely the same boat.
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/A42FTYI2Y3TEMEJK7VW25DBVLEAPYYYE/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-20 Thread Tzu-ping Chung
 some
> additional logic / validation
> https://github.com/sarugaku/shellingham 
> <https://github.com/sarugaku/shellingham> -- this is a shell detection library
> made up of some tooling we built in pipenv for environment detection
> https://github.com/sarugaku/pythonfinder 
> <https://github.com/sarugaku/pythonfinder> -- this is a library for finding
> python (pep 514 compliant) by version and for finding any other executables
> (cross platform)
> https://github.com/sarugaku/virtenv <https://github.com/sarugaku/virtenv> -- 
> python api for virtualenv creation
> 
> Happy to provide access or take advice as needed on any of those.  Thanks
> all for the receptiveness and collaboration
> 
> Dan Ryan
> gh: @techalchemy // e: d...@danryan.co <mailto:d...@danryan.co>
> 
> From: Donald Stufft [mailto:don...@stufft.io <mailto:don...@stufft.io>] 
> Sent: Wednesday, September 19, 2018 1:52 PM
> To: Tzu-ping Chung
> Cc: Distutils
> Subject: [Distutils] Re: Distlib vs Packaging (Was: disable building wheel
> for a package)
> 
> My general recommendation if you want a Python implementation/interface for
> something pip does, is:
> 
> - Open an issue on the pip repository to document your intent and to make
> sure that there is nobody there who is against having that functionality
> split out. This might also give a chance for people with familiarity in that
> API to mention pain points that you can solve in a new API. We can also
> probably give you a good sense if the thing you want in a library is
> something that probably has multiple things that are dependent on getting
> split out first (for instance, if you said you wanted a library for
> installing wheels, we'd probably tell you that there is a dependency on PEP
> 425 tags, pip locations, maybe other that need resolved first) and also
> whether this is something that should have a PEP first or not. Getting some
> rough agreement on the plan to split X thing out before you start is overall
> a good thing.
> 
> - Create or update a PEP if required, and get it into the provisional state.
> 
> - Make the library, either as a PR to packaging or as it's own independent
> library. If there are questions that come up while creating that library/PR
> that have to do with specific pip behaviors, go back to that original issue
> and ask for clarification etc. Ideally at some point you'll open a PR on pip
> that uses the new library (my suggestion is to not bundle the library in the
> initial PR, and just import it normally so that the PR diff doesn't include
> the full bundled library until there's agreement on it). If there's another
> tool (pipenv, whatever) that is looking to use that same functionality, open
> a WIP PR there too that switches it to using that. Use feedback and what you
> learn from trying to integrate in those libraries to influence back the
> design of the API itself.
> 
> Creating a PEP and creating the library and the PRs can happen in parallel,
> but at least for pip if something deserves a PEP, we're not going to merge a
> PR until that PEP is generally agreed on. However it can be supremely useful
> to have them all going at the same time, because you run into things that
> you didn't really notice until you went to actually implement it.
> 
> My other big suggestion would be to e careful about how much you bite off at
> one time. Pip's internal code base is not the greatest, so pulling out
> smaller chunks at a time rather than trying to start right off pulling out a
> big topic is more likely to meet with success. Be cognizant of what the
> dependencies are for the feature you want to implement, because if it has
> dependencies, you'll need to pull them out first before you can pull it out
> OR you'll need to design the API to invert those dependencies so they get
> passed in instead.
> 
> I personally would be happy to at a minimum participate on any issue where
> someone was trying to split out some functionality from pip into a re-usable
> library if not follow the develop of that library directly to help guide it
> more closely. My hope for pip is that it ends up being the glue around a
> bunch of these libraries, and that it doesn't implement most of the stuff
> itself anymore.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org 
> <mailto:distutils-sig@python.org>
> To unsubscribe send an email to distutils-sig-le...@python.org 
> <mailto:distutils-sig-le...@python.org>
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ 
> <https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/>
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/IQVZVVWX2BLEP6D4WQMKNXZHBF2NZINU/
>  
> <https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/IQVZVVWX2BLEP6D4WQMKNXZHBF2NZINU/>
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/CIB4OWV2CWR3RVOPTS55WWL4ISFDGTSC/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-20 Thread Tzu-ping Chung
Pipenv’s resolver does not use troove classifiers as far as I am aware. I could 
be
missing something though, the current resolver implementation is not the best
code base to reason with.

Can you provide some ideas on this, Dan?

TP


> On 20/9, 2018, at 15:09, Bernat Gabor  wrote:
> 
> One of such differences is pipenv apparently trying to work from the troove 
> classifiers for version compatibility instead of the existing python_requires 
> (see https://github.com/tox-dev/tox/pull/1005 
> <https://github.com/tox-dev/tox/pull/1005>). Why? Point in case how people 
> start to think pipenv is pip. At this point, pip is a completely different 
> beast than pip I think. 
> 
> Such check at best would fall under the ``twine check`` command at PyPi 
> upload time, not at install time. 
> 
> On Thu, Sep 20, 2018 at 8:03 AM Tzu-ping Chung  <mailto:uranu...@gmail.com>> wrote:
> 
> 
>> On 20/9, 2018, at 13:22, Chris Jerdonek > <mailto:chris.jerdo...@gmail.com>> wrote:
>> 
>> On Wed, Sep 19, 2018 at 8:54 PM, Dan Ryan > <mailto:d...@danryan.co>> wrote:
>> I should clarify that we have already implemented a number of these as
>> libraries over the last several months (and I am super familiar with pip's
>> internals by now and I'm sure TP is getting there as well). More on this
>> below
>> ...
>> We are super cognizant of that aspect as I am pretty sure we are hitting
>> this wall in a full (nearly) pip-free reimplementation of all of the pipenv
>> internals from the ground up, including wheel building/installation, but we
>> basically had to start by calling pip directly, then slowly reimplement each
>> aspect of the underlying logic using various elements in distlib/setuptools
>> or rebuilding those.
>> 
>> Is the hope or game plan then for pipenv not to have to depend on pip? This 
>> is partly what I was trying to learn in my email to this list a month ago 
>> (on Aug. 20, with subject: "pipenv and pip"):
>> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/thread/2QECNWSHNEW7UBB24M2K5BISYJY7GMZF/
>>  
>> <https://mail.python.org/mm3/archives/list/distutils-sig@python.org/thread/2QECNWSHNEW7UBB24M2K5BISYJY7GMZF/>
>> 
>> Based on the replies, I wasn't getting that impression at the time (though I 
>> don't remember getting a clear answer), but maybe things have changed since 
>> then.
> 
> The resolution side of Pipenv really needs a Python API, and also cannot 
> really
> use the CLI because it needs something slightly different than pip’s 
> high-level
> logic (Nick mentioned this briefly). If we can’t use pip internals, then yes, 
> the plan
> is to not depend on pip. The hope is we can share those internals with pip 
> (either
> following the same standards, or using the same implementation), hence my 
> series
> of questions.
> 
> The installation side of Pipenv will continue to use pip directly, at least 
> for a while
> more even after the resolution side breaks away, since “pip install” is 
> adequate
> enough for our purposes. There are some possible improvements if there is a
> lower-layer library (e.g. to avoid pip startup overhead), but that is far 
> less important.
> 
> 
>> 
>> It should certainly be a lot easier for pipenv to move fast since there is 
>> no legacy base of users to maintain compatibility with. However, I worry 
>> about the fracturing this will cause. In creating these libraries, from the 
>> pip tracker it doesn't look like any effort is going into refactoring pip to 
>> make use of them. This relates to the point I made earlier today about how 
>> there won't be an easy way to cut pip over to using a new library unless an 
>> effort is made from the beginning. Thus, it's looking like things could be 
>> on track to split the user and maintainer base in two, with pip bearing the 
>> legacy burden and perhaps not seeing the improvements. Are we okay with that 
>> future?
> 
> I’m afraid the new implementation will still need to deal with compatibility 
> issues.
> Users expect Pipenv to work exactly as pip, and get very angry if it does not,
> especially when they see it is under the PyPA organisation on GitHub. The last
> time Pipenv tries to explain it does whatever arbitrary things it does, we get
> labelled as “toxic” (there are other issues in play, but this is IMO the 
> ultimate
> cause). Whether the image is fair or not, I would most definitely want to 
> avoid
> similar incidents from happening again.
> 
> I think Pipenv would be okay to maintain a different (from scratch) 
> implementation
> than pip’

[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-20 Thread Tzu-ping Chung
Thanks for the advices, they are really helpful. Incidentally (or maybe not? I 
wonder
if there is an underlying pattern here) the two areas I do want to work on 
first are
a) how to find a package, and b) how to choose an artifact for a given package.

I think I’m starting with the package discovery part first and work my way from
there. I’ll create an issue in pypa/pip and try to outline the intention (and 
summarise
this thread), but there’s a couple of things I wish to clarify first:

1. Should dependency link processing be included? Since it is un-deprecated 
now, I
guess the answer is yes?
2. (Maybe not an immediate issue) What formats should I include? Wheels and 
.tar.gz
sdists, of course, what others? Eggs? .zip sdists? Are there other formats?

TP


> On 20/9, 2018, at 02:40, Paul Moore  wrote:
> 
> On Wed, 19 Sep 2018 at 18:52, Donald Stufft  wrote:
>> 
>> On Sep 19, 2018, at 1:14 PM, Tzu-ping Chung  wrote:
>> 
>> I feel the plan is quite solid. This however leaves us (who want a Python 
>> implementation and interface to do what pip does) in an interesting place. 
>> So I can tell there are a couple of principles:
>> 
>> 1. Do not use pip internals
>> 2. pip won’t be using either distlib or setuptools, so they might not match 
>> what pip does, in the long run
>> 
>> Does this leaves us only one option left, to implement a library that 
>> matches what pip does (follows the standards), but is not pip? That feels 
>> quite counter-productive to me, but if it’s what things would be, I’d accept 
>> it.
>> 
>> The next step (for me) in that case would then be to start working on that 
>> library. Since existing behaviours in setuptools and pip (including the part 
>> it uses distlib for) are likely to be standardised, I can rely on distlib 
>> for script creation, setuptools for some miscellaneous things (editable 
>> installs?), and pull (or reimplement) parts out of pip for others. Are there 
>> caveats I should look out?
>> 
>> 
>> My general recommendation if you want a Python implementation/interface for 
>> something pip does, is:
>> 
>> - Open an issue on the pip repository to document your intent and to make 
>> sure that there is nobody there who is against having that functionality 
>> split out. This might also give a chance for people with familiarity in that 
>> API to mention pain points that you can solve in a new API. We can also 
>> probably give you a good sense if the thing you want in a library is 
>> something that probably has multiple things that are dependent on getting 
>> split out first (for instance, if you said you wanted a library for 
>> installing wheels, we’d probably tell you that there is a dependency on PEP 
>> 425 tags, pip locations, maybe other that need resolved first) and also 
>> whether this is something that should have a PEP first or not. Getting some 
>> rough agreement on the plan to split X thing out before you start is overall 
>> a good thing.
>> 
>> - Create or update a PEP if required, and get it into the provisional state.
>> 
>> - Make the library, either as a PR to packaging or as it’s own independent 
>> library. If there are questions that come up while creating that library/PR 
>> that have to do with specific pip behaviors, go back to that original issue 
>> and ask for clarification etc. Ideally at some point you’ll open a PR on pip 
>> that uses the new library (my suggestion is to not bundle the library in the 
>> initial PR, and just import it normally so that the PR diff doesn’t include 
>> the full bundled library until there’s agreement on it). If there’s another 
>> tool (pipenv, whatever) that is looking to use that same functionality, open 
>> a WIP PR there too that switches it to using that. Use feedback and what you 
>> learn from trying to integrate in those libraries to influence back the 
>> design of the API itself.
>> 
>> Creating a PEP and creating the library and the PRs can happen in parallel, 
>> but at least for pip if something deserves a PEP, we’re not going to merge a 
>> PR until that PEP is generally agreed on. However it can be supremely useful 
>> to have them all going at the same time, because you run into things that 
>> you didn’t really notice until you went to actually implement it.
>> 
>> My other big suggestion would be to e careful about how much you bite off at 
>> one time. Pip’s internal code base is not the greatest, so pulling out 
>> smaller chunks at a time rather than trying to start right off pulling out a 
>> big topic is more likely to meet with success. Be cognizant of what the 
>> dependencie

[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-20 Thread Tzu-ping Chung

> On 21 Sep 2018, at 02:01, Bert JW Regeer  wrote:
> 
> 
> 
>> On Sep 19, 2018, at 23:22, Chris Jerdonek  wrote:
>> 
>> Thus, it's looking like things could be on track to split the user and 
>> maintainer base in two, with pip bearing the legacy burden and perhaps not 
>> seeing the improvements. Are we okay with that future?
> 
> This'll be a sad day. pip is still used as an installer by other build system 
> where using pipenv is simply not a possibility.

I am not quite sure I understand why you’d think so. pip has been bearing the 
legacy burden for years, and if this is the future (not saying it is), it would 
more like just another day in the office for pip users, since nothing is 
changing.--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/IUGYOCCIOJ7PRKNYEFW5FDEZBXYGOSPM/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel fora package)

2018-09-21 Thread Tzu-ping Chung
I agree with you about Pipfile. It is likely not something pip would not 
directly install packages based on. pip could potentially add a “lock” command 
that is able to generate a Pipfile.lock from Pipfile, or even start work in a 
fashion like npm etc., but conceptually, pip would only install things based on 
Pipfile.lock, and if it takes a Pipfile, it’s used to generate a Pipfile.lock 
(and maybe install from that).

Regarding the format of Pipfile.lock, the proposal of it being less 
tool-specific is interesting to me. Brett also touched on a similar proposition 
a while ago that maybe we could standardise a common lock file format (shared 
by Pipenv and Poetry in the context of the time), and I think it is a nice 
idea, too.

On the other hand, there are many other application dependency management tools 
out there, and as far as I know none of them actually have a lock file format 
with interoperability. JavaScript, for example, has maybe the most bipartisan 
state in that area (in npm and Yarn), and I don’t recall reading anything of 
this nature at all. I’m not saying this is wrong, but it’s interesting that 
Python, being relatively behind in this particular area, has this somewhat 
unique proposal here. (Again, this does not imply it’s either good or bad, just 
unique.)

An extremely generic name like requirements.lock is probably not a good idea, 
since it is not uncommon for a project to require multiple package managers 
(e.g. for multiple languages), and it would be a disaster if everyone uses 
generic names. If not tool-specific (e.g. yarn.lock), the name should at least 
be context-specific, like… I don’t know, pyproject? But that is taken :p (This 
is intentionally rhetoric to touch on the we-should-use-pyproject-for-this 
camp. To be clear: I am not in that camp, that’s likely a bad idea unless we 
rethink the whole application-library distinction Python packaging makes.)

TP

Sent from Mail for Windows 10

From: Paul Moore
Sent: 21 September 2018 20:16
To: Nick Coghlan
Cc: Michael Merickel; Bert JW Regeer; Distutils
Subject: [Distutils] Re: Distlib vs Packaging (Was: disable building wheel fora 
package)

On Fri, 21 Sep 2018 at 11:41, Nick Coghlan  wrote:
>
> On Fri, 21 Sep 2018 at 05:47, Donald Stufft  wrote:
> > On Sep 20, 2018, at 3:35 PM, Paul Moore  wrote:
> > I don't think anyone's even spoken to the pip maintainers (yet?) about
> > supporting the pipfile format
> >
> > That comes from me, I initially wrote the Pipfile as a proof of concept / 
> > sketch of an API for replacing the requirements.txt format, which Kenneth 
> > took and created pipenv from. At some point I plan on trying to push 
> > support for those ideas back into pip (not the virtual environment 
> > management bits though). That’s obviously my personal goal though, and 
> > doesn’t represent an agreed upon direction for pip.
>
> And it's one where I think there are a couple of different levels of
> support that are worth considering:
>
> Q. Should pip support installing from Pipfile.lock files as well as
> requirements.txt files?
>
> A. Once the lock file format stabilises, absolutely, since this is
> squarely in pip's "component installer" wheelhouse.
>
> Q. Should "pip install" support saving the installed components to a
> Pipfile, and then regenerating Pipfile.lock?
>
> A. This is far less of a clearcut decision, as managing updates to a
> file that's intended to be checked in to source control is where I
> draw the line between "component installer" and
> "application/environment dependency manager".

Speaking as a pip developer:

Where's there a good summary of the pipfile format, the pipfile.lock
format, and their relationship and contrast with requirements.txt? I
don't view https://github.com/pypa/pipfile as a "good summary",
because it explicitly states that pipfile is intended to *replace*
requirements.txt, and I disagree strongly with that.

Also, pipfile is human-readable, but pipfile.lock isn't. As far as I
know, pipfile.lock is currently generated solely by pipfile - before
pip consumes pipfile.lock, I'd like to see that format discussed and
agreed as a formal interop standard that any tools wanting to pass
data to pip (for the use case the standard describes) can use. One
obvious thing I'd like to consider is changing the name to something
less tool-specific - requirements.lock maybe?

As far as the pipfile format is concerned, I see that more as pipenv's
human readable input file that is used to *generate* the lock file,
and I don't see it as something pip should consume directly, as that
would mean pip overlapping in functionality with pipenv.

If I'm misunderstanding the relationship between pip and pipenv, or
between pipenv and pipfile, I'm happy to be corrected. But can I
suggest that the best way to do so would be to amend the project pages
that are giving me the impressions I have above, and pointing me at
the corrected versions? That way, we can make sure that any
misinformation is correct

[Distutils] Re: setuptools configuration in pyproject.toml

2018-09-24 Thread Tzu-ping Chung
Not sure if you’re already aware of this, but there’s a similar discussion just 
a short while ago.

https://mail.python.org/mm3/archives/list/distutils-sig@python.org/thread/54QFJKANZOXY6NQZKEAPG346OL7MCQCV/#54QFJKANZOXY6NQZKEAPG346OL7MCQCV
 


Basically the answer is yeah, sure, if someone makes the effort to standardise 
it. No-one ever did, but you can be the first.

TP


> On 24/9, 2018, at 23:30, Bernat Gabor  wrote:
> 
> I'm aware this might be a controversial subject, so let's have the initial 
> discussion about it here first for full disclosure and see what people think 
> about it. Should setuptools support pyproject.toml as configuration source or 
> not (alternative to setup.cfg which it already does - 
> https://setuptools.readthedocs.io/en/latest/setuptools.html#configuring-setup-using-setup-cfg-files
>  
> )?
> 
> The main benefit of having this would be to decrease configuration files and 
> have build dependencies and other types of dependencies in one location. 
> Furthermore many other packaging projects (flit, poetry) already do define 
> their dependencies inside pyproject.toml; so would create one unified 
> location where to look for such in the future. 
> 
> The counter-argument is that "a big part of pyproject.toml was keeping that 
> file clean" and would furthermore increase the size of that file down the 
> line.
> 
> So what do people think? Should we encourage or discourage to have a single 
> python project file?
> 
> I'm personally supporting build/code quality tools supporting pyproject.toml 
> as their main configuration file.
> 
> Thanks
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/C3JEBOCQEILLPXK4FDQPADCFO6WWW6JT/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/GGGNAVCNM7X72IS4R7XXH2NZABW33IHZ/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-25 Thread Tzu-ping Chung
Pipenv wraps pip usages inside a virtual environment, so pip is always 
available via “pipenv run pip”,
so in a sense Pipenv “supports” everything pip does. But as far as things 
Pipenv actually has wrapper
commands for, it only tries to be pip’s functional superset in “install” and 
“uninstall”; everything else
is out of scope.

There are constantly people advocating adding more, or having confusion between 
similarly-named
commands (e.g. check), but that’s another issue…

TP


> On 25/9, 2018, at 08:22, Chris Jerdonek  wrote:
> 
> On Fri, Sep 21, 2018 at 5:14 AM, Paul Moore  > wrote:
>> If I'm misunderstanding the relationship between pip and pipenv, or
>> between pipenv and pipfile, I'm happy to be corrected. But can I
>> suggest that the best way to do so would be to amend the project pages
>> that are giving me the impressions I have above, and pointing me at
>> the corrected versions? That way, we can make sure that any
>> misinformation is corrected at source...
> 
> [This question isn't directed at Paul, even though it's in reply to his 
> email.]
> 
> Thanks for the good discussion, all. To clarify the overlap in
> functionality between pip and pipenv further, will pipenv be a strict
> superset in terms of what it can do?
> 
> Tzu-ping said earlier that users expect pipenv to behave the same as
> pip, so I'm wondering if there are any areas of functionality that pip
> covers that pipenv won't be able to (or that doesn't plan to).
> 
> --Chris
> 
> 
>> 
>> Paul
>> 
>> PS Full disclosure - I've tried to use pipenv in a couple of local
>> projects, because of the hype about it being the "great new thing" and
>> found it basically of no use for my requirements/workflow. So I may
>> have a biased view of either pipenv, or how it's being presented. I'm
>> trying to be objective in the above, but my bias may have slipped
>> through.
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org
>> To unsubscribe send an email to distutils-sig-le...@python.org
>> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
>> Message archived at 
>> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/SR3FAMLZM646GT3IPYFILD47KMRWOALD/
> --
> Distutils-SIG mailing list -- distutils-sig@python.org 
> 
> To unsubscribe send an email to distutils-sig-le...@python.org 
> 
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ 
> 
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/Y75UQYWNCCPMM77UVQ7NTDMSIBRIO27W/
>  
> 
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/Q4IZBOMZLZZ7SWQHHHYQDDXNFVVRILEL/


[Distutils] Re: Distlib vs Packaging (Was: disable building wheel for a package)

2018-09-25 Thread Tzu-ping Chung
We are using pip internals for things pip wasn’t implemented for. Specifically,
Pipenv uses pip’s package-fetching functions to implement its platform-agnostic
resolver. pip does not have this, so there’s no functional overlap here. Those
utilities are used to build something that doesn’t exist in pip, so there’s no
duplicated efforts.

My recent focus on making sense of packaging implementations and splitting out
parts of pip is exactly to prevent potential duplicated efforts. If we can’t 
use pip
internals, let’s make things we want to use *not* internal!

TP


> On 25/9, 2018, at 18:38, Chris Jerdonek  wrote:
> 
> On Tue, Sep 25, 2018 at 3:21 AM, Nick Coghlan  wrote:
>> On Tue, 25 Sep 2018 at 19:48, Chris Jerdonek  
>> wrote:
>>> What I'm trying to gauge is, if the plan is for pipenv not to depend
>>> on pip, and pipenv has strictly greater functionality than pip, then
>>> what purpose will PyPA have in continuing to develop pip in addition
>>> to pipenv?
>> 
>> That's not the plan, as all of pip's features for actually
>> installing/uninstalling packages, and for introspecting the *as built*
>> package environment, aren't things where pipenv's needs diverge from
>> pip's.
> 
> That's not what Tzu-ping said though. In an earlier email, he said,
> "If we can’t use pip internals, then yes, the plan is to not depend on
> pip."
> 
> --Chris
> 
> 
>> 
>> Where their needs diverge is at the dependency resolver level, as
>> pipenv needs to be able to generate a lock file for an arbitrary
>> target environment that may not match the currently running Python
>> interpreter *without* necessarily installing those packages anywhere
>> (although it may need to build wheels to get the dependency listing),
>> whereas pip has the more concrete task of "get theses packages and
>> their dependencies installed into the currently active environment".
>> 
>> If it helps, think of pipenv relating to pip in much the same way as
>> pip-tools (pip-compile/pip-sync) relates to pip, just with Pipfile and
>> Pipfile.lock instead of requirements.in and requirements.txt.
>> 
>> Cheers,
>> Nick.
>> 
>> --
>> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/LYY4SED3GISTJPNURZKAM45FZMYAAVKF/


[Distutils] Re: Notes from python core sprint on workflow tooling

2018-09-30 Thread Tzu-ping Chung

> On 01/10, 2018, at 00:47, Dan Ryan  wrote:
> 
>> Can't install Python. (There's... really no reason we *couldn't* distribute 
>> pre-built Python interpreters on PyPI? between the python.org installers and 
>> the manylinux image, we're already building redistributable run-anywhere 
>> binaries for the most popular platforms on every Python release; we just 
>> aren't zipping them up and putting them on PyPI.)
> 
> Erm, well actually you can install python currently via pyenv on linux, and 
> Tzu-ping is the maintainer of a pyenv-clone on windows which we've just never 
> really got around to integrating fully.  I've spoken to some of the folks 
> over at Anaconda and I know they are interested in this as well especially 
> given that it's pretty straightforward.  It hasn't been a primary focus 
> lately, but the tooling does exist (I don't think I've ever used it 
> personally though)

Regarding this specifically, my project is not actually a pyenv clone, since it 
is next to
impossible to automate compilation of old Python versions on Windows. My project
only automates binary download, installation, and configuration of binary 
releases from
python.org .

pyenv always compiling from source makes it quite flexible, but for an official 
tool,
however, this is likely a better approach to only automate downloads from 
python.org .
Official binary distributions are vastly underused, and this IMO has long 
produced
fragmentation in the community. Almost all platform-specific Python distributors
introduce their own quirks (Homebrew breaks all your virtual environments every
time you upgrade Python, and don’t get me started with Debian). They feel 
“broken”
when people hit specific use cases, and people blame Python when that happens 
for
not “fixing” it.

A standard (official?), automated runtime management tool a la rustup would help
greatly with this situation, so we don’t need to constantly answer questions 
with the
question “how did you install Python” and follow up by “oh that’s broken, but 
it’s not
our fault, don’t use it”. This is probably out of the scope of distutils-sig 
though.


TP

> 
> Anyway, this is all a good discussion to have and I really appreciate you 
> kicking it off. I've been following the __pypackages__ conversation a bit 
> since pycon and I honestly don't have much opinion about where we want to put 
> stuff, but I'm not sure  that the impact of the folder is going to be as 
> great to the user as  people might imagine -- the tooling is already being 
> built, so maybe it's just a matter of agreeing on that as the place to put 
> stuff, which schema to follow, and honestly working with some new users.  I 
> do this quite a bit but I haven't done any formal information gathering.  
> Anecdotally I'll always tell you I'm right, but if we had some user data on 
> specific pain points  / usability issues I'd definitely be prepared to change 
> my mind.
> 
> Dan Ryan
> gh: @techalchemy // e: d...@danryan.co
> 
>> -Original Message-
>> From: Nathaniel Smith [mailto:n...@pobox.com]
>> Sent: Sunday, September 30, 2018 6:42 AM
>> To: distutils-sig
>> Subject: [Distutils] Notes from python core sprint on workflow tooling
>> 
>> Now that the basic wheels/pip/PyPI infrastructure is mostly
>> functional, there's been a lot of interest in improving higher-level
>> project workflow. We have a lot of powerful tools for this –
>> virtualenv, pyenv, conda, tox, pipenv, poetry, ... – and more in
>> development, like PEP 582 [1], which adds a support for project-local
>> packages directories (`__pypackages__/`) directly to the interpreter.
>> 
>> But to me it feels like right now, Python workflow tools are like the
>> blind men and the elephant [2]. Each group sees one part of the
>> problem, and so we end up with one set of people building legs,
>> another a trunk, a third some ears... and there's no overall plan for
>> how they can fit together.
>> 
>> For example, PEP 582 is trying to solve the problem that virtualenv is
>> really hard to use for beginners just starting out [3]. This is a
>> serious problem! But I don't want a solution that *only* works for
>> beginners starting out, so that once they get a little more
>> sophisticated they have to throw it out and learn something new from
>> scratch.
>> 
>> So I think now might be a time for a bit of top-down design. **I want
>> a picture of the elephant.** If we had that, maybe we could see how
>> all these different ideas could be put together into a coherent whole.
>> So at the Python core sprint a few weeks ago, I dragged some
>> interested parties [4] into a room with a whiteboard [5], and we made
>> a start at it. And now I'm writing it up to share with you all.
>> 
>> This is very much a draft, intended as a seed for discussion, not a 
>> conclusion.
>> 
>> [1] https://www.python.org/dev/peps/pep-0582/
>> [2] https://en.wikipedia.org/wiki/Blind_men_and_an_elephant
>> [3] https://www.python.

[Distutils] Re: Notes from python core sprint on workflow tooling

2018-09-30 Thread Tzu-ping Chung
I can’t speak for others (also not really sure what “we” should include here…), 
but I
have a couple of interactions with the author on Twitter. I can’t recall 
whether I invited
him to join distutils-sig specifically, but I would understand if he was 
reluctant to do so
even if I did. The mailing list could be a bit intimidating unless you have a 
good topic to
join, especially for someone not with an English-speaking background (I am 
talking from
experience here).

Overall I could see it be a good idea to invite him to join the mailing list, 
and/or provide
inputs on this particular discussion. Would you be interested in doing this?

TP



> On 01/10, 2018, at 01:37, Nicholas Chammas  wrote:
> 
> On Sun, Sep 30, 2018 at 6:48 AM Nathaniel Smith  > wrote:
> So I think now might be a time for a bit of top-down design. **I want
> a picture of the elephant.** If we had that, maybe we could see how
> all these different ideas could be put together into a coherent whole.
> So at the Python core sprint a few weeks ago, I dragged some
> interested parties [4] into a room with a whiteboard [5], and we made
> a start at it. And now I'm writing it up to share with you all.
> 
> Just curious: Have we directly engaged the author of Poetry 
>  to see if he is interested in 
> participating in these discussions?
> 
> I ask partly just as an interested observer, partly because I see that Pipenv 
> tends to dominate these discussions, and partly because I find Poetry more 
> appealing than Pipenv  
> and -- not being a packaging expert -- I want to see it discussed in more 
> depth by the experts here.
> 
> Nick
> 
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/NKFEQ3G2N5F745NZ6VNJIAJRXOWNYT5T/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/QUCZF2WXQEHSANMVSFUSFBI24J2E5YFJ/


[Distutils] Re: Notes from python core sprint on workflow tooling

2018-09-30 Thread Tzu-ping Chung

> On 01/10, 2018, at 00:47, Dan Ryan  wrote:
> 
>> Uses Pipfile as a project marker instead of pyproject.toml.
> 
> See above.  pyproject.toml wasn't standardized yet when pipenv was released 
> (and still isn't, beyond that it is a file that could exist and store 
> information).  Pipfile was intended to replace requirements.txt per some 
> previous thread on the topic, and pipenv was an experimental implementation 
> of the separation between the two different ways that people currently use 
> requirements.txt in the wild -- one as a kind of abstract, unpinned 
> dependency list (Pipfile),  and the other as a transitive closure 
> (Pipfile.lock).  Since neither is standardized _for applications_, I'm not 
> totally sure this is an actual sticking point.  
> 
> In either case, this seems super minor…

I feel this would need to be extensively discussed either way before the 
community can
jump into a decision. The discussion I’ve seen has been quite split on whether 
we should
use one file or the other, but nothing very explaining why outside of “one file 
is better
than two”.

To me, there are two main things one would want to specify dependencies for: a 
thing
you want to import (as a Python library from another Python source file), and a 
thing
you want to run (as a command, a webapp, etc.). These setups require inherently 
different
ways to specify dependencies; sometime they match, but not always, and a tool 
would
(eventually) need to provide a solution when they diverge (i.e. when a project 
needs to
be both importable and runnable, and has different dependency sets for them).

Of course, the solution to this may as well be to always use pyproject.toml, 
and to design
it to fit both scenarios. In this case, however, the fields need to be designed 
carefully to
make sure all areas are taken care of. NPM, for example, requires you to 
specify a (package)
name for your project, even if it never needs it (e.g. a website backend), 
which to me is
a sign of designing too much toward package distribution, not standalone 
runnable. Other
tools using a one-file-of-truth configuration has a similar problem to a 
degree, as far as I
can tell. Another example would be Rust’s Cargo, which cannot specify a 
binary-only
dependency. A duel-file configuration (e.g. Pipfile and pyproject.toml) would 
essentially
punt this design decision; it is probably not the most purest solution, but at 
least keeps
things open enough they don’t collide.

Maybe another “solution” would be to have multiple (two?) files for each use 
case, but
have a tool for syncing them automatically if desired. This is how Bundler 
works if you
use it to package a Ruby Gem, but I am not sure if that is by design or an 
artefact of
circumstances (Gem specification predates Bundler like how Python packages 
appear
before all-in-one project management tools start to appear).

I don’t have much to provide at the current time regarding what the best design 
should
look like, but want to voice my concerns before it is too late.

TP


> 
>> Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.)
> (Not sure it should be)
> 
>> Environments should be stored in project directory, not off in $HOME 
>> somewhere. (Not sure what this is about, but some of the folks present were 
>> quite insistent.)
> 
> They used to be by default stored in $PROJECT/.venv but user feedback led us 
> to use $WORKON_HOME by default.  This is configurable by environment variable 
> ($PIPENV_VENV_IN_PROJECT) or if you simply have a virtualenv in the .venv 
> folder in your project directory.
> 
>> Environments should be relocatable.
> 
> And that will be possible whenever we can use venv across platforms and 
> python versions. Currently that isn't possible, and we are forced to use 
> virtualenv for compatibility.
> 
>> Hardcoded to only support "default" and "dev" environments, which is 
>> insufficient.
> 
> Why? I mean, if you are planning to replace setuptools / flit / other build 
> systems with pipenv and use pipfile as your new specification for declaring 
> extras, I guess, but that's not how it's designed currently. Beyond that, I 
> think we need some actual information on this one -- adding more complexity 
> to any tool (including this kind of complexity) is going to ask more of the 
> user in terms of frontloaded knowledge.  This constraint limits the space a 
> bit and for applications, I've very rarely seen actual limitations of this 
> setup (but am interested, we have obviously had this feedback before but are 
> not eager to add knobs in this specific area).
> 
>> No mechanism for sharing prespecified commands like "run tests" or 
>> "reformat".
> 
> There is, but the documentation on the topic is not very thorough: 
> https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts 
> See also: https://github.com/sarugaku/requirementslib/blob/master/Pipfile#L26 
> 
> For an example for the specific cases you mentioned, the Pipfile 

[Distutils] Re: Notes from python core sprint on workflow tooling

2018-09-30 Thread Tzu-ping Chung
I didn’t intend my comments to be specific to Pipenv, but it is about Pipfile 
being
considered why Pipenv is not suitable.

Whether different kinds of projects should share one configuration file is an
important but less addressed design decision, and the decision is not yet made.
Considering Pipfile as a project marker instead of pyproject.toml as a complaint
is jumping into a particular decision, and would risk skipping this discussion 
IMO.


TP


> On 01/10, 2018, at 03:56, Paul Moore  wrote:
> 
> On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung  <mailto:uranu...@gmail.com>> wrote:
>> 
>> 
>>> On 01/10, 2018, at 00:47, Dan Ryan  wrote:
>>> 
>>>> Uses Pipfile as a project marker instead of pyproject.toml.
>>> 
>>> See above.  pyproject.toml wasn't standardized yet when pipenv was released 
>>> (and still isn't, beyond that it is a file that could exist and store 
>>> information).  Pipfile was intended to replace requirements.txt per some 
>>> previous thread on the topic, and pipenv was an experimental implementation 
>>> of the separation between the two different ways that people currently use 
>>> requirements.txt in the wild -- one as a kind of abstract, unpinned 
>>> dependency list (Pipfile),  and the other as a transitive closure 
>>> (Pipfile.lock).  Since neither is standardized _for applications_, I'm not 
>>> totally sure this is an actual sticking point.
>>> 
>>> In either case, this seems super minor…
>> 
>> I feel this would need to be extensively discussed either way before the 
>> community can
>> jump into a decision. The discussion I’ve seen has been quite split on 
>> whether we should
>> use one file or the other, but nothing very explaining why outside of “one 
>> file is better
>> than two”.
> 
> This discussion seems to have diverted into being about pipenv. Can I
> ask that the pipenv-specific discussions be split out into a different
> thread? (For example, I'm not clear if Tzu-Ping's comment here is
> specific to pipenv or not).
> 
> My main reason is that (as I noted in my reply to Nathaniel's post) my
> use cases are, as far as I can tell, *not* suitable for pipenv as it's
> currently targeted (I'm willing to be informed otherwise, but please,
> can we do it on another thread or off-list if it's not generally
> useful). And I'd rather that we kept the central discussion
> tool-agnostic until we come to some view on what tools we'd expect to
> be suggesting to users in the various categories we end up
> identifying.
> 
> Thanks,
> Paul

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/OVTZR4J45GB5BVBPA22J754DTQLTNGQD/


[Distutils] Re: Idea: perennial manylinux tag

2018-11-30 Thread Tzu-ping Chung
Also betraying the lack of knowledge of how this works, I read this section in 
PEP 513 (which defines manylinux1):

> To be eligible for the manylinux1 platform tag, a Python wheel must therefore 
> both (a) contain binary executables and compiled code that links only to 
> libraries with SONAMEs included in the following list:

.…
libglib-2.0.so.0

Does this mean that only tags down to 2.0 needs to be generated?


TP


Sent from Mail for Windows 10

From: Brett Cannon
Sent: 01 December 2018 02:12
To: Nathaniel Smith
Cc: distutils sig
Subject: [Distutils] Re: Idea: perennial manylinux tag

I think either approach works, but if we do go with a glibc-versioned tag that 
we make it explicit in the tag, e.g. `manylinux_glibc_{version}`. That way if 
we ever choose to support musl (for Alpine) we can.

The one question I do have is how the compatibility tags will work for a tagged 
platform? E.g. if you say manylinux_glibc_2_12 for manylinux2010, then do we 
generate from 2.12 down to 1.0 (or whatever the floor is for manylinux1)? This 
would match how compatibility tags work on macOS where you go from your macOS 
version all the way down to the first version supporting your CPU architecture.

And just to double-check, I'm assuming we don't want to just jump straight to 
distro tags and say if you're centos_6 compatible then you're fine? I assume 
that would potentially over-reach on compatibility in terms of what might be 
dynamically-linked against, but I thought I would ask because otherwise the 
glibc-tagged platform will be a unique hybrid of macOS + not an actual OS 
restriction.

On Fri, 30 Nov 2018 at 00:10, Nathaniel Smith  wrote:
Hi all,

The manylinux1 -> manylinux2010 transition has turned out to be very difficult. 
Timeline so far:

March 2017: CentOS 5 went EOL
April 2018: PEP 517 accepted
May 2018: support for manylinux2010 lands in warehouse
November 2018: support lands in auditwheel, and pip master
December 2018: 21 months after CentOS 5 EOL, wwee still don't have an official 
build environment, or support in a pip release

We'll get through this, but it's been super painful and maybe we can change 
things somehow so it will suck less next time.

We don't have anything like this pain on Windows or macOS. We never have to 
update pip, warehouse, etc., after those OSes hit EOLs. Why not?

On Windows, we have just two tags: "win32" and "win_amd64". These are defined 
to mean something like "this wheel will run on any recent-ish Windows system". 
So the meaning of the tag actually changes over time: it used to be that if a 
wheel said it ran on win32, then that meant it would work on winxp, but since 
winxp hit EOL people started uploading "win32" wheels that don't work on winxp, 
and that's worked fine.

On macOS, the tags look like "macosx_10_9_x86_64". So here we have the OS 
version embedded in the tag. This means that we do occasionally switch which 
tags we're using, kind of like how manylinux1 -> manylinux2010 is intended to 
work. But, unlike for the manylinux tags, defining a new macosx tag is totally 
trivial: every time a new OS version is released, the tag springs into 
existence without any human intervention. Warehouse already accepts uploads 
with this tag; pip already knows which systems can install wheels with this 
tag, etc.

Can we take any inspiration from this for manylinux?

We could do the Windows thing, and have a plain "manylinux" tag that means "any 
recent-ish glibc-based Linux". Today it would be defined to be "any distro 
newer than CentOS 6". When CentOS 6 goes out of service, we could tweak the 
definition to be "any distro newer than CentOS 7". Most parts of the toolchain 
wouldn't need to be updated, though, because the tag wouldn't change, and by 
assumption, enforcement wouldn't really be needed, because the only people who 
could break would be ones running on unsupported platforms. Just like happens 
on Windows.

We could do the macOS thing, and have a "manylinux_${glibc version}" tag that 
means "this package works on any Linux using glibc newer than ${glibc 
version}". We're already using this as our heuristic to handle the current 
manylinux profiles, so e.g. manylinux1 is effectively equivalent to 
manylinux_2_5, and manylinux2010 will be equivalent to manylinux_2_12. That way 
we'd define the manylinux tags once, get support into pip and warehouse and 
auditwheel once, and then in the future the only thing that would have to 
change to support new distro releases or new architectures would be to set up a 
proper build environment.

What do y'all think?

-n
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/6AFS4HKX6PVAS76EQNI7JNTGZZRHQ6SQ/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to 

[Distutils] Re: older pypi packages

2018-12-18 Thread Tzu-ping Chung
Paul has described the technical details behind this phenomenon, but
to be more explicit: it is not pip that breaks older packages, but the
new PyPI server (pypi.org instead of the old pypi.python.org) that does.

So no, there is not a legacy mode in pip. Furthermore, you won’t be
able to install the package now, even if you have the old pip version.

The only way to overcome this is to find the original package, and
either upload it to PyPI, or serve it yourself on your own server.


> On 18/12/2018, at 22:05, Paul Moore  wrote:
> 
> The PyPI index page for fcrypt (https://pypi.org/simple/fcrypt/) has
> no file links on it. I don't know why, but there's nothing there for
> pip to download.
> 
> The "Download" link points to a file not on PyPI - maybe that's the
> issue here, PEP 470 describes the process that was undertaken to
> remove external file hosting from PyPI (and the reasons behind doing
> so).
> 
> Paul
> 
> On Tue, 18 Dec 2018 at 13:55, Robin Becker  wrote:
>> 
>> I recently had to rebuild a server and find that pip 18.1 is apparently 
>> unable to install at least some older packages eg
>> 
>>> $ bin/pip install fcrypt
>>> Collecting fcrypt
>>>  Could not find a version that satisfies the requirement fcrypt (from 
>>> versions: )
>>> No matching distribution found for fcrypt
>> 
>> the version I needed is in fact the last released 1.3.1  (in 2004) and it 
>> was installed by an earlier pip. I tried being more explicit
>> 
>>> $ bin/pip install fcrypt==1.3.1
>>> Collecting fcrypt==1.3.1
>>>  Could not find a version that satisfies the requirement fcrypt==1.3.1 
>>> (from versions: )
>>> No matching distribution found for fcrypt==1.3.1
>> 
>> I assume that latest pip needs information from the package / pypi data that 
>> is not available. Luckily installing from a pypi
>> download works.
>> 
>> Is there any legacy mode in pip? It seems wrong to cause these older 
>> packages to become unusable.
>> --
>> Robin Becker
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org
>> To unsubscribe send an email to distutils-sig-le...@python.org
>> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
>> Message archived at 
>> https://mail.python.org/archives/list/distutils-sig@python.org/message/FT6JKV352GJTTYLHFNBTDCKYP77BHZH7/
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/UQD226NENS52YQ2VCJJ5F7OAXVAB7KP5/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/ZQBZF7HE2Q3OFYLW23G2WQOKGT6V6Q2D/


[Distutils] Re: PEP 517 - source dir and sys.path

2019-01-27 Thread Tzu-ping Chung
Yes, of course people can. I think the problem is that common Python 
installations defaults to adding the cwd into sys.path, so people expect this 
to “just work”.  A PEP 517 not doing it is not intuitive unless you follow 
Python packaging closely, and most would simply assume pip is broken.

(Incidentally, the Windows embedded distribution also does not include the cwd 
in sys.path by default, and I do see confused people on discussion boards or 
issue trackers from time to time complaining things don’t work. The embedded 
distribution’s user base is quite small, however, so this is not really a 
problem.)

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
Sent from my iPhone

> On 28 Jan 2019, at 04:17, Bernat Gabor  wrote:
> 
> It feels to me that importing from setup.py level is a code smell and should 
> be avoided (I mean what happens if you move the source into src layout, do 
> you now want tools to also append the src folder additionally if exists?). 
> Can't people just inject at the start of setup.py into sys path the cwd if 
> they really want to go with it? (feels superior over pinning pip to 18.0).
> 
>> On Sun, Jan 27, 2019 at 7:48 PM Paul Moore  wrote:
>> On Sun, 27 Jan 2019 at 14:39, Thomas Kluyver  wrote:
>> > I think the rule about the CWD not being on sys.path only applies to 
>> > loading a proper PEP 517 build backend when that's specified in 
>> > pyproject.toml.
>> 
>> I'm not entirely sure what you're intending when you refer to a
>> "proper PEP 517 build backend". The setuptools backend is a perfectly
>> acceptable one, the only difference here is that we default to it in
>> pip when the project hasn't specified a backend. The reason we do that
>> is simple - if we don't, we're unlikely to be able to remove the
>> legacy setuptools code any time in the foreseeable future, as there
>> will be essentially no incentive for existing projects to specify a
>> setuptools backend. Also, we'd get basically no testing of the new PEP
>> 517 code, for the same reason. (Yes, it would have been better to test
>> before release, but we've never had any success getting beta testers
>> for pip). We could have made PEP 517 entirely opt-in, but (a) that
>> would have drastically slowed down adoption, and (b) implicit in
>> everything we'd discussed was the assumption that there would be a
>> setuptools backend that was semantically equivalent to running
>> setup.py the way we currently did, so there was no need to. Clearly
>> that assumption turned out to be wrong, and that's where the issues
>> for our users arose.
>> 
>> Having said this, I don't have any problem with changing the default
>> backend used by pip. In fact, if the setuptools project don't view it
>> as a goal of their existing backend to replicate setup.py behaviour,
>> then I think it's pretty much essential that we change. Pip needs to
>> default to a PEP 517 backend that behaves the same as pip's legacy
>> code - that's essential for backward compatibility if we're to remove
>> or deprecate the legacy setup.py code path. The only problem is that
>> right now there doesn't appear to be a backend that suits our
>> requirements.
>> 
>> Paul Ganssle suggested that setuptools could provide an alternative
>> ("legacy") setuptools backend that preserved full setup.py
>> compatibility (including having the current directory on sys.path).
>> That seems to me to be a reasonable solution, and I appreciate the
>> offer. In practice, the legacy setuptools backend doesn't *have* to be
>> implemented by setuptools itself - it could be a separate project if
>> setuptools don't want to maintain it themselves. But IMO it would be
>> better if the setuptools team are OK with maintaining it.(At a pinch,
>> pip could even implement that backend internally, but adding pip to
>> every build environment seems a bit heavyweight).
>> 
>> So in terms of practical resolution of the problems people are hitting
>> since pip 19.0 was released, I don't think there's a significant issue
>> here. We need a new pip release, sure, and someone needs to write the
>> new backend, but I don't think either of those tasks are huge.
>> 
>> What is more difficult is the question of whether the PEP should
>> change. As Chris pointed out, the previous discussion ended up saying
>> that the build directory should not be on sys.path, but acknowledged
>> that mandating that might cause issues. So the question now is, are
>> the issues we've seen big enough that we want to change P

[Distutils] Re: PEP 517 - source dir and sys.path

2019-01-27 Thread Tzu-ping Chung
I guess this depends on how explicit you wan to be. PEP 517 is not enabled 
unconditionally, only when the project root contains pyproject.toml. The 
problem is that other projects (unrelated to packaging) take advantage of the 
file format’s existence, resulting in people “buy in” the situation 
unknowingly. Things are already explicit from pip’s point of view, but 
(unanticipated?) third-party usages of the same spec unrelated to PEP 517 
muddles the situation.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
Sent from my iPhone

> On 28 Jan 2019, at 04:37, Bernat Gabor  wrote:
> 
> This is more an argument to not use pep-517 unless people explicitly specify 
> the backend, at which point they acknowledge to buy in that the cwd is not on 
> sys path (and they need to alter their packaging code accordingly).
> 
>> On Sun, 27 Jan 2019, 20:31 Tzu-ping Chung > Yes, of course people can. I think the problem is that common Python 
>> installations defaults to adding the cwd into sys.path, so people expect 
>> this to “just work”.  A PEP 517 not doing it is not intuitive unless you 
>> follow Python packaging closely, and most would simply assume pip is broken.
>> 
>> (Incidentally, the Windows embedded distribution also does not include the 
>> cwd in sys.path by default, and I do see confused people on discussion 
>> boards or issue trackers from time to time complaining things don’t work. 
>> The embedded distribution’s user base is quite small, however, so this is 
>> not really a problem.)
>> 
>> --
>> Tzu-ping Chung (@uranusjr)
>> uranu...@gmail.com
>> Sent from my iPhone
>> 
>>> On 28 Jan 2019, at 04:17, Bernat Gabor  wrote:
>>> 
>>> It feels to me that importing from setup.py level is a code smell and 
>>> should be avoided (I mean what happens if you move the source into src 
>>> layout, do you now want tools to also append the src folder additionally if 
>>> exists?). Can't people just inject at the start of setup.py into sys path 
>>> the cwd if they really want to go with it? (feels superior over pinning pip 
>>> to 18.0).
>>> 
>>>> On Sun, Jan 27, 2019 at 7:48 PM Paul Moore  wrote:
>>>> On Sun, 27 Jan 2019 at 14:39, Thomas Kluyver  wrote:
>>>> > I think the rule about the CWD not being on sys.path only applies to 
>>>> > loading a proper PEP 517 build backend when that's specified in 
>>>> > pyproject.toml.
>>>> 
>>>> I'm not entirely sure what you're intending when you refer to a
>>>> "proper PEP 517 build backend". The setuptools backend is a perfectly
>>>> acceptable one, the only difference here is that we default to it in
>>>> pip when the project hasn't specified a backend. The reason we do that
>>>> is simple - if we don't, we're unlikely to be able to remove the
>>>> legacy setuptools code any time in the foreseeable future, as there
>>>> will be essentially no incentive for existing projects to specify a
>>>> setuptools backend. Also, we'd get basically no testing of the new PEP
>>>> 517 code, for the same reason. (Yes, it would have been better to test
>>>> before release, but we've never had any success getting beta testers
>>>> for pip). We could have made PEP 517 entirely opt-in, but (a) that
>>>> would have drastically slowed down adoption, and (b) implicit in
>>>> everything we'd discussed was the assumption that there would be a
>>>> setuptools backend that was semantically equivalent to running
>>>> setup.py the way we currently did, so there was no need to. Clearly
>>>> that assumption turned out to be wrong, and that's where the issues
>>>> for our users arose.
>>>> 
>>>> Having said this, I don't have any problem with changing the default
>>>> backend used by pip. In fact, if the setuptools project don't view it
>>>> as a goal of their existing backend to replicate setup.py behaviour,
>>>> then I think it's pretty much essential that we change. Pip needs to
>>>> default to a PEP 517 backend that behaves the same as pip's legacy
>>>> code - that's essential for backward compatibility if we're to remove
>>>> or deprecate the legacy setup.py code path. The only problem is that
>>>> right now there doesn't appear to be a backend that suits our
>>>> requirements.
>>>> 
>>>> Paul Ganssle suggested that setuptools could provide an alternative
>>>> ("le

[Distutils] Re: Update PEP 508 to allow version specifiers

2019-01-29 Thread Tzu-ping Chung
I’m wondering, why is it needed to specify both a version and a link? I assume 
the version specifier would be redundant when a link is provided as the source, 
since the link can only point to one possible package version.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
Sent from my iPhone

> On 29 Jan 2019, at 17:07, Paul Moore  wrote:
> 
>> On Tue, 29 Jan 2019 at 08:50, Jan Musílek  wrote:
>> 
>> Nathaniel Smith wrote:
>>> What would this do? I don't think there's any way for pip (or any
>>> program) to look at a git repo and extract a list of which revisions
>>> correspond to which python package versions. I guess you could have
>>> some hack where you guess that certain tag strings correspond to
>>> certain versions, but it seems pretty messy to me...
>> 
>> Well, I'd like it to do the exact same think as dependency_links did before 
>> they were removed from pip. AFAIK it was possible to specify `package >= 
>> 10.0` in `install_requires` and then 
>> `https://github.com/owner/package.git#egg=package-0` in `depencency_links`. 
>> I don't really see into the pip internals, so I'm not sure how pip did it in 
>> the past. But it's a real issue [1].
>> 
>> [1] https://github.com/pypa/pip/issues/5898
> 
> If this is to be an extension to the current PEP, then *someone* is
> going to need to specify the semantics precisely. That's part of the
> problem - existing behaviours are implementation defined, and not
> specified clearly anywhere. The first step in working out a PEP update
> is to define those semantics so they can be discussed without vague
> statements like "needs to work like dependency_links does".
> 
> I don't know how dependency_links worked, but URL links as defined in
> PEP 508 give "the URL to a specific artifact to install". So they link
> to one precise file, i.e. one precise version. So the only plausible
> semantics I can see for something like "package >= 10.0 @
> https://github.com/owner/package.git"; would be "Get the package at
> that URL. If it doesn't satisfy the version restriction "package >=
> 10.0", discard it. If it does satisfy that restriction, install it.
> Which doesn't seem that useful, IMO.
> 
> Maybe the way to define useful semantics here would be to articulate
> the actual problem you're trying to solve (*without* referring to how
> dependency_links works) and propose a semantics that solves that
> problem?
> 
> Paul
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/B7MZO6AX7THV2RPUP6BA7VMUMCEUUXMC/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/VAVYUDUBTPFQXTHIR2PNGIW2CSCZJFXP/


[Distutils] Re: Update PEP 508 to allow version specifiers

2019-01-29 Thread Tzu-ping Chung

> On 29 Jan 2019, at 23:19, Donald Stufft  wrote:
> 
>> On Jan 29, 2019, at 10:15 AM, Xavier Fernandez  
>> wrote:
>> 
>> I agree that such specifier would make little sense but why add a new syntax 
>> "foo-1 @ url" when "foo==1 @ url" (where ==1 is a version specifier as 
>> defined in PEP 508) would perfectly fit the bill ?
> 
> 
> Well foo-1 wouldn’t work great because you can’t disambiguate between (“foo”, 
> “1”) and (“foo-1”, None). But the reason for using a different syntax would 
> be so people didn’t confuse the concept with the idea of version specifiers, 
> since >=1.0 doesn’t make sense, if we allow ==10, then people will assume 
> they can add >=10.0 and willet confused when it doesn’t work. A different 
> syntax side steps that issue.

A >=10.0 specifier could still work, I think. Resolvers are implemented to 
calculate the union of specifiers, so any specifiers would do. Of course it 
does not make perfect sense, but I guess it could be interpreted as “you can 
find the package here, and I promise it satisfies the given specifiers.

TP


> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/CTQH6ZR3FVCODND3NXQC34U6O4J6AVTM/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/3LMJRZSRFC5ATDEDFZF7YM6J2PD77XTY/


[Distutils] Re: pip + safety

2019-02-11 Thread Tzu-ping Chung
One way to avoid disclosing user environments to a third party is to build this 
into PyPI instead. The API could generate the warning for pip to display. 

This only covers packages on PyPI, of course, but trying to audit local and 
self-hosted packages is is a fools errand anyway IMO since there is no 
practical way for any tool to reliably know what *actually* is installed.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
Sent from my iPhone

> On 12 Feb 2019, at 11:34, Wes Turner  wrote:
> 
> Would something like this require:
> 
> - a pip extension/plugin/post-install hook API
> - a post-install hook that discloses all installed packages and versions 
> (from pypi.org, mirrors, local directory) in exchange for checking and online 
> security DB
> - a way to specify a key to e.g. pyup
> 
> GItHub and GitLab offer similar functionality:
> 
> https://github.blog/2018-07-12-security-vulnerability-alerts-for-python/
>   
> https://help.github.com/articles/about-security-alerts-for-vulnerable-dependencies/
> 
> https://docs.gitlab.com/ee/user/project/merge_requests/dependency_scanning.html
>   
> https://gitlab.com/gitlab-org/security-products/dependency-scanning#supported-languages-and-package-managers
> 
> https://pyup.io
> 
> https://github.com/pyupio/safety-db
> 
> > pipenv check relies on safety and Safety-DB to check for known 
> > vulnerabilities in locked components
> 
> 
>> On Monday, February 11, 2019, Julian Berman  wrote:
>> Hi.
>> 
>> I recently found myself installing a node.js package, and in the process 
>> noticed that (sometime recently?) it started automatically warning about 
>> known vulnerabilities during installation of package.jsons (see 
>> https://docs.npmjs.com/cli/audit).
>> 
>> At work, we run safety (https://pypi.org/project/safety/) on all our 
>> projects (which has both free and paid versions). It's great.
>> 
>> I know there's a ton of wonderful work happening at the minute to improve 
>> underlying scaffolding + specification to enable tools other than setuptools 
>> + pip to thrive, so maybe this is the wrong moment, but I figured I'd ask 
>> anyways :) -- what are opinions on running a similar thing during pip 
>> install?
>> 
>> -J
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/GSTL47B4CREYHKOS5I47WOPQURBKTOAY/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/UT77IM2YPV7BKHJX7N3QZJUE4TGRRP5E/


[Distutils] Re: pip + safety

2019-02-12 Thread Tzu-ping Chung
PyUp’s dataset is public, and the insecure_full document posted earlier in 
thread is 344 kb, so yeah, it is totally possible.

https://github.com/pyupio/safety-db/blob/master/data/insecure_full.json 
<https://github.com/pyupio/safety-db/blob/master/data/insecure_full.json>


> On 12/2, 2019, at 17:05, Joni Orponen  wrote:
> 
> On Tue, Feb 12, 2019 at 5:24 AM Tzu-ping Chung  <mailto:uranu...@gmail.com>> wrote:
> One way to avoid disclosing user environments to a third party is to build 
> this into PyPI instead. The API could generate the warning for pip to 
> display. 
> 
> How large are these kinds of databases? Would it be a conceivable thought end 
> users and/or CI infrastructures of organisations keep and update their local 
> copies and thus only disclose the fact they're using such a database?
> 
> -- Joni Orponen
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/ERBNV6DJ5MTXF5KOHXZDABPQAEUJELMF/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/SQDHTUVE43XACR3AKT3VMGGFWW5JNV4B/


[Distutils] Re: API for SHA-256 fingerprints

2019-02-12 Thread Tzu-ping Chung
I believe you’re looking for the PEP 503 simple API 
. This is what pip uses to find the 
hashes (among other things) as well. The hash value is included as a fragment 
in the URL.

TP

> On 12/2/2019, at 23:03, Eric Peterson  
> wrote:
> 
> Hi all,
> 
> When in the "Download files" section of a project on PyPI, next to each 
> download there is a convenient "SHA256" link that will copy the SHA-256 
> fingerprint for that file to the clipboard. I am wondering if there is a 
> programmatic way to access the SHA-256 for a file (besides just scraping the 
> web page)? Ideally there would be some way to construct a URL based on the 
> name of the file that, when called, would return the fingerprint.
> 
> Thanks,
> Eric
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/FLNOENK2525RMHGL7SV2SBUXKSOJHSEZ/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/EY2SRYRV5NDKOSISK4JEX34YOOLUQ5ED/


[Distutils] Re: content clashes

2019-02-19 Thread Tzu-ping Chung
It is possible, but in practice there are some complications.

There is not an established way to map packages to files they install, so it’d 
be very difficult to answer the generic question “is there another package that 
install the same things as mine does” (except you crawl the whole PyPI). Here’s 
a similar request: https://github.com/pypa/warehouse/issues/5375

It is more plausible, on the other hand, when given two packages, to answer 
whether they would install conflicting files. This is still complicated by the 
fact that setup.py can do literally anything, and there is no theoretical way 
to be 100% sure until you actually install it, but it is possible to make a 
(very) educated guess via wheels. Each .whl archive contains a RECORD file 
 that lists 
files to copy, so you can compare them to know whether conflicts exist.

TP


> On 19/2/2019, at 20:07, Robin Becker  wrote:
> 
> Is there a way for a package to recognize that its content clashes with that 
> of another package? This can happen when a package becomes unmaintained and 
> another differently named package takes over with perhaps clashing 
> modules/__package__ paths.
> -- 
> Robin Becker
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/ZXAFK3SO4ZIPDLFWTLK75ABNTMX7TJXX/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/Z55OK46RFF3DSJBXVJ5JAXGERISOSVC7/


[Distutils] Re: PEP-582 concerns

2019-02-20 Thread Tzu-ping Chung

> On 20/2/2019, at 20:38, Alex Walters  wrote:
> 
> I have 2 main concerns about PEP 582 that might just be me misunderstanding
> the pep.
> 
> My first concern is the use of CWD, and prepending ./_pypackages_ for
> scripts.  For example, if you were in a directory with a _pypackages_
> subdirectory, and had installed the module "super.important.module".  My
> understanding is that any scripts you run will have "super.important.module"
> available to it before the system site-packages directory.  Say you also run
> "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module"
> (and there is no _pypackages_ subdirectory in /usr/bin).  You would be
> shadowing "super.important.module”.
> 
> In this case, this adds no more version isolation than "pip install --user",
> and adds to the confoundment factor for a new user.  If this is a
> misunderstanding of the pep (which it very well might be!), then ignore that
> concern.  If it's not a misunderstanding, I think that should be emphasized
> in the docs, and perhaps the pep.

It is my understanding that the PEP already covers this:

https://www.python.org/dev/peps/pep-0582/#security-considerations

> While executing a Python script, it will not consider the __pypackages__ in
> the current directory, instead if there is a __pypackages__ directory in the
> same path of the script, that will be used.

This is also mentioned in the Specification section:

> In case of Python scripts, Python will try to find __pypackages__ in the same
> directory as the script. If found (along with the current Python version
> directory inside), then it will be used, otherwise Python will behave as it 
> does
> currently.


> My second concern is a little more... political.
> 
> This pep does not attempt to cover all the use-cases of virtualenvs - which
> is understandable.  However, this also means that we have to teach new users
> *both* right away in order to get them up and running, and teach them the
> complexities of both, and when to use one over the other.  Instead of making
> it easier for the new user, this pep makes it harder.  This also couldn't
> have come at a worse time with the growing use of pipenv which provides a
> fully third way of thinking about application dependencies (yes, pipenv uses
> virtualenvs under the hood, but it is a functionally different theory of
> operation from a user standpoint compared to traditional pip/virtualenv or
> this pep).
> 
> Is it really a good idea to do this pep at this time?

I am not in the position to comment on this in general, although I fully
understand the concern as a volunteer myself.

As one of the Pipenv maintainers, however, it is my personal opinion that this
PEP would not be end up in the “yet another standard” situation, but even be
beneficial to Pipenv, if done correctly.

I hope this can provide some confidence :)


> 
> In a vacuum, I like this pep.  Aside from the (possible) issue of unexpected
> shadowing, it's clean and straight forward.  It's easy to teach.  But it
> doesn't exist in a vacuum, and we have to teach the methods it is intended
> to simplify anyways, and it exists in competition with other solutions.
> 
> I am not a professional teacher; I don't run python training courses.  I do,
> however, volunteer quite a bit of time on the freenode channel.  I get that
> the audience there is self-selecting to those who want to donate their time,
> and those who are having a problem (sometimes, those are the same people).
> This is the kind of thing that generates a lot of confusion and frustration
> to the new users I interact with there.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFKTQVKTONCYNN7UEKLFAQ2VRKXEHK/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/M5M355RB3KL34VSTFTXH4ZUJYHMETVNK/


[Distutils] Re: PEP-582 concerns

2019-02-20 Thread Tzu-ping Chung

> On 20/2/2019, at 23:19, Steve Dower  wrote:
> 
> On 20Feb.2019 0533, Tzu-ping Chung wrote:
>> As one of the Pipenv maintainers, however, it is my personal opinion that 
>> this
>> PEP would not be end up in the “yet another standard” situation, but even be
>> beneficial to Pipenv, if done correctly.
>> 
>> I hope this can provide some confidence :)
> 
> I'd love to hear more about how Pipenv would make use of it. So far it's
> only really been designed with pip in mind (and their team in the
> discussion), but we've explicitly left it _very_ tool independent. So if
> you can describe how it would work with Pipenv, that would be helpful
> for finding things that need changing.

When you run `pipenv install` (roughly analogous to pip install -r), Pipenv
creates a virtual environment somewhere on the machine (depending on
various configurations), and install packages into it. Afterward the user can
run commands like `pipenv run python`, and Pipenv would activate the
virtual environment for the command.

With PEP 582, __pypackages__ can be used instead of virtual environments,
and since “activation” is done automatically by the interpreter and pip,
`pipenv run` can do less than it currently needs to.

There are still some ergonomics problems, e.g. how does Pipenv know what
Python version to install into, but I don’t think there’s anything in the PEP at
the moment that would make the adoption impossible. We’ll definitely try
to be heard if any blockers appear :)


> 
> Also, the `pythonloc` package is an implementation of this that anyone
> can try out today - https://pypi.org/project/pythonloc/ (the major
> difference is that when implemented, you won't have to use "pythonloc"
> and "piploc" to get the new behaviour).
> 
> Cheers,
> Steve
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/2IQOMDUV5F5DEGEKIEE2GPBTGBXAPNMH/


[Distutils] Re: Update PEP 508 to allow version specifiers

2019-03-13 Thread Tzu-ping Chung

> On 6 Mar 2019, at 03:53, Simon  wrote:
> 
> I hope it's not an issue that I'm replying to a month-old thread. I reviewed 
> the previous discussion to try to avoid duplicating any of it.
> 
> When using pip with PyPI, calling pip a second time is much quicker than the 
> first time, because it verifies that the requirements, including version 
> constraints, are satisfied in the target environment and doesn't needlessly 
> reinstall stuff.
> 
> Dependency links allowed the same behaviour to be implemented for private 
> packages with dependencies on other private repositories: given a requirement 
> B >= 3 and a dependency link that B was available from, pip could check if 
> the environment already includes a package B with a new enough version, and 
> only use the dependency link as a fallback if the requirement isn't already 
> satisfied.
> 
> URL specifiers aren't useful for providing a fallback location to get a 
> package from, because using one prevents the package from specifying a 
> version constraint in the same way that was possible with dependency links, 
> or with normal requirements available from PyPI. Curiously, discussion of 
> version constraints in this thread has focused on how nonsensical it would be 
> to compare them to the specifying URL, ignoring the possibility of comparing 
> the constraint with the target environment.
> 
> The loss of this functionality means that anyone who was previously using pip 
> to automatically install private packages with private dependencies now has 
> to either forgo automatic dependency management (a large part of why one 
> would use a package manager to begin with) in favour of recursively specified 
> requirements files, publish their private packages somewhere so that pip can 
> find them, or stick with pip 18.1 for now.

Wouldn’t you still need to “publish the private packages somewhere” for 
dependency links to work? `setup.py sdist` with `pip --find-links` can get you 
very far; the only differences IMO is you have to provide a proper package, and 
write a simple HTML file to point to it.


> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/BALD2PVKGHBBWIKNYTZGGF6LHEXI7O26/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/K3NCORKUWGPCPFXGTZGXUFWHVGODL6FC/


[Distutils] Re: Update PEP 508 to allow version specifiers

2019-03-15 Thread Tzu-ping Chung

> On 15 Mar 2019, at 21:47, Simon Ruggier  wrote:
> 
> The packages have to be available online for dependency links to work, yes, 
> but they're not public: one needs to authenticate with an SSH key to clone 
> each repository.

Both --find-links and --index-url (or --extra-index-url) provide the 
possibility to secure the download with authentication (HTTP Basic Auth).


> 
>> On March 13, 2019 2:09:03 PM UTC, Tzu-ping Chung  wrote:
>> 
>>> On 6 Mar 2019, at 03:53, Simon  wrote:
>>> 
>>> I hope it's not an issue that I'm replying to a month-old thread. I 
>>> reviewed the previous discussion to try to avoid duplicating any of it.
>>> 
>>> When using pip with PyPI, calling pip a second time is much quicker than 
>>> the first time, because it verifies that the requirements, including 
>>> version constraints, are satisfied in the target environment and doesn't 
>>> needlessly reinstall stuff.
>>> 
>>> Dependency links allowed the same behaviour to be implemented for private 
>>> packages with dependencies on other private repositories: given a 
>>> requirement B >= 3 and a dependency link that B was available from, pip 
>>> could check if the environment already includes a package B with a new 
>>> enough version, and only use the dependency link as a fallback if the 
>>> requirement isn't already satisfied.
>>> 
>>> URL specifiers aren't useful for providing a fallback location to get a 
>>> package from, because using one prevents the package from specifying a 
>>> version constraint in the same way that was possible with dependency links, 
>>> or with normal requirements available from PyPI. Curiously, discussion of 
>>> version constraints in this thread has focused on how nonsensical it would 
>>> be to compare them to the specifying URL, ignoring the possibility of 
>>> comparing the constraint with the target environment.
>>> 
>>> The loss of this functionality means that anyone who was previously using 
>>> pip to automatically install private packages with private dependencies now 
>>> has to either forgo automatic dependency management (a large part of why 
>>> one would use a package manager to begin with) in favour of recursively 
>>> specified requirements files, publish their private packages somewhere so 
>>> that pip can find them, or stick with pip 18.1 for now.
>> 
>> Wouldn’t you still need to “publish the private packages somewhere” for 
>> dependency links to work? `setup.py sdist` with `pip --find-links` can get 
>> you very far; the only differences IMO is you have to provide a proper 
>> package, and write a simple HTML file to point to it.
>> 
>> 
>>> --
>>> Distutils-SIG mailing list -- distutils-sig@python.org
>>> To unsubscribe send an email to distutils-sig-le...@python.org
>>> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
>>> Message archived at 
>>> https://mail.python.org/archives/list/distutils-sig@python.org/message/BALD2PVKGHBBWIKNYTZGGF6LHEXI7O26/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/WN2TO4LUQ7IOIWGBLLLEVM7HAC3KY45I/


[Distutils] Re: setuptools reading simple index takes 30sec

2019-08-01 Thread Tzu-ping Chung
What command are you using to result in the output? The root page lists all 
available packages on that index, and with 11MB of data there must be literally 
millions of entries to parse, and would definitely take a long time. But it 
definitely don’t need to be done under normal circumstances, and pip/setuptools 
definitely does not do it under normal circumstances.


Sent from Mail for Windows 10

From: chris.cra...@rsa.com
Sent: 02 August 2019 05:06
To: distutils-sig@python.org
Subject: [Distutils] setuptools reading simple index takes 30sec

Hello,
I have a python dev team that I support in our local jenkins/artifactory 
environment.  Recently this team switched over from an old local pypi legacy 
repository to a current pypi remote/virtual of pypi.org.  So we could get them 
updated libraries and such. Anyway, I am banging my head to explain the time 
taken for reading the simple index.  This index is about 11MB, takes 1 second 
to download itself, but then it takes ~30 seconds for setuptools to scan this 
and find the package it's looking for.  Multiple this by the ~170 libraries 
they have in requirements, and the build has gone from 30minutes to almost 3 
hours.  And 90% comes from reading the simple index for each dependency.

Here's a sample output with timestamps
00:57:00.036 Searching for tenacity==4.8.0
00:57:00.037 Reading 
https://repo1.local/artifactory/api/pypi/py-pypi-virt/simple/
00:57:27.650 Reading 
https://repo1.local/artifactory/api/pypi/py-pypi-virt/simple/tenacity/
00:57:27.881 Downloading 
https://repo1.local/artifactory/api/pypi/py-pypi-virt/packages/packages/fc/e9/5499018e0d420f8d03a215c310ee7bc6e1a7e84adaa63f6ea208e864bdb6/tenacity-4.8.0-py2.py3-none-any.whl#sha256=efcf0672547f52fd49f96c2c1912e0f0e77d78a6630823aad54f99227a3c332d
00:57:28.037 Best match: tenacity 4.8.0
00:57:28.037 Processing tenacity-4.8.0-py2.py3-none-any.whl
00:57:28.038 Installing tenacity-4.8.0-py2.py3-none-any.whl to 
/home/build/workspace/workflows-pr/workflows-extension/.eggs
00:57:28.069 writing requirements to 
/home/build/workspace/workflows-pr/workflows-extension/.eggs/tenacity-4.8.0-py2.7.egg/EGG-INFO/requires.txt
00:57:28.114 Installed 
/home/build/workspace/workflows-pr/workflows-extension/.eggs/tenacity-4.8.0-py2.7.egg

Note the time on the first Reading.  I'm hoping there is some obvious simple 
solution to make this faster.
Any ideas or help would be greatly appreciated?

Thanks
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/O7M274ZYQMMU3WT52PY2YIVJMCNOQJIR/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/M3R6JQ5X6MN4TRYAM5JYVL2OLO52BJW7/


[Distutils] Re: Linux binary wheels?

2019-08-20 Thread Tzu-ping Chung

> On 20 Aug 2019, at 23:47, Nick Timkovich  wrote:
> 
>> On Tue, Aug 20, 2019, at 5:05 AM Matthew Brett  
>> wrote:
> 
>> ...  Unless you meant wheels for non-Intel platforms, in which case, please 
>> do say more about you need.
> 
> Minor tangent: I've seen some people use https://www.piwheels.org/ for 
> Raspberry Pi (ARM 6/7), but could the ARM binaries be uploaded to PyPI?
> 
> I think I'm conflating the wheel building spec (is manylinux amd64 specific, 
> or as long as the libraries are on any architecture?), toolchains, 
> environment (sounds like Piwheels provides a platform to build them on), and 
> package hosting (can PyPI host arbitrary archs?) in that one sentence.

This issue may be of relevant: https://github.com/pypa/warehouse/issues/3668

And there are even more layers to this problem. Wheels on piwheels are 
currently maintained by RPi folks; if they are going into PyPI, either package 
maintainers need to take over uploading (and even building) them, or PyPI needs 
a way to allow (qualified) people to upload stuffs for packages they don’t own. 
And maintainers might decide that ARM is not their supported platform anyway, 
and get us back to where we started.

> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/OXSUW73EO5DTUO34EFURN3KHCDAKNS4Z/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/HYGOQ6EGLZHNN6HR3KY6XUGI2ZZQX2SI/


[Distutils] Re: Python 3.7.4 MSI

2019-09-09 Thread Tzu-ping Chung
FWIW, individual msi files are still available on python.org, e.g. (for 3.7.4 
64-bit)
https://www.python.org/ftp/python/3.7.4/amd64/

This is essentially how the web-based installer works. Those msi files map 
quite nicely to individual options of the exe wrapper, so it shouldn’t be too 
difficult to figure out what you need (or you can just read the CPython source).

I assume existing versions are guaranteed to work as long as python.org works 
(otherwise the web-based installer would cease to work), but there’s likely no 
compatibility guarantees for future releases.



From: Paul Moore
Sent: 09 September 2019 22:29
To: Renegad3 Kay
Cc: Distutils
Subject: [Distutils] Re: Python 3.7.4 MSI

Correct. The installer technology used for the python.org builds
changed some time ago (I think Python 3.4 was the last version to use
an MSI installer).

Paul

On Mon, 9 Sep 2019 at 15:25, Renegad3 Kay  wrote:
>
> Greetings!
>
> my organization uses Python across it's departments but the recent versions 
> of Python do NOT have an MSI download. We use SCCM for deployment of software 
> and because the downloads are all .exe based, the program is no longer in 
> compliance with my organization's security policies. I've looked and I've 
> looked but I cannot find an MSI for this program.
>
> IT Guy.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/MADS575JCLLNRV2Q4IOZAUON5SIN52NC/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/DQDLNAYW6HTLUCOMA4LUBVUXOSK6B2RU/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/4CEFAYNGCITLSC2XMYEX5OZTGQ5OFOBW/


[Distutils] Re: Python 3.8

2019-11-13 Thread Tzu-ping Chung
Hi, The error happens when pip tried (and failed) to build pygame from source. It does not happen on 3.7 because pygame published binary releases for it, so pip does not need to build it for 3.7. Pygame just published binaries for 3.8 yesterday, and the static file server may need some time to propagate until the file is available for pip. I hope this helps clear up the situation. TP  Sent from Mail for Windows 10 From: Шехматов С.В.Sent: 13 November 2019 21:46To: distutils-sig@python.orgSubject: [Distutils] Python 3.8 Здравствуйте, Distutils-sig. That  is  what i have after attempt to install pygame module in Python3.8. c:\Users\AMD\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0>pip3.8 install pygameCollecting pygame  Using cached https://files.pythonhosted.org/packages/0f/9c/78626be04e193c0624842090feb3805c050dfaa81c8094d6441db2be/pygame-1.9.6.tar.gz    ERROR: Command errored out with exit status 1: command: 'C:\Users\AMD\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\python.exe' -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\AMD\\AppData\\Local\\Temp\\pip-install-fl0na0dp\\pygame\\setup.py'"'"'; __file__='"'"'C:\\Users\\AMD\\AppData\\Local\\Temp\\pip-install-fl0na0dp\\pygame\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info cwd: C:\Users\AMD\AppData\Local\Temp\pip-install-fl0na0dp\pygame\    Complete output (17 lines):  WARNING, No "Setup" File Exists, Running "buildconfig/config.py"    Using WINDOWS configuration...  Download prebuilts to "prebuilt_downloads" and copy to "./prebuilt-x64"? [Y/n]Traceback (most recent call last):  File "", line 1, in   File "C:\Users\AMD\AppData\Local\Temp\pip-install-fl0na0dp\pygame\setup.py", line 194, in     buildconfig.config.main(AUTO_CONFIG)  File "C:\Users\AMD\AppData\Local\Temp\pip-install-fl0na0dp\pygame\buildconfig\config.py", line 210, in main    deps = CFG.main(**kwds)  File "C:\Users\AMD\AppData\Local\Temp\pip-install-fl0na0dp\pygame\buildconfig\config_win.py", line 576, in main    and download_win_prebuilt.ask(**download_kwargs):  File "C:\Users\AMD\AppData\Local\Temp\pip-install-fl0na0dp\pygame\buildconfig\download_win_prebuilt.py", line 302, in ask    reply = raw_input(    EOFError: EOF when reading a line    ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.WARNING: You are using pip version 19.2.3, however version 19.3.1 is available.You should consider upgrading via the 'python -m pip install --upgrade pip' command. P,S. Python 3.7 did this operation without problem!?-- С уважением,Шехматов  mailto:s...@yandex.ru--Distutils-SIG mailing list -- distutils-sig@python.orgTo unsubscribe send an email to distutils-sig-le...@python.orghttps://mail.python.org/mailman3/lists/distutils-sig.python.org/Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/26MJ6XEPISLWP422YZOUZO4W7HXWSKOB/ --
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/CYSD4Y3CEC5WOULU5UIVDOWM5AYZMBZH/


[Distutils] Re: pip resolver work chugging along

2020-03-24 Thread Tzu-ping Chung
To expand a little on the topic, there are multiple abstraction layers required 
to make the new resolver useful as a separate package.

ResolveLib (mentioned in Paul’s message) is an dependency resolver 
implementation in abstract, entirely divorced of Python packaging. While it can 
be used by other tools, the implementer would need to provide the necessary 
parts to “teach” it how understand Python packages.

The ongoing pip resolver work is exactly that, to integrate ResolveLib into pip 
by wiring it onto pip’s internal representation of Python packages. This is 
unfortunately not really reusable for other tools, since pip’s current model on 
Python packages is too deeply entangled into various parts of the code base.

Some extra work is needed to make ResolveLib really useful as a reusable Python 
dependency resolver for other projects. One way would be to implement a wrapper 
library to provide the same Python package semantics to ResolveLib, but with 
more reusable models such as pypa/packaging, importlib-metadata, etc. The 
current pip resolver work would be able to offer much knowledge on dealing with 
various hairy problems, but the code written for pip wouldn’t be directly 
reusable.

TP


> On 25/3/2020, at 03:02, Paul Moore  wrote:
> 
> It's already available as a separate package:
> https://pypi.org/project/resolvelib/
> 
> Paul
> 
> On Tue, 24 Mar 2020 at 18:52, Brett Cannon  wrote:
>> 
>> I couldn't find this in the blog post but is the plan to make the resolver a 
>> separate package so other tools can use it? Or is the plan perhaps to get it 
>> working in pip first and to break it out later?
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org
>> To unsubscribe send an email to distutils-sig-le...@python.org
>> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
>> Message archived at 
>> https://mail.python.org/archives/list/distutils-sig@python.org/message/CLWGXEPNNMZ6YI5WWJ3ZKWWY5WVQOCAE/
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/43NVR5W7YKVFBDY5FSMBWQPLKKG2V2EP/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/VLSD5OXWR6YLAFXQZWCZSCLXPNUUBSU4/


[Distutils] Re: pip resolver work chugging along

2020-03-24 Thread Tzu-ping Chung

> On 25/3/2020, at 05:37, Wes Turner  wrote:
> 
> On Tue, Mar 24, 2020 at 5:24 PM Tzu-ping Chung  <mailto:uranu...@gmail.com>> wrote:
> To expand a little on the topic, there are multiple abstraction layers 
> required to make the new resolver useful as a separate package.
> 
> ResolveLib (mentioned in Paul’s message) is an dependency resolver 
> implementation in abstract, entirely divorced of Python packaging. While it 
> can be used by other tools, the implementer would need to provide the 
> necessary parts to “teach” it how understand Python packages.
> 
> The ongoing pip resolver work is exactly that, to integrate ResolveLib into 
> pip by wiring it onto pip’s internal representation of Python packages. This 
> is unfortunately not really reusable for other tools, since pip’s current 
> model on Python packages is too deeply entangled into various parts of the 
> code base.
> 
> Some extra work is needed to make ResolveLib really useful as a reusable 
> Python dependency resolver for other projects. One way would be to implement 
> a wrapper library to provide the same Python package semantics to ResolveLib, 
> but with more reusable models such as pypa/packaging, importlib-metadata, 
> etc. The current pip resolver work would be able to offer much knowledge on 
> dealing with various hairy problems, but the code written for pip wouldn’t be 
> directly reusable.
> 
> TP
> 
> "New Resolver: technical choices" compares various resolvers.
> https://github.com/pypa/pip/issues/7406 
> <https://github.com/pypa/pip/issues/7406>
> 
> FWICS, more test cases should be the immediate objective.
> 
> Factoring out to make a generic resolver would be redundant? Would existing 
> interfaces for e.g. libsolv (dnf) and pycosat (conda) be sufficient?

In theory yes, but the practical problem is that a few ideas in Python 
packaging can’t be matched cleanly to the general models. Extras (or 
conditional dependency in general) is one problem a lot of package managers 
have to deal with outside of the generic resolution process. Another problem is 
to identify where incompatibilities are declared—version specifiers would be 
the first coming to mind, but there are actually more, e.g. Requires-Python.

All these are solvable problems, and indeed each problem can wire directly to 
the resolver interface. But they would then all need to solve with the same 
problems, and likely to each miss something in the process. I believe it would 
be immensely beneficial to have a “canonical” implementation all parties use 
and maintain, if only to keep the domain knowledge shared and kept.


>  
> 
> 
> > On 25/3/2020, at 03:02, Paul Moore  > <mailto:p.f.mo...@gmail.com>> wrote:
> > 
> > It's already available as a separate package:
> > https://pypi.org/project/resolvelib/ <https://pypi.org/project/resolvelib/>
> > 
> > Paul
> > 
> > On Tue, 24 Mar 2020 at 18:52, Brett Cannon  > <mailto:br...@python.org>> wrote:
> >> 
> >> I couldn't find this in the blog post but is the plan to make the resolver 
> >> a separate package so other tools can use it? Or is the plan perhaps to 
> >> get it working in pip first and to break it out later?
> >> --
> >> Distutils-SIG mailing list -- distutils-sig@python.org 
> >> <mailto:distutils-sig@python.org>
> >> To unsubscribe send an email to distutils-sig-le...@python.org 
> >> <mailto:distutils-sig-le...@python.org>
> >> https://mail.python.org/mailman3/lists/distutils-sig.python.org/ 
> >> <https://mail.python.org/mailman3/lists/distutils-sig.python.org/>
> >> Message archived at 
> >> https://mail.python.org/archives/list/distutils-sig@python.org/message/CLWGXEPNNMZ6YI5WWJ3ZKWWY5WVQOCAE/
> >>  
> >> <https://mail.python.org/archives/list/distutils-sig@python.org/message/CLWGXEPNNMZ6YI5WWJ3ZKWWY5WVQOCAE/>
> > --
> > Distutils-SIG mailing list -- distutils-sig@python.org 
> > <mailto:distutils-sig@python.org>
> > To unsubscribe send an email to distutils-sig-le...@python.org 
> > <mailto:distutils-sig-le...@python.org>
> > https://mail.python.org/mailman3/lists/distutils-sig.python.org/ 
> > <https://mail.python.org/mailman3/lists/distutils-sig.python.org/>
> > Message archived at 
> > https://mail.python.org/archives/list/distutils-sig@python.org/message/43NVR5W7YKVFBDY5FSMBWQPLKKG2V2EP/
> >  
> > <https://mail.python.org/archives/list/distutils-sig@python.org/message/43NVR5W7YKVFBDY5FSMBWQPLKKG2V2EP/>
> --
> Distutils-SIG mailing list -- distutils-sig@python.org 
> <mailto:distutils-sig@python.org>
> To unsub

[Distutils] Re: Pipenv - Use system python lib for one package

2020-03-27 Thread Tzu-ping Chung
Hi,

In short, no. Pipenv is designed to only manage Python packages, and cannot be 
used to access a software that’s not available as a Python package.

APT’s software package format (DEB) is significantly different from Python’s, 
and it is not possible to mix them together unless the tool is designed 
specifically to support such an operation; Pipenv does not try to set foot in 
this area.

TP


> On 28/3/2020, at 01:30, Samuel Mutel  wrote:
> 
> Hello,
> 
> I have a pipfile like this.
> I would like to install python-apt library but not from pypi but from the 
> system.
> Is-it possible?
> This python library is no longer supported on pypi but is supported in the 
> package distribution.
> 
> [[source]]
>   
> name = "pypi" 
>   
> url = "https://pypi.org/simple " 
> 
> verify_ssl = true 
>   
>   
>   
> [dev-packages]
>   
>   
>   
> [packages]
>   
> ansible = "==2.8.4"   
>   
> ansible-lint = "*"
>   
> molecule = "*"
>   
> docker-py = "*"   
>   
>   
>   
> [requires]
>   
> python_version = "3"
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/Y5ANHS5NPQE52RZ565EMC7ZNQDUWI5JE/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/ECZ3CWAI6QPXBYUKTYWRI762YVC4FUPC/


[Distutils] Re: Pipenv - Use system python lib for one package

2020-03-27 Thread Tzu-ping Chung
No, a Python package is more than the installed library you find in 
site-packages.

If you know exactly what you want to install, you can try to repackage the 
library yourself by supplying the required metadata, and tell Pipenv to install 
from that instead.

The simplest way would be to produce a wheel 
<https://www.python.org/dev/peps/pep-0427/> containing the files you want to 
install, and supply the file to Pipfile like this:

python-apt = { file = ‘file://url/to/you/built/wheel.whl’ }

Since the value is an URL, you can upload the wheel somewhere to install on 
another computer as well.


> On 28/3/2020, at 03:36, Samuel Mutel  wrote:
> 
> I spoke about debian packages but it's a python library installed by a 
> package.
> 
> So could we install a python library from the system python library folder?
> 
> Le ven. 27 mars 2020 à 20:15, Tzu-ping Chung  <mailto:uranu...@gmail.com>> a écrit :
> Hi,
> 
> In short, no. Pipenv is designed to only manage Python packages, and cannot 
> be used to access a software that’s not available as a Python package.
> 
> APT’s software package format (DEB) is significantly different from Python’s, 
> and it is not possible to mix them together unless the tool is designed 
> specifically to support such an operation; Pipenv does not try to set foot in 
> this area.
> 
> TP
> 
> 
>> On 28/3/2020, at 01:30, Samuel Mutel > <mailto:samuel.mu...@gmail.com>> wrote:
>> 
>> Hello,
>> 
>> I have a pipfile like this.
>> I would like to install python-apt library but not from pypi but from the 
>> system.
>> Is-it possible?
>> This python library is no longer supported on pypi but is supported in the 
>> package distribution.
>> 
>> [[source]]   
>>
>> name = "pypi"
>>
>> url = "https://pypi.org/simple <https://pypi.org/simple>"
>>  
>> verify_ssl = true
>>
>>  
>>
>> [dev-packages]   
>>
>>  
>>
>> [packages]   
>>
>> ansible = "==2.8.4"  
>>
>> ansible-lint = "*"   
>>
>> molecule = "*"   
>>
>> docker-py = "*"  
>>
>>  
>>
>> [requires]   
>>
>> python_version = "3"
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org 
>> <mailto:distutils-sig@python.org>
>> To unsubscribe send an email to distutils-sig-le...@python.org 
>> <mailto:distutils-sig-le...@python.org>
>> https://mail.python.org/mailman3/lists/distutils-sig.python.org/ 
>> <https://mail.python.org/mailman3/lists/distutils-sig.python.org/>
>> Message archived at 
>> https://mail.python.org/archives/list/distutils-sig@python.org/message/Y5ANHS5NPQE52RZ565EMC7ZNQDUWI5JE/
>>  
>> <https://mail.python.org/archives/list/distutils-sig@python.org/message/Y5ANHS5NPQE52RZ565EMC7ZNQDUWI5JE/>
> 

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/HAT5IX4MDGT264DNCOUMRHBTMDEX5WVH/


[Distutils] Re: Install distutils with pip3

2020-03-28 Thread Tzu-ping Chung
Hi,

distutils is a part of the standard library. The module is not available from 
pip since it is intended to be installed with Python itself.

Some Linux distributions like to pull standard libraries into separate 
packages. On Ubuntu, for example, you will need to install python3-distutils to 
get it. After that, it should be available for both the system Python and 
virtual environments.

TP


> On 29/3/2020, at 01:33, Samuel Mutel  wrote:
> 
> Hello,
> 
> I would like to know if distutils could be installed in python3 environment 
> with pip3 ?
> 
> Thanks.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/N7VOW7MGNHS6MPUINHHZCIG7Y3XGBW3V/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/QEYCLULZ7P7IFJO5SOQVTBO6NDRBKOJN/


[Distutils] Re: setup.py with a different name

2020-06-29 Thread Tzu-ping Chung
I assume you’re building an sdist (e.g. a .tar.gz file, not a .whl file), since 
wheels don’t generally include the setup.py file. In that case you want to 
specify additional files to include with the MANIFEST.in file.

More details are available at 
https://packaging.python.org/guides/using-manifest-in/


> On 30/6, 2020, at 08:45, Gabriel Becedillas  
> wrote:
> 
> Hi,
> I'm trying to build a python package but my setup.py is actually named 
> pkg1_setup.py (this is because I have multiple packages in the same repo).
> The package built is *almost* fine, except that setup.py is not included in 
> the package built. I understand that the *normal* structure would be to have 
> something like this:
> * setup.py
> * mypackage\...
> 
> but before refactoring the repository structure to accommodate that I'd like 
> to validate that there is no other way of doing that. I tried using the 
> script_name parameter to setuptools.setup but had no luck.
> Any suggestions?
> Thanks a lot
> 
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/EXOGGI2YTZPJVFLK5LIVYKUGQZ4G4JQ2/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/HSCMIXOLV64CEXCJYNVIHEFZ3I4GK6MH/


[Distutils] Re: Development installation with pyproject.toml/poetry?

2020-07-01 Thread Tzu-ping Chung
To supplement what Thomas said, yes, there is current some development effort 
in standardising development/editable installs. Much of the discussion happens 
here:

https://discuss.python.org/t/next-steps-for-editable-develop-proof-of-concept/4118


> On 01/7, 2020, at 17:15, Thomas Kluyver  wrote:
> 
> Hi Fredrik,
> 
> You can do an editable install using whichever development tool you've 
> chosen, e.g. Poetry. The poetry docs 
> (https://python-poetry.org/docs/basic-usage/) say it does an editable install 
> by default.
> 
> Currently, there isn't a standardised way for 'frontend' tools like pip to 
> ask for an editable install.
> 
> Thomas
> 
> On Wed, 1 Jul 2020, at 10:03, fred...@averpil.com wrote:
>> Hi,
>> 
>> I have converted my setup.py to pyproject.toml and use poetry to 
>> build/manage deps.
>> I maintain packages which also have dependency packages and I 
>> frequently use development installations locally ("pip install -e 
>> ../my-dep") so to develop dependencies at the same time as my main 
>> packages.
>> 
>> Q: With a pyproject.toml, can I enable some sort of development 
>> installation without also maintaining a setup.py - or is there a 
>> plan/PEP for enabling this without a setup.py?
>> 
>> Right now, I have a setup.py which needs to stipulate a number of 
>> things which is already in my pyproject.toml. In particular, I find it 
>> a little bothering that I need to specify the 
>> entry_points/console_scripts in both the pyproject.toml and in setup.py.
>> 
>> Kind regards,
>> Fredrik
>> --
>> Distutils-SIG mailing list -- distutils-sig@python.org
>> To unsubscribe send an email to distutils-sig-le...@python.org
>> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
>> Message archived at 
>> https://mail.python.org/archives/list/distutils-sig@python.org/message/UJ6XSFLQBNOXDCAWVST7V43C3H66PPYE/
>> 
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/4HFFSVVDCVVOUWNN5USXXHJDIEOPRXZW/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/TGRG35MALBPRLLMW7OWORONLIZLJQHQQ/


[Distutils] Re: Fwd: Re: Use of "python" shebang an installation error?

2020-07-22 Thread Tzu-ping Chung

> On 23/7, 2020, at 06:51, David Mathog  wrote:
> 
> On Wed, Jul 22, 2020 at 1:27 PM Paul Moore  wrote:
>> 
>> On Wed, 22 Jul 2020 at 19:31, David Mathog  wrote:
>>> but that shebang has to be corrected when the installation is moved to a 
>>> normal
>>> environment, which my code is doing now.)
>> 
>> Moving files that are installed by Python packaging tools isn't
>> supported. It might work, and you can probably make it work with some
>> effort, but it's very much a case of "don't do it unless you know what
>> you're doing". Correcting shebang lines is definitely something you
>> will need to do.
> 
> I understand that moving files is iffy.  However, given that I want
> only 1 copy of each installed python package on the system and I need
> to be able to install different versions of the same package (to
> resolve module version number conflicts between packages), moving the
> files around and replacing most copies with links to the single copy
> seemed like the only way to go.
> 
> Here:
> 
> https://www.python.org/dev/peps/pep-0394/#recommendation
> 
> It says:
> 
> When packaging third party Python scripts, distributors are encouraged
> to change less specific shebangs to more specific ones. This ensures
> software is used with the latest version of Python available, and it
> can remove a dependency on Python 2. The details on what specifics to
> set are left to the distributors; though. Example specifics could
> include:
> 
> Changing python shebangs to python3 when Python 3.x is supported.
> Changing python shebangs to python2 when Python 3.x is not yet supported.
> Changing python3 shebangs to python3.8 if the software is built with Python 
> 3.8.
> 
> and then immediately after it says:
> 
> When a virtual environment (created by the PEP 405 venv package or a
> similar tool such as virtualenv or conda) is active, the python
> command should refer to the virtual environment's interpreter and
> should always be available. The python3 or python2 command (according
> to the environment's interpreter version) should also be available.
> 
> Which seems to be exactly the opposite of the preceding stanza.  Ie,
> 
>  "always be as specific as possible"
> 
> then
> 
>  "be general, and also provide specific"

The first paragraph is saying it is recommended to rewrite shebangs such as

#!python3

to the actual Python interpreter the script is installed against, e.g. the 
interpreter in a virtual environment.

The second paragraph is describing “which° command the installer should choose 
to refer to an interpreter. For CPython 3.8, for example, up to three commands 
may be available in a given virtual environment:

{prefix}/bin/python
{prefix}/bin/python3
{prefix}/bin/python3.8

and the installer should choose the most generic one, i.e. {prefix}/bin/python, 
because this avoids dealing with interpreter-specific naming conventions, e.g. 
the Python version (3 or 3.8), implementation (pypy or jython).


> 
> Personally I think the generic use of "python" both in shebangs and
> when invoking scripts as "python script" should be deprecated, with
> warnings from the installers to force developers to strip it out.  It
> only works now by chance.  Sure, there is a high probability it will
> work, but if one is on the wrong system it fails.  If python4
> (whenever it arrives) is not fully backwards compatible with python3
> the generic use of "python" is going to cause untold grief.  Whereas
> in that scenario all the code which uses "python3" should continue to
> function normally.

You are assuming “python3” or “python4” is a reliable command to refer to—which 
is an understandable misconception coming from a Linux background, but a 
misconception nonetheless. The wheel specification chose “python” for one 
reason: it is the only name that’s guaranteed to exist across operating systems 
and interpreter implementations.

Also, by your logic (“python” would break when Python 4 comes out), wheels 
really should have break when a minor Python version is released, since a 
script written on Python 3.4 does not always work on 3.6 (as an example). So 
the installer really should use “python3.4” instead, does it not? But it does 
not, and nothing breaks right now—because wheels have other means to declare 
compatibility (wheel tags). An installer should be capable to put a script 
under the interpreter version only if the wheel tags allow it to. If the 
shebang needs to care about compatibility, something is already going very 
wrong.

TP


> 
> Regards,
> 
> David Mathog
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/HAZUEGH7D7Y3PDMSYVNXHLYT6YMQLYUW/
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail

[Distutils] Re: Best way for a project to provide an optional C module

2020-08-02 Thread Tzu-ping Chung
Ideally this problem should be solved with the Provides-Dist metadata.[1] The 
exact semantic to the flag is unfortunately not discussed much currently, and 
no installer implementations that I know of (including pip) support it at all.

This also recently came up in pip’s issue tracker,[2] and I guess this is as 
good a time as any to start the conversation if someone is willing to help 
drive that discussion forward.

[1]: 
https://packaging.python.org/specifications/core-metadata/#provides-dist-multiple-use
[2]: https://github.com/pypa/pip/issues/8669

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com



> On 03/8, 2020, at 05:59, Daniele Varrazzo  wrote:
> 
> On Sun, 2 Aug 2020 at 21:34, Bert JW Regeer  wrote:
>> 
>> By splitting it into two different packages you end up with the same 
>> situation that currently plagues psycopg2/psycopg2-binary whereby if you 
>> depend on psycopg2 you can't easily swap in psycopg2-binary and vice-versa 
>> as the two don't satisfy the same dependency.
> 
> The psycopg2 wheel or not wheel is the situation you describe, and
> it's really suboptimal from the point of declaring dependencies:
> beginners would like to use psycopg2-binary, but projects are advised
> to depend on psycopg2, so they either don't get the wheel benefit or
> they end up in a tangle of dependencies, two distributions installing
> the same files, bad stuff. Offering the C distribution as an opt-in
> extension would allow projects to depend only on the pure python
> psycopg3, which would be also the right choice for beginners, and
> allowing the grown-ups with a compiler to go faster by installing
> psycopg3-c too, which wouldn't conflict with the basic package.
> 
> 
>> Number 3 is kind of what sqlalchemy does, and then provide wheels for a huge 
>> variety of platforms to allow people to install the package without needing 
>> a compiler themselves.
> 
> The difference in performance of the C extension is important enough
> (15-20x - https://www.varrazzo.com/blog/2020/05/19/a-trip-into-optimisation/)
> to arguably make or break a deal. If someone wanted the C extension
> because they need the performance I wouldn't want its installation to
> fail silently.
> 
> 
> -- Daniele
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/MQNGT25XY3V7UDTKXVOEQ3XABLTEK4DS/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/3T4CHRSOQAYIYSFQ5OV7ZWBIW7BOKCCV/


[Distutils] Re: pip and missing shared system system library

2020-08-05 Thread Tzu-ping Chung
Exactly. Python actually specifies metadata around this (Requires-External 
<https://packaging.python.org/specifications/core-metadata/#requires-external-multiple-use>),
 but I don’t believe pip implements it at all since there’re almost no sensible 
rules available on how the external libraries can be located in a 
cross-platform way.

Conda is probably the best bet when you need to deal with tight cross-language 
package integration like this, by punting the whole idea of system libraries 
and installing a separate copy of everything you need.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com

> On 06/8/2020, at 07:25, Jonathan DEKHTIAR  wrote:
> 
> I like the general idea. But I feel it's not going to be doable in practice.
> Many of the C libraries are not necessarily installed in usual places like 
> `/usr/shared/lib` (like drivers for instances) or you can't be 100% sure 
> about it.
> 
> And that doesn't even account for Windows, which might behave quite a lot 
> different.
> How about python package that comes with C libraries (compiled on install): 
> numpy / tensorflow / torch / cupy / etc.
> 
> I'm not against the idea. I just don't see a good way of doing it. For 
> example, do you want to check on the system libraries or also 
> `LD_LIBRARY_PATH` and `LIBRARY_PATH`.
> Do you want to check inside the user .bashrc for some modification of env 
> vars (what if he doesn't bash) ?
> 
> Sounds honestly difficult to design a feature like this,
> 
> Jonathan
> 
>  Le mer., 05 août 2020 16:03:40 -0700 David Mathog  
> écrit 
> 
> pip install package 
> 
> often results in compiling (using gcc, g++, whatever) to produce a 
> binary. Usually that proceeds without issue. However, there seems to 
> be no checking that the libraries required to link that binary are 
> already on the system. Or at least the message which results when 
> they are not is not at all clear about what is missing. 
> 
> I discovered that today by wasting several hours figuring out why 
> scanpy-scripts was failing trying to build dependency "louvain", which 
> would not install into a venv with pip. It had something to do with 
> "igraph", but pip had downloaded python-igraph before it got to 
> louvain. When louvain tried to build there was a mysterious message 
> about pkgconfig and igraph 
> 
> Cannot find the C core of igraph on this system using pkg-config. 
> 
> (Note that when python-igraph installs it places an igraph directory 
> in site-packages, so which it is referring to is fairly ambiguous.) 
> Then it tried to install a different version number of igraph, failed, 
> and the install failed. This was very confusing because the second 
> igraph install was not (it turned out) a different version of 
> python-igraph but a system level igraph library, which it could not 
> install either because the process was not privileged and could not 
> write to the target directories. Yet it tried to install anyway. 
> This is discussed in the louvain documentation here (it turns out): 
> 
> https://github.com/vtraag/louvain-igraph 
> <https://github.com/vtraag/louvain-igraph> 
> 
> but since I was actually trying to install a different package, of 
> course I had not read the louvain documentation. 
> 
> In short form the problem was "cannot build a binary because required 
> library libigraph.so is not present in the operating system" but that 
> was less than obvious in the barrage of warnings and error messages. 
> 
> Is it possible to tell pip or setup.py to fail immediately when a 
> required system library like this is not found, here presumably after 
> that "C core" message, rather than confusing the matter further with 
> a failed partial build and install of the same component? 
> 
> More generally, is there anything in the python installation methods 
> which could list system libraries as dependencies and give a more 
> informative error message when they are missing? 
> 
> Thanks, 
> 
> David Mathog 
> -- 
> Distutils-SIG mailing list -- distutils-sig@python.org 
> <mailto:distutils-sig@python.org> 
> To unsubscribe send an email to distutils-sig-le...@python.org 
> <mailto:distutils-sig-le...@python.org> 
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/ 
> <https://mail.python.org/mailman3/lists/distutils-sig.python.org/> 
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/MSS42UYQ7FJWHID54FXSW5M5KCMK7ZQI/
>  
> <https://mail.python.org/archives/list/distutils-sig@python.org/message/MSS42UYQ7FJWHID54FXSW5M5KCMK7ZQI/>
>  
> 
> 
> --
> Distutils-SIG mailing 

[Distutils] Re: Pip update fails

2020-09-29 Thread Tzu-ping Chung
There’s some discussion on this a while ago on GitHub:
https://github.com/pypa/pip/issues/8450 
<https://github.com/pypa/pip/issues/8450>

The problem is likely caused by a corrupt pip installation, and can be resolved 
by re-initialising pip with the built-in ensurepip module. Once pip is 
recovered, the suggested upgrade command should work as expected.

The comment with a green tick in the thread lists more detailed commands.

TP

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com

> On 29/9, 2020, at 16:14, brenn...@bezeqint.net wrote:
> 
> Hello,
> 
> When I tried to install a package using pip, it informed me that there is a 
> new version available.
> Per the recommendation, I tried to update pip, but the update failed.
> 
> The following is the last few lines of the failure messages:
> 
>  File 
> "C:\Users\user\AppData\Roaming\Python\Python38\site-packages\pip\_vendor\distlib\scripts.py",
>  line 386, in _get_launcher
> 
>raise ValueError(msg)
> 
> ValueError: Unable to find resource t64.exe in package pip._vendor.distlib
> 
> WARNING: You are using pip version 20.2.2; however, version 20.2.3 is 
> available.
> 
> You should consider upgrading via the 'C:\Program Files\Python38\python.exe 
> -m pip install --upgrade pip' command.
> 
> 
> I can of course supply the full error traceback, if needed, not just these 
> last few lines.
> 
> The file:
> c:\Users\user\AppData\Roaming\Python\Python38\site-packages\pip\_vendor\distlib\t64.exe
> DOES exist.
> 
> I just installed the latest python version 3.8.6 – I would have thought it 
> would already include the latest version of pip…
> 
> Any help resolving this failure will be much appreciated.
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/XRTQTQQKWMGYCXAWQXCCEVNLU5HX5D7C/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/C2IHYFI3GNMR6GQYHNUZSMOXUEGKAQJ3/


[Distutils] Re: Critical problem in PyCharm caused by the removal of "--build-dir" in 2020.3

2020-12-01 Thread Tzu-ping Chung

> On 01/12/2020, at 02:24, Mikhail Golubev via Distutils-SIG 
>  wrote:
> 
> The second question is about the behavior of this option. It appears that we 
> initially started using it because in the past packages were not built in a 
> temporary directory by default. Could you please point me to the exact 
> version of pip where it changed? -- I couldn't find it in a changelog. It 
> would help us decide whether we need to keep some compatibility layer for 
> interpreters with an old version of pip installed.
> 

It seems like this has been the case since 2014 (around pip 6.x), from reading 
https://github.com/pypa/pip/issues/906

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/PXM3675TOWUVPKMZJBCIB3ESUXRCHMWI/


[Distutils] Re: Making setup.py run external command to build

2021-03-23 Thread Tzu-ping Chung

> On 23 Mar 2021, at 19:13, Julian Smith  wrote:
> 
> Approach 1 is to pass callbacks to distutils.core.setup() or
> setuptools.setup(). However there doesn't appear to be documentation in
> either of these modules about what such callbacks should do or how/when
> they are called. The only way to figure things out seems to be to look
> at the distutils or setuptools source, which i'm finding pretty opaque.

[snipped]

> As far as i can tell, callbacks are only given information about the
> original command line arguments rather than more abstract information
> such as where to put output files, so distutils and setuptools don't
> seem to be giving any added value here.

[snipped]

> Given how fundamental the pip tool is to Python packaging, i was hoping
> that the command-line arguments that pip passes to setup.py would be
> standardised and documented, but this doesn't seem to be the case.

What you get from pip is standardised and documented in PEP 517. The
part that’s not so is what happens *after* setup tools receives those
things, which is entirely setuptools internals.

[snipped]

> So as far as i can tell, there are two levels of abstraction at which
> on can implement customised Python packaging (the setuptools.setup()'s
> callbacks or the setup.py command line), but neither one seems to be
> documented or standardised.
> 
> Is that right? Or am i missing something fundamental here?

You’re basically correct, there are two abstractions in play. The first
(between pip and setuptools) is documented. The second (between
setuptools and the external command) is not, because that’s setuptools
internals and setuptools doesn’t really expect people to mess with
them. You still can if you want to, of course, but you’re on your own.

So my advice would be to ditch setup tools entirely and work directly with
the PEP 517 interface. You don’t need to start from scratch though; there
are several PEP 517 implementations other than setuptools, and some of
them already implemented some mechanisms to call external build commands,
such as enscons[1] (via SCons) and PDM[2] (via setuptools). Some other
projects also have already expressed interests supporting this, such as
Flit[3], and I think they’d be more than happy to discuss this if you
want to work this with them.

[1]: https://github.com/dholth/enscons
[2]: https://github.com/frostming/pdm
[3]: https://github.com/takluyver/flit

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/BDGNJTUU7A36XZIRKE64LA73GJ6LWXFT/


[Distutils] Re: Packaging optional, arch-dependent, pre-built libraries

2021-04-05 Thread Tzu-ping Chung
If a file is not built or linked against, a dll in your wheel is essentially a 
plain data file from Python packaging’s perspective, no different from e.g. a 
text file. So you’re looking in the wrong direction for solutions.

I believe the issue PyInstaller has with your package is that, since 
PyInstaller compiles a program into an executable, ctypes.util.find_library() 
won’t work (since there is no actual dll to find). If you know for sure the dll 
will be available, you can copy the binary to a temporary location (the 
“official” way to do this is through importlib.resources.path[1]), and use the 
path to load the dll directly instead.


[1]: https://importlib-resources.readthedocs.io/en/latest/

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com

> On 03/4/2021, at 06:57, Vincent Pelletier  wrote:
> 
> Hello,
> 
> I'm the author of python-libusb1, a pure-python ctypes wrapper for
> libusb1.
> 
> Until recently, I had been purely relying on OS-linker-provided libusb1
> (distro-installed on GNU/Linux and *BSD, fink/macports/... on OSX, ...).
> 
> Then, I've been requested to bundle the libusb1 dll on windows (x86 and
> x86_64 wheels) because otherwise distributions seems exceedingly
> painful for applications using my module. With some extra code to
> setup.py to fetch, unzip and copy[1] the dlls, plus a now even more
> multi-stage distribution process (sdist, both windows wheels, in
> addition to the existing sign and twine steps), and it ipso facto works.
> 
> Now, I'm asked to add pyinstaller compatibility, as it on its own
> overlooks the dll. Which makes me feel that I am maybe not using the
> best possible way to bundle these.
> 
> From my reading of distutils and setuptools, my understanding is that a
> package is that non-pure-python packages contain:
> - stuff they built themselves (build_ext & friends)
> - third-party libraries that the stuff they built themselves is linked
>  against
> Having nothing to build, I cannot seem to reach the library inclusion
> step.
> 
> What is the recommended way of bundling a third-party-built
> arch-dependent library in an otherwise pure-python package ?
> 
> [1] 
> https://github.com/vpelletier/python-libusb1/blob/49f7f846bdd3c3d0f2ec3a01c23ed69885cf63a4/setup.py#L48-L58
> 
> Regards,
> -- 
> Vincent Pelletier
> GPG fingerprint 983A E8B7 3B91 1598 7A92 3845 CAC9 3691 4257 B0C1
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/J6DDMSAC73TVHFMBIEIT6Z6ZB7HGSQGJ/

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/OBAR3NYBPXFG3G2LU6YWA2CCVQO2UFNA/


[Distutils] Re: Packaging optional, arch-dependent, pre-built libraries

2021-04-10 Thread Tzu-ping Chung
“setup.py install” is pretty ancient at this point and lacks many of the 
remotely modern packaging syntax.

I’d strongly advise to ignore it entirely. Use “pip install .” instead.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com


> On 11/4/2021, at 08:59, Vincent Pelletier  wrote:
> 
> Hello,
> 
> On Tue, 6 Apr 2021 00:17:32 +0800, Tzu-ping Chung  wrote:
>> If a file is not built or linked against, a dll in your wheel is essentially 
>> a plain data file from Python packaging’s perspective, no different from 
>> e.g. a text file.
> 
> Thanks, I somehow did not get this until I saw it spelled out.
> 
> There seems to be one catch, though: once I list the dll in
> package_data, it gets copied over to build/lib*, and the same dll gets
> used for all distributions, so if I build the win32 wheel first:
> - plat=win_amd64 gets a 32bits dll
> - plat=any gets a windows 32bits dll
> so I now have to clean --all between each.
> 
> For a text file it could make no difference (although there are the EOL
> shenanigans which would arguably be platform-specific). So while I now
> agree the dll should be treated by my setup.py as a "plain data file",
> there is this annoying extra complication layer.
> 
>> I believe the issue PyInstaller has with your package is that, since 
>> PyInstaller compiles a program into an executable, 
>> ctypes.util.find_library() won’t work (since there is no actual dll to 
>> find). If you know for sure the dll will be available, you can copy the 
>> binary to a temporary location (the “official” way to do this is through 
>> importlib.resources.path[1]), and use the path to load the dll directly 
>> instead.
> 
> Thanks for the pointer, I would love to use it. Unfortunately, this
> fails to install on 2.7:
> with
>install_requires=(
>"importlib_resources<=4.0.0;python_version<'3.0'",
>"importlib_resources;python_version>='3.0' and python_version<'3.7'",
>),
> I get
>  $ ./vpy2/bin/python setup.py install
>  [...]
>  Installed 
> /home/vincent/git/python-libusb1/vpy2/lib/python2.7/site-packages/libusb1-1.9.2+4.g5aeb636.dirty-py2.7.egg
>  Processing dependencies for libusb1==1.9.2+4.g5aeb636.dirty
>  Searching for zipp>=0.4
>  Reading https://pypi.org/simple/zipp/
>  Downloading 
> https://files.pythonhosted.org/packages/38/f9/4fa6df2753ded1bcc1ce2fdd8046f78bd240ff7647f5c9bcf547c0df77e3/zipp-3.4.1.tar.gz#sha256=3607921face881ba3e026887d8150cca609d517579abe052ac81fc5aeffdbd76
>  Best match: zipp 3.4.1
>  Processing zipp-3.4.1.tar.gz
>  Writing /tmp/easy_install-ZDtgKM/zipp-3.4.1/setup.cfg
>  Running zipp-3.4.1/setup.py -q bdist_egg --dist-dir 
> /tmp/easy_install-ZDtgKM/zipp-3.4.1/egg-dist-tmp-GmefKD
>  DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. 
> Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 
> will drop support for Python 2.7 in January 2021. More details about Python 2 
> support in pip can be found at 
> https://pip.pypa.io/en/latest/development/release-process/#python-2-support 
> pip 21.0 will remove support for this functionality.
>  DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. 
> Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 
> will drop support for Python 2.7 in January 2021. More details about Python 2 
> support in pip can be found at 
> https://pip.pypa.io/en/latest/development/release-process/#python-2-support 
> pip 21.0 will remove support for this functionality.
>  error: find_namespace: directive is unsupported on Python < 3.3
> 
> The contributor who requested pyinstaller support somehow got this to
> work with my archaic and unfortunately very zip-hostile
>  os.path.join(dirname(__file__), 'libusb-1.0.dll')
> so I will be continuing with this until I finally drop 2.7 support.
> 
> Regards,
> -- 
> Vincent Pelletier
> GPG fingerprint 983A E8B7 3B91 1598 7A92 3845 CAC9 3691 4257 B0C1

--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/EQOOX4AULLP474MGD5ORCSZB7PW4CM7X/


[Distutils] Re: PEP 440

2021-11-16 Thread Tzu-ping Chung
Feel free to send a pull request to fix this. You can find the link to the PEP 
repository at the bottom of the page.

--
Tzu-ping Chung (@uranusjr)
uranu...@gmail.com
https://uranusjr.com
On Nov 4 2021, at 2:24 am, Johnathan Irvin  wrote:
> Noticed the link was broken for RFC 2119.
>
> I believe it should be https://datatracker.ietf.org/doc/html/rfc2119 and 
> currently points to http://tools.ietf.org/html/rfc2119.html
> Thanks,
> Johnathan Irvin
> https://twitter.com/_JohnnyIrvin
> https://www.linkedin.com/in/johnnyirvin/
>
> --
> Distutils-SIG mailing list -- distutils-sig@python.org
> To unsubscribe send an email to distutils-sig-le...@python.org
> https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> Message archived at 
> https://mail.python.org/archives/list/distutils-sig@python.org/message/RXW2MPUWYUVHERA365JJSBFDQPXGP37M/

-- 
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/archives/list/distutils-sig@python.org/message/7IANT4R3Y6BJ72ESS2SNJ3IN5CIJM3GU/