PEP number yet?
On Sun, Nov 22, 2015 at 4:45 PM, Donald Stufft wrote:
> Okay. I’ve read over this, implemented enough of it, and I think it’s gone
> through enough nit picking. I’m going to go ahead and accept this PEP. It’s
> largely just standardizing what we are already doing so it’s pretty l
what's the sharp thing hanging from the "P"
a device that measures the packaging cubes? loads them?
: )
On Fri, Nov 20, 2015 at 8:22 AM, Donald Stufft wrote:
> As many of you may know, I’ve been working on Warehouse which is designed
> to replace the PyPI code base with something modern and ma
On Wed, Nov 18, 2015 at 11:42 AM, Donald Stufft wrote:
>
> On Nov 18, 2015, at 2:40 PM, Marcus Smith wrote:
>
>
>> > Will "direct references" ever be well-defined? or open to whatever any
>> tool
>> > decides can be an artifact reference?
>>
&
>
>
> > Will "direct references" ever be well-defined? or open to whatever any
> tool
> > decides can be an artifact reference?
>
> We can define the syntax without capturing all the tool support, which
> is what PEP-440 and thus this PEP does.
>
so, to be clear, what syntax for the URI portion do
as it is, this PEP defers the concept of a "Direct Reference URL" to PEP440.
but then PEP440 partially defers to PEP426's "source_url" concept, when it
says "a direct URL reference may be a valid source_url entry"
do we expect PEP440 to be updated to fully own what a "Direct Reference
URL" can b
>
>
> Its included in the complete grammar, otherwise it can't be tested.
> Note that that the PEP body refers to the IETF document for the
> definition of URIs. e.g. exactly what you suggest.
>
doesn't this imply any possible URI can theoretically be a PEP440 direct
reference URI ?
Is that true?
yea, I was thinking the same.
but we'll see how the initial reorg goes.
On Sun, Nov 15, 2015 at 9:32 PM, Nick Coghlan wrote:
> On 16 November 2015 at 03:49, Marcus Smith wrote:
> >>
> >> To have the most success, the writers will certainly need feedback from
> >
>
>
> To have the most success, the writers will certainly need feedback from
> subject matter experts, so the process will include 2 stages where we
> specifically ask for feedback from PyPA-Dev and Distutils-Sig: 1) To
> validate the initial proposal that covers the scope of the changes, and 2)
this reads ok to me...
On Sun, Nov 8, 2015 at 9:20 PM, Nathaniel Smith wrote:
> Hi all,
>
> Following the strategy of trying to break out the different
> controversial parts of the new build system interface, here's some
> proposed text defining the environment that a build frontend like pip
> p
btw, I'm very aware that recent discussions may be changing the roadmap...
: )
I'm holding fast for the smoke to clear...
On Wed, Nov 4, 2015 at 4:42 PM, Marcus Smith wrote:
> FYI, I went ahead and merged it.
>
> https://www.pypa.io/en/latest/roadmap/
>
> Again, hel
> Because even if we go with the entry-point-style Python
> hooks, the build frontends like pip will still want to spawn a child
> to do the actual calls -- this is important for isolating pip from the
> build backend and the build backend from pip, it's important because
> the build backend needs
>
>
> PEP-345 doesn't
> describe Provides-Extra, which pkg_resources uses when parsing
> .dist-info directories as well
fwiw, this provides a bit of history on the "Provides-Extra":
https://github.com/pypa/interoperability-peps/issues/44
___
Distutils-
>
>
> I'm not sure this is the syntax that I would have come up with, but I
> guess it's too late to put the genie back in the bottle, so this PEP
> should have some way to cope with these things?
why would this PEP deal with this?
the higher level PEP that builds on top of this would bump the wh
>
>
> Maybe it would be clearer to drop the comment and newline handling
> stuff from the core requirement specifier syntax (declaring that
> newlines are simply a syntax error), and assume that there's some
> higher-level framing protocol taking care of that stuff?
that sounds right to me.
>
>
>
> So both the abstract build system PEP and donalds setup.py interface
> depend on having a bootstrap dependency list written into a file in
> the source tree.
your build PEP said stuff like this "Additional data *may* be included,
but the ``build_requires`` and ``metadata_version`` keys mus
>
>
>
> No - it specifies the serialisation format for
> names/specifiers/extras/markers that is in common use, but doesn't
> specify a programming API. It is intended as an interop building
> block,
I'm not not talking about programming API.
this PEP would set the format used in interop formats,
>
>
> The language defined is a compact line based format which is already in
> widespread use
this is the most critical thing for me, and the reason this approach seems
more attractive than the path of PEP426, although I'd certainly like to see
Nick's reaction.
PEP426 tries to cover how names/s
sorry, I feel like I have confirm my translation of your intro paragraph :
)
maybe it will help some others...
ended up with a hard dependency on this
my understanding is that you were depending on having PEP426 metadata, e.g.
for build_requires.
since this PEP, as you say doesn't handle the
>
>
>
> Basically: Historical reasons. The name “PyPA” was a joke by the
> pip/virtualenv developers and it was only pip and virtualenv so it was on
> Github.
here's an anecdote per the pypa.io history page, 'Other proposed names
were “ianb-ng”, “cabal”, “pack” and “Ministry of Installation”
FYI, I went ahead and merged it.
https://www.pypa.io/en/latest/roadmap/
Again, help appreciated from anyone to keep it accurate as things change
(and they surely will)
--Marcus
___
Distutils-SIG maillist - Distutils-SIG@python.org
https://mail.python
>
> answering basic questions can take time away from making important
> improvements?
>
to be fair, distutils-sig is mentioned as a user support list on the
"Python Packaging User Guide"
a few years back, there was a debate on splitting it between a user and
planning list, but no traction there.
>
>
> One question - would it be worth having a "design principles" section
> for notes about things like how we want to allow Warehouse/PyPI to
> publish metadata from the distribution files, so distribution formats
> should include static metadata for those values?
maybe. we can certainly beef
>
>
>
> Shouldn't Warehouse be mentioned there?
>
Indeed. I'll add it.
thanks
Marcus
___
Distutils-SIG maillist - Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig
Based on discussions in another thread [1], I've posted a PR to pypa.io for
a "PyPA Roadmap"
PR: https://github.com/pypa/pypa.io/pull/7
built version: http://pypaio.readthedocs.org/en/roadmap/roadmap/
To be clear, I'm not trying to dictate anything here, but rather just
trying to mirror what I t
>
>
> =
> Whenever a new PEP is put forward on distutils-sig, any PyPA core
> reviewer that believes they are suitably experienced to make the final
> decision on that PEP may offer to serve as the BDFL's delegate (or
> "PEP czar") for that PEP. If their self-nominat
> If python-dev ends up adopting GitLab for the main PEPs repo, then we
> should be able to move the whole process there, rather than needing to
> maintain a separate copy.
>
will that be as open as pypa/interoperability-peps?
if it's closed off such that only python devs can log PRs against PEPs o
>
>
> > so, Robert, to be clear, you think 3rd party build tools can get the
> > dynamic-dependency support they need just by implementing something
> dynamic
> > via "setup,py dist-info"? (and it doesn't need to happen literally in the
> > wheel build step?)
>
> Sure -- 'dist-info' would be run in
>
>
> However, the big step you're proposing that I think is fundamentally
> unsound is that of requiring a wheel be built before dependencies can
> be queried.
so, Robert, to be clear, you think 3rd party build tools can get the
dynamic-dependency support they need just by implementing something
>
>
> > for example, going with your idea above, that we need to support 3rd
> party
> > tools being dynamic in the "setup.py bdist_wheel" step, not simply in the
> > "setup.py dist-info" step.
> >
> > and as it is, pip does't understand this possibility.
>
> So in the new path, we wouldn’t call ``
>
>
> since a wheel is created by executing setup.py, you’d just have your build
> tool dynamically output different wheels based on the system you’re
> building on (or whatever axis is causing the dynamic dependencies).
understood, but I guess I was under the [mistaken?] impression, that
dynamic
>
> > Nathaniel's grand plan requires dynamic run-time dependencies, so to be
> > clear, this plan doesn't help the case that instigated most of the recent
> > discussion, right?
>
> It should still solve that problem because it allows a project to
> dynamically decide what dependencies they have (
>
>
> pip doesn't necessarily have to "interact with many different versions of
> the same build tool during a single invocation" if for example it's
> subprocessing the interactions to some "pip-build" tool that handles the
> imports and use of the python API. I.e. pips calls some "pip-build" too
>
>
> * This essentially doesn't solve any of the dynamic vs static metadata
> issues
>
Nathaniel's grand plan requires dynamic run-time dependencies, so to be
clear, this plan doesn't help the case that instigated most of the recent
discussion, right?
_
>
>
> - Will allow for both static and dynamic specification of build
> dependencies
I think you need to fill in the story on dynamic dependencies, or otherwise
this PEP will be a mystery to most people.
I *think* I understand your motivation for this, based on hearing your plan
(in another thre
>
>
> > 4) Although using a process interface is not necessarily a problem, I
> don't
> > agree with your point on why a python interface would be unworkable.
> You're
> > assuming that pip would try to import all the build tools (for every
> > dependency it's processing) in the master process. An
On Tue, Oct 27, 2015 at 10:02 PM, Ben Finney
wrote:
> Marcus Smith writes:
>
> > 1) *Please*, *please*, *please* let's start doing PEP conversations as
> > PRs to pypa/interoperability-peps : )
>
> Please keep the conversation on a mailing list where one can particip
>
>
> Current draft text in rendered form at:
> https://gist.github.com/rbtcollins/666c12aec869237f7cf7
>
>
Thanks for working on this.
Overall I like the idea, but have some comments/questions
1) *Please*, *please*, *please* let's start doing PEP conversations as PRs
to pypa/interoperability-pe
As some of you may already know, Nicole from the Warehouse team has
initiated an effort to improve our packaging tutorials, and have them
linked in the new Warehouse UI (https://github.com/pypa/warehouse/issues/729
).
Although originally the idea was to create additional tutorials (but still
maint
ok, I'll post a PR for a roadmap to pypa.io within a few days and post back
to the list for feedback...
On Thu, Oct 22, 2015 at 12:51 PM, Paul Moore wrote:
> On 22 October 2015 at 19:45, Marcus Smith wrote:
> > Mainly, I I think it should consist of a set of links to Issue
>
>
>> Would it makes sense to start a roadmap doc/repo under the PyPA account
> so the current grand vision is written down in a very high-level overview
>
I think it makes sense.
we had such a page for awhile at https://www.pypa.io
I recently dropped it because nobody was really maintaining it
>
>
> > I was happy to see this thread, because I thought maybe I"d learn what i
> > should teach my students - new to python.
> >
> >
> > Maybe we could come up with a decision tree for this -- some guidance for
> > knowing what to do, when?
>
> Exactly. I think it could even be fun :)
>
> How co
thanks for the summary!
* Things that have reason to change (deps) are more reasonable to be
> dynamic (even with PEP-426 markers there are exceptions)
>
as we know, for *many* cases, run-time deps aren't dynamic.
is there a consensus for those cases? exist in the sdist metadata? or no?
or ma
>
>
> So instead, the current plan is that we're going
> to drop the libraries inside a wheel and upload it to PyPI:
>
aha... ok, now it's clearer where you're coming from.
but using what platform in the wheel tag?
linux wheels are blocked currently
regardless of the tagging though, I'm not so su
>
>
> it's a fact of life that the same
> source release may be configured in different ways that create
> different resulting dependencies. NumPy is one example of this, but
> it's hardly unusual
I've tried to respond to this point twice.. but let me try again. : )
If I'm confused, please help
ut to be clear, it's still not a "binary" dependency declaration... i.e.
the dependency is still declared by name and a version specifier alone.
On Sun, Oct 11, 2015 at 10:48 AM, Marcus Smith wrote:
>
>
>> 2) after unpacking this sdist it then calls 'setup.py egg_i
> 2) after unpacking this sdist it then calls 'setup.py egg_info' to get
> the full metadata for the wheel
I wouldn't say "get the full metadata for the wheel". it's not a wheel
yet.
`egg_info` run so we can use the pkg_resources api to find the dependencies.
> Specifically what it does with
>
> But the different builds for the different configurations end up with
> different metadata. If I'm understanding right, the whole point of
> "source wheels" is that they have all the static metadata that pip
> needs in order to make decisions, and this has to match the resulting
> wheels -- rig
>
>
> If you're interested, I'm happy to directly collaborate on this PEP if
> it's in
> a github repository somewhere or something. There's an interoptability repo
>
btw, the repo he's talking about is here:
https://github.com/pypa/interoperability-peps
it has a convention about where to add pep
>
>> The first thing that immediately stood out to me, is that it's
>> recommending
>> that downstream redistributors like Debian, Fedora, etc utilize Wheels
>> instead
>> of the sdist to build their packages from. However, that is not really
>> going to
>> fly with most (all?) of the downstream re
>
>
> So downstream distributors can download an sdist - or even a tarball of a
> VCS tag, if they're being strict about it - build wheels from that using
> the config in this proposal, and then transform the wheels into their own
> package format.
>
this has wheel itself being the interoperabili
Can you clarify the relationship to PEP426 metadata?
There's no standard for metadata in here other than what's required to run
a build hook.
Does that imply you would have each build tool enforce their own convention
for where metadata is found?
On Thu, Oct 1, 2015 at 9:53 PM, Nathaniel Smith wr
although I can see the value of distinguishing a description vs readme
file, I can also see that it's confusing enough to make me want the sample
project to just have a readme for simplicity (and maybe just mention the
distinction as a possibility)
I opened an issue here https://github.com/pypa
I think we want to turn down
https://bitbucket.org/pypa/pypi-metadata-formats?
Since it's replaced by https://github.com/pypa/interoperability-peps
I'm thinking we should migrate issues (and close the old ones with links to
the new ones), and add a loud notification to the old readme. People are
> so the idea would be to:
>> 1) house current specs at packaging.python.org... basically a document
>> tree that's organized by topic, not numbers and it's free of proposal
>> rationales, historical discussion, and transition plans etc...
>>
>
> *
> https://packaging.python.org/en/latest/glossary.
pulling this idea out of the "Linux wheel support" thread, since it
deserves it's own thread...
the idea being that we should better distinguish:
1) the current packaging "Specs" (for metadata, versions, etc...)
vs
2) Proposals to change them
currently, we just have PEPs that serve both roles.
s
er) PEPs.
>
> On Sep 7, 2015 10:36 AM, "Marcus Smith" wrote:
> >
> > I'm still unclear on whether you'd want A or B:
> >
> > A) Different major/minor versions of the spec are different documents
>
> From http://semver.org Semantic Versioni
ndex of links to the
unrendered old versions in vcs history
2) use a custom build/publishing worflow that pulls versions out of history
so they can be built as peers in the published version
On Sun, Sep 6, 2015 at 9:26 PM, Nick Coghlan wrote:
> On 7 September 2015 at 14:
> That way, the URL works as people expect, *and* the resulting
> > destination gives a URL that (when inevitably copy-and-pasted) will
> > retain its meaning over time.
>
> Yes, ReadTheDocs does let us do that.
well, it lets you do it for a whole project.
we'd have to have a project per spec for
n 5 September 2015 at 16:46, Nathaniel Smith wrote:
> > On Fri, Sep 4, 2015 at 9:24 PM, Marcus Smith wrote:
> >>> I don't have a specific problem with the specs living somewhere else
> >>> as well, I just don't think moving a lengthy document full of edge
>
ok, so this is PEP 474
where's the activity for the forge idea happening? python-dev list?
On Sat, Sep 5, 2015 at 10:47 PM, Nick Coghlan wrote:
>
> On 6 Sep 2015 10:39, "Marcus Smith" wrote:
> >
> > yea, I like the idea of our own authoritative Pypa p
>
>
> then bundles the system
> version back up with rewheel for installation into Python virtual
> environments.
this "bundles the system version back up" step happens when?
which fedora version did this start?
___
Distutils-SIG maillist - Distutils-
n (ncogh...@gmail.com)
> wrote:
> > On 6 Sep 2015 08:31, "Marcus Smith" wrote:
> > >
> > > is this a response to other thread about how/where to store specs and
> > PEPs?
> > > If not, what in this email are you responding to?
> >
> >
is this a response to other thread about how/where to store specs and PEPs?
If not, what in this email are you responding to?
On Sat, Sep 5, 2015 at 1:32 PM, Donald Stufft wrote:
> If it’s more useful we could also just use an RFC repository like Rust
> does instead of doing a mishmash between h
> I don't have a specific problem with the specs living somewhere else
> as well, I just don't think moving a lengthy document full of edge cases
> from one location to another is going to make things better
If I may, I don't think that really captures Nick's idea.
I think it's about clearly dist
Can anyone summarize the state of ensurepip for the major linux distros.
Do any currently include a version that leaves ensurepip intact?
If not, will any? Moreover, would any ever also bootstrap pip for you?
I'm not asking out of interest in wanting it, more to understand for the
sake of editing
Hello:
I'm looking for opinions on mentioning Copr and/or EPEL and/or IUS in the
pip install instructions.
Here's the PR with the actual docs changes:
https://github.com/pypa/pip/pull/3067
The goal is to give people a linux distro-friendly way (at least for
fedora/centos/rhel) to upgrade pip (in
I think Linux wheel support is almost useless unless the pypa stack
> provides _something_ to handle non-python dependencies.
>
I wouldn't say useless, but I tend to agree with this sentiment.
I'm thinking the only way to really "compete" with the ease of Conda (for
non-python dependencies) is to
Why not start with pip at least being a "simple" fail-on-conflict resolver
(vs the "1st found wins" resolver it is now)...
You'd "backtrack" for the sake of re-walking when new constraints are
found, but not for the purpose of solving conflicts.
I know you're motivated to solve Openstack build is
the PyPA site has a PEP reference that includes details on implementation:
https://www.pypa.io/en/latest/peps
I don't think we need another reference in the Packaging User Guide
(PUG). We could mention that the PyPA one exists in the PUG.
As for user-facing PEP docs, I think the docs for P
On Wed, Apr 15, 2015 at 8:56 AM, Robin Becker wrote:
> On 15/04/2015 16:49, Marcus Smith wrote:
> ..
>
>>
>> agreed on the warning, but there is a documented workaround for this, that
>> is to put the desired constraint for C at level 0 (i.e. in the in
> level 0 A
> A level 1 1.4<= C
>
>
> level 0 B
> B level 1 1.6<= C <1.7
>
> pip manages to download version 1.8 of C(Django) using A's requirement,
> but never even warns us that the B requirement of C was violated. Surely
> even in the absence of a resolution pip could raise a warning at th
> So you *can* import things inside of a setup.py today, you just have
to
I think it's time for the Packaging User Guide to try to cover
"setup_requires"...
___
Distutils-SIG maillist - Distutils-SIG@python.org
https://mail.python.org/mailman/listi
> For instance, if the problem is "when setuptools does the install, then
> things
> get installed differently, with different options, SSL certs, proxies, etc"
> then I think a better solution is that pip does terrible hacks in order to
> forcibly take control of setup_requires from setuptools and
> I don’t think that an artificial limit on the number of issues or pull
> requests is a good path forward.
>
I would say "reasonable" limit, not "artificial". : )
It's a simple way to balance the efforts towards project maintenance.
> I feel like if someone doesn’t want to close issues for
On Thu, Mar 5, 2015 at 10:21 AM, Randy Syring wrote:
>
>
> On 03/05/2015 12:07 PM, Paul Moore wrote:
>
> It seems to me that there is another point that delays progress on a
> certain proportion of PRs, specifically feature requests, namely that
> no-one really has that strong an opinion on whet
>
> That implies closing 183 issues and 65 PRs from where we are now. And
> when you say "adding features" presumably that means somehow
> forbidding people (core devs? we can't forbid anyone else...) from
> creating new PRs until we're below the limit.
>
> In general, I don't think it's practical
> Currently there are no labels at all for any issue or PR.
>
there are labels https://github.com/pypa/pip/labels
I put most of these last year.
___
Distutils-SIG maillist - Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-
So I guess my suggestions boil down to:
>
> - Add more humans
> - Add more money to make humans more efficient
> - Add more computer automation
>
maybe agree to always maintain < X open issues and < Y open PRs, before
adding features.
where x can vary as needed, but for starters, x=250, and y=25
In 90% of the cases I see, requirements.txt are used to define the
> requirements for the project to function which typically are the exact
> same requirements necessary when installing the project. People also
> will then write a test-requirements.txt (or dev-requirements.txt) file
> to have a co
> In general, requirements.txt seems to be an
> anti-pattern. You either have to use likely to break tooling or you'll
> have to reinvent that from scratch. You're better off putting it
> directly in setup.py and using setup.py to install dependencies in a
> virtualenv instead of requirements.txt
>
> As a possible interim approach to improving the situation, what do you
> think of my writing up a "Binary distribution for Linux" advanced
> topic? That could cover not only containers, but also the technique of
> "bundle a /opt virtualenv in a platform binary package" as well as
> actually creat
> I agree that from an implementation perspective, this could just be a
> new recommended URL in the project URLs metadata (e.g. "Reference
> Container Images"). If folks don't think the idea sounds horrible,
> I'll make that update to the PEP 459 draft.
>
wouldn't this just be a use case for a cu
+1 to a pypa-announce list.
I personally care more about the list than twitter.
at this point, probably need to post the idea to pypa-dev, and get a few
+1's there, and get someone to agree to execute on the idea.
there's a few people there that don't monitor distutils-sig
Marcus
On Sun, Jan 25,
yes, I see a similar bug on android.
If I try to scroll up a *little*, it goes all the back to the top.
On Fri, Jan 2, 2015 at 7:39 AM, Chris Jerdonek
wrote:
> Sorry if this isn't the best list on which to bring this up, but it came
> up for me during the recent PEP 440 discussions.
>
> For a wh
how about just pypa/peps for the name?
On Sun, Dec 28, 2014 at 10:16 PM, Donald Stufft wrote:
>
> On Dec 29, 2014, at 12:01 AM, Nick Coghlan wrote:
>
> * Donald is going to put together a PR to update the interoperability spec
> to match the semantics of his proposed change to the packaging li
> >
> > * 1.7.1 matches >1.7 (previously it did not)
>
> This sounds like a straight up bug fix in the packaging module to me - the
> PEP 440 zero padding should apply to *all* checks, not just to equality
> checks, as you can't sensibly compare release segments with different
> numbers of elements
>
> In gives me a minor bit of pause. However any alternative that I can come
> up
> with bothers me more, especially since I don't believe many people actually
> even *use* a bare > and any alternative I can come up with has worse
> behavior
> for operators which get much more use.
>
what about m
>
>
> The equality operator pads zeros, >= has the equality operator as part of
> it.
>
right, but that's not really an answer.
Does the conceptual inconsistency phase you at all?
>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>
>
_
You mean "1.7.*" right? Because 1.70 would satisfy >1.7
>
yes
___
Distutils-SIG maillist - Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig
I'm starting a new thread to state cleanly what my current question/concern
is...
per PEP440 as I understand it:
- for ">1.7", "1.7" means roughly the "1.7 series" or "1.7*"
- for ">=1.7", "1.7" means literally "1.7" (with zero-padding as needed)
While I understand the motivation for the "series"
> > but look at this (using setuptools 8)
> >
> '1.7.dev1' in pkg_resources.Requirement.parse('foo>=1.7')
> > False
> '1.7.dev1' in pkg_resources.Requirement.parse('foo<=1.7')
> > True
>
> I believe the first example is a consequence of the following two
> excerpts from the PEP [...] In b
> > the problem with thinking of it this way is that you naturally want to
> > extend the concept to >=, but it doesn't work.
> > If the concept were consistent, 1.7.dev1 would satisfy >=1.7, but it
> > doesn't.
>
> I'm pretty sure it's consistent. For example, "1.7.2" doesn't satisfy
> ">1.7",
Thinking of ">1.7" as "greater than the 1.7 series" sort of helps me as
> well...
>
>
the problem with thinking of it this way is that you naturally want to
extend the concept to >=, but it doesn't work.
If the concept were consistent, 1.7.dev1 would satisfy >=1.7, but it
doesn't.
for >=, the co
>
> A note about terminology here (both in this email and The Packaging User
> Guide) -- it seems to me that install_requires is about requirements for a
> "package" not a "project",
>
well, read through the PyPUG glossary:
https://packaging.python.org/en/latest/glossary.html
a "project" is anyth
> It *should* be being kept in sync with the published versions, but Donald
> and I have just been working directly in the main PEP repo recently.
>
> Hence my suggestion of moving it to GitHub - it's more likely to be kept
> up to date there, and, unlike the master PEP repo, I'm happy to host the
>
>
> Heh, that’s actually a hold over from before we made specifiers mandatory.
> That needs updated.
>
you mean made operators mandatory?
so the bit about "version identifier without any comparison operator" needs
to be removed?
>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C
just post edit suggestions here?
https://bitbucket.org/pypa/pypi-metadata-formats is not up to date anymore
some other location?
On Tue, Dec 23, 2014 at 1:01 AM, Donald Stufft wrote:
>
> On Dec 23, 2014, at 1:23 AM, Marcus Smith wrote:
>
>
>> In particular, <, >
> git+https://url_to_the_repo.git#egg=name_of_package
>
> why isn't that "wheel=name_of_package"
>
the "egg" part here has nothing to do with eggs. just a vestige of another
time.
see https://github.com/pypa/pip/issues/1265
and will it work if setuptools was not used in the packages setup.py???
>
>
> In particular, <, >, ~=, and, when using a .*, the != and == use the
> number of dots in the given specifier to indicate the precision of the
> specifier.
>
the PEP text is pretty clear on the precision concept for ~= and when using
".*", but not so much for < and <.
how about an example rig
oh, "A compatible release clause consists of either a version identifier
without any comparison operator or else the compatible release operator ~="
On Mon, Dec 22, 2014 at 9:57 PM, Marcus Smith wrote:
> the first 3 examples here are confusing.
> https://www.python.org/d
1 - 100 of 334 matches
Mail list logo