Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Antoine Pitrou
On Sat, 29 Nov 2014 01:59:06 +
Nathaniel Smith n...@pobox.com wrote:
 
 Option 1: Make it possible to change the type of a module object
 in-place, so that we can write something like
 
sys.modules[__name__].__class__ = MyModuleSubclass
 
 Option 1 downside: The invariants required to make __class__
 assignment safe are complicated, and only implemented for
 heap-allocated type objects. PyModule_Type is not heap-allocated, so
 making this work would require lots of delicate surgery to
 typeobject.c. I'd rather not go down that rabbit-hole.

Option 1b: have __class__ assignment delegate to a tp_classassign slot
on the old class, so that typeobject.c doesn't have to be cluttered with
many special cases.

 Option 3: Make it legal to assign to the __dict__ attribute of a
 module object, so that we can write something like
 
new_module = MyModuleSubclass(...)
new_module.__dict__ = sys.modules[__name__].__dict__
sys.modules[__name__].__dict__ = {} # ***
sys.modules[__name__] = new_module
 
[...]
 
 Option 4: Add a new function sys.swap_module_internals, which takes
 two module objects and swaps their __dict__ and other attributes. By
 making the operation a swap instead of an assignment, we avoid the
 lifecycle pitfalls from Option 3. By making it a builtin, we can make
 sure it always handles all the module fields that matter, not just
 __dict__. Usage:

How do these two options interact with the fact that module functions
store their globals dict, not the module itself?

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Move selected documentation repos to PSF BitBucket account?

2014-11-29 Thread Nick Coghlan
On 29 November 2014 at 03:34, Demian Brecht demianbre...@gmail.com wrote:
 On Tue, Nov 25, 2014 at 6:52 AM, Brett Cannon br...@python.org wrote:

 I suspect if we make sure we add Bitbucket and GitHub login support to the 
 issue tracker then that would help go a fair distance to helping with the 
 GitHub pull of reach (and if we make it so people can simply paste in their 
 fork's URL into the issue tracker and we simply grab a new patch for review 
 that would go even farther).

 Chiming in horribly late, so hopefully this hasn't already been
 mentioned (I've only loosely been following this thread).

 In addition to the login support (I'm not sold on how much that would
 help the reach), I think it would be really beneficial to have some
 documentation on either emulating git-style workflow in hg or
 detailing a git fork workflow while working on multiple patches
 concurrently and keeping master in sync with hg default (or perhaps
 even both).

As far as I'm aware, the easiest way to do that by using git-remote-hg
to treat the CPython Mercurial repository as a git remote (although
I've never tried it myself).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Nick Coghlan
On 29 November 2014 at 21:32, Antoine Pitrou solip...@pitrou.net wrote:
 On Sat, 29 Nov 2014 01:59:06 +
 Nathaniel Smith n...@pobox.com wrote:

 Option 1: Make it possible to change the type of a module object
 in-place, so that we can write something like

sys.modules[__name__].__class__ = MyModuleSubclass

 Option 1 downside: The invariants required to make __class__
 assignment safe are complicated, and only implemented for
 heap-allocated type objects. PyModule_Type is not heap-allocated, so
 making this work would require lots of delicate surgery to
 typeobject.c. I'd rather not go down that rabbit-hole.

 Option 1b: have __class__ assignment delegate to a tp_classassign slot
 on the old class, so that typeobject.c doesn't have to be cluttered with
 many special cases.

Aye, being able to hook class switching could be potentially useful
(including the ability to just disallow it entirely if you really
wanted to do that).

 Option 3: Make it legal to assign to the __dict__ attribute of a
 module object, so that we can write something like

new_module = MyModuleSubclass(...)
new_module.__dict__ = sys.modules[__name__].__dict__
sys.modules[__name__].__dict__ = {} # ***
sys.modules[__name__] = new_module

 [...]

 Option 4: Add a new function sys.swap_module_internals, which takes
 two module objects and swaps their __dict__ and other attributes. By
 making the operation a swap instead of an assignment, we avoid the
 lifecycle pitfalls from Option 3. By making it a builtin, we can make
 sure it always handles all the module fields that matter, not just
 __dict__. Usage:

 How do these two options interact with the fact that module functions
 store their globals dict, not the module itself?

Right, that's the part I consider the most challenging with
metamodules - the fact that there's a longstanding assumption that a
module is just a dictionary with some metadata, so the interpreter
is inclined to treat them that way.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-29 Thread Olemis Lang
On 11/28/14, Guido van Rossum gu...@python.org wrote:
[...]

 @Olemis: You never showed examples of how your code would be used, so it's
 hard to understand what you're trying to do and how PEP 479 affects you.


The intention is not to restart the debate . PEP is approved , it's
done ... but ...

comment
as a side-effect beware of the consequences that it is a fact that
performance will be degraded (under certain circumstances) due to
either a chain of (SI = StopIteration)

raise SI = except SI: return = raise SI = ...

... or a few other similar cases which I will not describe for the
sake of not repeating myself and being brief .
/comment

-- 
Regards,

Olemis - @olemislc

Apache(tm) Bloodhound contributor
http://issues.apache.org/bloodhound
http://blood-hound.net

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Mark Shannon

On 29/11/14 01:59, Nathaniel Smith wrote:

Hi all,


[snip]


Option 3: Make it legal to assign to the __dict__ attribute of a
module object, so that we can write something like

new_module = MyModuleSubclass(...)
new_module.__dict__ = sys.modules[__name__].__dict__
sys.modules[__name__].__dict__ = {} # ***
sys.modules[__name__] = new_module



Why does MyModuleClass need to sub-class types.ModuleType?
Modules have no special behaviour, apart from the inability to write
to their __dict__ attribute, which is the very thing you don't want.

If it quacks like a module...

Cheers,
Mark.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-29 Thread Nick Coghlan
On 30 November 2014 at 02:45, Olemis Lang ole...@gmail.com wrote:
 On 11/28/14, Guido van Rossum gu...@python.org wrote:
 [...]

 @Olemis: You never showed examples of how your code would be used, so it's
 hard to understand what you're trying to do and how PEP 479 affects you.


 The intention is not to restart the debate . PEP is approved , it's
 done ... but ...

 comment
 as a side-effect beware of the consequences that it is a fact that
 performance will be degraded (under certain circumstances) due to
 either a chain of (SI = StopIteration)

 raise SI = except SI: return = raise SI = ...

 ... or a few other similar cases which I will not describe for the
 sake of not repeating myself and being brief .
 /comment

Guido wrote a specific micro-benchmark for that case in one of the
other threads. On his particular system, the overhead was around 150
ns per link in the chain at the point the data processing pipeline was
shut down. In most scenarios where a data processing pipeline is worth
setting up in the first place, the per-item handling costs (which
won't change) are likely to overwhelm the shutdown costs (which will
get marginally slower).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-29 Thread Guido van Rossum
On Sat, Nov 29, 2014 at 9:07 AM, Nick Coghlan ncogh...@gmail.com wrote:

 Guido wrote a specific micro-benchmark for that case in one of the
 other threads. On his particular system, the overhead was around 150
 ns per link in the chain at the point the data processing pipeline was
 shut down. In most scenarios where a data processing pipeline is worth
 setting up in the first place, the per-item handling costs (which
 won't change) are likely to overwhelm the shutdown costs (which will
 get marginally slower).


If I hadn't written that benchmark I wouldn't recognize what you're talking
about here. :-) This is entirely off-topic, but if I didn't know it was
about one generator calling next() to iterate over another generator, I
wouldn't have understood what pattern you refer to as a data processing
pipeline. And I still don't understand how the try/except *setup* cost
became *shut down* cost of the pipeline. But that doesn't matter, since the
number of setups equals the number of shut downs.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Nathaniel Smith
On Sat, Nov 29, 2014 at 4:21 AM, Guido van Rossum gu...@python.org wrote:
 Are these really all our options? All of them sound like hacks, none of them
 sound like anything the language (or even the CPython implementation) should
 sanction. Have I missed the discussion where the use cases and constraints
 were analyzed and all other approaches were rejected? (I might have some
 half-baked ideas, but I feel I should read up on the past discussion first,
 and they are probably more fit for python-ideas than for python-dev. Plus
 I'm just writing this email because I'm procrastinating on the type hinting
 PEP. :-)

The previous discussions I was referring to are here:
  http://thread.gmane.org/gmane.comp.python.ideas/29487/focus=29555
  http://thread.gmane.org/gmane.comp.python.ideas/29788

There might well be other options; these are just the best ones I
could think of :-). The constraints are pretty tight, though:
- The new module object (whatever it is) should have a __dict__ that
aliases the original module globals(). I can elaborate on this if my
original email wasn't enough, but hopefully it's obvious that making
two copies of the same namespace and then trying to keep them in sync
at the very least smells bad :-).
- The new module object has to be a subtype of ModuleType, b/c there
are lots of places that do isinstance(x, ModuleType) checks (notably
-- but not only -- reload()). Since a major goal here is to make it
possible to do cleaner deprecations, it would be really unfortunate if
switching an existing package to use the metamodule support itself
broke things :-).
- Lookups in the normal case should have no additional performance
overhead, because module lookups are extremely extremely common. (So
this rules out dict proxies and tricks like that -- we really need
'new_module.__dict__ is globals()' to be true.)

AFAICT there are three logically possible strategies for satisfying
that first constraint:
(a) convert the original module object into the type we want, in-place
(b) create a new module object that acts like the original module object
(c) somehow arrange for our special type to be used from the start

My options 1 and 2 are means of accomplishing (a), and my options 3
and 4 are means of accomplishing (b) while working around the
behavioural quirks of module objects (as required by the second
constraint).

The python-ideas thread did also consider several methods of
implementing strategy (c), but they're messy enough that I left them
out here. The problem is that somehow we have to execute code to
create the new subtype *before* we have an entry in sys.modules for
the package that contains the code for the subtype. So one option
would be to add a new rule, that if a file pkgname/__new__.py exists,
then this is executed first and is required to set up
sys.modules[pkgname] before we exec pkgname/__init__.py. So
pkgname/__new__.py might look like:

import sys
from pkgname._metamodule import MyModuleSubtype
sys.modules[__name__] = MyModuleSubtype(__name__, docstring)

This runs into a lot of problems though. To start with, the 'from
pkgname._metamodule ...' line is an infinite loop, b/c this is the
code used to create sys.modules[pkgname]. It's not clear where the
globals dict for executing __new__.py comes from (who defines
__name__? Currently that's done by ModuleType.__init__). It only works
for packages, not modules. The need to provide the docstring here,
before __init__.py is even read, is weird. It adds extra stat() calls
to every package lookup. And, the biggest showstopper IMHO: AFAICT
it's impossible to write a polyfill to support this code on old python
versions, so it's useless to any package which needs to keep
compatibility with 2.7 (or even 3.4). Sure, you can backport the whole
import system like importlib2, but telling everyone that they need to
replace every 'import numpy' with 'import importlib2; import numpy' is
a total non-starter.

So, yeah, those 4 options are really the only plausible ones I know of.

Option 1 and option 3 are pretty nice at the language level! Most
Python objects allow assignment to __class__ and __dict__, and both
PyPy and Jython at least do support __class__ assignment. Really the
only downside with Option 1 is that actually implementing it requires
attention from someone with deep knowledge of typeobject.c.

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Nathaniel Smith
On Sat, Nov 29, 2014 at 11:32 AM, Antoine Pitrou solip...@pitrou.net wrote:
 On Sat, 29 Nov 2014 01:59:06 +
 Nathaniel Smith n...@pobox.com wrote:

 Option 1: Make it possible to change the type of a module object
 in-place, so that we can write something like

sys.modules[__name__].__class__ = MyModuleSubclass

 Option 1 downside: The invariants required to make __class__
 assignment safe are complicated, and only implemented for
 heap-allocated type objects. PyModule_Type is not heap-allocated, so
 making this work would require lots of delicate surgery to
 typeobject.c. I'd rather not go down that rabbit-hole.

 Option 1b: have __class__ assignment delegate to a tp_classassign slot
 on the old class, so that typeobject.c doesn't have to be cluttered with
 many special cases.

I'm intrigued -- how would this help?

I have a vague impression that one could add another branch to
object_set_class that went something like

if at least one of the types is a subtype of the other type, and the
subtype is a heap type with tp_dealloc == subtype_dealloc, and the
subtype doesn't add any important slots, and ... then the __class__
assignment is legal.

(This is taking advantage of the fact that if you don't have any extra
slots added, then subtype_dealloc just basically defers to the base
type's tp_dealloc, so it doesn't really matter which one you end up
calling.)

And my vague impression is that there isn't really anything special
about the module type that would allow a tp_classassign function to
simplify this logic.

But these are just vague impressions :-)

 Option 3: Make it legal to assign to the __dict__ attribute of a
 module object, so that we can write something like

new_module = MyModuleSubclass(...)
new_module.__dict__ = sys.modules[__name__].__dict__
sys.modules[__name__].__dict__ = {} # ***
sys.modules[__name__] = new_module

 [...]

 Option 4: Add a new function sys.swap_module_internals, which takes
 two module objects and swaps their __dict__ and other attributes. By
 making the operation a swap instead of an assignment, we avoid the
 lifecycle pitfalls from Option 3. By making it a builtin, we can make
 sure it always handles all the module fields that matter, not just
 __dict__. Usage:

 How do these two options interact with the fact that module functions
 store their globals dict, not the module itself?

I think that's totally fine? The whole point of all these proposals is
to make sure that the final module object does in fact have the
correct globals dict.

~$ git clone g...@github.com:njsmith/metamodule.git
~$ cd metamodule
~/metamodule$ python3.4
 import examplepkg
 examplepkg
FancyModule 'examplepkg' from '/home/njs/metamodule/examplepkg/__init__.py'
 examplepkg.f.__globals__ is examplepkg.__dict__
True

If anything this is another argument for why we NEED something like this :-).

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Mark Shannon


On 29/11/14 19:37, Nathaniel Smith wrote:

[snip]


- The new module object has to be a subtype of ModuleType, b/c there
are lots of places that do isinstance(x, ModuleType) checks (notably


It has to be a *subtype* is does not need to be a *subclass*


class M:

...__class__ = ModuleType
...

isinstance(M(), ModuleType)

True

Cheers,
Mark.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Antoine Pitrou
On Sat, 29 Nov 2014 20:02:50 +
Nathaniel Smith n...@pobox.com wrote:
 
  Option 1b: have __class__ assignment delegate to a tp_classassign slot
  on the old class, so that typeobject.c doesn't have to be cluttered with
  many special cases.
 
 I'm intrigued -- how would this help?

It would allow ModuleType to override tp_classassign to decide whether
and how __class__ assignment on a module instance is allowed to work.
So typeobject.c needn't know about any specifics of ModuleType or any
other type.

  How do these two options interact with the fact that module functions
  store their globals dict, not the module itself?
 
 I think that's totally fine? The whole point of all these proposals is
 to make sure that the final module object does in fact have the
 correct globals dict.
 
 ~$ git clone g...@github.com:njsmith/metamodule.git

Ok, I see. The code hacks up the new module to take ownership of the
old module's __dict__. That doesn't look very clean to me.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Greg Ewing

Nathaniel Smith wrote:


Option 4: Add a new function sys.swap_module_internals, which takes
two module objects and swaps their __dict__ and other attributes. By
making the operation a swap instead of an assignment, we avoid the
lifecycle pitfalls from Option 3.


Didn't I see somewhere that module dicts are not being
cleared on shutdown any more? If so, then the lifetime
problem mentioned here no longer exists.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Greg Ewing

Guido van Rossum wrote:
Are these really all our options? All of them sound like hacks, none of 
them sound like anything the language (or even the CPython 
implementation) should sanction.


If assignment to the __class__ of a module were permitted
(by whatever means) then you could put this in a module:

   class __class__(types.ModuleType):
  ...

which makes it look almost like a deliberate language
feature. :-)

Seriously, of the options presented, I think that allowing
__class__ assignment is the most elegant, since it solves
a lot of problems in one go without introducing any new
features -- just removing a restriction that prevents an
existing language mechanism from working in this case.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Ionel Cristian Mărieș
What if we'd have metaclass semantics on module creation?

Eg, suppose the default:

  __metaclass__ = ModuleType

What if Python would support __prepare__ for modules?


Thanks,
-- Ionel M.

On Sat, Nov 29, 2014 at 11:36 PM, Greg Ewing greg.ew...@canterbury.ac.nz
wrote:

 Guido van Rossum wrote:

 Are these really all our options? All of them sound like hacks, none of
 them sound like anything the language (or even the CPython implementation)
 should sanction.


 If assignment to the __class__ of a module were permitted
 (by whatever means) then you could put this in a module:

class __class__(types.ModuleType):
   ...

 which makes it look almost like a deliberate language
 feature. :-)

 Seriously, of the options presented, I think that allowing
 __class__ assignment is the most elegant, since it solves
 a lot of problems in one go without introducing any new
 features -- just removing a restriction that prevents an
 existing language mechanism from working in this case.

 --
 Greg
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe: https://mail.python.org/mailman/options/python-dev/
 contact%40ionelmc.ro

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Greg Ewing

Nathaniel Smith wrote:

So pkgname/__new__.py might look like:

import sys
from pkgname._metamodule import MyModuleSubtype
sys.modules[__name__] = MyModuleSubtype(__name__, docstring)

To start with, the 'from
pkgname._metamodule ...' line is an infinite loop,


Why does MyModuleSubtype have to be imported from pkgname?
It would make more sense for it to be defined directly in
__new__.py, wouldn't it? Isn't the purpose of separating
stuff out into __new__.py precisely to avoid circularities
like that?

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Move selected documentation repos to PSF BitBucket account?

2014-11-29 Thread Ethan Furman
On 11/28/2014 09:34 AM, Demian Brecht wrote:
 
 I primarily use git for development. Having little or no effort to
 context switch to work on CPython in any capacity (PEPs, code, etc)
 would be hugely beneficial for me. Having a well defined workflow in
 the docs (perhaps alongside Lifecycle of a patch?) would have
 significantly lowered the initial barrier of entry for me. Given the
 amount of time I put into that initially, I can only imagine how many
 people it's entirely turned away from contributing. I'd definitely be
 interested in contributing documentation around this (I've written up
 something similar here
 http://demianbrecht.github.io/vcs/2014/07/31/from-git-to-hg/) if
 others feel that it would be valuable.

That would be very valuable, thank you.

--
~Ethan~



signature.asc
Description: OpenPGP digital signature
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Petr Viktorin
On Sat, Nov 29, 2014 at 8:37 PM, Nathaniel Smith n...@pobox.com wrote:
[...]
 The python-ideas thread did also consider several methods of
 implementing strategy (c), but they're messy enough that I left them
 out here. The problem is that somehow we have to execute code to
 create the new subtype *before* we have an entry in sys.modules for
 the package that contains the code for the subtype. So one option
 would be to add a new rule, that if a file pkgname/__new__.py exists,
 then this is executed first and is required to set up
 sys.modules[pkgname] before we exec pkgname/__init__.py. So
 pkgname/__new__.py might look like:

 import sys
 from pkgname._metamodule import MyModuleSubtype
 sys.modules[__name__] = MyModuleSubtype(__name__, docstring)

 This runs into a lot of problems though. To start with, the 'from
 pkgname._metamodule ...' line is an infinite loop, b/c this is the
 code used to create sys.modules[pkgname].

As Greg Ewing said – you don't want to import from the package whose
metamodule you're defining. You'd want to do as little work as
possible in __new__.py.

I'd use something like this:

import types

class __metamodule__(types.ModuleType):
def __call__(self):
return self.main()

where Python would get the attribute __metamodule__ from __new__.py,
and use `__metamodule__(name, doc)` as the thing to execute __main__
in.

 It's not clear where the
 globals dict for executing __new__.py comes from (who defines
 __name__? Currently that's done by ModuleType.__init__).

Well, it could still be in __metamodule__.__init__().

 It only works for packages, not modules.

I don't see a need for this treatment for modules in a package – if
you want `from mypkg import callme`, you can make callme a function
rather than a callable module. If you *also* want `from mypkg.callme
import something_else`, I say you should split callme into two
differently named things; names are cheap inside a package.
If really needed, modules in a package can use an import hook defined
in the package, or be converted to subpackages.
Single-module projects would be left out, yes – but those can be
simply converted to a package.

 The need to provide the docstring here,
 before __init__.py is even read, is weird.

Does it have to be before __init__.py is read? Can't __init__.py be
compiled beforehand, to get __doc__, and only *run* in the new
namespace?
(Or should __new__.py define import hooks that say how __init__.py
should be loaded/compiled? I don't see a case for that.)

 It adds extra stat() calls to every package lookup.

Fair.

 And, the biggest showstopper IMHO: AFAICT
 it's impossible to write a polyfill to support this code on old python
 versions, so it's useless to any package which needs to keep
 compatibility with 2.7 (or even 3.4). Sure, you can backport the whole
 import system like importlib2, but telling everyone that they need to
 replace every 'import numpy' with 'import importlib2; import numpy' is
 a total non-starter.

I'm probably missing something obvious, but where would this not work?
- As the first thing it does, __init__.py imports the polyfill and
calls polyfill(__name__)
- The polyfill, if running non-recursively* under old Python:
-- compiles __init__.py
-- imports __new__.py to get __metamodule__
-- instantiates metamodule with name, and docstring from compiled code
-- * remembers the instance, to check for recursion later
-- puts it in sys.modules
-- execs __init__ in it
- afterwards the original __init__.py execution continues, filling up
a now-unused module's namespace
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Steven D'Aprano
On Sun, Nov 30, 2014 at 11:07:57AM +1300, Greg Ewing wrote:
 Nathaniel Smith wrote:
 So pkgname/__new__.py might look like:
 
 import sys
 from pkgname._metamodule import MyModuleSubtype
 sys.modules[__name__] = MyModuleSubtype(__name__, docstring)
 
 To start with, the 'from
 pkgname._metamodule ...' line is an infinite loop,
 
 Why does MyModuleSubtype have to be imported from pkgname?
 It would make more sense for it to be defined directly in
 __new__.py, wouldn't it? Isn't the purpose of separating
 stuff out into __new__.py precisely to avoid circularities
 like that?

Perhaps I'm missing something, but won't that imply that every module 
which wants to use a special module type has to re-invent the wheel?

If this feature is going to be used, I would expect to be able to re-use 
pre-written module types. E.g. having written module with properties 
(so to speak) once, I can just import it and use it in my next project.


-- 
Steven
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft
As promised in the Move selected documentation repos to PSF BitBucket
account? thread I've written up a PEP for moving selected repositories from
hg.python.org to Github.

You can see this PEP online at: https://www.python.org/dev/peps/pep-0481/

I've also reproduced the PEP below for inline discussion.

---

Abstract


This PEP proposes migrating to Git and Github for certain supporting
repositories (such as the repository for Python Enhancement Proposals) in a way
that is more accessible to new contributors, and easier to manage for core
developers. This is offered as an alternative to PEP 474 which aims to achieve
the same overall benefits but while continuing to use the Mercurial DVCS and
without relying on a commerical entity.

In particular this PEP proposes changes to the following repositories:

* https://hg.python.org/devguide/
* https://hg.python.org/devinabox/
* https://hg.python.org/peps/


This PEP does not propose any changes to the core development workflow for
CPython itself.


Rationale
=

As PEP 474 mentions, there are currently a number of repositories hosted on
hg.python.org which are not directly used for the development of CPython but
instead are supporting or ancillary repositories. These supporting repositories
do not typically have complex workflows or often branches at all other than the
primary integration branch. This simplicity makes them very good targets for
the Pull Request workflow that is commonly found on sites like Github.

However where PEP 474 wants to continue to use Mercurial and wishes to use an
OSS and self-hosted and therefore restricts itself to only those solutions this
PEP expands the scope of that to include migrating to Git and using Github.

The existing method of contributing to these repositories generally includes
generating a patch and either uploading them to bugs.python.org or emailing
them to p...@python.org. This process is unfriendly towards non-comitter
contributors as well as making the process harder than it needs to be for
comitters to accept the patches sent by users. In addition to the benefits
in the pull request workflow itself, this style of workflow also enables
non techincal contributors, especially those who do not know their way around
the DVCS of choice, to contribute using the web based editor. On the committer
side the Pull Requests enable them to tell, before merging, whether or not
a particular Pull Request will break anything. It also enables them to do a
simple push button merge which does not require them to check out the
changes locally. Another such feature that is useful in particular for docs,
is the ability to view a prose diff. This Github specific feature enables
a committer to view a diff of the rendered output which will hide things like
reformatting a paragraph and show you what the actual meat of the change
actually is.


Why Git?


Looking at the variety of DVCS which are available today it becomes fairly
clear that git has gotten the vast mindshare of people who are currently using
it. The Open Hub (Previously Ohloh) statistics [#openhub-stats]_ show that
currently 37% of the repositories Open Hub is indexing is using git which is
second only to SVN (which has 48%) while Mercurial has just 2% of the indexed
repositories (beating only bazaar which has 1%). In additon to the Open Hub
statistics a look at the top 100 projects on PyPI (ordered by total download
counts) shows us that within the Python space itself there is a majority of
projects using git:

=== = == == === 
Git Mercurial Subversion Bazaar CVS None
=== = == == === 
62  227  4  1   1
=== = == == === 


Chosing a DVCS which has the larger mindshare will make it more likely that any
particular person who has experience with DVCS at all will be able to
meaningfully use the DVCS that we have chosen without having to learn a new
tool.

In addition to simply making it more likely that any individual will already
know how to use git, the number of projects and people using it means that the
resources for learning the tool are likely to be more fully fleshed out and
when you run into problems the liklihood that someone else had that problem
and posted a question and recieved an answer is also far likelier.

Thirdly by using a more popular tool you also increase your options for tooling
*around* the DVCS itself. Looking at the various options for hosting
repositories it's extremely rare to find a hosting solution (whether OSS or
commerical) that supports Mercurial but does not support Git, on the flip side
there are a number of tools which support Git but do not support Mercurial.
Therefore the popularity of git increases the flexibility of our options going
into the future for what toolchain these projects use.

Also by moving to the more popular DVCS we increase the likelhood that the
knowledge that the person has learned in 

Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Alex Gaynor
Donald Stufft donald at stufft.io writes:


 [words words words]


I strongly support this PEP. I'd like to share two pieces of information. Both
of these are personal anecdotes:

For the past several years, I've been a contributor on two major projects using
mercurial, CPython and PyPy. PyPy has a strong culture of in-repo branching,
basically all contributors regularly make branches in the main repo for their
work, and we're very free in giving people commit rights, so almost everyone
working on PyPy in any way has this level of access. This workflow works ok. I
don't love it as much as git, but it's fine, it's not an impediment to my work.

By contrast, CPython does not have this type of workflow, there are almost no
in-tree branches besides the 2.7, 3.4, etc. ones. Despite being a regular hg
user for years, I have no idea how to create a local-only branch, or a branch
which is pushed to a remote (to use the git term). I also don't generally
commit my own work to CPython, even though I have push privledges, 
because I
prefer to *always* get code review on my work. As a result, I use a git mirror
of CPython to do all my work, and generate patches from that.

The conclusion I draw from this is that hg's workflow is probably fine if
you're a committer on the project, or don't ever need to maintain multiple
patches concurrently (and thus can just leave everything uncommitted in the
repo). However, the hg workflow seems extremely defficient at non-committer
contributors.

The seconds experience I have is that of Django's migration to git and github.
For a long time we were on SVN, and we were very resistant to moving to 
DVCS in
general, and github in particular. Multiple times I said that I didn't see how
exporting a patch and uploading it to trac was more difficult than sending a
pull request. That was very wrong on my part.

My primary observation is not about new contributors though, it's actually
about the behavior of core developers. Before we were on github, it was fairly
rare for core developers to ask for reviews for anything besides *gigantic*
patches, we'd mostly just commit stuff to trunk. Since the switch to github,
I've seen that core developers are *far* more likely to ask for reviews of
their work before merging.

Big +1 from me, thanks for writing this up Donald,
Alex

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 29, 2014, at 6:27 PM, Donald Stufft don...@stufft.io wrote:
 
 [lots of words]

Just FYI, I’ve pushed an update to the PEP. Nothing major just some grammatical
fixes and such. The revision is here: 
https://hg.python.org/peps/rev/6c6947dbd13f

For whatever it's worth, the person who submitted that patch used Github's
online editor to submit them and made a PR to my personal PEP repository where
I work on my PEPs (https://github.com/dstufft/peps/pull/3). If someone wanted
to see some of the features in action that is a nice PEP to look at. In
particular if you hit Files Changed then beside the view button is an icon
that looks like a piece of paper and when you over over it it'll say Display
the Rich Diff. Clicking on that you'll see the diff of the rendered output
which lets you ignore things which were just reflowing the paragraphs and
such.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Petr Viktorin
On Sun, Nov 30, 2014 at 12:05 AM, Steven D'Aprano st...@pearwood.info wrote:
 On Sun, Nov 30, 2014 at 11:07:57AM +1300, Greg Ewing wrote:
 Nathaniel Smith wrote:
 So pkgname/__new__.py might look like:
 
 import sys
 from pkgname._metamodule import MyModuleSubtype
 sys.modules[__name__] = MyModuleSubtype(__name__, docstring)
 
 To start with, the 'from
 pkgname._metamodule ...' line is an infinite loop,

 Why does MyModuleSubtype have to be imported from pkgname?
 It would make more sense for it to be defined directly in
 __new__.py, wouldn't it? Isn't the purpose of separating
 stuff out into __new__.py precisely to avoid circularities
 like that?

 Perhaps I'm missing something, but won't that imply that every module
 which wants to use a special module type has to re-invent the wheel?

 If this feature is going to be used, I would expect to be able to re-use
 pre-written module types. E.g. having written module with properties
 (so to speak) once, I can just import it and use it in my next project.

I expect you'd package the special metamodule class in a stand-alone
package, not directly in the ones that use it.
You could import other packages freely, just the one that you're
currently defining would be unavailable.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 29, 2014, at 7:15 PM, Alex Gaynor alex.gay...@gmail.com wrote:
 
 Donald Stufft donald at stufft.io writes:
 
 
 [words words words]
 
 
 I strongly support this PEP. I'd like to share two pieces of information. Both
 of these are personal anecdotes:
 
 For the past several years, I've been a contributor on two major projects 
 using
 mercurial, CPython and PyPy. PyPy has a strong culture of in-repo branching,
 basically all contributors regularly make branches in the main repo for their
 work, and we're very free in giving people commit rights, so almost everyone
 working on PyPy in any way has this level of access. This workflow works ok. I
 don't love it as much as git, but it's fine, it's not an impediment to my 
 work.
 
 By contrast, CPython does not have this type of workflow, there are almost no
 in-tree branches besides the 2.7, 3.4, etc. ones. Despite being a regular hg
 user for years, I have no idea how to create a local-only branch, or a branch
 which is pushed to a remote (to use the git term). I also don't generally
 commit my own work to CPython, even though I have push privledges, 
 because I
 prefer to *always* get code review on my work. As a result, I use a git mirror
 of CPython to do all my work, and generate patches from that.
 
 The conclusion I draw from this is that hg's workflow is probably fine if
 you're a committer on the project, or don't ever need to maintain multiple
 patches concurrently (and thus can just leave everything uncommitted in the
 repo). However, the hg workflow seems extremely defficient at non-committer
 contributors.

I also don’t know how to do this. When I’m doing multiple things for CPython
my “branching” strategy is essentially using hg diff to create a patch file
with my “branch” name (``hg diff  my-branch.patch``), then revert all of my
changes (``hg revert —all —no-backup``), then either work on a new “branch”
or switch to an old “branch” by applying the corresponding patch
(``patch -p1  other-branch.patch``).

 
 The seconds experience I have is that of Django's migration to git and github.
 For a long time we were on SVN, and we were very resistant to moving to 
 DVCS in
 general, and github in particular. Multiple times I said that I didn't see how
 exporting a patch and uploading it to trac was more difficult than sending a
 pull request. That was very wrong on my part.
 
 My primary observation is not about new contributors though, it's actually
 about the behavior of core developers. Before we were on github, it was fairly
 rare for core developers to ask for reviews for anything besides *gigantic*
 patches, we'd mostly just commit stuff to trunk. Since the switch to github,
 I've seen that core developers are *far* more likely to ask for reviews of
 their work before merging.

I’ve also seen this effect, not just in Django but that I also notice my own
behavior. Projects where I have commit access but which aren’t on Github I
find myself less likely to look for others to review them and find myself
just committing directly to master/default/trunk while I do tend to look for
reviews and create PRs to give others the chance to review on Github.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Antoine Pitrou
On Sun, 30 Nov 2014 00:15:55 + (UTC)
Alex Gaynor alex.gay...@gmail.com wrote:
 
 The seconds experience I have is that of Django's migration to git and github.
 For a long time we were on SVN, and we were very resistant to moving to 
 DVCS in
 general, and github in particular. Multiple times I said that I didn't see how
 exporting a patch and uploading it to trac was more difficult than sending a
 pull request. That was very wrong on my part.
 
 My primary observation is not about new contributors though, it's actually
 about the behavior of core developers. Before we were on github, it was fairly
 rare for core developers to ask for reviews for anything besides *gigantic*
 patches, we'd mostly just commit stuff to trunk. Since the switch to github,
 I've seen that core developers are *far* more likely to ask for reviews of
 their work before merging.

I don't know anything about Django's old SVN setup, but our code review
tool (Rietveld) is actually quite good at that, and IMHO slightly
better than github, since it will only send an e-mail at the final
submission - by contrast, each individual comment you leave on github
fires a separate notification, which can feel like spam.

Our main problem for reviews these days is the lack of core developer
time. For example, Serhiy often asks for reviews (he has many patches
pending), which I personally don't have a lot of time to provide.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Nick Coghlan
On 30 Nov 2014 09:28, Donald Stufft don...@stufft.io wrote:

 As promised in the Move selected documentation repos to PSF BitBucket
 account? thread I've written up a PEP for moving selected repositories
from
 hg.python.org to Github.

 You can see this PEP online at: https://www.python.org/dev/peps/pep-0481/

 I've also reproduced the PEP below for inline discussion.

Given that hg.python.org isn't going anywhere, you could also use hg-git to
maintain read-only mirrors at the existing URLs and minimise any breakage
(as well as ensuring a full historical copy remains available on PSF
infrastructure). Then the only change needed would be to set up appropriate
GitHub web hooks to replace anything previously based on a commit hook
rather than periodic polling.

The PEP should also cover providing clear instructions for setting up
git-remote-hg with the remaining Mercurial repos (most notably CPython), as
well as documenting a supported workflow for generating patches based on
the existing CPython GitHub mirror.

Beyond that, GitHub is indeed the most expedient option. My two main
reasons for objecting to taking the expedient path are:

1. I strongly believe that the long term sustainability of the overall open
source community requires the availability and use of open source
infrastructure. While I admire the ingenuity of the free-as-in-beer model
for proprietary software companies fending off open source competition, I
still know a proprietary platform play when I see one (and so do venture
capitalists looking to extract monopoly rents from the industry in the
future). (So yes, I regret relenting on this principle in previously
suggesting the interim use of another proprietary hosted service)

2. I also feel that this proposal is far too cavalier in not even
discussing the possibility of helping out the Mercurial team to resolve
their documentation and usability issues rather than just yelling at them
your tool isn't popular enough for us, and we find certain aspects of it
too hard to use, so we're switching to something else rather than working
with you to address our concerns. We consider the Mercurial team a
significant enough part of the Python ecosystem that Matt was one of the
folks specifically invited to the 2014 language summit to discuss their
concerns around the Python 3 transition. Yet we'd prefer to switch to
something else entirely rather than organising a sprint with them at PyCon
to help ensure that our existing Mercurial based infrastructure is
approachable for git  GitHub users? (And yes, I consider some of the core
Mercurial devs to be friends, so this isn't an entirely abstract concern
for me)

Given my proposal to use BitBucket as a near term solution for enabling
pull request based workflows, it's clear I consider the second argument the
more significant of the two.

However, if others consider some short term convenience that may or may not
attract additional contributors more important than supporting the broader
Python and open source communities (an argument I'm more used to hearing in
the ruthlessly commercial environment of Red Hat, rather than in upstream
contexts that tend to be less worried about efficiency at any cost), I'm
not going to expend energy trying to prevent a change I disagree with on
principle, but will instead work to eliminate (or significantly reduce) the
current expedience argument in GitHub's favour.

As a result, I'm -0 on the PEP, rather than -1 (and will try to stay out of
further discussions).

Given this proposal, I'm planning to refocus PEPs 474  462 specifically on
resolving the CPython core workflow issues, since that will require
infrastructure customisation regardless, and heavy customisation of GitHub
based infrastructure requires opting in to the use of the GitHub specific
APIs that create platform lockin. (Note that the argument in PEP 481 about
saving overall infrastructure work is likely spurious - the vast majority
of that work will be in addressing the complex CPython workflow
requirements, and moving some support repos to GitHub does little to
alleviate that)

If folks decide they want to migrate the ancillary repos back from GitHub
after that other infrastructure work is done, so be it, but if they don't,
that's OK too. We're already running heterogeneous infrastructure across
multiple services (especially if you also take PyPA into account), so
having additional support repos externally hosted isn't that big a deal
from a purely practical perspective.

Regards,
Nick.

 ---

 Abstract
 

 This PEP proposes migrating to Git and Github for certain supporting
 repositories (such as the repository for Python Enhancement Proposals) in
a way
 that is more accessible to new contributors, and easier to manage for core
 developers. This is offered as an alternative to PEP 474 which aims to
achieve
 the same overall benefits but while continuing to use the Mercurial DVCS
and
 without relying on a commerical entity.

 In 

Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 29, 2014, at 7:43 PM, Antoine Pitrou solip...@pitrou.net wrote:
 
 On Sun, 30 Nov 2014 00:15:55 + (UTC)
 Alex Gaynor alex.gay...@gmail.com wrote:
 
 The seconds experience I have is that of Django's migration to git and 
 github.
 For a long time we were on SVN, and we were very resistant to moving to 
 DVCS in
 general, and github in particular. Multiple times I said that I didn't see 
 how
 exporting a patch and uploading it to trac was more difficult than sending a
 pull request. That was very wrong on my part.
 
 My primary observation is not about new contributors though, it's actually
 about the behavior of core developers. Before we were on github, it was 
 fairly
 rare for core developers to ask for reviews for anything besides *gigantic*
 patches, we'd mostly just commit stuff to trunk. Since the switch to github,
 I've seen that core developers are *far* more likely to ask for reviews of
 their work before merging.
 
 I don't know anything about Django's old SVN setup, but our code review
 tool (Rietveld) is actually quite good at that, and IMHO slightly
 better than github, since it will only send an e-mail at the final
 submission - by contrast, each individual comment you leave on github
 fires a separate notification, which can feel like spam.
 
 Our main problem for reviews these days is the lack of core developer
 time. For example, Serhiy often asks for reviews (he has many patches
 pending), which I personally don't have a lot of time to provide.
 

I think one of the issues with Reitveld isn’t related to Reitveld itself at
all, it’s all the *other* stuff you have to do to get a patch into Reitveld to
allow someone to review it at all. Generating a patch and uploading it to
Roundup is a pain and it’s far easier to just commit things directly to
default.

As far as Reitveld vs Github itself goes for reviews, I don’t personally agree.
Sending a single notification vs a notification per comment is going to be a
very subjective point of view, I like more notifications for the simple fact
that if I'm actively working on a PR while someone is reviewing it I can fix
the issues as quickly as they are finding them.

However if you move past that one thing, the Github PR review has a number of
benefits over Reitveld itself. Since the repositories this PEP currently deals
with are largely documentation-esque repositories the Prose diff is
incredibly useful for viewing a content-based diff instead of a diff designed
more for software where you don't normally reflow lines as much. In addition
the Github diff viewer also makes it trivial to expand the context of the diff
so you can see more of the surrounding file, even allowing you to expand it
to the entire file. This is super useful when the automatic amount of context
can't really give you enough information for reviewing the change or not.

Another difference is in how the review comments are presented. With Reitveld
the inline comments go away for each patchset you upload, regardless of if
you've addressed the concerns in that comment or not. They do get bubbled up
into the overall Messages view, however this has the opposite problem where
it gives you all of the messages regardless of if they are still a problem
with the latest code or not. In contrast the Github PR will hide old comments
when they lines they addressed have changed, but still make it easy to see the
old comments and the context in which they were made.

There's also the UI itself, it's somewhat dated and I think it's not entirely
unobvious that it suffers from something a lot of OSS projects suffer from in
that it is a Developer made UI. I don't think it's a big secret that
developers tend to make UIs that are not as nice as proper UX folks make, but
OSS has historically had a problem attracting that kind of talent (and is
often openly hostile towards it).

Finally there's the transferability of knowledge at play too. I likely would
not review someone else's patch unless I felt strongly about it on Reitveld
largely because I don't know Reitveld very well and I'd rather spend the time
that I'd need to learn to use it effectively working on other projects where
I already know the entire toolchain well. This is one of the larger points in
this PEP that the benefits to using popular tooling is that not only does
people's existing knowledge transfer easily into your project, but the time
they spend learning your tooling also transfers easily outside of this project.
This makes it much more attractive to learn the tooling since the hypothetical
person would be able to take that knowledge and apply it elsewhere.

It is my experience, and this entirely ancedotal, that it's far easier to get
reviews from non-committers and committers alike on projects which are hosted
on Github than it is to get reviews on projects which are not hosted there.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

___
Python-Dev 

Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Chris Angelico
On Sun, Nov 30, 2014 at 11:37 AM, Donald Stufft don...@stufft.io wrote:
 I also don’t know how to do this. When I’m doing multiple things for CPython
 my “branching” strategy is essentially using hg diff to create a patch file
 with my “branch” name (``hg diff  my-branch.patch``), then revert all of my
 changes (``hg revert —all —no-backup``), then either work on a new “branch”
 or switch to an old “branch” by applying the corresponding patch
 (``patch -p1  other-branch.patch``).

IMO, this is missing out on part of the benefit of a DVCS. When your
patches are always done purely on the basis of files, and have to be
managed separately, everything will be manual; and your edits won't
(normally) contain commit messages, authorship headers, date/time
stamps, and all the other things that a commit will normally have.
Using GitHub automatically makes all that available; when someone
forks the project and adds a commit, that commit will exist and have
its full identity, metadata, etc, and if/when it gets merged into
trunk, all that will be carried through automatically.

I strongly support this PEP.

ChrisA
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 29, 2014, at 8:12 PM, Nick Coghlan ncogh...@gmail.com wrote:
 
 
 On 30 Nov 2014 09:28, Donald Stufft don...@stufft.io 
 mailto:don...@stufft.io wrote:
 
  As promised in the Move selected documentation repos to PSF BitBucket
  account? thread I've written up a PEP for moving selected repositories from
  hg.python.org http://hg.python.org/ to Github.
 
  You can see this PEP online at: https://www.python.org/dev/peps/pep-0481/ 
  https://www.python.org/dev/peps/pep-0481/
 
  I've also reproduced the PEP below for inline discussion.
 
 Given that hg.python.org http://hg.python.org/ isn't going anywhere, you 
 could also use hg-git to maintain read-only mirrors at the existing URLs and 
 minimise any breakage (as well as ensuring a full historical copy remains 
 available on PSF infrastructure). Then the only change needed would be to set 
 up appropriate GitHub web hooks to replace anything previously based on a 
 commit hook rather than periodic polling.
 
 

Ah yes, I meant to include that and just forgot to do it when I went to test
hg-git to see how well it worked and whether I got different commit hashes on
different machines. I also thought about adding a git.python.org 
http://git.python.org/ which just
acted as a read-only mirror of what was on Github, but I don’t know if that’s
actually generally useful or not.

 The PEP should also cover providing clear instructions for setting up 
 git-remote-hg with the remaining Mercurial repos (most notably CPython), as 
 well as documenting a supported workflow for generating patches based on the 
 existing CPython GitHub mirror.
 
 

I can add this. I’ve never actually tried using git-remote-hg with CPython
itself because I’ve made it segfault on other Mercurial repositories and I
never figured out why so I just generally fight my way through using Mercurial
on projects that themselves use Mercurial. I will absolutely test to see if
git-remote-hg works with CPython and I can document using that to contribute to
CPython. I’m not sure it needs to be part of the PEP or not? Feels like
something that would be better inside the devguide itself but I’m not opposed
to putting it both locations.

 Beyond that, GitHub is indeed the most expedient option. My two main reasons 
 for objecting to taking the expedient path are:
 

It's not entirely about expedience. I think a lot of the reason why we should
look towards outsourcing some of these items is that volunteer time is not
a fungible resource. Volunteers are generally only willing to work on things
which they personally care about. This is entirely unlike a business where you
have employees who will generally work on whatever you tell them to because
that's what you're paying them for. To this end I personally don't really have
an interest in trying to create a better code hosting platform than Github when
Github is doing an amazing job in my opinion and they satisify my needs fine.
Given the *current* state of tooling it appears that there are not a lot of
people who both care about making that piece of software exist and are capable
of competing with Github in terms of quality.

 1. I strongly believe that the long term sustainability of the overall open 
 source community requires the availability and use of open source 
 infrastructure. While I admire the ingenuity of the free-as-in-beer model 
 for proprietary software companies fending off open source competition, I 
 still know a proprietary platform play when I see one (and so do venture 
 capitalists looking to extract monopoly rents from the industry in the 
 future). (So yes, I regret relenting on this principle in previously 
 suggesting the interim use of another proprietary hosted service)
 
 

I somewhat agree. However I’m less concerned specifically about where projects
are hosted exactly and more about the *ability* to move to a completely OSS
infrastructure. In particular if at somepoint we need to move off of Github we
can totally do that, it’s not particularly difficult. Currently you lose the
higher quality polish of Github if you do that however if at some point in the
future Github either turns evil or an OSS software offers a truly compelling
alternative to Github then there is really nothing stopping a migration to
another platform. As I said in the PEP I view this as a “let’s cross that
bridge if/when we get to it”. The main thing we should look at is things that
would be difficult to migrate away from. For code hosting in particular most of
the truly valuable data is stored within the DVCS so migrating the bulk of the
data is as simple as pushing the repository to a new location. The other data
is within the issues, for these repositories I suggest moving the issues to
Github entirely because I don’t suspect they’ll get many if any issues
themselves so the amount of data stored in issues will be low.

However I also think that long term sustainability of any particular project
depends on attracting and retaining contributors. 

[Python-Dev] PEP 479

2014-11-29 Thread Jim J. Jewett
I have a strong suspicion that I'm missing something; I have been
persuaded both directions too often to believe I have a grip on the
real issue.

So I'm putting out some assumptions; please tell me if I'm wrong, and
maybe make them more explicit in the PEP.

(1)  The change will only affect situations where StopIteration is
currently raised as an Exception -- i.e., it leaks past the bounds of
a loop.

(2)  This can happen because of an explicit raise StopIteration.  This
is currently a supported idiom, and that is changing with PEP 479.

(2a)  Generators in the unwind path will now need to catch and reraise.

(3)  It can also happen because of an explicit next statement (as
opposed the the implicit next of a loop).
This is currently supported; after PEP 479, the next statement should
be wrapped in a try statement, so that the intent will be explicit.

(4)  It can happen because of yield from yielding from an iterator,
rather than a generator?

(5)  There is no other case where this can happen?  (So the generator
comprehension case won't matter unless it also includes one of the
earlier cases.)

-jJ
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479

2014-11-29 Thread Chris Angelico
On Sun, Nov 30, 2014 at 1:04 PM, Jim J. Jewett jimjjew...@gmail.com wrote:
 I have a strong suspicion that I'm missing something; I have been
 persuaded both directions too often to believe I have a grip on the
 real issue.

 So I'm putting out some assumptions; please tell me if I'm wrong, and
 maybe make them more explicit in the PEP.

 (1)  The change will only affect situations where StopIteration is
 currently raised as an Exception -- i.e., it leaks past the bounds of
 a loop.

Where a StopIteration would come up out of the generator. Inside the
generator function, it's exactly the same as it is in any other
function; you can raise it, you can catch it, everything's normal.

 (2)  This can happen because of an explicit raise StopIteration.  This
 is currently a supported idiom, and that is changing with PEP 479.

Correct. There is nothing that explicitly-raised StopIteration can do
in 3.0-3.4 that a return statement can't do in 3.0-3.7. There is the
downside that raise StopIteration(value) works on 2.7 where return
value is a syntax error; the PEP currently has no solution for this.

 (2a)  Generators in the unwind path will now need to catch and reraise.

More likely, catch and return; if your code was allowing next(iter)
to have the effect of potentially terminating the function, then you
now have to spell that try: next(iter); except StopIteration:
return, which makes it clear that there's control flow here.

 (3)  It can also happen because of an explicit next statement (as
 opposed the the implicit next of a loop).
 This is currently supported; after PEP 479, the next statement should
 be wrapped in a try statement, so that the intent will be explicit.

Correct, as per previous point. As you say, the intent will be
explicit: take a value, and if there aren't any more, stop processing.

 (4)  It can happen because of yield from yielding from an iterator,
 rather than a generator?

No; as I understand it (though maybe I'm wrong too), yield from will
yield every value the other iterator yields, and will then quietly
emit a value if the iterator raises StopIteration, or will allow any
other exception to propagate. The StopIteration coming from the
iterator is absorbed by the yield from construct. To completely
propagate it out, return (yield from iter) should cover all three
results (yielded value, returned value, raised exception).

 (5)  There is no other case where this can happen?  (So the generator
 comprehension case won't matter unless it also includes one of the
 earlier cases.)

Correct. In a generator expression (I assume that's what you mean?),
the most likely way to leak a StopIteration is the or stop() hack,
which has always been at least slightly dubious, and is now going to
be actively rejected. Control flow in a generator expression is now
the same as in a comprehension, with no early-abort option; if you
want that, the best way is to break the expression into an out-of-line
generator function. This is now very similar to the restrictions on
lambda; you can't (eg) raise exceptions in a lambda function, and if
anyone comes to python-list asking how to do so, the best response is
use def instead of lambda.

ChrisA
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] advice needed: best approach to enabling metamodules?

2014-11-29 Thread Guido van Rossum
All the use cases seem to be about adding some kind of getattr hook to
modules. They all seem to involve modifying the CPython C code anyway. So
why not tackle that problem head-on and modify module_getattro() to look
for a global named __getattr__ and if it exists, call that instead of
raising AttributeError?

On Sat, Nov 29, 2014 at 11:37 AM, Nathaniel Smith n...@pobox.com wrote:

 On Sat, Nov 29, 2014 at 4:21 AM, Guido van Rossum gu...@python.org
 wrote:
  Are these really all our options? All of them sound like hacks, none of
 them
  sound like anything the language (or even the CPython implementation)
 should
  sanction. Have I missed the discussion where the use cases and
 constraints
  were analyzed and all other approaches were rejected? (I might have some
  half-baked ideas, but I feel I should read up on the past discussion
 first,
  and they are probably more fit for python-ideas than for python-dev. Plus
  I'm just writing this email because I'm procrastinating on the type
 hinting
  PEP. :-)

 The previous discussions I was referring to are here:
   http://thread.gmane.org/gmane.comp.python.ideas/29487/focus=29555
   http://thread.gmane.org/gmane.comp.python.ideas/29788

 There might well be other options; these are just the best ones I
 could think of :-). The constraints are pretty tight, though:
 - The new module object (whatever it is) should have a __dict__ that
 aliases the original module globals(). I can elaborate on this if my
 original email wasn't enough, but hopefully it's obvious that making
 two copies of the same namespace and then trying to keep them in sync
 at the very least smells bad :-).
 - The new module object has to be a subtype of ModuleType, b/c there
 are lots of places that do isinstance(x, ModuleType) checks (notably
 -- but not only -- reload()). Since a major goal here is to make it
 possible to do cleaner deprecations, it would be really unfortunate if
 switching an existing package to use the metamodule support itself
 broke things :-).
 - Lookups in the normal case should have no additional performance
 overhead, because module lookups are extremely extremely common. (So
 this rules out dict proxies and tricks like that -- we really need
 'new_module.__dict__ is globals()' to be true.)

 AFAICT there are three logically possible strategies for satisfying
 that first constraint:
 (a) convert the original module object into the type we want, in-place
 (b) create a new module object that acts like the original module object
 (c) somehow arrange for our special type to be used from the start

 My options 1 and 2 are means of accomplishing (a), and my options 3
 and 4 are means of accomplishing (b) while working around the
 behavioural quirks of module objects (as required by the second
 constraint).

 The python-ideas thread did also consider several methods of
 implementing strategy (c), but they're messy enough that I left them
 out here. The problem is that somehow we have to execute code to
 create the new subtype *before* we have an entry in sys.modules for
 the package that contains the code for the subtype. So one option
 would be to add a new rule, that if a file pkgname/__new__.py exists,
 then this is executed first and is required to set up
 sys.modules[pkgname] before we exec pkgname/__init__.py. So
 pkgname/__new__.py might look like:

 import sys
 from pkgname._metamodule import MyModuleSubtype
 sys.modules[__name__] = MyModuleSubtype(__name__, docstring)

 This runs into a lot of problems though. To start with, the 'from
 pkgname._metamodule ...' line is an infinite loop, b/c this is the
 code used to create sys.modules[pkgname]. It's not clear where the
 globals dict for executing __new__.py comes from (who defines
 __name__? Currently that's done by ModuleType.__init__). It only works
 for packages, not modules. The need to provide the docstring here,
 before __init__.py is even read, is weird. It adds extra stat() calls
 to every package lookup. And, the biggest showstopper IMHO: AFAICT
 it's impossible to write a polyfill to support this code on old python
 versions, so it's useless to any package which needs to keep
 compatibility with 2.7 (or even 3.4). Sure, you can backport the whole
 import system like importlib2, but telling everyone that they need to
 replace every 'import numpy' with 'import importlib2; import numpy' is
 a total non-starter.

 So, yeah, those 4 options are really the only plausible ones I know of.

 Option 1 and option 3 are pretty nice at the language level! Most
 Python objects allow assignment to __class__ and __dict__, and both
 PyPy and Jython at least do support __class__ assignment. Really the
 only downside with Option 1 is that actually implementing it requires
 attention from someone with deep knowledge of typeobject.c.

 -n

 --
 Nathaniel J. Smith
 Postdoctoral researcher - Informatics - University of Edinburgh
 http://vorpus.org
 ___
 Python-Dev 

Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 29, 2014, at 9:01 PM, Donald Stufft don...@stufft.io wrote:
 
 
 The PEP should also cover providing clear instructions for setting up 
 git-remote-hg with the remaining Mercurial repos (most notably CPython), as 
 well as documenting a supported workflow for generating patches based on the 
 existing CPython GitHub mirror.
 
 
 
 I can add this. I’ve never actually tried using git-remote-hg with CPython
 itself because I’ve made it segfault on other Mercurial repositories and I
 never figured out why so I just generally fight my way through using Mercurial
 on projects that themselves use Mercurial. I will absolutely test to see if
 git-remote-hg works with CPython and I can document using that to contribute 
 to
 CPython. I’m not sure it needs to be part of the PEP or not? Feels like
 something that would be better inside the devguide itself but I’m not opposed
 to putting it both locations.


Nevermind. I’m not going to add this because git-remote-hg is busted with the 
latest
version of Mercurial and I don’t think it’s really needed for this PEP (though 
would be
good for the devguide at some point).

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Ben Finney
Nick Coghlan ncogh...@gmail.com writes:

 1. I strongly believe that the long term sustainability of the overall
 open source community requires the availability and use of open source
 infrastructure.

I concur. This article URL:http://mako.cc/writing/hill-free_tools.html
makes the arguments well, IMO.

 2. I also feel that this proposal is far too cavalier in not even
 discussing the possibility of helping out the Mercurial team […] we'd
 prefer to switch to something else entirely rather than organising a
 sprint with them at PyCon to help ensure that our existing Mercurial
 based infrastructure is approachable for git  GitHub users?

Exactly. For such a core tool, instead of pushing proprietary platforms
at the expense of software freedom, the sensible strategy for a project
(Python) that hopes to be around in the long term is to use and improve
the free software platforms.

 As a result, I'm -0 on the PEP, rather than -1 (and will try to stay
 out of further discussions).

I don't get a vote. So I'm glad there are some within the Python core
development team that can see the mistakes inherent in depending on
non-free tools for developing free software.

-- 
 \ “The cost of a thing is the amount of what I call life which is |
  `\   required to be exchanged for it, immediately or in the long |
_o__)   run.” —Henry David Thoreau |
Ben Finney

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Chris Angelico
On Sun, Nov 30, 2014 at 4:06 PM, Ben Finney ben+pyt...@benfinney.id.au wrote:
 I don't get a vote. So I'm glad there are some within the Python core
 development team that can see the mistakes inherent in depending on
 non-free tools for developing free software.

While this is a laudable view, this kind of extreme stance is contrary
to any semblance of practicality. Compare:

http://www.gnu.org/distros/free-distros.html
http://www.gnu.org/distros/common-distros.html#Debian

Debian is not considered sufficiently free because people can readily
learn about these nonfree packages by browsing Debian's online package
database, even though you have to be very much explicit about these
things (you have to go and enable the non-free repos).

Yes, GitHub is proprietary. But all of your actual code is stored in
git, which is free, and it's easy to push that to a new host somewhere
else, or create your own host. This proposal is for repositories that
don't need much in the way of issue trackers etc, so shifting away
from GitHub shouldn't demand anything beyond moving the repos
themselves. How bad is it, really? Is it worth fighting a
philosophical battle for the sake of no real gain, sacrificing real
benefits for the intangible but it's not free debate?

Python is already using quite a bit of non-free software in its
ecosystem. The Windows builds of CPython are made with Microsoft's
compiler, and the recent discussion about shifting to Cygwin or MinGW
basically boiled down to but it ought to be free software, and that
was considered not a sufficiently strong argument. In each case, the
decision has impact on other people (using MSVC for the official
python.org installers means extension writers need to use MSVC too;
and using GitHub means that contributors are strongly encouraged,
possibly required, to use GitHub); so why is it acceptable to use a
non-free compiler, but not acceptable to use a non-free host?

I admire and respect the people who, for their own personal use,
absolutely and utterly refuse to use any non-free systems or software.
It's great that they do it, because that helps encourage free software
to be created. But for myself? I'll use whatever makes the most sense.
Proprietary systems have inherent issues (the best-maintained non-free
programs seem to have about the same bugginess as a poorly-maintained
free program, or at least that's how it feels), but if the available
free alternatives have even more issues, I'll not hobble myself for
the purity of freedom. Practicality wins.

ChrisA
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Donald Stufft

 On Nov 30, 2014, at 12:06 AM, Ben Finney ben+pyt...@benfinney.id.au wrote:
 
 Nick Coghlan ncogh...@gmail.com writes:
 
 1. I strongly believe that the long term sustainability of the overall
 open source community requires the availability and use of open source
 infrastructure.
 
 I concur. This article URL:http://mako.cc/writing/hill-free_tools.html
 makes the arguments well, IMO.
 
 2. I also feel that this proposal is far too cavalier in not even
 discussing the possibility of helping out the Mercurial team […] we'd
 prefer to switch to something else entirely rather than organising a
 sprint with them at PyCon to help ensure that our existing Mercurial
 based infrastructure is approachable for git  GitHub users?
 
 Exactly. For such a core tool, instead of pushing proprietary platforms
 at the expense of software freedom, the sensible strategy for a project
 (Python) that hopes to be around in the long term is to use and improve
 the free software platforms.

I think there is a big difference here between using a closed source VCS
or compiler and using a closed source code host. Namely in that the
protocol is defined by git so switching from one host to another is easy.

It’s akin to saying that if we chose to run the PyPI services on a Windows
machine that it is somehow makes it less-free even though we could
have chosen to run it on a “free” OS and we weren’t doing much, if anything,
to tie us to that particular OS.

If it makes people feel better we can continue to support the existing
mechanisms of contribution, then people can choose between interacting
with a “non free” host and “free” tooling. I suspect most people will choose
the “non-free” tooling.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Wes Turner
Specifically, which features are most ideal here?

- [ ] Userbase
- [ ] TTW editing only over SSL (see: Zope 2)
- [ ] Pull Requests (see also: BitBucket, Torvalds rant)
- [ ] Simple Issue Tagging
- [ ] Pingbacks
- [ ] CI Integration


On Sat, Nov 29, 2014 at 11:27 PM, Donald Stufft don...@stufft.io wrote:


  On Nov 30, 2014, at 12:06 AM, Ben Finney ben+pyt...@benfinney.id.au
 wrote:
 
  Nick Coghlan ncogh...@gmail.com writes:
 
  1. I strongly believe that the long term sustainability of the overall
  open source community requires the availability and use of open source
  infrastructure.
 
  I concur. This article URL:http://mako.cc/writing/hill-free_tools.html
  makes the arguments well, IMO.
 
  2. I also feel that this proposal is far too cavalier in not even
  discussing the possibility of helping out the Mercurial team […] we'd
  prefer to switch to something else entirely rather than organising a
  sprint with them at PyCon to help ensure that our existing Mercurial
  based infrastructure is approachable for git  GitHub users?
 
  Exactly. For such a core tool, instead of pushing proprietary platforms
  at the expense of software freedom, the sensible strategy for a project
  (Python) that hopes to be around in the long term is to use and improve
  the free software platforms.

 I think there is a big difference here between using a closed source VCS
 or compiler and using a closed source code host. Namely in that the
 protocol is defined by git so switching from one host to another is easy.

 It’s akin to saying that if we chose to run the PyPI services on a Windows
 machine that it is somehow makes it less-free even though we could
 have chosen to run it on a “free” OS and we weren’t doing much, if
 anything,
 to tie us to that particular OS.

 If it makes people feel better we can continue to support the existing
 mechanisms of contribution, then people can choose between interacting
 with a “non free” host and “free” tooling. I suspect most people will
 choose
 the “non-free” tooling.
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 481 - Migrate Some Supporting Repositories to Git and Github

2014-11-29 Thread Demian Brecht
On Sat, Nov 29, 2014 at 3:27 PM, Donald Stufft don...@stufft.io wrote:
 As promised in the Move selected documentation repos to PSF BitBucket
 account? thread I've written up a PEP for moving selected repositories
from
 hg.python.org to Github.


FWIW, I'm a pretty solid -1 to this PEP.

Don't get me wrong, I'm much more accustomed to git than I am hg, much
prefer git's lightweight branching model and would love to see CPython and
all ancillary repos migrated to git  Github. If that was what this PEP was
after, I'd be a +1. What I don't like about it is the introduction of
multiple tools that directly impact the barrier of entry to contributing. I
think that splitting ancillary repos such as PEPs and docs out might be a
little short sighted at an overall project level.

In my mind, there are three major categories of contributors (and
prospective contributors):

1. Those that use git on a daily basis
2. Those that use hg on a daily basis
3. Those who use neither and are more accustomed to Perforce, SVN and the
like

Let's say this PEP is approved and the suggested repos are moved to Github.

For git users, life is suddenly made much easier when contributing to those
projects for obvious reasons. However, they still have the same barrier of
entry to contributing to CPython (I would imagine that this would be the
goal for most users, but maybe I'm wrong about that). I would imagine that
contributing to the ancillary repos would be great grounds to ramp up on
using hg before hitting CPython with its multiple long lived branches and
such. Making the switch as suggested by this PEP removes that.

For hg users, you now add a barrier of entry for contributing to the repos
now living on Github.

In both cases, you've introduced the need to context switch when
contributing to CPython and any of the other repos. Two tools that require
quite different workflows.

Then, for the third class of users, you've now introduced the requirement
of learning two different sets of tools (if they want to do anything
outside of using the Edit button through Github's UI). Now you're looking
at conflated contributor documentation or project-specific documentation.
IMHO, suboptimal either way you look at it.

Personally, I don't think that there are any silver bullets to this
problem. In no case is everyone going to be satisfied. In cases like that,
I tend to think that no matter what the solution eventually agreed upon is,
consistency is of the utmost importance. Moving a number of repos to Github
breaks that consistency.

What *would* be nice if CPython was to stay on mercurial though is perhaps
moving those repos to Bitbucket. In that case, you get both the Edit
feature that really by all account removes the initial bar of entry, but
you still remain consistent with the rest of the project.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com