Re: [Python-Dev] Benchmarking Python and micro-optimizations

2016-11-01 Thread Ludovic Gasc
Hi,

Thanks first for that, it's very interesting.
About to enrich benchmark suite, I might have a suggestion: We might add
REST/JSON scenarios, because a lot of people use Python for that.
It isn't certainly not the best REST/JSON scenarios, because they have a
small payload, but better than nothing:
https://www.techempower.com/benchmarks/#section=code=peak=fortune
Moreover, we already have several implementations for the most populars Web
frameworks:
https://github.com/TechEmpower/FrameworkBenchmarks/tree/master/frameworks/Python

The drawback is that a lot of tests need a database.
I can help if you're interested in.

Have a nice week.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Request for CPython 3.5.3 release

2016-07-03 Thread Ludovic Gasc
Hi Nick,

First, thanks a lot for your detailed answer, it was very instructive to me.
My answers below.

2016-07-03 6:09 GMT+02:00 Nick Coghlan <ncogh...@gmail.com>:

> On 2 July 2016 at 16:17, Ludovic Gasc <gml...@gmail.com> wrote:
> > Hi everybody,
> >
> > I fully understand that AsyncIO is a drop in the ocean of CPython, you're
> > working to prepare the entire 3.5.3 release for December, not yet ready.
> > However, you might create a 3.5.2.1 release with only this AsyncIO fix ?
>
> That would be more work than just doing a 3.5.3 release, though - the
> problem isn't with the version number bump, it's with asking the
> release team to do additional work without clearly explaining the
> rationale for the request (more on that below). While some parts of
> the release process are automated, there's still a lot of steps to run
> through by a number of different people:
> https://www.python.org/dev/peps/pep-0101/.
>

Thanks for the link, I didn't know this PEP, it was interesting to read.


>
> The first key question to answer in this kind of situation is: "Is
> there code that will run correctly on 3.5.1 that will now fail on
> 3.5.2?" (i.e. it's a regression introduced by the asyncio and
> coroutine changes in the point release rather than something that was
> already broken in 3.5.0 and 3.5.1).
>
> If the answer is "No", then it doesn't inhibit the 3.5.2 rollout in
> any way, and folks can wait until 3.5.3 for the fix.
>
> However, if the answer is "Yes, it's a new regression in 3.5.2" (as in
> this case), then the next question becomes "Is there an agreed
> resolution for the regression?"
>
> The answer to that is currently "No" - Yury's PR against the asyncio
> repo is still being discussed.
>
> Once the answer to that question is "Yes", *then* the question of
> releasing a high priority fix in a Python 3.5.3 release can be
> properly considered by answering the question "Of the folks using
> asyncio, what proportion of them are likely to encounter problems in
> upgrading to Python 3.5.2, and is there a workaround they can apply or
> alternate approach they can use to avoid the problem?".
>
> At the moment, Yury's explanation of the fix in the PR is
> (understandably) addressed at getting the problem resolved within the
> context of asyncio, and hence just describes the particular APIs
> affected, and the details of the incorrect behaviour. While that's an
> important step in the process, it doesn't provide a clear assessment
> of the *consequences* of the bug aimed at folks that aren't themselves
> deeply immersed in using asyncio, so we can't tell if the problem is
> "Some idiomatic code frequently recommended in user facing examples
> and used in third party asyncio based libraries may hang client
> processes" (which would weigh in favour of an early 3.5.3 release
> before people start encountering the regression in practice) or "Some
> low level API's not recommended for general use may hang if used in a
> particular non-idiomatic combination only likely to be encountered by
> event loop implementors" (which would suggest it may be OK to stick
> with the normal maintenance release cadence).
>

To my basic understanding, it seems to have race conditions to open sockets.
If my understanding is true, it's a little bit the heart of AsyncIO is
affected ;-)

If you search about loop.sock_connect in Github, you've found a lot of
results
https://github.com/search?l=python=loop.sock_connect=searchresults=Code=%E2%9C%93

Moreover, if Yury, one of contributors of AsyncIO:
https://github.com/python/asyncio/graphs/contributors and uvloop creator
has sent an e-mail about that, I'm tented to believe him.
It's why a little bit scared by that, even if we don't have a lot of
AsyncIO's users, especially with the latest release.

However, Google Trends might give us a good overview of relative users we
have, compare to Twisted, Gevent and Tornado:
https://www.google.com/trends/explore#q=asyncio%2C%20%2Fm%2F02xknvd%2C%20gevent%2C%20%2Fm%2F07s58h4=1%2F2016%2012m=q=Etc%2FGMT-2


>
> > If 3.5.2.1 or 3.5.3 are impossible to release before december,
>
> Early maintenance releases are definitely possible, but the
> consequences of particular regressions need to be put into terms that
> make sense to the release team, which generally means stepping up from
> "APIs X, Y, and Z broke in this way" to "Users doing A, B, and C will
> be affected in this way".
>
> As an example of a case where an early maintenance release took place:
> several years ago, Python 2.6.3 happened to break both "from logging
> import *" (due to a missing entry in test___all__ letting an error in
> lo

Re: [Python-Dev] Request for CPython 3.5.3 release

2016-07-02 Thread Ludovic Gasc
Hi everybody,

I fully understand that AsyncIO is a drop in the ocean of CPython, you're
working to prepare the entire 3.5.3 release for December, not yet ready.
However, you might create a 3.5.2.1 release with only this AsyncIO fix ?

PEP 440 doesn't seem to forbid that even if I see only 3 digits examples in
PEP, I only find an example with 4 digits:
https://www.python.org/dev/peps/pep-0440/#version-specifiers

If 3.5.2.1 or 3.5.3 are impossible to release before december, what are the
alternative solutions for AsyncIO users ?
1. Use 3.5.1 and hope that Linux distributions won't use 3.5.2 ?
2. Patch by hand asyncio source code ?
3. Remove asyncio folder in CPython, and install asyncio via github
repository ?
4. Anything else ?

To be honest, I'm migrating an AsyncIO application from 3.4.3 to 3.5.1 with
more than 10 000 lines of code, I'm really interested in to know if it's
better to keep 3.4.3 for now, or if 3.5 branch is enough stable ?

Have a nice week-end.
--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2016-06-30 9:41 GMT+02:00 Larry Hastings <la...@hastings.org>:

> On 06/28/2016 02:51 PM, Larry Hastings wrote:
>
>
> On 06/28/2016 02:05 PM, Yury Selivanov wrote:
>
> Larry and the release team: would it be possible to make an
> "emergency" 3.5.3 release?
>
>
> I'd like to hear from the other asyncio reviewers: is this bug bad enough
> to merit such an "emergency" release?
>
>
> Thanks,
>
>
> */arry*
>
>
> There has been a distinct lack of "dear god yes Larry" emails so far.
> This absence suggests that, no, it is not a bad enough bug to merit such a
> release.
>
> If we stay to our usual schedule, I expect 3.5.3 to ship December-ish.
>
>
> */arry*
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com
>
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] FAT Python (lack of) performance

2016-01-25 Thread Ludovic Gasc
Hi,

Just thanks for this big contribution.
And maybe this project could give new ideas to optimize Python, who knows ?

At least, you've win a beer for the FOSDEM event this week-end ;-)

Have a nice week.


--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2016-01-25 19:16 GMT+01:00 Victor Stinner <victor.stin...@gmail.com>:

> Hi,
>
> Summary: FAT Python is not faster, but it will be ;-)
>
> --
>
> When I started the FAT Python as a fork of CPython 3.6, I put
> everything in the same repository. Last weeks, I focused on splitting
> my giant patch (10k lines) into small reviewable patches. I wrote 3
> PEP (509 dict version, 510 function specialziation, 511 code
> tranformers) and I enhanced the API to make it usable for more use
> cases than just FAT Python. I also created fatoptimizer (the AST
> optimizer) and fat (runtime dependency of the optimizer) projects on
> GitHub to separate clearly what should be outside Python core. For all
> links, see:
>
>http://faster-cpython.readthedocs.org/fat_python.html
>
> For the fatoptimizer project, my constraint is to be able to run the
> full Python test suite unmodified. In practice, I have to disable some
> optimizations by putting a "__fatoptimizer__= {...}" configuration to
> some test files. For example, I have to disable constant folding on
> test_bool because it tests that False+2 gives 2 at runtime, whereas
> the optimizer replaces directly False+2 with 2 during the compilation.
> Well, test_bool.py is not the best example because all tests pass with
> the constant folding optimization (if I comment my
> "__fatoptimizer__={...}" change).
>
> This constraint ensures that the optimizer "works" and doesn't break
> (too much ;-)) the Python semantics, but it's more difficult to
> implement powerful optimizations.
>
> I also found and fixed various kinds of bugs. In my code obviously,
> but also in the Python core, in various places. Some bugs only concern
> AST transformers which is a new feature, but I had to fix them. For
> example, Python didn't support negative line number delta in
> co_lntotab of code objects, and so line number were all wrong on
> optimized code. I merged my enhancement in the default branch of
> CPython (issue #26107).
>
> In short, I focused on having something working (respecting the Python
> semantics), rather than spending time on writing optimizations.
>
> --
>
> When I asked explicitly "Is someone opposed to this PEP 509 [dict
> verion] ?", Barry Warsaw answered that a performance analysis is
> required. Extract of his mail:
>
>"I still think this is maintenance and potential performance
> overhead we don't want to commit to long term unless it enables
> significant optimization.  Since you probably can't prove that without
> some experimentation, this API should be provisional."
>
> Last week, I ran some benchmarks and I have to admin that I was
> disappointed. Not only fatoptimizer doesn't make Python faster, but it
> makes it much slower on some tests!
>
>http://fatoptimizer.readthedocs.org/en/latest/benchmarks.html
>
> Quickly, I identified a major performance issue when nested functions
> are specialized, especially in Lib/json/encoder.py (tested by
> bm_json_v2.py benchmark). I fixed my optimizer to not specialize
> nested functions anymore. This simple change fixed the main
> performance issue. Reminder: in performance critical code, don't use
> nested functions! I will maybe propose patches for Lib/json/encoder.py
> to stop using nested functions.
>
> I only ran benchmarks with the optimizer enabled. I now have to
> measure the overhead of my patches (PEP 509, 510 and 511) adding the
> API fat AST optimizers. The overhead must be negligible. For me, it's
> a requirement of the whole project. Changes must not make Python
> slower when the optimizer is not used.
>
> fatoptimizer is faster on microbenchmarks, but I had to write manually
> some optimizations:
>
>http://fatoptimizer.readthedocs.org/en/latest/microbenchmarks.html
>
> IMHO fatoptimizer is not faster on macro benchmarks because it is not
> smart enough (yet) to generate the most interesting optimizations,
> like function inlining and specialization for argument types. You can
> estimate the speedup if you specialize manually your functions.
>
> --
>
> Barry also wrote: "Did you address my suggestion on python-ideas to
> make the new C API optionally compiled in?"
>
> Well, it is an option, but I would prefer to have the API for AST
> optimizer directly built in Python.
>
> The first beta version of Python 3.6 is scheduled in September 2016
> (deadline for new features in Python 3.6), so I still ha

Re: [Python-Dev] Python 3.4.3 on RedHat 6.6 s390x make fails with: make: *** [sharedmods] Error 139

2015-07-22 Thread Ludovic Gasc
Hi Jo,

Terry has right, however, if you launch this command before the
compilation, it should work:
https://github.com/saghul/pythonz/blob/master/README.rst#rpm-family-centos-rhel

In case of it doesn't work, please contact me privately to avoid to pollute
this mailing list.

Have a nice day.

Ludovic Gasc (GMLudo)
http://www.gmludo.eu/
On 22 Jul 2015 08:23, Terry Reedy tjre...@udel.edu wrote:

 On 7/21/2015 6:20 PM, Vitale, Joseph wrote:

 Hello,

 Trying to install Python 3.4.3  on Red Hat  6.6 zLinux(s390x)  but
 “make” fails and core dumps.  Not using OpenSSL and did not configure
 for it.


 Questions about installing current Python should be directed to
 python-list.  pydev is for developing future releases and versions of
 Python.

 --
 Terry Jan Reedy


 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Adding c-api async protocol support

2015-06-25 Thread Ludovic Gasc
For one time, while we are in a congratulations tunnel, thank you a lot
AsyncIO core devs:

Since several months, we've pushed on production an average of 2 daemons
based on AsyncIO in my company with several protocols.
Most of the time there are small daemons, however, some are complex.
For now, the AsyncIO toolbox is pretty stable and predictive, especially
for the debugging.
The maturity is better that I've excepted, especially when you list the
AsyncIO ecosystem: We are for now few developers to use that for concrete
applications.

At least to me, it's the best proof that the foundations are good.

We should now try to publish more tutorials/examples to attract more
newcomers, but I'm the first guilty: I'm completely lack of time to do that.
BTW, I hope that EuroPython will be a good event to propagate some good
vibes around AsyncIO.

--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-06-25 22:24 GMT+02:00 Victor Stinner victor.stin...@gmail.com:

 Hi,

 2015-06-25 19:25 GMT+02:00 Andrew Svetlov andrew.svet...@gmail.com:
  P.S.
  Thank you Victor so much for your work on asyncio.
  Your changes on keeping source tracebacks and raising warnings for
  unclosed resources are very helpful.

 Ah! It's good to know. You're welcome.

 We can still enhance the source traceback by building it using the
 traceback of the current task.

 Victor
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Python 3 migration status update across some key subcommunities (was Re: 2.7 is here until 2020, please don't call it a waste.)

2015-06-01 Thread Ludovic Gasc
2015-05-31 16:15 GMT+02:00 Nick Coghlan ncogh...@gmail.com:

 On 31 May 2015 at 19:07, Ludovic Gasc gml...@gmail.com wrote:
  About Python 3 migration, I think that one of our best control stick is
  newcomers, and by extension, Python trainers/teachers.
  If newcomers learn first Python 3, when they will start to work
  professionally, they should help to rationalize the Python 3 migration
  inside existing dev teams, especially because they don't have an interest
  conflict based on the fact that they haven't written plenty of code with
  Python 2.
  2020 is around the corner, 5 years shouldn't be enough to change the
  community mind, I don't know.

 The education community started switching a while back - if you watch
 Carrie-Anne Philbin's PyCon UK 2014 keynote, one of her requests for
 the broader Python community was for everyone else to just catch up
 already in order to reduce student's confusion (she phrased it more
 politely than that, though). Educators need to tweak examples and
 exercises to account for a version switch, but that's substantially
 easier than migrating hundreds of thousands or even millions of lines
 of production code.


About the French article about Python 3 from a teacher on the production
field, it's available again:
https://translate.google.com/translate?hl=frsl=frtl=enu=http%3A%2F%2Fsametmax.com%2Fpython-3-est-fait-pour-les-nouveaux-venus%2F




 And yes, if you learn Python 3 first, subsequently encountering Python
 2's quirks and cruft is likely to encourage folks that know both
 versions of the language to start advocating for a version upgrade :)


Exactly ;-)


 After accounting for the Wow, the existing Python 2 install base is
 even larger than we realised factour, the migration is actually in a
 pretty good place overall these days. The enterprise crowd really
 are likely to be the only ones that might need the full remaining 5
 years of migration time (and they may potentially have even more time,
 if they're relying on a commercial redistributor).


More than a full-monty toolbox to migrate to Python 3 now available, I've a
good hope when I see numbers like this:
http://blog.frite-camembert.net/python-survey-2014.html

Even if we have a statistical bia because it's only Pythonists who keep an
eye on the Python actuality who have answer to this survey, by definition,
people who are more aware about Python 3 migration than the average
Python user, however, I don't remember exactly the theorem, but I know to
diffuse a piece of information inside a community and to be accepted, the
most work will be with the first 10%.
After 10%, you have enough people to start to invert the network/group
effects to keep the previous status.

BTW, during PyCON, Guido did a keynote where he noticed that a lot of
libraries don't support Python 3 yet, however a lot of famous Python
packages are already ported.
If about the absolute values, I was agree with him (more or less 5000
Python 3 packages on 55000 of PyPI if I remember), we should maybe do some
datamining with PyPI data to have a more precise vision of the situation,
for example:

1. Requests with 4195790 of downloads supports Python 3. hikvision,
9 downloads, doesn't support Python 3. It isn't fair to count with the same
weight both products.
In term of probability, you have more chances in your project to use
Requests that use hikvision. If we ponderate each package by the number of
download and calculate the percentage, we should have more of less the
probability that a lambda project has all dependencies available for Python
3.

2. The acceleration of Python 3 adoption on PyPI: More complicated to
calculate that because we need to know when the Python 3 trove classifier
has appear on each library metadata. However, I'm pretty sure that we
should see an acceleration. And we should be capable to do a prediction for
a date range when a majority of Python packages will be available for
Python 3.

3. Maybe some other hidden data, maybe Python scientific community should
have better ideas.

Web frameworks have allowed Python 3 development for a while now, and
 with Django switching their tutorial to Python 3 by default, Django
 downloads via pip show one of the highest proportions of Python 3
 adoption on PyPI. www.python.org itself is now a production Python 3
 Django web service, and the next generation of pypi.python.org will be
 a Pyramid application that's also running on Python 3.


Pretty cool :-)


 The dedicated async/await syntax in 3.5 represents a decent carrot to
 encourage migration for anyone currently using yield (or yield from)
 based coroutines, since the distinct syntax not only allows for easier
 local reasoning about whether something is an iterator or a coroutine,
 it also provides a much improved user experience for asynchronous
 iterators and context managers (including finally handling the
 asynchronous database transaction as a context manager case, which
 previous versions of Python couldn't really

Re: [Python-Dev] 2.7 is here until 2020, please don't call it a waste.

2015-05-31 Thread Ludovic Gasc
2015-05-31 0:26 GMT+02:00 Nick Coghlan ncogh...@gmail.com:


 On 31 May 2015 04:20, Ludovic Gasc gml...@gmail.com wrote:
 
  For now, I'm following the mailing-lists from a spy-glass: I don't read
 most of the e-mails.
  However, this thread seems to be infected: I can smell from here your
 emotions behind your words.
 
  Why to push a lot of emotions inside a technical discussion ?
  What's the nerves have been hit with this discussion ?

 I think you answered your own question fairly well

Thanks.

 - there's a longstanding, but rarely articulated, culture clash between
 the folks that are primarily interested in the innovators and early
 adopters side of things, and those of us that are most interested in
 bridging the gap to the early majority, late majority and laggards.

 Add in the perfectly reasonable wariness a lot of folks have regarding the
 potential for commercial interests to unfairly exploit open source
 contributors without an adequate return contribution of development effort,
 gratis software, gratis services,

Based on my professional experience, more a client pays for your skills,
more you have chance that he will respect you, because he knows your value.
The contrary is, that, less a client pays, more he will try to manipulate
you to do more things that it was planned in the contract.

Now, for an open source software, you don't have money cost, but, you still
have the knowledge cost.
If you replace money by knowledge in my two previous sentences, theses
sentences are also true.

However, things aren't binary: Apart the contribution level [1] of each
member, the good and bad ideas for the future of Python can arrive from
everybody.
The only thing I'm sure: I'm incompetent to predict the future, I've no
idea how each member of our community will react, I can list only some
possible scenarios.
But with Internet, you know as me that with only few persons you can change
a lot of things, look Edward Snowden for example.

About Python 3 migration, I think that one of our best control stick is
newcomers, and by extension, Python trainers/teachers.
If newcomers learn first Python 3, when they will start to work
professionally, they should help to rationalize the Python 3 migration
inside existing dev teams, especially because they don't have an interest
conflict based on the fact that they haven't written plenty of code with
Python 2.
2020 is around the corner, 5 years shouldn't be enough to change the
community mind, I don't know.

[1] Don't forget that contributions aren't only the source code ;-)

 or interesting employment opportunities, and you're going to see the
 occasional flare-ups as we find those rough edges where differences in
 motivation  background lead to differences of opinion  behaviour.

 Cheers,
 Nick.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] 2.7 is here until 2020, please don't call it a waste.

2015-05-30 Thread Ludovic Gasc
For now, I'm following the mailing-lists from a spy-glass: I don't read
most of the e-mails.
However, this thread seems to be infected: I can smell from here your
emotions behind your words.

Why to push a lot of emotions inside a technical discussion ?
What's the nerves have been hit with this discussion ?

If you know me a little bit, you know I'm always interested in by
efficiency improvements, especially around Python.

However, I see two parts of this discussion:

1. Python 3 must continue to be the first class citizen for the features,
bugs-killing and performance improvements, as Barry explained.
Programming in Python isn't only a language, it's also a spirit and a
community with forces and weaknesses.

The main issue for the Python 3 adoption by the community is that Python
community is mainly composed by Late Majority and Laggards [1], contrary to
some fancy programming language like Ruby, Go, Rust, insert your fancy
language here where you have a majority of Early Adopters. For example,
the migration from Ruby 1.8 to 1.9 has taken time because they changed some
critical parts, but finally, now, almost nobody uses Ruby 1.8 on production.
FYI, Ruby 1.9 has been released only one year after Python 3.0, and Ruby
community has finished their migration a long time ago, where you continue
to support Python 2.7. Maybe the change was less important between Ruby 1.8
and 1.9 that between Python 2 and Python 3, however I personally think the
majority of Early Adopters in Ruby community has helped a lot for that.

Nevertheless, at least to my eyes, it's a proof that, despite the fact time
to time somebody announce that Python is dying and that nobody will use
that on production for the new projects, in fact, Python is a clearly a
mainstream programming language, Python 3 migration time is the best proof,
you don't have that with the fancy languages.
But, it also means that to accelerate Python 3 adoption, we need more
incentives: Have a clean way to migrate, almost important libraries ported
and the fact that Python 3 is more newcomers friendly [2] aren't enough,
new features and performances are a better incentive, at least to me.
Without AsyncIO, I'll continue to code for Python 2.

2. From a strategical point of view, even if it should be reduce the
adoption speed of Python 3, it should be a good move to support that for
Python 2, to reduce the risk of fork of Python: It's better for the Python
community to use Python 2 than not Python at all.
See the NodeJS community: even if the reasons seem to be more political
than technical, fork a language isn't a potential myth.
If we force too much Python 2 users to migrate to Python 3, they should
reject completely the language, everybody will lose in this story.
Moreover, if we start to have a critical mass of Laggards with Python 2 who
have enough money/time to maintain a patch like that, and we reject that,
we should lose the discussion link and mutual enrichment: everybody is
concerned by performance improvements. Personally, only final results
matter, I don't care about the personal motivations: economical,
ecological, or basely to publish a blog post about the fact that the Python
community has a bigger one that some others ;-)

And don't forget: Almost nobody cares about our internal discussions and
our drama, they only interested by the source code we produce, even the
Python developers who use CPython.
Even if we have different motivations, I'm sure that everybody on this
mailing-list, or at least in this thread, believe in Python: You don't
take personal time during a week-end if Python isn't something important to
you, because during the time you take to write e-mails/source code, you
don't watch series or take care of your family.

[1] http://en.wikipedia.org/wiki/Early_adopter#History
[2] It's in French (Google translate is your friend), however an
interesting point of view of a Python trainer who has switched to Python 3:
http://sametmax.com/python-3-est-fait-pour-les-nouveaux-venus/ (The website
is down for now)
--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-05-30 17:42 GMT+02:00 Barry Warsaw ba...@python.org:

 On May 30, 2015, at 06:55 PM, Nick Coghlan wrote:

 Intel are looking to get involved in CPython core development
 *specifically* to work on performance improvements, so it's important
 to offer folks in the community good reasons for why we're OK with
 seeing at least some of that work applied to Python 2, rather than
 restricting their contributions to Python 3.

 I think that's fine, for all the reasons you, Toshio, and others mention.
 For
 better or worse, Python 2.7 *is* our LTS release so I think we can make
 life
 easier for the folks stuck on it wink.

 However, I want us to be very careful not to accept performance
 improvements
 in Python 2.7 that haven't also been applied to Python 3, unless of course
 they aren't relevant.  Python 3 also has a need for performance
 improvements,
 perhaps more so for various reasons, so

[Python-Dev] An yocto change proposal in logging module to simplify structured logs support

2015-05-24 Thread Ludovic Gasc
, that when the LogRecord is sent to the Handler, you can't
retrieve the dict from the extra parameter of logger.
The only way to do that without to patch Python logging, is to rebuild by
yourself the dict with a list of official attributes of LogRecord, as is
done in python-logstash:
https://github.com/vklochan/python-logstash/blob/master/logstash/formatter.py#L23
At least to me, it's a little bit dirty.

My quick'n'dirty patch I use for now on our CPython on production:

diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py
index 104b0be..30fa6ef 100644
--- a/Lib/logging/__init__.py
+++ b/Lib/logging/__init__.py
@@ -1382,6 +1382,7 @@ class Logger(Filterer):
 
 rv = _logRecordFactory(name, level, fn, lno, msg, args, exc_info,
func,
  sinfo)
+rv.extra = extra
 if extra is not None:
 for key in extra:
 if (key in [message, asctime]) or (key in rv.__dict__):

At least to me, it should be cleaner to add extra as parameter
of _logRecordFactory, but I've no idea of side effects, I understand that
logging module is critical, because it's used everywhere.
However, except with python-logstash, to my knowledge, extra parameter
isn't massively used.
The only backward incompatibility I see with a new extra attribute of
LogRecord, is that if you have a log like this:
LOG.debug('message', extra={'extra': 'example'})
It will raise a KeyError(Attempt to overwrite 'extra' in LogRecord)
exception, but, at least to me, the probability of this use case is near to
0.

Instead of to maintain this yocto patch, even it's very small, I should
prefer to have a clean solution in Python directly.

Thanks for your remarks.

Regards.
--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Accepting PEP 492 (async/await)

2015-05-07 Thread Ludovic Gasc
Thank you Yuri for the all process (PEP+code+handle debate).

It's the first time I follow the genesis of a PEP, from the idea to the
acceptation, it was very instructive to me.

--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-05-06 1:58 GMT+02:00 Guido van Rossum gu...@python.org:

 I totally forgot to publicly congratulate Yury on this PEP. He's put a
 huge effort into writing the PEP and the implementation and managing the
 discussion, first on python-ideas, later on python-dev. Congrats, Yury! And
 thanks for your efforts. Godspeed.

 On Tue, May 5, 2015 at 4:53 PM, Guido van Rossum gu...@python.org wrote:

 Everybody,

 In order to save myself a major headache I'm hereby accepting PEP 492.

 I've been following Yury's efforts carefully and I am fully confident
 that we're doing the right thing here. There is only so much effort we can
 put into clarifying terminology and explaining coroutines. Somebody should
 write a tutorial. (I started to write one, but I ran out of time after just
 describing basic yield.)

 I've given Yury clear instructions to focus on how to proceed -- he's to
 work with another core dev on getting the implementation ready in time for
 beta 1 (scheduled for May 24, but I think the target date should be May 19).

 The acceptance is provisional in the PEP 411 sense (stretching its
 meaning to apply to language changes). That is, we reserve the right to
 change the specification (or even withdraw it, in a worst-case scenario)
 until 3.6, although I expect we won't need to do this except for some
 peripheral issues (e.g. the backward compatibility flags).

 I now plan to go back to PEP 484 (type hints). Fortunately in that case
 there's not much *implementation* that will land (just the typing.py
 module), but there's still a lot of language in the PEP that needs updating
 (check the PEP 484 tracker https://github.com/ambv/typehinting/issues).

 --
 --Guido van Rossum (python.org/~guido)




 --
 --Guido van Rossum (python.org/~guido)

 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Ludovic Gasc
2015-04-22 22:46 GMT+02:00 Victor Stinner victor.stin...@gmail.com:

 Kind (A):

 - yield-from coroutines or coroutines based on yield-from
 - maybe asyncio coroutines
 - legacy coroutines?


legacy coroutines name has the advantage to be directly clear it isn't a
good idea to write new source code with that.


 Kind (B):

 - awaitable coroutines or coroutines based on await
 - asynchronous coroutine to remember the async keyword even if it
 sounds
 wrong to repeat that a coroutine can be interrupted (it's almost the
 definition of a coroutine, no?)
 - or just asynchronous function (coroutine function)  asynchronous
 object (coroutine object)


Personally, if I've a vote right, async coroutine is just enough, even if
it's a repetition. Or just coroutine ?
I'm not fan for new-style coroutines like name.
By the way, I hope you don't change a third time how to write async code in
Python, because it will be harder to define a new name.

Not related, but one of my coworkers asked me if with the new syntax it
will be possible to write an async decorator for coroutines.
If I understand correctly new grammar in PEP, it seems to be yes, but could
you confirm ?
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Ludovic Gasc
+1 about Andrew Svetlov proposition: please help to migrate as smoothly as
possible to async/await.

--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-04-22 20:32 GMT+02:00 Andrew Svetlov andrew.svet...@gmail.com:

 For now I can use mix asyncio.coroutines and `async def` functions, I
 mean I can write `await f()` inside async def to call
 asyncio.coroutine `f` and vise versa: I can use `yield from g()`
 inside asyncio.coroutine to call `async def g(): ...`.

 If we forbid to call `async def` from regualr code how asyncio should
 work? I'd like to push `async def` everywhere in asyncio API where
 asyncio.coroutine required.

 On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov yselivanov...@gmail.com
 wrote:
  Hi Rajiv,
 
  On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
 
  I'd like to suggest another way around some of the issues here, with
  apologies if this has already been discussed sometime in the past.
 
   From the viewpoint of a Python programmer, there are two distinct
 reasons
  for wanting to suspend execution in a block of code:
 
  1. To yield a value from an iterator, as Python generators do today.
 
  2. To cede control to the event loop while waiting for an asynchronous
  task
  to make progress in a coroutine.
 
  As of today both of these reasons to suspend are supported by the same
  underlying mechanism, i.e. a yield at the end of the chain of yield
  froms. PEPs 492 and 3152 introduce await and cocall, but at the
  bottom
  of it all there's effectively still a yield as I understand it.
 
  I think that the fact that these two concepts use the same mechanism is
  what leads to the issues with coroutine-generators that Greg and Yury
 have
  raised.
 
  With that in mind, would it be possible to introduce a second form of
  suspension to Python to specifically handle the case of ceding to the
  event
  loop? I don't know what the implementation complexity of this would be,
 or
  if it's even feasible. But roughly speaking, the syntax for this could
 use
  await, and code would look just like it does in the PEP. The semantics
  of
  await Task would be analogous to yield from Task today, with the
  difference that the Task would go up the chain of awaits to the
  outermost
  caller, which would typically be asyncio, with some modifications from
 its
  form today. Progress would be made via __anext__ instead of __next__.
 
 
  I think that what you propose is a great idea. However, its
  implementation will be far more invasive than what PEP 492
  proposes.  I doubt that we'll be able to make it in 3.5 if
  we choose this route.
 
  BUT: With my latest proposal to disallow for..in loops and
  iter()/list()-like builtins, the fact that coroutines are
  based internally on generators is just an implementation
  detail.
 
  There is no way users can exploit the underlying generator
  object.  Coroutine-objects only provide 'send()' and 'throw()'
  methods, which they would also have with your implementation
  idea.
 
  This gives us freedom to consider your approach in 3.6 if
  we decide to add coroutine-generators.  To make this work
  we might want to patch inspect.py to make isgenerator() family
  of functions to return False for coroutines/coroutine-objects.
 
  Thanks a lot for the feedback!
 
  Yury
  ___
  Python-Dev mailing list
  Python-Dev@python.org
  https://mail.python.org/mailman/listinfo/python-dev
  Unsubscribe:
 
 https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com



 --
 Thanks,
 Andrew Svetlov
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com