Asyncio -- delayed calculation

2016-11-28 Thread Steve D'Aprano
I'm a complete and utter newbie when it comes to asynchronous programming,
so I may have the entire concept backwards here. But treat this as a
learning exercise rather than something I'd really do.

Suppose I have a bunch of calculations to do: count down from 10. So I have
a bunch of objects:

class Counter:
def __init__(self):
self.count = 10
def count_down(self):
print(self, "starting")
while self.count > 0:
# simulate a computation
time.sleep(0.5)
self.count -= 1
print(self, "completed")

pool = [Counter() for i in range(5)]

for obj in pool:
obj.count_down()



Since that's all blocking, I wait for the first Counter to count down to
zero before I move on to the second.

Let's pretend that the computation can be performed asynchronously, so that
I can have all five Counter objects counting down in parallel. I have this:


import asyncio

class Counter:
def __init__(self):
self.count = 10
async def count_down(self):
print(self, "starting")
while self.count > 0:
# simulate a computation
await asyncio.sleep(0.5)
self.count -= 1
print(self, "completed")

async def main():
pool = [Counter() for i in range(5)]
for obj in pool:
obj.count_down()

loop = asyncio.get_event_loop()
loop.run_until_complete(main())




When I try running that, I get no output. No error, no exception, the
run_until_complete simply returns instantly.

What am I doing wrong?

What I expected is that the pool of Counter objects would be created, each
one would have their count_down() method called without blocking, so I'd
have something like:

# IDs are simulated for ease of comprehension
<__main__.Counter object at 0x0123> starting
<__main__.Counter object at 0x0246> starting
<__main__.Counter object at 0x048c> starting
<__main__.Counter object at 0x0918> starting
<__main__.Counter object at 0x1230> starting
<__main__.Counter object at 0x0123> completed
<__main__.Counter object at 0x0246> completed
<__main__.Counter object at 0x048c> completed
<__main__.Counter object at 0x0918> completed
<__main__.Counter object at 0x1230> completed



-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Mon, Nov 28, 2016 at 11:48 PM, Steve D'Aprano
 wrote:
> When I try running that, I get no output. No error, no exception, the
> run_until_complete simply returns instantly.

When I do, I get this warning:

asynctest.py:17: RuntimeWarning: coroutine 'Counter.count_down' was
never awaited
  obj.count_down()

Putting an 'await' in front of that call causes the tasks to be run
consecutively, of course. The most similar code for running tasks
concurrently seems to be this:

async def main():
pool = [Counter() for i in range(5)]
await asyncio.gather(*(obj.count_down() for obj in pool))

Taken from:
https://docs.python.org/3/library/asyncio-task.html#example-parallel-execution-of-tasks

There may be other ways, but that's the best I could find. It seems to
do what you want.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Simple Python equivalent for the shell command

2016-11-28 Thread Ganesh Pal
I  was trying  to write a function that will return me the unique number
associated with each employee id.The command has the output in the below
pattern

Linux-Box-1# employee_details ls
List of names:
100910bd9 s7018
100d60003 s7019
110610bd3 s7020
100d60002 s7021


Linux-Box-1# employee_details ls | grep "s7020" | awk '{print $1}'
100d60003

It's a one liner in Shell  :)


I tried converting  the same  in the python style , Any better suggestion
and loop holes in the below program

def get_unique_number(str(emp_id)):
""" Return the unique number associated with each employee id """
out, err, rc = run("employee_details ls", timeout=600)
emp_unum=""
if rc != 0:
return False

for line in out.split('\n'):
if emp_id in line:
   emp_unum = line.split()[0]
return emp_unum

I am on Python 2.7 and Linux OS

Regards,
Ganesh
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: correct way to catch exception with Python 'with' statement

2016-11-28 Thread Ganesh Pal
On Mon, Nov 28, 2016 at 1:16 PM, Steven D'Aprano <
steve+comp.lang.pyt...@pearwood.info> wrote:

>
>
> There is no need to return True. The function either succeeds, or it
> raises an
> exception, so there is no need to return any value at all.
>
>
 I returned True here ,because based on the result of this function , I
would want to perform next steps

 Example
  if  create_files_append():
   do_somthing()
  else:
 do_next_thing()



>
> Your comment says "append the files", but you're not appending to the
> files,
> you are overwriting them. So your code is better written like this:
>
>
 Yes , correct  and apologies there was a typo it should have been 'a'
instead of 'w'  . Thanks for the comments
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Simple Python equivalent for the shell command

2016-11-28 Thread Ganesh Pal
I  remembered that I might need to add an else condition if the emp_num
does not exist , so re sending the updated code


def get_unique_number(str(emp_id)):
""" Return the unique number associated with each employee id """
out, err, rc = run("employee_details ls", timeout=600)
emp_unum=""
if rc != 0:
return False

for line in out.split('\n'):
if emp_id in line:
   emp_unum = line.split()[0]
else:
print("emp_unum does not exist")
return False
return emp_unum

PS :  [Edited the above code with else condition]

Regards,
Ganesh


On Mon, Nov 28, 2016 at 8:38 PM, Ganesh Pal  wrote:

>
>
> I  was trying  to write a function that will return me the unique number
> associated with each employee id.The command has the output in the below
> pattern
>
> Linux-Box-1# employee_details ls
> List of names:
> 100910bd9 s7018
> 100d60003 s7019
> 110610bd3 s7020
> 100d60002 s7021
>
>
> Linux-Box-1# employee_details ls | grep "s7020" | awk '{print $1}'
> 100d60003
>
> It's a one liner in Shell  :)
>
>
> I tried converting  the same  in the python style , Any better suggestion
> and loop holes in the below program
>
> def get_unique_number(str(emp_id)):
> """ Return the unique number associated with each employee id """
> out, err, rc = run("employee_details ls", timeout=600)
> emp_unum=""
> if rc != 0:
> return False
>
> for line in out.split('\n'):
> if emp_id in line:
>emp_unum = line.split()[0]
> return emp_unum
>
> I am on Python 2.7 and Linux OS
>
> Regards,
> Ganesh
>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Ian Kelly
On Mon, Nov 28, 2016 at 5:48 AM, Steve D'Aprano
 wrote:
> Let's pretend that the computation can be performed asynchronously, so that
> I can have all five Counter objects counting down in parallel. I have this:
>
>
> import asyncio
>
> class Counter:
> def __init__(self):
> self.count = 10
> async def count_down(self):
> print(self, "starting")
> while self.count > 0:
> # simulate a computation
> await asyncio.sleep(0.5)
> self.count -= 1
> print(self, "completed")
>
> async def main():
> pool = [Counter() for i in range(5)]
> for obj in pool:
> obj.count_down()
>
> loop = asyncio.get_event_loop()
> loop.run_until_complete(main())
>
>
>
>
> When I try running that, I get no output. No error, no exception, the
> run_until_complete simply returns instantly.
>
> What am I doing wrong?

Remember that coroutines are basically generators. Native "async def"
coroutines are dressed up as something different, but they were still
designed as a drop-in replacement for generator coroutines. If
count_down were a generator function then simply calling it wouldn't
really do anything and as a native coroutine it still doesn't (other
than return a "coroutine object").

In order for the coroutines to actually do anything, you need to
schedule them in some way with the event loop. That could take the
form of awaiting them from some other coroutine, or passing them
directly to loop.run_until_complete or event_loop.create_task, or as
Chris suggested awaiting them as an aggregate.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Simple Python equivalent for the shell command

2016-11-28 Thread Michael Torrie
On 11/28/2016 08:08 AM, Ganesh Pal wrote:
> I  was trying  to write a function that will return me the unique number
> associated with each employee id.The command has the output in the below
> pattern
> 
> Linux-Box-1# employee_details ls
> List of names:
> 100910bd9 s7018
> 100d60003 s7019
> 110610bd3 s7020
> 100d60002 s7021
> 
> 
> Linux-Box-1# employee_details ls | grep "s7020" | awk '{print $1}'
> 100d60003
> 
> It's a one liner in Shell  :)
> 
> 
> I tried converting  the same  in the python style , Any better suggestion
> and loop holes in the below program

Well Bash is really good at some things.  Piping commands together is
one of those things.  Python can do such things but not in as compact a
way.  For one Python has no quick way of interfacing with subprograms as
if they were language-level constructs like Bash does.  But it still can
do a lot of the same tasks that shell scripting does.  Read on.

In many respects, generators are the Python equivalent of pipes. See
this: http://www.dabeaz.com/generators/, specifically
http://www.dabeaz.com/generators/Generators.pdf

Here's a simple grep-like filter that searches lines:

def simple_grep(search_in, look_for):
for line in search_in:
if look_for in line:
yield line

import sys
for result in simple_grep(sys.stdin, "s2070"):
print result.split()[0]

This obviously does not run a subprogram, but could feed the filter
input from any file-like object that returns lines.  The beauty of
generators is that you can string them together just like you would do
in Bash with pipes.  And they can potentially be fairly fast, especially
if you can reduce the number of lines the generators have to process by
putting the faster filters earlier on in the chain (much like you would
optimize a database query by pushing selections ahead of joins).  Though
this is not shorter than your code, nor is it substantially different,
it does have the advantage that the simple_grep() generator can be used
in multiple places and could even be placed in utility library.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: correct way to catch exception with Python 'with' statement

2016-11-28 Thread Michael Torrie
On 11/28/2016 08:18 AM, Ganesh Pal wrote:
> On Mon, Nov 28, 2016 at 1:16 PM, Steven D'Aprano <
> steve+comp.lang.pyt...@pearwood.info> wrote:
> 
>>
>>
>> There is no need to return True. The function either succeeds, or it
>> raises an
>> exception, so there is no need to return any value at all.
>>
>>
>  I returned True here ,because based on the result of this function , I
> would want to perform next steps

Except that you didn't read what Steven wrote, and you may not be
understanding your own code.  If the function fails it never returns
anyway, since an exception is raised!  So you don't need to worry about
a return type at all. You just do:

create_files_append()
do_something().

If you want to check for the exception, you would do:
try:
create_files_append()
do_something()
except OSError:
do_next_thing()


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Immutability of Floats, Ints and Strings in Python

2016-11-28 Thread Random832
On Fri, Nov 25, 2016, at 06:33, Ned Batchelder wrote:
> A Python implementation can choose when to reuse immutable objects and
> when not to.  Reusing a value has a cost, because the values have to
> be kept, and then found again. So the cost is only paid when there's
> a reasonable chance that the values will actually be needed again.
> And that cost has to be weighed against the opposite cost of simply
> making a new object instead.

Of course, there are more complicated costs to consider. For an
implementation where objects do not have a naturally persistent object
identity (such as an immovable address in memory as in cpython) they may
consider it easier to have an "object identity" that consists of the
whole value for immutable types rather than pay whatever costs are
associated with having a unique and unchanging "id" value. It's also not
hard to imagine an implementation that has "references" consisting of a
tagged union and incorporating the value in the "reference" itself (and
therefore the id) for floats and small integer/string values, though I
can't name any implementation (of Python, anyway - it's not uncommon for
Lisp) that does so.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Simple Python equivalent for the shell command

2016-11-28 Thread Michael Torrie
On 11/28/2016 08:55 AM, Michael Torrie wrote:
> Well Bash is really good at some things.  Piping commands together is
> one of those things.  Python can do such things but not in as compact a
> way.  For one Python has no quick way of interfacing with subprograms as
> if they were language-level constructs like Bash does.  But it still can
> do a lot of the same tasks that shell scripting does.  Read on.
> 
> In many respects, generators are the Python equivalent of pipes. See
> this: http://www.dabeaz.com/generators/, specifically
> http://www.dabeaz.com/generators/Generators.pdf

I should mention that the one thing that Python has no support for in
its generator methodology is the idea of having one execution unit that
can have both standard out and standard error pipes and communicate with
both at the same time. Bash can support arbitrary pipes I believe.
However even Bash's syntax is awkward, and I can't imagine a decent way
to do it in Python anyway.  Perhaps yielding a tuple.
-- 
https://mail.python.org/mailman/listinfo/python-list


plot band structure

2016-11-28 Thread badr
Hello everybody
I have an xml file for Band structure that I would like to plot like this 
https://pypi.python.org/pypi/pydass_vasp/0.1. 
I have download phyton and I want to know the code to use and how to run the 
program, I am beginer to use it , please I need  a help,thank you in advance.

Regards

Rachid
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: correct way to catch exception with Python 'with' statement

2016-11-28 Thread Thomas 'PointedEars' Lahn
Ganesh Pal wrote:

> I am using Python 2.7 and Linux

As a rule of thumb¹, use at least Python 3.3 for new programs.

> What will be the best  way to catch the exception in the above program  ?
> Can we replace both the  with statement in the above program with
> something like below
> 
> try:
> for i in range(1000):
>with open(os.path.join(QA_TEST_DIR,"filename%d" %i),'w') as
>f:
>f.write("hello")
> except IOError as e:
> raise

If you do it like this, you will have a hard time figuring out what caused 
the exception, and it becomes the harder the more you do in the loop. 
Because *several* statements could throw an exception of the *same* type.

Therefore, as a rule of thumb¹, if you catch exceptions, catch them closest 
to where they could occur:

for i in range(1000):
try:
with open(os.path.join(QA_TEST_DIR, "filename%d" % i), 'w') as f:
try:
f.write("hello")
except … as e:
raise e

except … as e:
raise e

This is just an example; you do not have to just re-raise the exception; 
“raise” is just where your custom exception handling, if any, should be.  
For example, if f.write() fails with an IOError, you could decide to ignore 
that and “continue” with the next file, whereas you may decide that failing 
to open a file is so grave a problem that you want to abort the program at 
this point.

Finally, use uniform indentation and spacing in your code; the former is 
even more important in Python than in other programming languages because 
indentation makes a block statement in Python.  Use a Python-aware editor to 
make it easy to create uniform indentation by use of the Tab key, and a 
reformatting feature for existing code.  I can recommend Atom [0] with the 
packages “autocomplete-python”, “linter-pylint”, “python-isort”, and 
“python-tools” (“language-python” is a built-in); and Vim [1] with

if has("syntax")
…
syntax on
…
endif
…
filetype plugin indent on

in ~/.vimrc and a ~/.vim/after/ftplugin/python.vim containing

setlocal expandtab
setlocal shiftwidth=4
setlocal softtabstop=4

___
¹  YMMV

[0] 
[1] 
-- 
PointedEars

Twitter: @PointedEars2
Please do not cc me. / Bitte keine Kopien per E-Mail.
-- 
https://mail.python.org/mailman/listinfo/python-list


correct way to catch exception with Python 'with' statement

2016-11-28 Thread g thakuri
Dear Python friends,

Any suggestion on how to add exception and make the below program look
better  ,  I am using Python 2.7 and Linux



def create_files_append():
"""  """
try:
os.makedirs(QA_TEST_DIR)
except:
raise OSError("Can't create directory (%s)!" % (QA_TEST_DIR))

# Create few files and write something
for i in range(1000):
with open(os.path.join(QA_TEST_DIR,"filename%d" %i),'w') as f:
 f.write("hello")

# Append the files
for i in range(1000):
with open(os.path.join(QA_TEST_DIR,"filename%d" %i),'w') as f:
 f.write("hello")

return True

---

What will be the best  way to catch the exception in the above program  ?
Can we replace both the  with statement in the above program with something
like below

try:
for i in range(1000):
   with open(os.path.join(QA_TEST_DIR,"filename%d" %i),'w') as f:
   f.write("hello")
except IOError as e:
raise

Regards,
PT
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: JavaScrypthon 0.5, now with embedded evaluation of transpiled code

2016-11-28 Thread Amirouche Boubekki
On Sat, Nov 26, 2016 at 7:21 PM Alberto Berti <
azazel+python-annou...@arstecnica.it> wrote:

> Hi all,
>

Héllo!


> i'm pleased to announce that JavaScripthon 0.5 has been released to
> PyPI. JavaScrypthon can translate a subset of Python 3.5 code to  ES6
> JavaScript producing beautiful and lean  code, while supporting some of
> the latest Python features.
>
> Changelog
> (https://github.com/azazel75/metapensiero.pj/blob/master/CHANGES.rst)


This is the best of ES6 in a Python syntax.

Impressive work.

I have a few questions:

a) Why do you choose not to use your own Javascripton class to represent
Python dict?

b) Do you do some type tracing or something to be able to convert
``foo.update(bar)`` to ``Object.assign(foo, bar)``

c) IIUC you don't implement Python type system and rely on ES6 (or ES7)
class system? Is that correct? If that's the case how do you implement
class decorators and method decorators?

d) What about Python <-> Javascript interop. It's a tremendous topic not to
be forgotten.

A few remarks:

i) I had a remark about arguments but javascripthon doesn't support
**kwargs at call site. This is implemented in Pythonium, have a look, it
has some js <-> python interop.

ii) Too bad I can't run the vanilla pystone benchmark

I created a similar project targeting ES5 [0].

Also, FWIW users are looking for a Javascript replacement that is real
Python, not another coffeescript.

Best wishes!

[0] https://github.com/amirouche/pythonium/blob/master/dodge.py
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: JavaScrypthon 0.5, now with embedded evaluation of transpiled code

2016-11-28 Thread Eric S. Johansson


On 11/28/2016 2:02 PM, Amirouche Boubekki wrote:
> Also, FWIW users are looking for a Javascript replacement that is real
> Python, not another coffeescript.
does this count? http://brython.info/

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: The Case Against Python 3

2016-11-28 Thread Gregory Ewing

Steve D'Aprano wrote:

I daresay you are right that a sufficiently clever adversary may have found
an exploit. But there's no sign that anyone actually did find an exploit,
until f-strings made exploiting this trivial.


The person who wrote the bug report found at least one
way of exploiting it that doesn't require f-strings.

I agree that f-strings are not to blame here. If we really
want to avoid breaking anyone's ill-conceived attempts at
sandboxing eval, we'd better not add anything more to the
language, ever, because nobody can foresee all the possible
consequences.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: The Case Against Python 3

2016-11-28 Thread Paul Rubin
Gregory Ewing  writes:
> I agree that f-strings are not to blame here. If we really want to
> avoid breaking anyone's ill-conceived attempts at sandboxing eval,
> we'd better not add anything more to the language, ever, because
> nobody can foresee all the possible consequences.

I'm surprised eval was used like that.  It seems ill-advised.  Something
similar happened with pickles some time back.  Oh my, now I'm reminded
at how old we've all gotten:

"Using eval this way is like storing a vat of cyanide in your
child's bedroom.  Sure, maybe if you check the seals and locks on
the vat carefully enough, you can convince yourself that your child
won't be able to get to the cyanide.  But wouldn't you feel safer
just not having the vat there at all?  That's basic
safety-consciousness.  Security consciousness works the same way.
Try to keep dangerous ingredients and attackers as far away from
each other as possible."  ( http://bugs.python.org/msg6972 )
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: The Case Against Python 3

2016-11-28 Thread Steve D'Aprano
On Tue, 29 Nov 2016 09:35 am, Gregory Ewing wrote:

> Steve D'Aprano wrote:
>> I daresay you are right that a sufficiently clever adversary may have
>> found an exploit. But there's no sign that anyone actually did find an
>> exploit, until f-strings made exploiting this trivial.
> 
> The person who wrote the bug report found at least one
> way of exploiting it that doesn't require f-strings.

Oops, of course you are right. The bug report says:

My first discovery was that nothing prevents an input plural 
string that resembles a function call:

   gettext.c2py("n()")(lambda: os.system("sh"))

This is of course a low risk bug, since it requires control
of both the plural function string and the argument.

http://bugs.python.org/issue28563


And later on:

Instead of passing a string to eval, we can build a string
from characters in the docstrings available in the context
of the gettext module:
[...]
This will successfully spawn a shell in Python 2.7.11.

Which I had forgotten about. (And by the way: the exploit could have been
avoided by refusing any string which contains an underscore. If you can't
avoid vulnerable code, at least make attackers work hard to exploit it.)

The point I was making was this comment:

Bonus: With the new string interpolation in Python 3.7, 
exploiting gettext.c2py becomes trivial:

   gettext.c2py('f"{os.system(\'sh\')}"')(0)

The tokenizer will recognize the entire format-string as 
JUST A STRING [emphasis added]

f-strings are *not* "just a string", as this shows -- they're actually
equivalent to a call to eval. And *that* is my point: we've picked a syntax
which looks like a static literal string for something which is not only a
function call but equivalent to the second most powerful (and hence
dangerous) built-in function call available in the language.

Normally I think Guido's instinct for syntax and language features is pretty
good, but here I think he's made a blunder. The original proposal on the
Python-Ideas mailing list was for syntax to automatically perform *only*
name lookups so that 

f"{x}"

was syntactic sugar for

"{x}".format(x=x)


and from that simple request it has grown in scope to the point that we can
now write:

f"{os.system('sh')}"

as syntactic sugar for:

str(eval("os.system('sh')"))

Yay for progress!



> I agree that f-strings are not to blame here.

I've already said that. Certainly the *vulnerability* comes from the use of
eval by gettext.c2py, but the TRIVIAL *exploit* comes from f-strings.


> If we really 
> want to avoid breaking anyone's ill-conceived attempts at
> sandboxing eval, we'd better not add anything more to the
> language, ever, because nobody can foresee all the possible
> consequences.

Now you're just being silly, this isn't "anything", it is a specific design
decision: something which looks like, and is treated by the tokeniser, as a
string but is actually a hidden call to eval.

And in fact I did foresee the consequences. I never predicted this *exact*
vulnerability, of course, but I did say that disguising a call to eval
inside something which looks like a string would lead to trouble. I was
poo-pooed for that idea, so please excuse me if I gloat a little when the
first "trouble" is discovered before the feature is even available in a
production release.

I don't say this as if it were an amazing feat of prediction. I'm astonished
that apparently others can't, or won't, see it. Its like when people store
weed killer or some other poison in a soft-drink bottle, complete with the
label still on, and then are surprised when someone gets poisoned.

http://www.dailymail.co.uk/news/article-1337101/Father-Phillip-Ward-dies-accidentally-drinking-weedkiller-Lucozade-bottle.html

(Not the first, or last, case of accidental poisoning from herbicide or
pesticide stored inappropriately.)



-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Steve D'Aprano
On Tue, 29 Nov 2016 02:53 am, Ian Kelly wrote:

> In order for the coroutines to actually do anything, you need to
> schedule them in some way with the event loop. That could take the
> form of awaiting them from some other coroutine, or passing them
> directly to loop.run_until_complete or event_loop.create_task, or as
> Chris suggested awaiting them as an aggregate.

I thought that's what I had done, by calling

loop.run_until_complete(main())



-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: The Case Against Python 3

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 10:54 AM, Steve D'Aprano
 wrote:
> Now you're just being silly, this isn't "anything", it is a specific design
> decision: something which looks like, and is treated by the tokeniser, as a
> string but is actually a hidden call to eval.
>

This, I think, is the crux. A "hidden eval" is a fundamentally bad
thing. Python 2's input() function is bad for this reason - not
because eval is necessarily evil (there are times when that's the
exact behaviour you want), but because the simple "get text from the
keyboard" function shouldn't conceal an eval of user text.

The solution, IMO, is to treat f-strings as expressions and NOT as
strings. They are no more strings than function calls are:

"###".join(os.system("sh"))

Everything that tokenizes Python code needs to be aware of the
different string prefixes already (eg a raw string literal doesn't end
at the same point as other string literals do), and this should be no
different. The stdlib ast.parse (really a wrapper around compile)
already gets this right:

>>> ast.dump(ast.parse('lambda x,y: f"{x} + {y} = {x+y}"'))
"Module(body=[Expr(value=Lambda(args=arguments(args=[arg(arg='x',
annotation=None), arg(arg='y', annotation=None)], vararg=None,
kwonlyargs=[], kw_defaults=[], kwarg=None, defaults=[]),
body=JoinedStr(values=[FormattedValue(value=Name(id='x', ctx=Load()),
conversion=-1, format_spec=None), Str(s=' + '),
FormattedValue(value=Name(id='y', ctx=Load()), conversion=-1,
format_spec=None), Str(s=' = '),
FormattedValue(value=BinOp(left=Name(id='x', ctx=Load()), op=Add(),
right=Name(id='y', ctx=Load())), conversion=-1,
format_spec=None)])))])"

So what is it that's trying to read something and is calling an
f-string a mere string?

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 11:20 AM, Steve D'Aprano
 wrote:
> On Tue, 29 Nov 2016 02:53 am, Ian Kelly wrote:
>
>> In order for the coroutines to actually do anything, you need to
>> schedule them in some way with the event loop. That could take the
>> form of awaiting them from some other coroutine, or passing them
>> directly to loop.run_until_complete or event_loop.create_task, or as
>> Chris suggested awaiting them as an aggregate.
>
> I thought that's what I had done, by calling
>
> loop.run_until_complete(main())

That's invoking a single task, main(). If you were to write this code
using threads, main would need to be thread-aware so that it can fork
appropriately - it needs to start new threads for the subtasks. Tasks
in asyncio are like cooperatively-switched threads or threadlets, and
if you want them to run in parallel, you have to instruct them to run
as separate tasks.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Steve D'Aprano
On Tue, 29 Nov 2016 12:03 am, Chris Angelico wrote:

> On Mon, Nov 28, 2016 at 11:48 PM, Steve D'Aprano
>  wrote:
>> When I try running that, I get no output. No error, no exception, the
>> run_until_complete simply returns instantly.
> 
> When I do, I get this warning:
> 
> asynctest.py:17: RuntimeWarning: coroutine 'Counter.count_down' was
> never awaited
>   obj.count_down()

Ah yes, I saw that earlier, from an earlier version of the code that failed
with an exception. But it was enough to raise the warning, which then was
suppressed.


> Putting an 'await' in front of that call causes the tasks to be run
> consecutively, of course. The most similar code for running tasks
> concurrently seems to be this:
> 
> async def main():
> pool = [Counter() for i in range(5)]
> await asyncio.gather(*(obj.count_down() for obj in pool))
> 
> Taken from:
>
https://docs.python.org/3/library/asyncio-task.html#example-parallel-execution-of-tasks

This is confusing: why is this awaiting something inside an async function?
Doesn't that mean that the await asyncio.gather(...) call is turned
blocking?



-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Zentrader
Take a look at Doug Hellmann's example using multiprocessing at 
https://pymotw.com/2/multiprocessing/basics.html  You should be able to 
substitute the count down example directly into the first example.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 1:23 PM, Steve D'Aprano
 wrote:
> This is confusing: why is this awaiting something inside an async function?
> Doesn't that mean that the await asyncio.gather(...) call is turned
> blocking?

"await" means "don't continue this function until that's done". It
blocks the function until a non-blocking operation is done.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Zachary Ware
On Mon, Nov 28, 2016 at 6:48 AM, Steve D'Aprano
 wrote:
> What am I doing wrong?

Give yourself a bit more to debug with, since you're going to want to
do something with the result your expensive calculation anyway:

import asyncio

class Counter:
def __init__(self, i):
self.count = 10
self.i = i

async def count_down(self):
print(self, self.i, "starting")
while self.count > 0:
# simulate a computation
await asyncio.sleep(0.5)
self.count -= 1
print(self, self.i, "completed")
return self.i + self.count

async def main():
pool = [Counter(i) for i in range(5)]
results = []
for obj in pool:
results.append(obj.count_down())
return results

loop = asyncio.get_event_loop()
print(loop.run_until_complete(main()))


This gives you:

[, , , , ]
asynctest.py:25: RuntimeWarning: coroutine 'Counter.count_down' was
never awaited
  print(loop.run_until_complete(main()))

Ok, so let's fix that by adding an 'await' on line 21 (it's reported
at line 25 because that's when the unawaited coroutines are gc'd):

results.append(await obj.count_down())

Running that gives:

<__main__.Counter object at 0x10203f978> 0 starting
<__main__.Counter object at 0x10203f978> 0 completed
<__main__.Counter object at 0x1025af710> 1 starting
<__main__.Counter object at 0x1025af710> 1 completed
<__main__.Counter object at 0x1025b60b8> 2 starting
<__main__.Counter object at 0x1025b60b8> 2 completed
<__main__.Counter object at 0x1025b60f0> 3 starting
<__main__.Counter object at 0x1025b60f0> 3 completed
<__main__.Counter object at 0x1025b6128> 4 starting
<__main__.Counter object at 0x1025b6128> 4 completed
[0, 1, 2, 3, 4]

Still not right, only one count_down is run at a time.  But that's
because we're using a synchronous for loop to await our results and
populate the results list.  Naively, I tried an 'async for', but
that's trying to be asynchronous in the wrong place:

Traceback (most recent call last):
  File "asynctest.py", line 25, in 
print(loop.run_until_complete(main()))
  File "/usr/lib/python3.5/asyncio/base_events.py", line 387, in
run_until_complete
return future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
result = coro.send(None)
  File "asynctest.py", line 20, in main
async for obj in pool:
TypeError: 'async for' requires an object with __aiter__ method, got list

So instead, checking the docs suggests using asyncio.gather for
parallel execution of tasks, which takes a variable number of
'coros_or_futures'.  On the first attempt, we saw that we had
accidentally created a list of "coroutine objects", so lets go back
and use that:

--- asynctest.py.orig 2016-11-28 21:03:04.0 -0600
+++ asynctest.py 2016-11-28 21:03:35.0 -0600
@@ -16,9 +16,10 @@

 async def main():
 pool = [Counter(i) for i in range(5)]
-results = []
+coros = []
 for obj in pool:
-results.append(await obj.count_down())
+coros.append(obj.count_down())
+results = asyncio.gather(*coros)
 return results

 loop = asyncio.get_event_loop()

Output:

<__main__.Counter object at 0x1026b6160> 4 starting
<__main__.Counter object at 0x10213f978> 0 starting
<__main__.Counter object at 0x1026b6128> 3 starting
<__main__.Counter object at 0x1026af748> 1 starting
<__main__.Counter object at 0x1026b60f0> 2 starting
<_GatheringFuture pending>

Now we've started everything asynchronously, but it exited way too
fast and never completed anything.  But instead of a list of results,
we got a _GatheringFuture at the end of everything.  So let's await
that:

results = await asyncio.gather(*coros)

And now we get:

<__main__.Counter object at 0x101eb6160> 4 starting
<__main__.Counter object at 0x10063f978> 0 starting
<__main__.Counter object at 0x101eb6128> 3 starting
<__main__.Counter object at 0x101eaf748> 1 starting
<__main__.Counter object at 0x101eb60f0> 2 starting
<__main__.Counter object at 0x101eb6160> 4 completed
<__main__.Counter object at 0x10063f978> 0 completed
<__main__.Counter object at 0x101eb6128> 3 completed
<__main__.Counter object at 0x101eaf748> 1 completed
<__main__.Counter object at 0x101eb60f0> 2 completed
[0, 1, 2, 3, 4]


And there we have it.  Our final script is:

import asyncio

class Counter:
def __init__(self, i):
self.count = 10
self.i = i

async def count_down(self):
print(self, self.i, "starting")
while self.count > 0:
# simulate a computation
await asyncio.sleep(0.5)
self.count -= 1
print(self, self.i, "completed")
return self.i + self.count

async def main():
pool = [Counter(i) for i in range(5)]
coros = []
for obj in pool:
coros.append(obj.count_down())
results = await asyncio.gather(*coros)
return results

loop = asyncio.get

Re: Asyncio -- delayed calculation

2016-11-28 Thread Gregory Ewing

Chris Angelico wrote:

"await" means "don't continue this function until that's done". It
blocks the function until a non-blocking operation is done.


However, *not* using 'await' doesn't mean the operation
will be done without blocking. Rather, it won't be done
at all (and is usually an error, but there's no way for
the implementation to detect it at the time).

I don't blame Steve for being confused. All the
terminology around async/await is inherently confusing
and counterintuitive, IMO. I'm disappointed that we've
ended up here.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 3:16 PM, Gregory Ewing
 wrote:
> Chris Angelico wrote:
>>
>> "await" means "don't continue this function until that's done". It
>> blocks the function until a non-blocking operation is done.
>
>
> However, *not* using 'await' doesn't mean the operation
> will be done without blocking. Rather, it won't be done
> at all (and is usually an error, but there's no way for
> the implementation to detect it at the time).
>
> I don't blame Steve for being confused. All the
> terminology around async/await is inherently confusing
> and counterintuitive, IMO. I'm disappointed that we've
> ended up here.

Asynchronous I/O is something to get your head around. As someone who
teaches both JavaScript and Python, I've run into this quite a bit,
and frankly, callbacks may be easy to explain at a concrete level, but
I'd much rather work with generator-based async functions (which JS is
getting soon too). The best way to think about it IMO is a "process"
that does blocking calls, and which the "system" can multitask away
from any time it's blocked. Your function awaits something, but Python
doesn't.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Nathan Ernst
To be fair, in other languages, such as C# or C++ with similar mechanisms,
if you don't ask for the result from an async or future task, there's no
guarantee the async task will be executed at all unless (or until) you ask
for the result. C++'s futures even give an explicit flag indicating you
want the task to be executed on the current thread, but deferred until the
result is asked for. I'm sure the people on the standards committee had a
rationale for this, but I see only marginal utility for that use case
(maybe such as a cancellable expensive calc?). Seems like "I might need the
result of this calc down the road, but I don't need it currently, so I'll
just kick the can down the road.". I'm not aware of a mechanism to make C#
behave that way, but I know it's possible that tasks will not execute until
they're asked for their result, but I don't know that it is 100%
deterministic.

To be honest, I think async/await stile programming is so new that there's
not a lot of good material about how to properly use it (as far as both
when and how). The limit of my experience has been in C# dealing with web
requests, where the usage is basically, I'm going to need this data, so
I'll fire off the async request for it, and I've got a lot of work I can do
before I need the result, so lets do that, until I need to wait (await) for
the result of that request before I can proceed any further. It can be
useful, but it is a new, completely different paradigm and methodology to
wrap ones head around. I've used it for about 2 years in C# land, and I
still don't know that I'm doing it right. I've not done any async/await in
Python space, but I'm eager to learn and try, when it may be appropriate,
But, most of my current use cases don't require it.

On Mon, Nov 28, 2016 at 10:37 PM, Chris Angelico  wrote:

> On Tue, Nov 29, 2016 at 3:16 PM, Gregory Ewing
>  wrote:
> > Chris Angelico wrote:
> >>
> >> "await" means "don't continue this function until that's done". It
> >> blocks the function until a non-blocking operation is done.
> >
> >
> > However, *not* using 'await' doesn't mean the operation
> > will be done without blocking. Rather, it won't be done
> > at all (and is usually an error, but there's no way for
> > the implementation to detect it at the time).
> >
> > I don't blame Steve for being confused. All the
> > terminology around async/await is inherently confusing
> > and counterintuitive, IMO. I'm disappointed that we've
> > ended up here.
>
> Asynchronous I/O is something to get your head around. As someone who
> teaches both JavaScript and Python, I've run into this quite a bit,
> and frankly, callbacks may be easy to explain at a concrete level, but
> I'd much rather work with generator-based async functions (which JS is
> getting soon too). The best way to think about it IMO is a "process"
> that does blocking calls, and which the "system" can multitask away
> from any time it's blocked. Your function awaits something, but Python
> doesn't.
>
> ChrisA
> --
> https://mail.python.org/mailman/listinfo/python-list
>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Paul Rubin
Chris Angelico  writes:
> Asynchronous I/O is something to get your head around  I'd much
> rather work with generator-based async functions...

I haven't gotten my head around Python asyncio and have been wanting
to read this:

   http://lucumr.pocoo.org/2016/10/30/i-dont-understand-asyncio/

Is that picture too bleak?  I've used traditional cooperative
multitasking systems in the past, and the main hazard with them is to be
careful to not run for too long without yielding.  Python's stuff seems
much more complicated, with weirder hazards, some stemming from the
hidden mutable state in every generator.  I wonder whether a
stackless-based green thread approach might have been simpler.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: correct way to catch exception with Python 'with' statement

2016-11-28 Thread Steven D'Aprano
On Tuesday 29 November 2016 02:18, Ganesh Pal wrote:

> On Mon, Nov 28, 2016 at 1:16 PM, Steven D'Aprano <
> steve+comp.lang.pyt...@pearwood.info> wrote:
> 
>>
>>
>> There is no need to return True. The function either succeeds, or it
>> raises an
>> exception, so there is no need to return any value at all.
>>
>>
>  I returned True here ,because based on the result of this function , 

But the function *always* returns True, or it doesn't return at all: it raises.

Unless you have something like:

def func():
   do some work
   if condition:
  return False
   do more work
   return True

or similar, there's no point. When you write the documentation for the 
function, if it can only ever return True, then don't worry about returning 
True. Take the built-in methods as an example: dict.update either succeeds, or 
it raises an exception. It doesn't return True:

# this is unnecessary
flag = mydict.update(another_dict)
if flag:
print "update succeeded"
else:
print "update failed"


That cannot happen, because if the update fails, an exception is raised.

The bottom line is, since your function *only* has "return True" and doesn't 
have "return False" anywhere, there is no point to the "return True."


>  I would want to perform next steps
> 
>  Example
>   if  create_files_append():
>do_somthing()
>   else:
>  do_next_thing()

That cannot happen. It either returns True, or it raises an exception, so the 
"else" clause will not be executed.




-- 
Steven
"Ever since I learned about confirmation bias, I've been seeing 
it everywhere." - Jon Ronson

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Steven D'Aprano
On Tuesday 29 November 2016 14:21, Chris Angelico wrote:

> On Tue, Nov 29, 2016 at 1:23 PM, Steve D'Aprano
>  wrote:
>> This is confusing: why is this awaiting something inside an async function?
>> Doesn't that mean that the await asyncio.gather(...) call is turned
>> blocking?
> 
> "await" means "don't continue this function until that's done". It
> blocks the function until a non-blocking operation is done.

So that would be... "yes"?





-- 
Steven
"Ever since I learned about confirmation bias, I've been seeing 
it everywhere." - Jon Ronson

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 4:13 PM, Paul Rubin  wrote:
>
> I haven't gotten my head around Python asyncio and have been wanting
> to read this:
>
>http://lucumr.pocoo.org/2016/10/30/i-dont-understand-asyncio/

It's talking a lot about how we got here, which isn't all necessary if
you just want to give asyncio a whirl. The conclusion at the end says
that you should just use 'async def' and not bother with all the older
forms, which I agree with (subject to the usual caveat that this
implies no support for older Pythons).

There's one thing that I really struggle with, though, and that's that
there's no easy and obvious way to demonstrate the lowest level of
operation. If "await x()" is like "yield from x()", how do you do the
innermost "yield" that actually does something? I have the same
confusion with Node.js, too. It's as if async primitives can't be
implemented in application code at all, they just have to be given to
you. Certainly it's not something made clear anywhere in the docs that
I've found.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Chris Angelico
On Tue, Nov 29, 2016 at 4:32 PM, Steven D'Aprano
 wrote:
> On Tuesday 29 November 2016 14:21, Chris Angelico wrote:
>
>> On Tue, Nov 29, 2016 at 1:23 PM, Steve D'Aprano
>>  wrote:
>>> This is confusing: why is this awaiting something inside an async function?
>>> Doesn't that mean that the await asyncio.gather(...) call is turned
>>> blocking?
>>
>> "await" means "don't continue this function until that's done". It
>> blocks the function until a non-blocking operation is done.
>
> So that would be... "yes"?

>From the point of view of the function, yes. From the point of view of
the rest of Python, no. It's a sign saying "Okay, Python, you can
alt-tab away from me now".

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: correct way to catch exception with Python 'with' statement

2016-11-28 Thread Ganesh Pal
Thanks Steve I got what you were trying to explain  , nice learning  from
this conversation , what I was really doing wrong I had broken down my huge
 code into a simple program and had missed out returning False.

On Tue, Nov 29, 2016 at 11:01 AM, Steven D'Aprano <
steve+comp.lang.pyt...@pearwood.info> wrote:

> On Tuesday 29 November 2016 02:18, Ganesh Pal wrote:
>
> > On Mon, Nov 28, 2016 at 1:16 PM, Steven D'Aprano <
> > steve+comp.lang.pyt...@pearwood.info> wrote:
> >
> >>
> >>
> >> There is no need to return True. The function either succeeds, or it
> >> raises an
> >> exception, so there is no need to return any value at all.
> >>
> >>
> >  I returned True here ,because based on the result of this function ,
>
> But the function *always* returns True, or it doesn't return at all: it
> raises.
>
> Unless you have something like:
>
> def func():
>do some work
>if condition:
>   return False
>do more work
>return True
>
> or similar, there's no point. When you write the documentation for the
> function, if it can only ever return True, then don't worry about returning
> True. Take the built-in methods as an example: dict.update either
> succeeds, or
> it raises an exception. It doesn't return True:
>
> # this is unnecessary
> flag = mydict.update(another_dict)
> if flag:
> print "update succeeded"
> else:
> print "update failed"
>
>
> That cannot happen, because if the update fails, an exception is raised.
>
> The bottom line is, since your function *only* has "return True" and
> doesn't
> have "return False" anywhere, there is no point to the "return True."
>
>
> >  I would want to perform next steps
> >
> >  Example
> >   if  create_files_append():
> >do_somthing()
> >   else:
> >  do_next_thing()
>
> That cannot happen. It either returns True, or it raises an exception, so
> the
> "else" clause will not be executed.
>
>
>
>
> --
> Steven
> "Ever since I learned about confirmation bias, I've been seeing
> it everywhere." - Jon Ronson
>
> --
> https://mail.python.org/mailman/listinfo/python-list
>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Asyncio -- delayed calculation

2016-11-28 Thread Marko Rauhamaa
Gregory Ewing :
> All the terminology around async/await is inherently confusing and
> counterintuitive, IMO. I'm disappointed that we've ended up here.

I think the conceptual mess can be clarified over time. Coroutines are
essentially threads. Why Python needs two threading implementations is
questionable. At least, with threads, you don't have to remember to tag
your function definitions with "async" and your functions calls with
"await".

Both threads and now coroutines were introduced as fig leafs to cover
the underlying finite state machines. Personally, I think it is a
mistake to hide the state machines. Instead, people should learn to
think in terms of state machines, which is the classic network protocol
model.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list