Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread John J Lee
On Thu, 28 Apr 2005, Shane Hathaway wrote:
[...]
> I think this concept can be explained clearly.  I'd like to try
> explaining PEP 340 to someone new to Python but not new to programming.
[...snip explanation...]
> Is it understandable so far?

Yes, excellent.  Speaking as somebody who scanned the PEP and this thread
and only half-understood either, that was quite painless to read.

Still not sure whether thunks or PEP 340 are better, but I'm at least
confused on a higher level now.


John
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Shane Hathaway
Luis P Caamano wrote:
> I've been skipping most of the anonymous block discussion and thus,
> I only had a very vague idea of what it was about until I read this
> explanation.
> 
> Yes, it is understandable -- assuming it's correct :-)

To my surprise, the explanation is now in the PEP.  (Thanks, Guido!)

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Luis Bruno
Hello,

Shane Hathaway wrote:
> Is it understandable so far?

Definitely yes! I had the structure upside-down; your explanation is
right on target.

Thanks!
-- 
Luis Bruno
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Luis P Caamano
On 4/29/05, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> 
> Message: 2
> Date: Thu, 28 Apr 2005 21:56:42 -0600
> From: Shane Hathaway <[EMAIL PROTECTED]>
> Subject: Re: [Python-Dev] Re: anonymous blocks
> To: [EMAIL PROTECTED]
> Cc: Ka-Ping Yee <[EMAIL PROTECTED]>,  Python Developers List
>
> Message-ID: <[EMAIL PROTECTED]>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> 
> I think this concept can be explained clearly.  I'd like to try
> explaining PEP 340 to someone new to Python but not new to programming.
> I'll use the term "block iterator" to refer to the new type of
> iterator.  This is according to my limited understanding.
> 
> "Good programmers move commonly used code into reusable functions.
> Sometimes, however, patterns arise in the structure of the functions
> rather than the actual sequence of statements.  For example, many
> functions acquire a lock, execute some code specific to that function,
> and unconditionally release the lock.  Repeating the locking code in
> every function that uses it is error prone and makes refactoring difficult.
> 
> "Block statements provide a mechanism for encapsulating patterns of
> structure.  Code inside the block statement runs under the control of an
> object called a block iterator.  Simple block iterators execute code
> before and after the code inside the block statement.  Block iterators
> also have the opportunity to execute the controlled code more than once
> (or not at all), catch exceptions, or receive data from the body of the
> block statement.
> 
> "A convenient way to write block iterators is to write a generator.  A
> generator looks a lot like a Python function, but instead of returning a
> value immediately, generators pause their execution at "yield"
> statements.  When a generator is used as a block iterator, the yield
> statement tells the Python interpreter to suspend the block iterator,
> execute the block statement body, and resume the block iterator when the
> body has executed.
> 
> "The Python interpreter behaves as follows when it encounters a block
> statement based on a generator.  First, the interpreter instantiates the
> generator and begins executing it.  The generator does setup work
> appropriate to the pattern it encapsulates, such as acquiring a lock,
> opening a file, starting a database transaction, or starting a loop.
> Then the generator yields execution to the body of the block statement
> using a yield statement.  When the block statement body completes,
> raises an uncaught exception, or sends data back to the generator using
> a continue statement, the generator resumes.  At this point, the
> generator can either clean up and stop or yield again, causing the block
> statement body to execute again.  When the generator finishes, the
> interpreter leaves the block statement."
> 
> Is it understandable so far?
> 

I've been skipping most of the anonymous block discussion and thus,
I only had a very vague idea of what it was about until I read this
explanation.

Yes, it is understandable -- assuming it's correct :-)

Mind you though, I'm not new to python and I've been writing system
software for 20+ years.

-- 
Luis P Caamano
Atlanta, GA USA
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Paul Moore
On 4/29/05, Shane Hathaway <[EMAIL PROTECTED]> wrote:
> I think this concept can be explained clearly.  I'd like to try
> explaining PEP 340 to someone new to Python but not new to programming.
> I'll use the term "block iterator" to refer to the new type of
> iterator.  This is according to my limited understanding.
[...]
> Is it understandable so far?

I like it.
Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Josiah Carlson

Greg Ewing <[EMAIL PROTECTED]> wrote:
> That actually looks pretty reasonable.
> 
> Hmmm. "Patterns of structure." Maybe we could call it a
> "struct" statement.
> 
> struct opening(foo) as f:
>...
> 
> Then we could confuse both C *and* Ruby programmers at
> the same time! :-)

And Python programmers who already use the struct module!

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Guido van Rossum
> If the use of block-statements becomes common for certain
> tasks such as opening files, it seems to me that people are
> going to encounter their use around about the same time
> they encounter for-statements. We need *something* to
> tell these people to enable them to understand the code
> they're reading.
> 
> Maybe it would be sufficient just to explain the meanings
> of those particular uses, and leave the full general
> explanation as an advanced topic.

Right. The block statement is a bit like a chameleon: it adapts its
meaning to the generator you supply. (Or maybe it's like a sewer: what
you get out of it depends on what you put into it. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Shane Hathaway wrote:
"Block statements provide a mechanism for encapsulating patterns of
structure.  Code inside the block statement runs under the control of an
object called a block iterator.  Simple block iterators execute code
before and after the code inside the block statement.  Block iterators
also have the opportunity to execute the controlled code more than once
(or not at all), catch exceptions, or receive data from the body of the
block statement.
That actually looks pretty reasonable.
Hmmm. "Patterns of structure." Maybe we could call it a
"struct" statement.
   struct opening(foo) as f:
  ...
Then we could confuse both C *and* Ruby programmers at
the same time! :-)
[No, I don't really mean this. I actually prefer "block"
to this.]
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Steven Bethard wrote:
"""
A block-statement is much like a for-loop, and is also used to iterate
over the elements of an iterable object.
No, no, no. Similarity to a for-loop is the *last* thing
we want to emphasise, because the intended use is very
different from the intended use of a for-loop. This is
going to give people the wrong idea altogether.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Guido van Rossum wrote:
I don't know. What exactly is the audience supposed to be of this
high-level statement? It would be pretty darn impossible to explain
even the for-statement to people who are new to programming, let alone
generators.
If the use of block-statements becomes common for certain
tasks such as opening files, it seems to me that people are
going to encounter their use around about the same time
they encounter for-statements. We need *something* to
tell these people to enable them to understand the code
they're reading.
Maybe it would be sufficient just to explain the meanings
of those particular uses, and leave the full general
explanation as an advanced topic.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Shane Hathaway
Guido van Rossum wrote:
> I don't know. What exactly is the audience supposed to be of this
> high-level statement? It would be pretty darn impossible to explain
> even the for-statement to people who are new to programming, let alone
> generators. And yet explaining the block-statement *must* involve a
> reference to generators. I'm guessing most introductions to Python,
> even for experienced programmers, put generators off until the
> "advanced" section, because this is pretty wild if you're not used to
> a language that has something similar. (I wonder how you'd explain
> Python generators to an experienced Ruby programmer -- their mind has
> been manipulated to the point where they'd be unable to understand
> Python's yield no matter how hard they tried. :-)

I think this concept can be explained clearly.  I'd like to try
explaining PEP 340 to someone new to Python but not new to programming.
 I'll use the term "block iterator" to refer to the new type of
iterator.  This is according to my limited understanding.

"Good programmers move commonly used code into reusable functions.
Sometimes, however, patterns arise in the structure of the functions
rather than the actual sequence of statements.  For example, many
functions acquire a lock, execute some code specific to that function,
and unconditionally release the lock.  Repeating the locking code in
every function that uses it is error prone and makes refactoring difficult.

"Block statements provide a mechanism for encapsulating patterns of
structure.  Code inside the block statement runs under the control of an
object called a block iterator.  Simple block iterators execute code
before and after the code inside the block statement.  Block iterators
also have the opportunity to execute the controlled code more than once
(or not at all), catch exceptions, or receive data from the body of the
block statement.

"A convenient way to write block iterators is to write a generator.  A
generator looks a lot like a Python function, but instead of returning a
value immediately, generators pause their execution at "yield"
statements.  When a generator is used as a block iterator, the yield
statement tells the Python interpreter to suspend the block iterator,
execute the block statement body, and resume the block iterator when the
body has executed.

"The Python interpreter behaves as follows when it encounters a block
statement based on a generator.  First, the interpreter instantiates the
generator and begins executing it.  The generator does setup work
appropriate to the pattern it encapsulates, such as acquiring a lock,
opening a file, starting a database transaction, or starting a loop.
Then the generator yields execution to the body of the block statement
using a yield statement.  When the block statement body completes,
raises an uncaught exception, or sends data back to the generator using
a continue statement, the generator resumes.  At this point, the
generator can either clean up and stop or yield again, causing the block
statement body to execute again.  When the generator finishes, the
interpreter leaves the block statement."

Is it understandable so far?

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Guido van Rossum
[Greg Ewing]
> I think perhaps I'm not expressing myself very well.
> What I'm after is a high-level explanation that actually
> tells people something useful, and *doesn't* cop out by
> just saying "you're not experienced enough to understand
> this yet".
> 
> If such an explanation can't be found, I strongly suspect
> that this doesn't correspond to a cohesive enough concept
> to be made into a built-in language feature. If you can't
> give a short, understandable explanation of it, then it's
> probably a bad idea.

[Ping]
> In general, i agree with the sentiment of this -- though it's
> also okay if there is a way to break the concept down into
> concepts that *are* simple enough to have short, understandable
> explanations.

I don't know. What exactly is the audience supposed to be of this
high-level statement? It would be pretty darn impossible to explain
even the for-statement to people who are new to programming, let alone
generators. And yet explaining the block-statement *must* involve a
reference to generators. I'm guessing most introductions to Python,
even for experienced programmers, put generators off until the
"advanced" section, because this is pretty wild if you're not used to
a language that has something similar. (I wonder how you'd explain
Python generators to an experienced Ruby programmer -- their mind has
been manipulated to the point where they'd be unable to understand
Python's yield no matter how hard they tried. :-)

If I weren't limited to newbies (either to Python or to programming in
general) but simply had to explain it to Python programmers
pre-Python-2.5, I would probably start with a typical example of the
try/finally idiom for acquiring and releasing a lock, then explain how
for software engineering reasons you'd want to templatize that, and
show the solution with a generator and block-statement.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Nick Coghlan
Brett C. wrote:
I'm surprisingly close to agreeing with you, actually. I've worked out
that it isn't the looping that I object to, it's the inability to get
out of the loop without exhausting the entire iterator.

'break' isn't' enough for you as laid out by the proposal?  The raising of
StopIteration, which is what 'break' does according to the standard, should be
enough to stop the loop without exhausting things.  Same way you stop a 'for'
loop from executing entirely.
The StopIteration exception effectively exhausted the generator, though. 
However, I've figured out how to deal with that, and my reservations about PEP 
340 are basically gone.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Steven Bethard
On 4/28/05, Greg Ewing <[EMAIL PROTECTED]> wrote:
> Neil Schemenauer wrote:
> 
> > The translation of a block-statement could become:
> >
> > itr = EXPR1
> > arg = None
> > while True:
> > try:
> > VAR1 = next(itr, arg)
> > except StopIteration:
> > break
> > try:
> > arg = None
> > BLOCK1
> > except Exception, exc:
> > err = getattr(itr, '__error__', None)
> > if err is None:
> > raise exc
> > err(exc)
> 
> That can't be right. When __error__ is called, if the iterator
> catches the exception and goes on to do another yield, the
> yielded value needs to be assigned to VAR1 and the block
> executed again. It looks like your version will ignore the
> value from the second yield and only execute the block again
> on the third yield.

Could you do something like:
itr = EXPR1
arg = None
next_func = next
while True:
try:
VAR1 = next_func(itr, arg)
except StopIteration:
break
try:
arg = None
next_func = next
BLOCK1
except Exception, arg:
try:
next_func = type(itr).__error__
except AttributeError:
raise arg


?

STeVe

-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Nick Coghlan
Brett C. wrote:
Guido van Rossum wrote:
Yet another alternative would be for the default behaviour to be to raise
Exceptions, and continue with anything else, and have the third argument be
"raise_exc=True" and set it to False to pass an exception in without raising it.

You've lost me there. If you care about this, can you write it up in
more detail (with code samples or whatever)? Or we can agree on a 2nd
arg to __next__() (and a 3rd one to next()).
Channeling Nick, I think he is saying that the raising argument should be 
made
True by default and be named 'raise_exc'.
Pretty close, although I'd say 'could' rather than 'should', as it was an idle 
thought, rather than something I actually consider a good idea.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Steven Bethard
On 4/28/05, Greg Ewing <[EMAIL PROTECTED]> wrote:
> Guido van Rossum wrote:
> > And surely you exaggerate.  How about this then:
> >
> > The with-statement is similar to the for-loop.  Until you've
> > learned about the differences in detail, the only time you should
> > write a with-statement is when the documentation for the function
> > you are calling says you should.
> 
> I think perhaps I'm not expressing myself very well.
> What I'm after is a high-level explanation that actually
> tells people something useful, and *doesn't* cop out by
> just saying "you're not experienced enough to understand
> this yet".

How about:

"""
A block-statement is much like a for-loop, and is also used to iterate
over the elements of an iterable object.  In a block-statement
however, the iterable object is notified whenever a 'continue',
'break', or 'return' statement is executed inside the block-statement.
 Most iterable objects do not need to be notified of such statement
executions, so for most iteration over iterable objects, you should
use a for-loop.  Functions that return iterable objects that should be
used in a block-statement will be documented as such.
"""

If you need more information, you could also include something like:

"""
When generator objects are used in a block-statement, they are
guaranteed to be "exhausted" at the end of the block-statement.  That
is, any additional call to next() with the generator object will
produce a StopIteration.
"""

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Steven Bethard
On 4/28/05, Steven Bethard <[EMAIL PROTECTED]> wrote:
> however, the iterable object is notified whenever a 'continue',
> 'break', or 'return' statement is executed inside the block-statement.

This should read:

however, the iterable object is notified whenever a 'continue',
'break' or 'return' statement is executed *or an exception is raised*
inside the block-statement.

Sorry!

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Samuele Pedroni
Greg Ewing wrote:
Guido van Rossum wrote:
And surely you exaggerate.  How about this then:
The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.

I think perhaps I'm not expressing myself very well.
What I'm after is a high-level explanation that actually
tells people something useful, and *doesn't* cop out by
just saying "you're not experienced enough to understand
this yet".
this makes sense to me, also because a new control statement
will not be usually as hidden as metaclasses and some other possibly
obscure corners can be. OTOH I have the impression that the new toy is 
too shiny to have a lucid discussion whether it could have sharp edges 
or produce dizziness for the unexperienced.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Ka-Ping Yee
On Thu, 28 Apr 2005, Greg Ewing wrote:
> If such an explanation can't be found, I strongly suspect
> that this doesn't correspond to a cohesive enough concept
> to be made into a built-in language feature. If you can't
> give a short, understandable explanation of it, then it's
> probably a bad idea.

In general, i agree with the sentiment of this -- though it's
also okay if there is a way to break the concept down into
concepts that *are* simple enough to have short, understandable
explanations.


-- ?!ng
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Greg Ewing
Guido van Rossum wrote:
And surely you exaggerate.  How about this then:
The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.
I think perhaps I'm not expressing myself very well.
What I'm after is a high-level explanation that actually
tells people something useful, and *doesn't* cop out by
just saying "you're not experienced enough to understand
this yet".
If such an explanation can't be found, I strongly suspect
that this doesn't correspond to a cohesive enough concept
to be made into a built-in language feature. If you can't
give a short, understandable explanation of it, then it's
probably a bad idea.
Greg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Greg Ewing
Neil Schemenauer wrote:
The translation of a block-statement could become:
itr = EXPR1
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
break
try:
arg = None
BLOCK1
except Exception, exc:
err = getattr(itr, '__error__', None)
if err is None:
raise exc
err(exc)
That can't be right. When __error__ is called, if the iterator
catches the exception and goes on to do another yield, the
yielded value needs to be assigned to VAR1 and the block
executed again. It looks like your version will ignore the
value from the second yield and only execute the block again
on the third yield.
So something like Guido's safe_loop() would miss every other
yield.
I think Guido was right in the first place, and __error__
really is just a minor variation on __next__ that shouldn't
have a separate entry point.
Greg

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
Phillip J. Eby wrote:
> At 05:19 PM 4/27/05 -0700, Guido van Rossum wrote:
> >I'm not convinced of that, especially since all *generators* will
> >automatically be usable as templates, whether or not they were
> >intended as such. And why *shouldn't* you be allowed to use a block
> >for looping, if you like the exit behavior (guaranteeing that the
> >iterator is exhausted when you leave the block in any way)?
> 
> It doesn't guarantee that, does it?  (Re-reads PEP.)  Aha, for *generators*
> it does, because it says passing StopIteration in, stops execution of the
> generator.  But it doesn't say anything about whether iterators in general
> are allowed to be resumed afterward, just that they should not yield a
> value in response to the __next__, IIUC.  As currently written, it sounds
> like existing non-generator iterators would not be forced to an exhausted
> state.

I wonder if something can be done like what was done for (dare I say
it?) "old-style" iterators:

"The intention of the protocol is that once an iterator's next()
method raises StopIteration, it will continue to do so on subsequent
calls. Implementations that do not obey this property are deemed
broken. (This constraint was added in Python 2.3; in Python 2.2,
various iterators are broken according to this rule.)"[1]

This would mean that if next(itr, ...) raised StopIteration, then
next(itr, ...) should continue to raise StopIteration on subsequent
calls.  I don't know how this is done in the current implementation. 
Would it be hard to do so for the proposed block-statements?

If nothing else, we might at least clearly document what well-behaved
iterators should do...

STeVe

[1] http://docs.python.org/lib/typeiter.html
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
Neil Schemenauer wrote:
> For generators, calling __error__ with a StopIteration instance
> would execute any 'finally' block.  Any other argument to __error__
> would get re-raised by the generator instance.

This is only one case right?  Any exception (including StopIteration)
passed to a generator's __error__ method will just be re-raised at the
point of the last yield, right?  Or is there a need to special-case
StopIteration?

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 05:43 PM 4/27/05 -0700, Guido van Rossum wrote:
Well, perhaps block *should* call iter()? I'd like to hear votes about
this. In most cases that would make a block-statement entirely
equivalent to a for-loop, the exception being only when there's an
exception or when breaking out of an iterator with resource
management.
I initially decided it should not call iter() so as to emphasize that
this isn't supposed to be used for looping over sequences -- EXPR1 is
really expected to be a resource management generator (or iterator).
Which is why I vote for not calling iter(), and further, that blocks not 
use the iteration protocol, but rather use a new "block template" 
protocol.  And finally, that a decorator be used to convert a generator 
function to a "template function" (i.e., a function that returns a block 
template).

I think it's less confusing to have two completely distinct concepts, than 
to have two things that are very similar, yet different in a blurry kind of 
way.  If you want to use a block on an iterator, you can always explicitly 
do something like this:

@blocktemplate
def iterate(iterable):
for value in iterable:
yield value
block iterate([1,2,3]) as x:
print x

> I wonder if generators that contain a yield-expression should
> properly be called coroutines.  Practically, I suspect it would just
> cause confusion.
I have to admit that I haven't looked carefully for use cases for
this!
Anything that wants to do co-operative multitasking, basically.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Neil Schemenauer wrote:
> On Wed, Apr 27, 2005 at 03:58:14PM -0700, Guido van Rossum wrote:
> 
>>Time to update the PEP; I'm pretty much settled on these semantics
>>now...
> 
> 
> [I'm trying to do a bit of Guido channeling here.  I fear I may not
> be entirely successful.]
> 
> The the __error__ method seems to simplify things a lot.  The
> purpose of the __error__ method is to notify the iterator that the
> loop has been exited in some unusual way (i.e. not via a
> StopIteration raised by the iterator itself).
> 
> The translation of a block-statement could become:
> 
> itr = EXPR1
> arg = None
> while True:
> try:
> VAR1 = next(itr, arg)
> except StopIteration:
> break
> try:
> arg = None
> BLOCK1
> except Exception, exc:
> err = getattr(itr, '__error__', None)
> if err is None:
> raise exc
> err(exc)
> 
> 
> The translation of "continue EXPR2" would become:
> 
> arg = EXPR2
> continue
> 
> The translation of "break" inside a block-statement would
> become:
> 
> err = getattr(itr, '__error__', None)
> if err is not None:
> err(StopIteration())
> break
> 
> The translation of "return EXPR3" inside a block-statement would
> become:
> 
> err = getattr(itr, '__error__', None)
> if err is not None:
> err(StopIteration())
> return EXPR3
> 
> For generators, calling __error__ with a StopIteration instance
> would execute any 'finally' block.  Any other argument to __error__
> would get re-raised by the generator instance.
> 
> You could then write:
> 
> def opened(filename):
> fp = open(filename)
> try:
> yield fp
> finally:
> fp.close()
> 
> and use it like this:
> 
> block opened(filename) as fp:
> 
> 

Seems great to me.  Clean separation of when the block wants things to keep
going if it can and when it wants to let the generator it's all done.

> The main difference between 'for' and 'block' is that more iteration
> may happen after breaking or returning out of a 'for' loop.  An
> iterator used in a block statement is always used up before the
> block is exited.
> 

This constant use of the phrase "used up" for these blocks is bugging me
slightly.  It isn't like the passed-in generator is having next() called on it
until it stops, it is just finishing up (or cleaning up, choose your favorite
term).  It may have had more iterations to go, but the block signaled it was
done and thus the generator got its chance to finish up and wipe pick up after
itself.

> Maybe __error__ should be called __break__ instead.

I like that.

> StopIteration
> is not really an error.  If it is called something like __break__,
> does it really need to accept an argument?  Of hand I can't think of
> what an iterator might do with an exception.
> 

Could just make the default value be StopIteration.  Is there really a perk to
__break__ only raising StopIteration and not accepting an argument?

The real question of whether people would use the ability of raising other
exceptions passed in from the block.  If you view yield expressions as method
calls, then being able to call __break__ with other exceptions makes sense
since you might code up try/except statements within the generator and that
will care about what kind of exception gets raised.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 05:19 PM 4/27/05 -0700, Guido van Rossum wrote:
[Phillip]
> This also has the benefit of making the delineation between template blocks
> and for loops more concrete.  For example, this:
>
>  block open("filename") as f:
>  ...
>
> could be an immediate TypeError (due to the lack of a __resume__) instead
> of biting you later on in the block when you try to do something with f, or
> because the block is repeating for each line of the file, etc.
I'm not convinced of that, especially since all *generators* will
automatically be usable as templates, whether or not they were
intended as such. And why *shouldn't* you be allowed to use a block
for looping, if you like the exit behavior (guaranteeing that the
iterator is exhausted when you leave the block in any way)?
It doesn't guarantee that, does it?  (Re-reads PEP.)  Aha, for *generators* 
it does, because it says passing StopIteration in, stops execution of the 
generator.  But it doesn't say anything about whether iterators in general 
are allowed to be resumed afterward, just that they should not yield a 
value in response to the __next__, IIUC.  As currently written, it sounds 
like existing non-generator iterators would not be forced to an exhausted 
state.

As for the generator-vs-template distinction, I'd almost say that argues in 
favor of requiring some small extra distinction to make a generator 
template-safe, rather than in favor of making all iterators 
template-promiscuous, as it were.  Perhaps a '@block_template' decorator on 
the generator?  This would have the advantage of documenting the fact that 
the generator was written with that purpose in mind.

It seems to me that using a template block to loop over a normal iterator 
is a TOOWTDI violation, but perhaps you're seeing something deeper here...?

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
[SNIP]
>>It's interesting that there is such similarity between 'for' and
>>'block'.  Why is it that block does not call iter() on EXPR1?  I
>>guess that fact that 'break' and 'return' work differently is a more
>>significant difference.
> 
> 
> Well, perhaps block *should* call iter()? I'd like to hear votes about
> this. In most cases that would make a block-statement entirely
> equivalent to a for-loop, the exception being only when there's an
> exception or when breaking out of an iterator with resource
> management.
> 

I am -0 on changing it to call iter().  I do like the distinction from a 'for'
loop and leaving an emphasis for template blocks (or blocks, or whatever hip
term you crazy kids are using for these things at the moment) to use
generators.  As I said before, I am viewing these blocks as a construct for
external control of generators, not as a snazzy 'for' loop.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Neil Schemenauer
On Wed, Apr 27, 2005 at 03:58:14PM -0700, Guido van Rossum wrote:
> Time to update the PEP; I'm pretty much settled on these semantics
> now...

[I'm trying to do a bit of Guido channeling here.  I fear I may not
be entirely successful.]

The the __error__ method seems to simplify things a lot.  The
purpose of the __error__ method is to notify the iterator that the
loop has been exited in some unusual way (i.e. not via a
StopIteration raised by the iterator itself).

The translation of a block-statement could become:

itr = EXPR1
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
break
try:
arg = None
BLOCK1
except Exception, exc:
err = getattr(itr, '__error__', None)
if err is None:
raise exc
err(exc)


The translation of "continue EXPR2" would become:

arg = EXPR2
continue

The translation of "break" inside a block-statement would
become:

err = getattr(itr, '__error__', None)
if err is not None:
err(StopIteration())
break

The translation of "return EXPR3" inside a block-statement would
become:

err = getattr(itr, '__error__', None)
if err is not None:
err(StopIteration())
return EXPR3

For generators, calling __error__ with a StopIteration instance
would execute any 'finally' block.  Any other argument to __error__
would get re-raised by the generator instance.

You could then write:

def opened(filename):
fp = open(filename)
try:
yield fp
finally:
fp.close()

and use it like this:

block opened(filename) as fp:


The main difference between 'for' and 'block' is that more iteration
may happen after breaking or returning out of a 'for' loop.  An
iterator used in a block statement is always used up before the
block is exited.

Maybe __error__ should be called __break__ instead.  StopIteration
is not really an error.  If it is called something like __break__,
does it really need to accept an argument?  Of hand I can't think of
what an iterator might do with an exception.

  Neil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> It seems like what you are proposing is a limited form of
> coroutines.

Well, I though that's already what generators were -- IMO there isn't
much news there. We're providing a more convenient way to pass a value
back, but that's always been possible (see Fredrik's examples).

> Allowing 'continue' to have an optional value is elegant syntax.
> I'm a little bit concerned about what happens if the iterator does
> not expect a value.  If I understand the PEP, it is silently
> ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
> any worse then a caller not expecting a return value.

Exactly.

> It's interesting that there is such similarity between 'for' and
> 'block'.  Why is it that block does not call iter() on EXPR1?  I
> guess that fact that 'break' and 'return' work differently is a more
> significant difference.

Well, perhaps block *should* call iter()? I'd like to hear votes about
this. In most cases that would make a block-statement entirely
equivalent to a for-loop, the exception being only when there's an
exception or when breaking out of an iterator with resource
management.

I initially decided it should not call iter() so as to emphasize that
this isn't supposed to be used for looping over sequences -- EXPR1 is
really expected to be a resource management generator (or iterator).

> After thinking about this more, I wonder if iterators meant for
> 'for' loops and iterators meant for 'block' statements are really
> very different things.  It seems like a block-iterator really needs
> to handle yield-expressions.

But who knows, they might be useful for for-loops as well. After all,
passing values back to the generator has been on some people's wish
list for a long time.

> I wonder if generators that contain a yield-expression should
> properly be called coroutines.  Practically, I suspect it would just
> cause confusion.

I have to admit that I haven't looked carefully for use cases for
this! I just looked at a few Ruby examples and realized that it would
be a fairly simple extension of generators.

You can call such generators coroutines, but they are still generators.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip]
> It's not unlike David Mertz' articles on implementing coroutines and
> multitasking using generators, except that I'm adding more "debugging
> sugar", if you will, by making the tracebacks look normal.  It's just that
> the *how* requires me to pass the traceback into the generator.  At the
> moment, I accomplish that by doing a 3-argument raise inside of
> 'events.resume()', but it would be really nice to be able to get rid of
> 'events.resume()' in a future version of Python.

I'm not familiar with Mertz' articles and frankly I still fear it's
head-explosive material. ;-)

> I think maybe I misspoke.  I mean adding to the traceback *so* that when
> the same error is reraised, the intervening frames are included, rather
> than lost.
> 
> In other words, IIRC, the traceback chain is normally increased by one
> entry for each frame the exception escapes.  However, if you start hiding
> that inside of the exception instance, you'll have to modify it instead of
> just modifying the threadstate.  Does that make sense, or am I missing
> something?

Adding to the traceback chain already in the exception object is
totally kosher, if that's where the traceback is kept.

> My point was mainly that we can err on the side of caller convenience
> rather than callee convenience, if there are fewer implementations.  So,
> e.g. multiple methods aren't a big deal if it makes the 'block'
> implementation simpler, if only generators and a handful of special
> template objects are going need to implement the block API.

Well, the way my translation is currently written, writing next(itr,
arg, exc) is a lot more convenient for the caller than having to write

# if exc is True, arg is an exception; otherwise arg is a value
if exc:
err = getattr(itr, "__error__", None)
if err is not None:
VAR1 = err(arg)
else:
raise arg
else:
VAR1 = next(itr, arg)

but since this will actually be code generated by the bytecode
compiler, I think callee convenience is more important. And the
ability to default __error__ to raise the exception makes a lot of
sense. And we could wrap all this inside the next() built-in -- even
if the actual object should have separate __next__() and __error__()
methods, the user-facing built-in next() function might take an extra
flag to indicate that the argument is an exception, and to handle it
appropriate (like shown above).

> > > So, I guess I'm thinking you'd have something like tp_block_resume and
> > > tp_block_error type slots, and generators' tp_iter_next would just be the
> > > same as tp_block_resume(None).
> >
> >I hadn't thought much about the C-level slots yet, but this is a
> >reasonable proposal.
> 
> Note that it also doesn't require a 'next()' builtin, or a next vs.
> __next__ distinction, if you don't try to overload iteration and
> templating.  The fact that a generator can be used for templating, doesn't
> have to imply that any iterator should be usable as a template, or that the
> iteration protocol is involved in any way.  You could just have
> __resume__/__error__ matching the tp_block_* slots.
> 
> This also has the benefit of making the delineation between template blocks
> and for loops more concrete.  For example, this:
> 
>  block open("filename") as f:
>  ...
> 
> could be an immediate TypeError (due to the lack of a __resume__) instead
> of biting you later on in the block when you try to do something with f, or
> because the block is repeating for each line of the file, etc.

I'm not convinced of that, especially since all *generators* will
automatically be usable as templates, whether or not they were
intended as such. And why *shouldn't* you be allowed to use a block
for looping, if you like the exit behavior (guaranteeing that the
iterator is exhausted when you leave the block in any way)?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
> [Guido]
> 
>>>An alternative that solves this would be to give __next__() a second
>>>argument, which is a bool that should be true when the first argument
>>>is an exception that should be raised. What do people think?
>>>
>>>I'll add this to the PEP as an alternative for now.
> 
> 
> [Nick]
> 
>>An optional third argument (raise=False) seems a lot friendlier (and more
>>flexible) than a typecheck.
> 
> 
> I think I agree, especially since Phillip's alternative (a different
> method) is even worse IMO.
> 

The extra argument works for me as well.

> 
>>Yet another alternative would be for the default behaviour to be to raise
>>Exceptions, and continue with anything else, and have the third argument be
>>"raise_exc=True" and set it to False to pass an exception in without raising 
>>it.
> 
> 
> You've lost me there. If you care about this, can you write it up in
> more detail (with code samples or whatever)? Or we can agree on a 2nd
> arg to __next__() (and a 3rd one to next()).
> 

Channeling Nick, I think he is saying that the raising argument should be made
True by default and be named 'raise_exc'.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Neil Schemenauer
On Wed, Apr 27, 2005 at 12:30:22AM -0700, Guido van Rossum wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).

[Note: most of these comments are based on version 1.2 of the PEP]

It seems like what you are proposing is a limited form of
coroutines.  Just as Python's generators are limited (yield can only
jump up one stack frame), these coroutines have a similar
limitation.  Someone mentioned that we are edging closer to
continuations.  I think that may be a good thing.  One big
difference between what you propose and general continuations is in
finalization semantics.  I don't think anyone has figured out a way
for try/finally to work with continuations.  The fact that
try/finally can be used inside generators is a significant feature
of this PEP, IMO.

Regarding the syntax, I actually quite like the 'block' keyword.  It
doesn't seem so surprising that the block may be a loop.

Allowing 'continue' to have an optional value is elegant syntax.
I'm a little bit concerned about what happens if the iterator does
not expect a value.  If I understand the PEP, it is silently
ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
any worse then a caller not expecting a return value.

It's interesting that there is such similarity between 'for' and
'block'.  Why is it that block does not call iter() on EXPR1?  I
guess that fact that 'break' and 'return' work differently is a more
significant difference.

After thinking about this more, I wonder if iterators meant for
'for' loops and iterators meant for 'block' statements are really
very different things.  It seems like a block-iterator really needs
to handle yield-expressions.

I wonder if generators that contain a yield-expression should
properly be called coroutines.  Practically, I suspect it would just
cause confusion.

Perhaps passing an Iteration instance to next() should not be
treated the same as passing None.  It seems like that would
implementing the iterator easier.  Why not treat Iterator like any
normal value?  Then only None, StopIteration, and ContinueIteration
would be special.

Argh, it took me so long to write this that you are already up to
version 1.6 of the PEP.  Time to start a new message. :-)

  Neil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 03:58 PM 4/27/05 -0700, Guido van Rossum wrote:
OK, I sort of get it, at a very high-level, although I still feel this
is wildly out of my league.
I guess I should try it first. ;-)
It's not unlike David Mertz' articles on implementing coroutines and 
multitasking using generators, except that I'm adding more "debugging 
sugar", if you will, by making the tracebacks look normal.  It's just that 
the *how* requires me to pass the traceback into the generator.  At the 
moment, I accomplish that by doing a 3-argument raise inside of 
'events.resume()', but it would be really nice to be able to get rid of 
'events.resume()' in a future version of Python.


> Of course, it seems to me that you also have the problem of adding to the
> traceback when the same error is reraised...
I think when it is re-raised, no traceback entry should be added; the
place that re-raises it should not show up in the traceback, only the
place that raised it in the first place. To me that's the essence of
re-raising (and I think that's how it works when you use raise without
arguments).
I think maybe I misspoke.  I mean adding to the traceback *so* that when 
the same error is reraised, the intervening frames are included, rather 
than lost.

In other words, IIRC, the traceback chain is normally increased by one 
entry for each frame the exception escapes.  However, if you start hiding 
that inside of the exception instance, you'll have to modify it instead of 
just modifying the threadstate.  Does that make sense, or am I missing 
something?


> For that matter, I don't see a lot of value in
> hand-writing new objects with resume/error, instead of just using a 
generator.

Not a lot, but I expect that there may be a few, like an optimized
version of lock synchronization.
My point was mainly that we can err on the side of caller convenience 
rather than callee convenience, if there are fewer implementations.  So, 
e.g. multiple methods aren't a big deal if it makes the 'block' 
implementation simpler, if only generators and a handful of special 
template objects are going need to implement the block API.


> So, I guess I'm thinking you'd have something like tp_block_resume and
> tp_block_error type slots, and generators' tp_iter_next would just be the
> same as tp_block_resume(None).
I hadn't thought much about the C-level slots yet, but this is a
reasonable proposal.
Note that it also doesn't require a 'next()' builtin, or a next vs. 
__next__ distinction, if you don't try to overload iteration and 
templating.  The fact that a generator can be used for templating, doesn't 
have to imply that any iterator should be usable as a template, or that the 
iteration protocol is involved in any way.  You could just have 
__resume__/__error__ matching the tp_block_* slots.

This also has the benefit of making the delineation between template blocks 
and for loops more concrete.  For example, this:

block open("filename") as f:
...
could be an immediate TypeError (due to the lack of a __resume__) instead 
of biting you later on in the block when you try to do something with f, or 
because the block is repeating for each line of the file, etc.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Nick Coghlan wrote:
> Brett C. wrote:
> 
>> And while the thought is in my head, I think block statements should
>> be viewed
>> less as a tweaked version of a 'for' loop and more as an extension to
>> generators that happens to be very handy for resource management (while
>> allowing iterators to come over and play on the new swing set as
>> well).  I
>> think if you take that view then the argument that they are too
>> similar to
>> 'for' loops loses some luster (although I doubt Nick is going to be
>> buy this  =) .
> 
> 
> I'm surprisingly close to agreeing with you, actually. I've worked out
> that it isn't the looping that I object to, it's the inability to get
> out of the loop without exhausting the entire iterator.
> 

'break' isn't' enough for you as laid out by the proposal?  The raising of
StopIteration, which is what 'break' does according to the standard, should be
enough to stop the loop without exhausting things.  Same way you stop a 'for'
loop from executing entirely.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> OK - so what is the point of the sentence::
> 
> The generator should re-raise this exception; it should not yield
> another value.
> 
> when discussing StopIteration?

It forbids returning a value, since that would mean the generator
could "refuse" a break or return statement, which is a little bit too
weird (returning a value instead would turn these into continue
statements).

I'll change this to clarify that I don't care about the identity of
the StopException instance.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
A minor sticking point - I don't like that the generator has to
re-raise any ``StopIteration`` passed in. Would it be possible to
have the semantics be: 

If a generator is resumed with ``StopIteration``, the exception
is raised at the resumption point (and stored for later use).
When the generator exits normally (i.e. ``return`` or falls off
the end) it re-raises the stored exception (if any) or raises a
new ``StopIteration`` exception. 
I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().
OK - so what is the point of the sentence::
   The generator should re-raise this exception; it should not yield
   another value.  

when discussing StopIteration?
Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Tim Delaney wrote:
Also, within a for-loop or block-statement, we could have ``raise
`` be equivalent to::
   arg = 
   continue
For this to work, builtin next() would need to be a bit smarter ... 
specifically, for an old-style iterator, any non-Iteration exception would 
need to be re-raised there.

Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> A minor sticking point - I don't like that the generator has to re-raise any
> ``StopIteration`` passed in. Would it be possible to have the semantics be:
> 
> If a generator is resumed with ``StopIteration``, the exception is raised
> at the resumption point (and stored for later use). When the generator
> exits normally (i.e. ``return`` or falls off the end) it re-raises the
> stored exception (if any) or raises a new ``StopIteration`` exception.

I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip]
> Probably my attempt at a *brief* explanation backfired.  No, they're not
> continuations or anything nearly that complicated.  I'm "just" simulating
> threads using generators that yield a nested generator when they need to do
> something that might block waiting for I/O.  The pseudothread object pushes
> the yielded generator-iterator and resumes it.  If that generator-iterator
> raises an error, the pseudothread catches it, pops the previous
> generator-iterator, and passes the error into it, traceback and all.
> 
> The net result is that as long as you use a "yield expression" for any
> function/method call that might do blocking I/O, and those functions or
> methods are written as generators, you get the benefits of Twisted (async
> I/O without threading headaches) without having to "twist" your code into
> the callback-registration patterns of Twisted.  And, by passing in errors
> with tracebacks, the normal process of exception call-stack unwinding
> combined with pseudothread stack popping results in a traceback that looks
> just as if you had called the functions or methods normally, rather than
> via the pseudothreading mechanism.  Without that, you would only get the
> error context of 'async_readline()', because the traceback wouldn't be able
> to show who *called* async_readline.

OK, I sort of get it, at a very high-level, although I still feel this
is wildly out of my league.

I guess I should try it first. ;-)

> >In Python 3000 I want to make the traceback a standard attribute of
> >Exception instances; would that suffice?
> 
> If you're planning to make 'raise' reraise it, such that 'raise exc' is
> equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you
> mean?  (i.e., just making it easier to pass the darn things around)
> 
> If so, then I could probably do what I need as long as there exist no error
> types whose instances disallow setting a 'traceback' attribute on them
> after the fact.  Of course, if Exception provides a slot (or dictionary)
> for this, then it shouldn't be a problem.

Right, this would be a standard part of the Exception base class, just
like in Java.

> Of course, it seems to me that you also have the problem of adding to the
> traceback when the same error is reraised...

I think when it is re-raised, no traceback entry should be added; the
place that re-raises it should not show up in the traceback, only the
place that raised it in the first place. To me that's the essence of
re-raising (and I think that's how it works when you use raise without
arguments).

> All in all it seems more complex than just allowing an exception and a
> traceback to be passed.

Making the traceback a standard attribute of the exception sounds
simpler; having to keep track of two separate arguments that are as
closely related as an exception and the corresponding traceback is
more complex IMO.

The only reason why it isn't done that way in current Python is that
it couldn't be done that way back when exceptions were strings.

> >I really don't want to pass
> >the whole (type, value, traceback) triple that currently represents an
> >exception through __next__().
> 
> The point of passing it in is so that the traceback can be preserved
> without special action in the body of generators the exception is passing
> through.
> 
> I could be wrong, but it seems to me you need this even for PEP 340, if
> you're going to support error management templates, and want tracebacks to
> include the line in the block where the error originated.  Just reraising
> the error inside the generator doesn't seem like it would be enough.

*** I have to think about this more... ***

> > > I think it'd be simpler just to have two methods, conceptually
> > > "resume(value=None)" and "error(value,tb=None)", whatever the actual 
> > > method
> > > names are.
> >
> >Part of me likes this suggestion, but part of me worries that it
> >complicates the iterator API too much.
> 
> I was thinking that maybe these would be a "coroutine API" or "generator
> API" instead.  That is, something not usable except with
> generator-iterators and with *new* objects written to conform to it.  I
> don't really see a lot of value in making template blocks work with
> existing iterators.

(You mean existing non-generator iterators, right? existing
*generators* will work just fine -- the exception will pass right
through them and that's exactly the right default semantics.

Existing non-generator iterators are indeed a different case, and this
is actually an argument for having a separate API: if the __error__()
method doesn't exist, the exception is just re-raised rather than
bothering the iterator.

OK, I think I'm sold.

> For that matter, I don't see a lot of value in
> hand-writing new objects with resume/error, instead of just using a generator.

Not a lot, but I expect that there may be a few, like an optimized
version of lock synchronization.

> So, I guess I'm thinking you'd have something like tp_bl

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
- temporarily sidestepping the syntax by proposing 'block' instead of
'with'
- __next__() argument simplified to StopIteration or
ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
+1
A minor sticking point - I don't like that the generator has to re-raise any 
``StopIteration`` passed in. Would it be possible to have the semantics be:

   If a generator is resumed with ``StopIteration``, the exception is 
raised
   at the resumption point (and stored for later use). When the generator
   exits normally (i.e. ``return`` or falls off the end) it re-raises the
   stored exception (if any) or raises a new ``StopIteration`` exception.

So a generator would become effectively::
   try:
   stopexc = None
   exc = None
   BLOCK1
   finally:
   if exc is not None:
   raise exc
   if stopexc is not None:
   raise stopexc
   raise StopIteration
where within BLOCK1:
   ``raise `` is equivalent to::
   exc = 
   return
   The start of an ``except`` clause sets ``exc`` to None (if the clause is
   executed of course).
   Calling ``__next__(exception)`` with ``StopIteration`` is equivalent 
to::

   stopexc = exception
   (raise exception at resumption point)
   Calling ``__next__(exception)`` with ``ContinueIteration`` is equivalent 
to::

   (resume exception with exception.value)
   Calling ``__next__(exception)__`` with any other value just raises that 
value
   at the resumption point - this allows for calling with arbitrary 
exceptions.

Also, within a for-loop or block-statement, we could have ``raise 
`` be equivalent to::

   arg = 
   continue
This also takes care of Brett's concern about distinguishing between 
exceptions and values passed to the generator. Anything except StopIteration 
or ContinueIteration will be presumed to be an exception and will be raised. 
Anything passed via ContinueIteration is a value.

Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 02:50 PM 4/27/05 -0700, Guido van Rossum wrote:
[Guido]
> >I'm not sure what the relevance of including a stack trace would be,
> >and why that feature would be necessary to call them coroutines.
[Phillip]
> Well, you need that feature in order to retain traceback information when
> you're simulating threads with a stack of generators.  Although you can't
> return from a generator inside a nested generator, you can simulate this by
> keeping a stack of generators and having a wrapper that passes control
> between generators, such that:
>
>  def somegen():
>  result = yield othergen()
>
> causes the wrapper to push othergen() on the generator stack and execute
> it.  If othergen() raises an error, the wrapper resumes somegen() and
> passes in the error.  If you can only specify the value but not the
> traceback, you lose the information about where the error occurred in
> othergen().
>
> So, the feature is necessary for anything other than "simple" (i.e.
> single-frame) coroutines, at least if you want to retain any possibility of
> debugging.  :)
OK. I think you must be describing continuations there, because my
brain just exploded. :-)
Probably my attempt at a *brief* explanation backfired.  No, they're not 
continuations or anything nearly that complicated.  I'm "just" simulating 
threads using generators that yield a nested generator when they need to do 
something that might block waiting for I/O.  The pseudothread object pushes 
the yielded generator-iterator and resumes it.  If that generator-iterator 
raises an error, the pseudothread catches it, pops the previous 
generator-iterator, and passes the error into it, traceback and all.

The net result is that as long as you use a "yield expression" for any 
function/method call that might do blocking I/O, and those functions or 
methods are written as generators, you get the benefits of Twisted (async 
I/O without threading headaches) without having to "twist" your code into 
the callback-registration patterns of Twisted.  And, by passing in errors 
with tracebacks, the normal process of exception call-stack unwinding 
combined with pseudothread stack popping results in a traceback that looks 
just as if you had called the functions or methods normally, rather than 
via the pseudothreading mechanism.  Without that, you would only get the 
error context of 'async_readline()', because the traceback wouldn't be able 
to show who *called* async_readline.


In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice?
If you're planning to make 'raise' reraise it, such that 'raise exc' is 
equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you 
mean?  (i.e., just making it easier to pass the darn things around)

If so, then I could probably do what I need as long as there exist no error 
types whose instances disallow setting a 'traceback' attribute on them 
after the fact.  Of course, if Exception provides a slot (or dictionary) 
for this, then it shouldn't be a problem.

Of course, it seems to me that you also have the problem of adding to the 
traceback when the same error is reraised...

All in all it seems more complex than just allowing an exception and a 
traceback to be passed.


I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().
The point of passing it in is so that the traceback can be preserved 
without special action in the body of generators the exception is passing 
through.

I could be wrong, but it seems to me you need this even for PEP 340, if 
you're going to support error management templates, and want tracebacks to 
include the line in the block where the error originated.  Just reraising 
the error inside the generator doesn't seem like it would be enough.


> >An alternative that solves this would be to give __next__() a second
> >argument, which is a bool that should be true when the first argument
> >is an exception that should be raised. What do people think?
>
> I think it'd be simpler just to have two methods, conceptually
> "resume(value=None)" and "error(value,tb=None)", whatever the actual method
> names are.
Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much.
I was thinking that maybe these would be a "coroutine API" or "generator 
API" instead.  That is, something not usable except with 
generator-iterators and with *new* objects written to conform to it.  I 
don't really see a lot of value in making template blocks work with 
existing iterators.  For that matter, I don't see a lot of value in 
hand-writing new objects with resume/error, instead of just using a generator.

So, I guess I'm thinking you'd have something like tp_block_resume and 
tp_block_error type slots, and generators' tp_iter_next would just be the 
same as tp_block_resume(None).

But maybe this is the part you're thinking is complicated.  :)
___

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
> > An alternative that solves this would be to give __next__() a second
> > argument, which is a bool that should be true when the first argument
> > is an exception that should be raised. What do people think?
> >
> > I'll add this to the PEP as an alternative for now.

[Nick]
> An optional third argument (raise=False) seems a lot friendlier (and more
> flexible) than a typecheck.

I think I agree, especially since Phillip's alternative (a different
method) is even worse IMO.

> Yet another alternative would be for the default behaviour to be to raise
> Exceptions, and continue with anything else, and have the third argument be
> "raise_exc=True" and set it to False to pass an exception in without raising 
> it.

You've lost me there. If you care about this, can you write it up in
more detail (with code samples or whatever)? Or we can agree on a 2nd
arg to __next__() (and a 3rd one to next()).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Guido van Rossum wrote:
An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I'll add this to the PEP as an alternative for now.
An optional third argument (raise=False) seems a lot friendlier (and more 
flexible) than a typecheck.

Yet another alternative would be for the default behaviour to be to raise 
Exceptions, and continue with anything else, and have the third argument be 
"raise_exc=True" and set it to False to pass an exception in without raising it.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Brett C. wrote:
And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .
I'm surprisingly close to agreeing with you, actually. I've worked out that it 
isn't the looping that I object to, it's the inability to get out of the loop 
without exhausting the entire iterator.

I need to think about some ideas involving iterator factories, then my 
objections may disappear.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Jim Fulton]

> 2. I assume it would be a hack to try to use block statements to implement
> something like interfaces or classes, because doing so would require
> significant local-variable manipulation.  I'm guessing that
> either implementing interfaces (or implementing a class statement
> in which the class was created before execution of a suite)
> is not a use case for this PEP.

I would like to get back to the discussion about interfaces and
signature type declarations at some point, and a syntax dedicated to
declaring interfaces is high on my wish list.

In the mean time, if you need interfaces today, I think using
metaclasses would be easier than using a block-statement (if it were
even possible using the latter without passing locals() to the
generator).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> If the iterator fails to re-raise the StopIteration exception (the spec
> only says it should, not that it must) I think the return would be ignored
> but a subsquent exception would then get converted into a return value. I
> think the flag needs reset to avoid this case.

Good catch. I've fixed this in the PEP.

> Also, I wonder whether other exceptions from next() shouldn't be handled a
> bit differently. If BLOCK1 throws an exception, and this causes the
> iterator to also throw an exception then one exception will be lost. I
> think it would be better to propogate the original exception rather than
> the second exception.

I don't think so. It's similar to this case:

try:
raise Foo
except:
raise Bar

Here, Foo is also lost.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
> >I'm not sure what the relevance of including a stack trace would be,
> >and why that feature would be necessary to call them coroutines.

[Phillip]
> Well, you need that feature in order to retain traceback information when
> you're simulating threads with a stack of generators.  Although you can't
> return from a generator inside a nested generator, you can simulate this by
> keeping a stack of generators and having a wrapper that passes control
> between generators, such that:
> 
>  def somegen():
>  result = yield othergen()
> 
> causes the wrapper to push othergen() on the generator stack and execute
> it.  If othergen() raises an error, the wrapper resumes somegen() and
> passes in the error.  If you can only specify the value but not the
> traceback, you lose the information about where the error occurred in
> othergen().
> 
> So, the feature is necessary for anything other than "simple" (i.e.
> single-frame) coroutines, at least if you want to retain any possibility of
> debugging.  :)

OK. I think you must be describing continuations there, because my
brain just exploded. :-)

In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice? I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().

> Yes, it would be nice.  Also, you may have just come up with an even better
> word for what these things should be called... patterns.  Perhaps they
> could be called "pattern blocks" or "patterned blocks".  Pattern sounds so
> much more hip and politically correct than "macro" or even "code block".  :)

Yes, but the word has a much loftier meaning. I could get used to
template blocks though (template being a specific pattern, and this
whole thing being a non-OO version of the Template Method Pattern from
the GoF book).

> >An alternative that solves this would be to give __next__() a second
> >argument, which is a bool that should be true when the first argument
> >is an exception that should be raised. What do people think?
> 
> I think it'd be simpler just to have two methods, conceptually
> "resume(value=None)" and "error(value,tb=None)", whatever the actual method
> names are.

Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much. Your resume() would be
__next__(), but that means your error() would become __error__(). This
is more along the lines of PEP 288 and PEP 325 (and even PEP 310), but
we have a twist here in that it is totally acceptable (see my example)
for __error__() to return the next value or raise StopIteration. IOW
the return behavior of __error__() is the same as that of __next__().

Fredrik, what does your intuition tell you?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 01:27 PM 4/27/05 -0700, Guido van Rossum wrote:
[Phillip Eby]
> Very nice.  It's not clear from the text, btw, if normal exceptions can be
> passed into __next__, and if so, whether they can include a traceback.  If
> they *can*, then generators can also be considered co-routines now, in
> which case it might make sense to call blocks "coroutine blocks", because
> they're basically a way to interleave a block of code with the execution of
> a specified coroutine.
The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)
I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.
Well, you need that feature in order to retain traceback information when 
you're simulating threads with a stack of generators.  Although you can't 
return from a generator inside a nested generator, you can simulate this by 
keeping a stack of generators and having a wrapper that passes control 
between generators, such that:

def somegen():
result = yield othergen()
causes the wrapper to push othergen() on the generator stack and execute 
it.  If othergen() raises an error, the wrapper resumes somegen() and 
passes in the error.  If you can only specify the value but not the 
traceback, you lose the information about where the error occurred in 
othergen().

So, the feature is necessary for anything other than "simple" (i.e. 
single-frame) coroutines, at least if you want to retain any possibility of 
debugging.  :)


But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:
def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print "ignored", var, ":", err.__class__.__name__
block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x
Yes, it would be nice.  Also, you may have just come up with an even better 
word for what these things should be called... patterns.  Perhaps they 
could be called "pattern blocks" or "patterned blocks".  Pattern sounds so 
much more hip and politically correct than "macro" or even "code block".  :)


An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I think it'd be simpler just to have two methods, conceptually 
"resume(value=None)" and "error(value,tb=None)", whatever the actual method 
names are.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread David Ascher
On 4/27/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:

> As long as I am BDFL Python is unlikely to get continuations -- my
> head explodes each time someone tries to explain them to me.

You just need a safety valve installed. It's outpatient surgery, don't worry.

--david
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> I feel like we're quietly, delicately tiptoeing toward continuations...

No way we aren't. We're not really adding anything to the existing
generator machinery (the exception/value passing is a trivial
modification) and that is only capable of 80% of coroutines (but it's
the 80% you need most :-).

As long as I am BDFL Python is unlikely to get continuations -- my
head explodes each time someone tries to explain them to me.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip Eby]
> Very nice.  It's not clear from the text, btw, if normal exceptions can be
> passed into __next__, and if so, whether they can include a traceback.  If
> they *can*, then generators can also be considered co-routines now, in
> which case it might make sense to call blocks "coroutine blocks", because
> they're basically a way to interleave a block of code with the execution of
> a specified coroutine.

The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)

I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.

But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:

def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print "ignored", var, ":", err.__class__.__name__

block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x

This should print

0.1
0.2
ignored 0 : ZeroDivisionError
0.02

I've been thinking of alternative signatures for the __next__() method
to handle this. We have the following use cases:

1. plain old next()
2. passing a value from continue EXPR
3. forcing a break due to a break statement
4. forcing a break due to a return statement
5. passing an exception EXC

Cases 3 and 4 are really the same; I don't think the generator needs
to know the difference between a break and a return statement. And
these can be mapped to case 5 with EXC being StopIteration().

Now the simplest API would be this: if the argument to __next__() is
an exception instance (let's say we're talking Python 3000, where all
exceptions are subclasses of Exception), it is raised when yield
resumes; otherwise it is the return value from yield (may be None).

This is somewhat unsatisfactory because it means that you can't pass
an exception instance as a value. I don't know how much of a problem
this will be in practice; I could see it causing unpleasant surprises
when someone designs an API around this that takes an arbitrary
object, when someone tries to pass an exception instance. Fixing such
a thing could be expensive (you'd have to change the API to pass the
object wrapped in a list or something).

An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?

I'll add this to the PEP as an alternative for now.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).
> 
> Some highlights:
> 
> - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
> - __next__() argument simplified to StopIteration or ContinueIteration 
> instance
> - use "continue EXPR" to pass a value to the generator
> - generator exception handling explained
> 

I am at least +0 on all of this now, with a slow warming up to +1 (but then it
might just be the cold talking  =).

I still prefer the idea of arguments to __next__() be raised if they are
exceptions and otherwise just be returned through the yield expression.  But I
do realize this is easily solved with a helper function now::

 def raise_or_yield(val):
 """Return the argument if not an exception, otherwise raise it.

 Meant to have a yield expression as an argument.  Worries about
 Iteration subclasses are invalid since they will have been handled by the
 __next__() method on the generator already.


 """
 if isinstance(val, Exception):
raise val
 else:
return val

My objections that I had earlier to 'continue' and 'break' being somewhat
magical in block statements has subsided.  It all seems reasonable now within
the context of a block statement.

And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .

Basically block statements are providing a simplified, syntactically supported
way to control a generator externally from itself (or at least this is the
impression I am getting).  I just had a flash of worry about how this would
work in terms of abstractions of things to functions with block statements in
them, but then I realized you just push more code into the generator and handle
it there with the block statement just driving the generator.  Seems like this
might provide that last key piece for generators to finally provide cool flow
control that we all know they are capable of but just required extra work
beforehand.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum <[EMAIL PROTECTED]> wrote:
> Ouch. Another bug in the PEP. It was late. ;-)
> 
> The "finally:" should have been "except StopIteration:" I've updated
> the PEP online.
> 
> > Unless it is too early for me, I believe what you wanted is...
> > 
> > itr = iter(EXPR1)
> > arg = None
> > while True:
> > VAR1 = next(itr, arg)
> > arg = None
> > BLOCK1
> > else:
> > BLOCK2
> 
> No, this would just propagate the StopIteration when next() raises it.
> StopIteration is not caught implicitly except around the next() call
> made by the for-loop control code.

Still no good.  On break, the else isn't executed.

How about...

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
BLOCK2
break
arg = None
BLOCK1

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


RE: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Andrew Koenig

> that we are having this discussion at all seems a signal that the
> semantics are likely too subtle.

I feel like we're quietly, delicately tiptoeing toward continuations...


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> Your code for the translation of a standard for loop is flawed.  From
> the PEP:
> 
> for VAR1 in EXPR1:
> BLOCK1
> else:
> BLOCK2
> 
> will be translated as follows:
> 
> itr = iter(EXPR1)
> arg = None
> while True:
> try:
> VAR1 = next(itr, arg)
> finally:
> break
> arg = None
> BLOCK1
> else:
> BLOCK2
> 
> Note that in the translated version, BLOCK2 can only ever execute if
> next raises a StopIteration in the call, and BLOCK1 will never be
> executed because of the 'break' in the finally clause.

Ouch. Another bug in the PEP. It was late. ;-)

The "finally:" should have been "except StopIteration:" I've updated
the PEP online.

> Unless it is too early for me, I believe what you wanted is...
> 
> itr = iter(EXPR1)
> arg = None
> while True:
> VAR1 = next(itr, arg)
> arg = None
> BLOCK1
> else:
> BLOCK2

No, this would just propagate the StopIteration when next() raises it.
StopIteration is not caught implicitly except around the next() call
made by the for-loop control code.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
On 4/27/05, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).

So block-statements would be very much like for-loops, except:

(1) iter() is not called on the expression
(2) the fact that break, continue, return or a raised Exception
occurred can all be intercepted by the block-iterator/generator,
though break, return and a raised Exception all look the same to the
block-iterator/generator (they are signaled with a StopIteration)
(3) the while loop can only be broken out of by next() raising a
StopIteration, so all well-behaved iterators will be exhausted when
the block-statement is exited

Hope I got that mostly right.

I know this is looking a little far ahead, but is the intention that
even in Python 3.0 for-loops and block-statements will still be
separate statements?  It seems like there's a pretty large section of
overlap.  Playing with for-loop semantics right now isn't possible due
to backwards compatibility, but when that limitation is removed in
Python 3.0, are we hoping that these two similar structures will be
expressed in a single statement?

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
> I would think that the relevant psuedo-code should look more like:
> 
> except StopIteration:
> if ret:
> return exc
> if exc is not None:
> raise exc   # XXX See below
> break

Thanks! This was a bug in the PEP due to a last-minute change in how I
wanted to handle return; I've fixed it as you show (also renaming
'exc' to 'var' since it doesn't always hold an exception).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum <[EMAIL PROTECTED]> wrote:
> 
> I've written a PEP about this topic. It's PEP 340: Anonymous Block
> Statements (http://python.org/peps/pep-0340.html).
> 
> Some highlights:
> 
> - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
> - __next__() argument simplified to StopIteration or ContinueIteration 
> instance
> - use "continue EXPR" to pass a value to the generator
> - generator exception handling explained

Your code for the translation of a standard for loop is flawed.  From
the PEP:

for VAR1 in EXPR1:
BLOCK1
else:
BLOCK2

will be translated as follows:

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
finally:
break
arg = None
BLOCK1
else:
BLOCK2


Note that in the translated version, BLOCK2 can only ever execute if
next raises a StopIteration in the call, and BLOCK1 will never be
executed because of the 'break' in the finally clause.

Unless it is too early for me, I believe what you wanted is...

itr = iter(EXPR1)
arg = None
while True:
VAR1 = next(itr, arg)
arg = None
BLOCK1
else:
BLOCK2

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 04:37 AM 4/26/05 -0700, Guido van Rossum wrote:
*Fourth*, and this is what makes Greg and me uncomfortable at the same
time as making Phillip and other event-handling folks drool: from the
previous three points it follows that an iterator may *intercept* any
or all of ReturnFlow, BreakFlow and ContinueFlow, and use them to
implement whatever cool or confusing magic they want.
Actually, this isn't my interest at all.  It's the part where you can pass 
values or exceptions *in* to a generator with *less* magic than is 
currently required.

This interest is unrelated to anonymous blocks in any case; it's about 
being able to simulate lightweight pseudo-threads ala Stackless, for use 
with Twisted.  I can do this now of course, but "yield expressions" as 
described in PEP 340 would eliminate the need for the awkward syntax and 
frame hackery I currently use.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 12:30 AM 4/27/05 -0700, Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration 
instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
Very nice.  It's not clear from the text, btw, if normal exceptions can be 
passed into __next__, and if so, whether they can include a traceback.  If 
they *can*, then generators can also be considered co-routines now, in 
which case it might make sense to call blocks "coroutine blocks", because 
they're basically a way to interleave a block of code with the execution of 
a specified coroutine.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

>> No, the return sets a flag and raises StopIteration which should make
>> the iterator also raise StopIteration at which point the real return
>> happens. 
> 
> Only if exc is not None
> 
> The only return in the pseudocode is inside "if exc is not None".
> Is there another return that's not shown? ;)
> 

Ah yes, I see now what you mean. 

I would think that the relevant psuedo-code should look more like:

except StopIteration:
if ret:
return exc
if exc is not None:
raise exc   # XXX See below
break
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Samuele Pedroni
Jim Fulton wrote:
Duncan Booth wrote:
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that
happensto evaluate to None inside a block simply exits the 
block, rather
   than exiting a surrounding function. Did I miss something, or is
   this a bug?


No, the return sets a flag and raises StopIteration which should make 
the iterator also raise StopIteration at which point the real return 
happens.

Only if exc is not None
The only return in the pseudocode is inside "if exc is not None".
Is there another return that's not shown? ;)
I agree that we leave the block, but it doesn't look like we
leave the surrounding scope.
that we are having this discussion at all seems a signal that the 
semantics are likely too subtle.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Jim Fulton
Duncan Booth wrote:
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that
happens 
   to evaluate to None inside a block simply exits the block, rather
   than exiting a surrounding function. Did I miss something, or is
   this a bug?


No, the return sets a flag and raises StopIteration which should make the 
iterator also raise StopIteration at which point the real return happens.
Only if exc is not None
The only return in the pseudocode is inside "if exc is not None".
Is there another return that's not shown? ;)
I agree that we leave the block, but it doesn't look like we
leave the surrounding scope.
Jim
--
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]:

> Guido van Rossum wrote:
>> I've written a PEP about this topic. It's PEP 340: Anonymous Block
>> Statements (http://python.org/peps/pep-0340.html).
>> 
> Some observations:
> 
> 1. It looks to me like a bare return or a return with an EXPR3 that
> happens 
> to evaluate to None inside a block simply exits the block, rather
> than exiting a surrounding function. Did I miss something, or is
> this a bug?
> 

No, the return sets a flag and raises StopIteration which should make the 
iterator also raise StopIteration at which point the real return happens.

If the iterator fails to re-raise the StopIteration exception (the spec 
only says it should, not that it must) I think the return would be ignored 
but a subsquent exception would then get converted into a return value. I 
think the flag needs reset to avoid this case.

Also, I wonder whether other exceptions from next() shouldn't be handled a 
bit differently. If BLOCK1 throws an exception, and this causes the 
iterator to also throw an exception then one exception will be lost. I 
think it would be better to propogate the original exception rather than 
the second exception.

So something like (added lines to handle both of the above):

itr = EXPR1
exc = arg = None
ret = False
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
if exc is not None:
if ret:
return exc
else:
raise exc   # XXX See below
break
+   except:
+   if ret or exc is None:
+   raise
+   raise exc # XXX See below
+   ret = False
try:
exc = arg = None
BLOCK1
except Exception, exc:
arg = StopIteration()
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Greg Ewing wrote:
Nick Coghlan wrote:
def template():
  # pre_part_1
  yield None
  # post_part_1
  yield None
  # pre_part_2
  yield None
  # post_part_2
  yield None
  # pre_part_3
  yield None
  # post_part_3
def user():
  block = template()
  with block:
# do_part_1
  with block:
# do_part_2
  with block:
# do_part_3

That's an interesting idea, but do you have any use cases
in mind?
I was trying to address a use case which looked something like:
   do_begin()
   # code
   if some_condition:
  do_pre()
  # more code
  do_post()
   do_end()
It's actually doable with a non-looping block statement, but I have yet to come 
up with a version which isn't as ugly as hell.

I worry that it will be too restrictive to be really useful.
Without the ability for the iterator to control which blocks
get executed and when, you wouldn't be able to implement
something like a case statement, for example.
We can't write a case statement with a looping block statement either, since 
we're restricted to executing the same suite whenever we encounter a yield 
expression. At least the non-looping version offers some hope, since each yield 
can result in the execution of different code.

For me, the main sticking point is that we *already* have a looping construct to 
drain an iterator - a 'for' loop. The more different the block statement's 
semantics are from a regular loop, the more powerful I think the combination 
will be. Whereas if the block statement is just a for loop with slightly tweaked 
exception handling semantics, then the potential combinations will be far less 
interesting.

My current thinking is that we would be better served by a block construct that 
guaranteed it would call __next__() on entry and on exit, but did not drain the 
generator (e.g. by supplying appropriate __enter__() and __exit__() methods on 
generators for a PEP 310 style block statement, or __enter__(), __except__() and 
__no_except__() for the enhanced version posted elsewhere in this rambling 
discussion).

However, I'm currently scattering my thoughts across half-a-dozen different 
conversation threads. So I'm going to stop doing that, and try to put it all 
into one coherent post :)

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Jim Fulton
Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
This looks pretty cool.
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that happens
   to evaluate to None inside a block simply exits the block, rather
   than exiting a surrounding function. Did I miss something, or is this
   a bug?
2. I assume it would be a hack to try to use block statements to implement
   something like interfaces or classes, because doing so would require
   significant local-variable manipulation.  I'm guessing that
   either implementing interfaces (or implementing a class statement
   in which the class was created before execution of a suite)
   is not a use case for this PEP.
Jim
--
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained
I'm still trying to build a case for a non-looping block statement, but the 
proposed enhancements to generators look great. Any further suggestions I make 
regarding a PEP 310 style block statement will account for those generator changes.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).

Some highlights:

- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use "continue EXPR" to pass a value to the generator
- generator exception handling explained

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Guido van Rossum
> > [Greg Ewing]
> >>* It seems to me that this same exception-handling mechanism
> >>would be just as useful in a regular for-loop, and that, once
> >>it becomes possible to put 'yield' in a try-statement, people
> >>are going to *expect* it to work in for-loops as well.

[Guido]
> > (You can already put a yield inside a try-except, just not inside a
> > try-finally.)

[Greg]
> Well, my point still stands. People are going to write
> try-finally around their yields and expect the natural
> thing to happen when their generator is used in a
> for-loop.

Well, the new finalization semantics should take care of that when
their generator is finalized -- its __next__() will be called with
some exception.  But as long you hang on to the generator, it will not
be finalized, which is distinctly different from the desired
with-statement semantics.

> > There would still be the difference that a for-loop invokes iter()
> > and a with-block doesn't.
>  >
>  > Also, for-loops that don't exhaust the iterator leave it
>  > available for later use.
> 
> Hmmm. But are these big enough differences to justify
> having a whole new control structure? Whither TOOWTDI?

Indeed, but apart from declaring that henceforth the with-statement
(by whatever name) is the recommended looping construct and a
for-statement is just a backwards compatibility macro, I just don't
see how we can implement the necessary immediate cleanup semantics of
a with-statement.  In order to serve as a resource cleanup statement
it *must* have stronger cleanup guarantees than the for-statement can
give (if only for backwards compatibility reasons).

> > """
> > The statement:
> >
> > for VAR in EXPR:
> > BLOCK
> >
> > does the same thing as:
> >
> > with iter(EXPR) as VAR:# Note the iter() call
> > BLOCK
> >
> > except that:
> >
> > - you can leave out the "as VAR" part from the with-statement;
> > - they work differently when an exception happens inside BLOCK;
> > - break and continue don't always work the same way.
> >
> > The only time you should write a with-statement is when the
> > documentation for the function you are calling says you should.
> > """
> 
> Surely you jest. Any newbie reading this is going to think
> he hasn't a hope in hell of ever understanding what is going
> on here, and give up on Python in disgust.

And surely you exaggerate.  How about this then:

The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.

> >>I'm seriously worried by the
> >>possibility that a return statement could do something other
> >>than return from the function it's written in.
> 
> > Let me explain the use cases that led me to throwing that in
> 
> Yes, I can see that it's going to be necessary to treat
> return as an exception, and accept the possibility that
> it will be abused. I'd still much prefer people refrain
> from abusing it that way, though. Using "return" to spell
> "send value back to yield statement" would be extremely
> obfuscatory.

That depends on where you're coming from.  To Ruby users it will look
completely natural because that's what Ruby uses.  (In fact it'll be a
while before they appreciate the deep differences between yield in
Python and in Ruby.)

But I accept that in Python we might want to use a different keyword
to pass a value to the generator.  I think using 'continue' should
work; continue with a value has no precedent in Python, and continue
without a value happens to have exactly the right semantics anyway.

> > (BTW ReturnFlow etc. aren't great
> > names.  Suggestions?)
> 
> I'd suggest just calling them Break, Continue and Return.

Too close to break, continue and return IMO.

> > One last thing: if we need a special name for iterators and
> > generators designed for use in a with-statement, how about calling
> > them with-iterators and with-generators.
> 
> Except that if it's no longer a "with" statement, this
> doesn't make so much sense...

Then of course we'll call it after whatever the new statement is going
to be called.  If we end up calling it the foible-statement, they will
be foible-iterators and foible-generators.

Anyway, I think I'll need to start writing a PEP.  I'll ask the PEP
editor for a number.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Greg Ewing
I don't think this proposal has any chance as long as
it's dynamically scoped.
It mightn't be so bad if it were lexically scoped,
i.e. a special way of defining a function so that
it shares the lexically enclosing scope. This
would be implementable, since the compiler has
all the necessary information about both scopes
available.
Although it might be better to have some sort of
"outer" declaration for rebinding in the enclosing
scope, instead of doing it on a whole-function basis.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Greg Ewing
Nick Coghlan wrote:
def template():
  # pre_part_1
  yield None
  # post_part_1
  yield None
  # pre_part_2
  yield None
  # post_part_2
  yield None
  # pre_part_3
  yield None
  # post_part_3
def user():
  block = template()
  with block:
# do_part_1
  with block:
# do_part_2
  with block:
# do_part_3
That's an interesting idea, but do you have any use cases
in mind?
I worry that it will be too restrictive to be really useful.
Without the ability for the iterator to control which blocks
get executed and when, you wouldn't be able to implement
something like a case statement, for example.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Brian Sabbey
Nick Coghlan wrote:
Accordingly, I would like to suggest that 'with' revert to something 
resembling the PEP 310 definition:

   resource = EXPR
   if hasattr(resource, "__enter__"):
   VAR = resource.__enter__()
   else:
   VAR = None
   try:
   try:
   BODY
   except:
   raise # Force realisation of sys.exc_info() for use in __exit__()
   finally:
   if hasattr(resource, "__exit__"):
   VAR = resource.__exit__()
   else:
   VAR = None
Generator objects could implement this protocol, with the following 
behaviour:

   def __enter__():
   try:
   return self.next()
   except StopIteration:
   raise RuntimeError("Generator exhausted, unable to enter with 
block")

   def __exit__():
   try:
   return self.next()
   except StopIteration:
   return None
   def __except__(*exc_info):
   pass
   def __no_except__():
   pass
One peculiarity of this is that every other 'yield' would not be allowed 
in the 'try' block of a try/finally statement (TBOATFS).  Specifically, a 
'yield' reached through the call to __exit__ would not be allowed in the 
TBOATFS.

It gets even more complicated when one considers that 'next' may be called 
inside BODY.  In such a case, it would not be sufficient to just disallow 
every other 'yield' in the TBOATFS.  It seems like 'next' would need some 
hidden parameter that indicates whether 'yield' should be allowed in the 
TBOATFS.

(I assume that if a TBOATFS contains an invalid 'yield', then an exception 
will be raised immediately before its 'try' block is executed.  Or would 
the exception be raised upon reaching the 'yield'?)


These are also possible by combining a normal for loop with a non-looping
with (but otherwise using Guido's exception injection semantics):
def auto_retry(attempts):
   success = [False]
   failures = [0]
   except = [None]
   def block():
   try:
   yield None
   except:
   failures[0] += 1
   else:
   success[0] = True
   while not success[0] and failures[0] < attempts:
   yield block()
   if not success[0]:
   raise Exception # You'd actually propagate the last inner failure
for attempt in auto_retry(3):
   with attempt:
   do_something_that_might_fail()
I think your example above is a good reason to *allow* 'with' to loop. 
Writing 'auto_retry' with a looping 'with' would be pretty straightforward 
and intuitive.  But the above, non-looping 'with' example requires two 
fairly advanced techniques (inner functions, variables-as-arrays trick) 
that would probably be lost on some python users (and make life more 
difficult for the rest).

But I do see the appeal to having a non-looping 'with'.  In many (most?) 
uses of generators, 'for' and looping 'with' could be used 
interchangeably.  This seems ugly-- more than one way to do it and all 
that.

-Brian
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Josiah Carlson

[Guido]
> OK, now you *must* look at the Boo solution.
> http://boo.codehaus.org/Syntactic+Macros

That is an interesting solution, requiring macro writers to actually
write an AST modifier seems pretty reasonable to me.  Whether we want
macros or not... 

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Guido van Rossum
[Paul Moore]
> *YUK* I spent a long time staring at this and wondering "where did b come 
> from?"
> 
> You'd have to come up with a very compelling use case to get me to like this.

I couldn't have said it better.

I said it longer though. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Guido van Rossum
[Jim Jewett]
> >> (2)  Add a way to say "Make this function I'm calling use *my* locals
> >> and globals."  This seems to meet all the agreed-upon-as-good use
> >> cases, but there is disagreement over how to sensibly write it.  The
> >> calling function is the place that could get surprised, but people
> >> who want thunks seem to want the specialness in the called function.

[Guido]
> > I think there are several problems with this. First, it looks
> > difficult to provide semantics that cover all the corners for the
> > blending of two namespaces. What happens to names that have a
> > different meaning in each scope?

[Jim]
> Programming error.  Same name ==> same object.

Sounds like a recipe for bugs to me. At the very least it is a total
breach of abstraction, which is the fundamental basis of the
relationship between caller and callee in normal circumstances. The
more I understand your proposal the less I like it.

> If a function is using one of _your_ names for something incompatible,
> then don't call that function with collapsed scope.  The same "problem"
> happens with globals today.  Code in module X can break if module Y
> replaces (not shadows, replaces) a builtin with an incompatible object.
> 
> Except ...
> > (E.g. 'self' when calling a method of
> > another object; or any other name clash.)
> 
> The first argument of a method *might* be a special case.  It seems
> wrong to unbind a bound method.  On the other hand, resource
> managers may well want to use unbound methods for the called
> code.

Well, what would you pass in as the first argument then?

> > Are the globals also blended?  How?
> 
> Yes.  The callee does not even get to see its normal namespace.
> Therefore, the callee does not get to use its normal name resolution.

Another breach of abstraction: if a callee wants to use an imported
module, the import should be present in the caller, not in the callee.

This seems to me to repeat all the mistakes of the dynamic scoping of
early Lisps (including GNU Emacs Lisp I believe).

It really strikes me as an endless source of errors that these
blended-scope callees (in your proposal) are ordinary
functions/methods, which means that they can *also* be called without
blending scopes. Having special syntax to define a callee intended for
scope-blending seems much more appropriate (even if there's also
special syntax at the call site).

> If the name normally resolves in locals (often inlined to a tuple, today),
> it looks in the shared scope, which is "owned" by the caller.  This is
> different from a free variable only because the callee can write to this
> dictionary.

Aha! This suggests that a blend-callee needs to use different bytecode
to avoid doing lookups in the tuple of optimized locals, since the
indices assigned to locals in the callee and the caller won't match up
except by miracle.

> If the name is free in that shared scope, (which implies that the
> callee does not bind it, else it would be added to the shared scope)
> then the callee looks up the caller's nested stack and then to the
> caller's globals, and then the caller's builtins.
> 
> > Second, this construct only makes sense for all callables;

(I meant this to read "does not make sense for all callables".)

> Agreed.

(And I presume you read it that way. :-)

> But using it on a non-function may cause surprising results
> especially if bound methods are not special-cased.
> 
> The same is true of decorators, which is why we have (at least
> initially) "function decorators" instead of "callable decorators".

Not true. It is possible today to write decorators that accept things
other than functions -- in fact, this is often necessary if you want
to write decorators that combine properly with other decorators that
don't return function objects (such as staticmethod and classmethod).

> > it makes no sense when the callable is implemented as
> > a C function,
> 
> Or rather, it can't be implemented, as the compiler may well
> have optimized the variables names right out.  Stack frame
> transitions between C and python are already special.

Understatement of the year. There just is no similarity between C and
Python stack frames. How much do you really know about Python's
internals???

> > or is a class, or an object with a __call__ method.
> 
> These are just calls to __init__ (or __new__) or __call__.

No they're not. Calling a class *first* creates an instance (calling
__new__ if it exists) and *then* calls __init__ (if it exists).

> These may be foolish things to call (particularly if the first
> argument to a method isn't special-cased), but ... it isn't
> a problem if the class is written appropriately.  If the class
> is not written appropriately, then don't call it with collapsed
> scope.

That's easy for you to say. Since the failure behavior is so messy I'd
rather not get started.

> > Third, I expect that if we solve the first two
> > problems, we'll still find that for an efficient implementation we
> > nee

Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Paul Moore
On 4/26/05, Jim Jewett <[EMAIL PROTECTED]> wrote:
> I'm not sure I understand this.  The preferred way would be
> to just stick the keyword before the call.  Using 'collapse', it
> would look like:
> 
> def foo(b):
> c=a
> def bar():
> a="a1"
> collapse foo("b1")
> print b, c# prints "b1", "a1"
> a="a2"
> foo("b2")# Not collapsed this time
> print b, c# still prints "b1", "a1"

*YUK* I spent a long time staring at this and wondering "where did b come from?"

You'd have to come up with a very compelling use case to get me to like this.

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Jim Jewett
>> (2)  Add a way to say "Make this function I'm calling use *my* locals
>> and globals."  This seems to meet all the agreed-upon-as-good use
>> cases, but there is disagreement over how to sensibly write it.  The
>> calling function is the place that could get surprised, but people
>> who want thunks seem to want the specialness in the called function.

> I think there are several problems with this. First, it looks
> difficult to provide semantics that cover all the corners for the
> blending of two namespaces. What happens to names that have a
> different meaning in each scope? 

Programming error.  Same name ==> same object.  

If a function is using one of _your_ names for something incompatible,
then don't call that function with collapsed scope.  The same "problem"
happens with globals today.  Code in module X can break if module Y
replaces (not shadows, replaces) a builtin with an incompatible object.

Except ...
> (E.g. 'self' when calling a method of
> another object; or any other name clash.) 

The first argument of a method *might* be a special case.  It seems
wrong to unbind a bound method.  On the other hand, resource
managers may well want to use unbound methods for the called
code.

> Are the globals also blended?  How?

Yes.  The callee does not even get to see its normal namespace.
Therefore, the callee does not get to use its normal name resolution.

If the name normally resolves in locals (often inlined to a tuple, today), 
it looks in the shared scope, which is "owned" by the caller.  This is 
different from a free variable only because the callee can write to this 
dictionary.

If the name is free in that shared scope, (which implies that the 
callee does not bind it, else it would be added to the shared scope) 
then the callee looks up the caller's nested stack and then to the 
caller's globals, and then the caller's builtins.

> Second, this construct only makes sense for all callables; 

Agreed.  

But using it on a non-function may cause surprising results
especially if bound methods are not special-cased.

The same is true of decorators, which is why we have (at least 
initially) "function decorators" instead of "callable decorators".

> it makes no sense when the callable is implemented as
> a C function, 

Or rather, it can't be implemented, as the compiler may well
have optimized the variables names right out.  Stack frame
transitions between C and python are already special.

> or is a class, or an object with a __call__ method. 

These are just calls to __init__ (or __new__) or __call__.
These may be foolish things to call (particularly if the first
argument to a method isn't special-cased), but ... it isn't
a problem if the class is written appropriately.  If the class
is not written appropriately, then don't call it with collapsed 
scope.

> Third, I expect that if we solve the first two
> problems, we'll still find that for an efficient implementation we
> need to modify the bytecode of the called function.

Absolutely.  Even giving up the XXX_FAST optimizations would 
still require new bytecode to not assume them.  (Deoptimizing 
*all* functions, in *all* contexts, is not a sensible tradeoff.)

Eventually, an optimizing compiler could do the right thing, but ... 
that isn't the point.  

For a given simple algorithm, interpeted python is generally slower 
than compiled C, but we write in python anyhow -- it is fast enough, 
and has other advantages.  The same is true of anything that lets 
me not cut-and-paste.  

> Try to make sure that it can be used in a "statement context" 
> as well as in an "expression context". 

I'm not sure I understand this.  The preferred way would be
to just stick the keyword before the call.  Using 'collapse', it
would look like:

def foo(b):
c=a
def bar():
a="a1"
collapse foo("b1")
print b, c# prints "b1", "a1"
a="a2"
foo("b2")# Not collapsed this time
print b, c# still prints "b1", "a1"

but I suppose you could treat it like the 'global' keyword

def bar():
a="a1"
collapse foo   # forces foo to always collapse when called within bar
foo("b1")
print b, c# prints "b1", "a1"
a="a2"
foo("b2")# still collapsed
print b, c# now prints "b2", "a2"

>> [Alternative 3 ... bigger that merely collapsing scope]
>> (3)  Add macros.  We still have to figure out how to limit their obfuscation.
>> Attempts to detail that goal seem to get sidetracked.

> No, the problem is not how to limit the obfuscation. The problem is
> the same as for (2), only more so: nobody has given even a *remotely*
> plausible mechanism for how exactly you would get code executed at
> compile time.

macros can (and *possibly* should) be evaluated at run-time.  

Compile time should be possible (there is an interpreter running) and 
faster, but ... is certainly not required.

Even if the macros just rerun 

Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Phillip J. Eby wrote:
At 09:12 PM 4/24/05 -0600, Steven Bethard wrote:
I guess it would be helpful to see example where the looping
with-block is useful.

Automatically retry an operation a set number of times before hard failure:
with auto_retry(times=3):
do_something_that_might_fail()
Process each row of a database query, skipping and logging those that 
cause a processing error:

with x,y,z = log_errors(db_query()):
do_something(x,y,z)
You'll notice, by the way, that some of these "runtime macros" may be 
stackable in the expression.
These are also possible by combining a normal for loop with a non-looping with 
(but otherwise using Guido's exception injection semantics):

def auto_retry(attempts):
success = [False]
failures = [0]
except = [None]
def block():
try:
yield None
except:
failures[0] += 1
else:
success[0] = True
while not success[0] and failures[0] < attempts:
yield block()
if not success[0]:
raise Exception # You'd actually propagate the last inner failure
for attempt in auto_retry(3):
with attempt:
do_something_that_might_fail()
The non-looping version of with seems to give the best of both worlds - 
multipart operation can be handled by multiple with statements, and repeated use 
of the same suite can be handled by nesting the with block inside iteration over 
an appropriate generator.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Aahz
On Tue, Apr 26, 2005, Guido van Rossum wrote:
>
> Now there's one more twist, which you may or may not like.  Presumably
> (barring obfuscations or bugs) the handling of BreakFlow and
> ContinueFlow by an iterator (or generator) is consistent for all uses
> of that particular iterator.  For example synchronized(lock) and
> transactional(db) do not behave as loops, and forever() does.  Ditto
> for handling ReturnFlow.  This is why I've been thinking of leaving
> out the 'with' keyword: in your mind, these calls would become new
> statement types, even though the compiler sees them all the same:
> 
> synchronized(lock):
> BLOCK
> 
> transactional(db):
> BLOCK
> 
> forever():
> BLOCK
> 
> opening(filename) as f:
> BLOCK

That's precisely why I think we should keep the ``with``: the point of
Python is to have a restricted syntax and requiring a prefix for these
constructs makes it easier to read the code.  You'll soon start to gloss
over the ``with`` but it will be there as a marker for your subconscious.
-- 
Aahz ([EMAIL PROTECTED])   <*> http://www.pythoncraft.com/

"It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses."  "Hit it."
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread ron adam
Hi, this is my first post here and I've been following this very 
interesting discussion as is has developed. 

A really short intro about me,  I was trained as a computer tech in the 
early 80's... ie. learned transistors, gates, logic etc...  And so my 
focus tends to be from that of a troubleshooter.  I'm medically retired 
now (not a subject for here) and am looking for something meaningful and 
rewarding that I can contribute to with my free time.

I will not post often at first as I am still getting up to speed with 
CVS and how Pythons core works.  Hopefully I'm not lagging this 
discussion too far or adding unneeded noise to it.  :-)

So maybe the 'with' keyword should be dropped (again!) in
favour of
  with_opened(pathname) as f:
...
But that doesn't look so great for the case where there's no variable
to be assigned to -- I wasn't totally clear about it, but I meant the
syntax to be
   with [VAR =] EXPR: BLOCK
where VAR would have the same syntax as the left hand side of an
assignment (or the variable in a for-statement).
I keep wanting to read it as:
  with OBJECT [from EXPR]: BLOCK
2) I'm not sure about the '='. It makes it look rather deceptively
like an ordinary assignment, and I'm sure many people are going
to wonder what the difference is between
  with f = opened(pathname):
do_stuff_to(f)
and simply
  f = opened(pathname)
  do_stuff_to(f)
or even just unconsciously read the first as the second without
noticing that anything special is going on. Especially if they're
coming from a language like Pascal which has a much less magical
form of with-statement.
Below is what gives me the clearest picture so far.  To me there is 
nothing 'anonymous' going on here.  Which is good I think. :-)

After playing around with Guido's example a bit, it looks to me the role 
of a 'with' block is to define the life of a resource object.  so "with 
OBJECT: BLOCK" seems to me to be the simplest and most natural way to 
express this.

def with_file(filename, mode):
  """ Create a file resource """
  f = open(filename, mode)
  try:
  yield f# use yield here
  finally:
  # Do at exit of 'with : '
  f.close
# Get a resource/generator object and use it.
f_resource = with_file('resource.py', 'r')
with f_resource:
  f = f_resource.next()   # get values from yields
  for line in f:
  print line,
# Generator resource with yield loop.
def with_file(filename):
  """ Create a file line resource """
  f = open(filename, 'r')  try:
  for line in f:
  yield line
  finally:
  f.close()
 # print lines in this file.
f_resource = with_file('resource.py')
with f_resource:
  while 1:
  line = f_resource.next()
  if line == "":
  break
  print line,
The life of an object used with a 'with' block is shorter than that of 
the function it is called from, but if the function is short, the life 
could be the same as the function. Then the 'with' block could be 
optional if the resource objects __exit__ method is called when the 
function exits, but that may require some way to tag a resource as being 
different from other class's and generators to keep from evaluating 
__exit__ methods of other objects.
As far as looping behaviors go, I prefer the loop to be explicitly 
defined in the resource  or the body of the 'with', because it looks to 
be more flexible.

Ron_Adam
# "The right question is a good start to finding the correct answer."
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Michael Hudson
Samuele Pedroni <[EMAIL PROTECTED]> writes:

> Michael Hudson wrote:
>
>> The history of iterators and generators could be summarized by
>> saying that an API was invented, then it turned out that in practice
>> one way of implementing them -- generators -- was almost universally
>> useful.
>>
>> This proposal seems a bit like an effort to make generators good at
>> doing something that they aren't really intended -- or dare I say
>> suited? -- for.  The tail wagging the dog so to speak.
>>
> it is fun because the two of us sort of already had this discussion in
> compressed form a lot of time ago:

Oh yes.  That was the discussion that led to PEP 310 being written.

> http://groups-beta.google.com/groups?q=with+generators+pedronis&hl=en

At least I'm consistent :)

> not that I was really conviced about my idea at the time which was
> very embrional,  and in fact I'm bit skeptical right now about how
> much bending or not of generators makes sense, especially for a
> learnability point of view.

As am I, obviously.

Cheers,
mwh

-- 
  Agh, the braindamage!  It's not unlike the massively
  non-brilliant decision to use the period in abbreviations
  as well as a sentence terminator.  Had these people no
  imagination at _all_? -- Erik Naggum, comp.lang.lisp
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Samuele Pedroni
Michael Hudson wrote:
The history of iterators and generators could be summarized by saying 
that an API was invented, then it turned out that in practice one way 
of implementing them -- generators -- was almost universally useful.

This proposal seems a bit like an effort to make generators good at 
doing something that they aren't really intended -- or dare I say 
suited? -- for.  The tail wagging the dog so to speak.

it is fun because the two of us sort of already had this discussion in 
compressed form a lot of time ago:

http://groups-beta.google.com/groups?q=with+generators+pedronis&hl=en
not that I was really conviced about my idea at the time which was very 
embrional,  and in fact I'm bit skeptical right now about how much 
bending or not of generators makes sense, especially for a learnability 
point of view.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Guido van Rossum
> (2)  Add a way to say "Make this function I'm calling use *my* locals
> and globals."  This seems to meet all the agreed-upon-as-good use
> cases, but there is disagreement over how to sensibly write it.  The
> calling function is the place that could get surprised, but people
> who want thunks seem to want the specialness in the called function.

I think there are several problems with this. First, it looks
difficult to provide semantics that cover all the corners for the
blending of two namespaces. What happens to names that have a
different meaning in each scope? (E.g. 'self' when calling a method of
another object; or any other name clash.) Are the globals also
blended? How? Second, this construct only makes sense for all
callables; you seem to want to apply it for function (and I suppose
methods, whether bound or not), but it makes no sense when the
callable is implemented as a C function, or is a class, or an object
with a __call__ method. Third, I expect that if we solve the first two
problems, we'll still find that for an efficient implementation we
need to modify the bytecode of the called function.

If you really want to pursue this idea beyond complaining "nobody
listens to me" (which isn't true BTW), I suggest that you try to
define *exactly* how you think it should work. Try to make sure that
it can be used in a "statement context" as well as in an "expression
context". You don't need to come up with a working implementation, but
you should be able to convince me (or Raymond H :-) that it *can* be
implemented, and that the performance will be reasonable, and that it
won't affect performance when not used, etc.

If you think that's beyond you, then perhaps you should accept "no" as
the only answer you're gonna get. Because I personally strongly
suspect that it won't work, so the burden of "proof", so to speak, is
on you.

> (3)  Add macros.  We still have to figure out how to limit their obfuscation.
> Attempts to detail that goal seem to get sidetracked.

No, the problem is not how to limit the obfuscation. The problem is
the same as for (2), only more so: nobody has given even a *remotely*
plausible mechanism for how exactly you would get code executed at
compile time. You might want to look at Boo, a Python-inspired
language that translates to C#. They have something they call
syntactic macros: http://boo.codehaus.org/Syntactic+Macros .

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Jim Jewett
Michael Hudson:

> This proposal seems a bit like an effort to make generators good at 
> doing something that they aren't really intended -- or dare I say 
> suited? -- for. 

I think it is more an effort to use the right keyword, which has 
unfortunately already been claimed by generators (and linked
to iterators).

yield

is a sensible way for code to say "your turn, but come back later".

But at the moment, it means "I am producing an intermediate value",
and the way to call that function is to treat it as an iterator (which
seems to imply looping over a closed set, so don't send in more
information after the initial setup).

Should we accept that "yield" is already used up, or should we
shoehorn the concepts until they're "close enough"?

> So, here's a counterproposal!

> with expr as var:
> ... code ...

> is roughly:

> def _(var):
>  ... code ...
> __private = expr
> __private(_)

...

> The need for approximation in the above translation is necessary 
> because you'd want to make assignments in '...code...' affect the scope 
> their written in, 

To me, this seems like the core requirement.  I see three sensible paths:

(1)  Do nothing.

(2)  Add a way to say "Make this function I'm calling use *my* locals 
and globals."  This seems to meet all the agreed-upon-as-good use 
cases, but there is disagreement over how to sensibly write it.  The 
calling function is the place that could get surprised, but people
who want thunks seem to want the specialness in the called function.

(3)  Add macros.  We still have to figure out how to limit their obfuscation.
Attempts to detail that goal seem to get sidetracked.

-jJ
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Michael Hudson wrote:
This is a non-starter, I hope.  I really meant what I said in PEP 310 
about loops being loops.
The more I play with this, the more I want the 'with' construct to NOT be a loop 
construct.

The main reason is that it would be really nice to be able to write and use a 
multipart code template as:

def template():
  # pre_part_1
  yield None
  # post_part_1
  yield None
  # pre_part_2
  yield None
  # post_part_2
  yield None
  # pre_part_3
  yield None
  # post_part_3
def user():
  block = template()
  with block:
# do_part_1
  with block:
# do_part_2
  with block:
# do_part_3
If 'with' is a looping construct, the above won't work, since the first usage 
will drain the template.

Accordingly, I would like to suggest that 'with' revert to something resembling 
the PEP 310 definition:

resource = EXPR
if hasattr(resource, "__enter__"):
VAR = resource.__enter__()
else:
VAR = None
try:
try:
BODY
except:
raise # Force realisation of sys.exc_info() for use in __exit__()
finally:
if hasattr(resource, "__exit__"):
VAR = resource.__exit__()
else:
VAR = None
Generator objects could implement this protocol, with the following behaviour:
def __enter__():
try:
return self.next()
except StopIteration:
raise RuntimeError("Generator exhausted, unable to enter with 
block")
def __exit__():
try:
return self.next()
except StopIteration:
return None
def __except__(*exc_info):
pass
def __no_except__():
pass
Note that the code template can deal with exceptions quite happily by utilising 
sys.exc_info(), and that the result of the call to __enter__ is available 
*inside* the with block, while the result of the call to __exit__ is available 
*after* the block (useful for multi-part blocks).

If I want to drain the template, then I can use a 'for' loop (albeit without the 
cleanup guarantees).

Taking this route would mean that:
  * PEP 310 and the question of passing values or exceptions into iterators 
would again become orthogonal
  * Resources written using generator syntax aren't cluttered with the 
repetitive try/finally code PEP 310 is trying to eliminate
  * 'for' remains TOOW to write an iterative loop
  * it is possible to execute _different_ suites between each yield in the 
template block, rather than being constrained to a single suite as in the 
looping case.
  * no implications for the semantics of 'return', 'break', 'continue'
  * 'yield' would not be usable inside a with block, unless the AbortIteration 
concept was adopting for forcible generator termination.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Michael Hudson
On 26 Apr 2005, at 15:13, Michael Hudson wrote:
So, here's a counterproposal!
And a correction!
with expr as var:
   ... code ...
is roughly:
def _(var):
... code ...
try:
expr(_)
except Return, e:
return e.value
Cheers,
mwh
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Michael Hudson
Whew!  This is a bit long...
On 25 Apr 2005, at 00:57, Guido van Rossum wrote:
After reading a lot of contributions (though perhaps not all -- this
thread seems to bifurcate every time someone has a new idea :-)
I haven't read all the posts around the subject, I'll have to admit.  
I've read the one I'm replying and its followups to pretty carefully, 
though.

I'm back to liking yield for the PEP 310 use case. I think maybe it was
Doug Landauer's post mentioning Beta, plus scanning some more examples
of using yield in Ruby. Jim Jewett's post on defmacro also helped, as
did Nick Coghlan's post explaining why he prefers 'with' for PEP 310
and a bare expression for the 'with' feature from Pascal (and other
languages :-).
The history of iterators and generators could be summarized by saying 
that an API was invented, then it turned out that in practice one way 
of implementing them -- generators -- was almost universally useful.

This proposal seems a bit like an effort to make generators good at 
doing something that they aren't really intended -- or dare I say 
suited? -- for.  The tail wagging the dog so to speak.

It seems that the same argument that explains why generators are so
good for defining iterators, also applies to the PEP 310 use case:
it's just much more natural to write
def with_file(filename):
f = open(filename)
try:
yield f
finally:
f.close()
This is a syntax error today, of course.  When does the finally: clause 
execute  with your proposal? [I work this one out below :)]

than having to write a class with __entry__ and __exit__ and
__except__ methods (I've lost track of the exact proposal at this
point).

At the same time, having to use it as follows:
for f in with_file(filename):
for line in f:
print process(line)
is really ugly,
This is a non-starter, I hope.  I really meant what I said in PEP 310 
about loops being loops.

so we need new syntax, which also helps with keeping
'for' semantically backwards compatible. So let's use 'with', and then
the using code becomes again this:
with f = with_file(filename):
for line in f:
print process(line)
Now let me propose a strawman for the translation of the latter into
existing semantics. Let's take the generic case:
with VAR = EXPR:
BODY
This would translate to the following code:
it = EXPR
err = None
while True:
try:
if err is None:
VAR = it.next()
else:
VAR = it.next_ex(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend "except Exception:" == 
"except:"
if not hasattr(it, "next_ex"):
raise

(The variables 'it' and 'err' are not user-visible variables, they are
internal to the translation.)
This looks slightly awkward because of backward compatibility; what I
really want is just this:
it = EXPR
err = None
while True:
try:
VAR = it.next(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend "except Exception:" == 
"except:"
pass

but for backwards compatibility with the existing argument-less next()
API
More than that: if I'm implementing an iterator for, uh, iterating, why 
would one dream of needing to handle an 'err' argument in the next() 
method?

I'm introducing a new iterator API next_ex() which takes an
exception argument.  If that argument is None, it should behave just
like next().  Otherwise, if the iterator is a generator, this will
raised that exception in the generator's frame (at the point of the
suspended yield).  If the iterator is something else, the something
else is free to do whatever it likes; if it doesn't want to do
anything, it can just re-raise the exception.
Ah, this answers my 'when does finally' execute question above.
Finally, I think it would be cool if the generator could trap
occurrences of break, continue and return occurring in BODY.  We could
introduce a new class of exceptions for these, named ControlFlow, and
(only in the body of a with statement), break would raise BreakFlow,
continue would raise ContinueFlow, and return EXPR would raise
ReturnFlow(EXPR) (EXPR defaulting to None of course).
Well, this is quite a big thing.
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be "VAR = yield ..." like in Ruby.)
With a little extra magic we could also get the behavior that if the
generator doesn't handle ControlFlow exceptions but re-raises them,
they would affect the code containing the with statement; this means
that the generator can decide whether return, break and continue are
handled locally or passed through to the containing block.
Note that EXPR doesn't have to re

Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Reinhold Birkenfeld wrote:
Nick Coghlan wrote:
Guido van Rossum wrote:
[snip]
- I think there's a better word than Flow, but I'll keep using it
 until we find something better.
How about simply reusing Iteration (ala StopIteration)?
  Pass in 'ContinueIteration' for 'continue'
  Pass in 'BreakIteration' for 'break'
  Pass in 'AbortIteration' for 'return' and finalisation.
And advise strongly *against* intercepting AbortIteration with anything other 
than a finally block.

Hmmm... another idea: If break and continue return keep exactly the current
semantics (break or continue the innermost for/while-loop), do we need
different exceptions at all? AFAICS AbortIteration (+1 on the name) would be
sufficient for all three interrupting statements, and this would prevent
misuse too, I think.
No, the iterator should be able to keep state around in the case of 
BreakIteration and ContinueIteration, whereas AbortIteration should shut the 
whole thing down.

In particular "VAR = yield None" is likely to become syntactic sugar for:
  try:
yield None
  except ContinueIteration, exc:
VAR = ContinueIteration.value
We definitely don't want that construct swallowing AbortIteration.
Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: anonymous blocks

2005-04-26 Thread Reinhold Birkenfeld
Nick Coghlan wrote:
> Guido van Rossum wrote:
> [snip]
>> - I think there's a better word than Flow, but I'll keep using it
>>   until we find something better.
> 
> How about simply reusing Iteration (ala StopIteration)?
> 
>Pass in 'ContinueIteration' for 'continue'
>Pass in 'BreakIteration' for 'break'
>Pass in 'AbortIteration' for 'return' and finalisation.
> 
> And advise strongly *against* intercepting AbortIteration with anything other 
> than a finally block.

Hmmm... another idea: If break and continue return keep exactly the current
semantics (break or continue the innermost for/while-loop), do we need
different exceptions at all? AFAICS AbortIteration (+1 on the name) would be
sufficient for all three interrupting statements, and this would prevent
misuse too, I think.

yours,
Reinhold

-- 
Mail address is perfectly valid!

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Greg Ewing
Guido van Rossum wrote:
[Greg Ewing]
* It seems to me that this same exception-handling mechanism
would be just as useful in a regular for-loop, and that, once
it becomes possible to put 'yield' in a try-statement, people
are going to *expect* it to work in for-loops as well.
(You can already put a yield inside a try-except, just not inside a
try-finally.)
Well, my point still stands. People are going to write
try-finally around their yields and expect the natural
thing to happen when their generator is used in a
for-loop.
There would still be the difference that a for-loop invokes iter() and
a with-block doesn't.
>
> Also, for-loops that don't exhaust the iterator leave it available for
> later use.
Hmmm. But are these big enough differences to justify
having a whole new control structure? Whither TOOWTDI?
"""
The statement:
for VAR in EXPR:
BLOCK
does the same thing as:
with iter(EXPR) as VAR:# Note the iter() call
BLOCK
except that:
- you can leave out the "as VAR" part from the with-statement;
- they work differently when an exception happens inside BLOCK;
- break and continue don't always work the same way.
The only time you should write a with-statement is when the
documentation for the function you are calling says you should.
"""
Surely you jest. Any newbie reading this is going to think
he hasn't a hope in hell of ever understanding what is going
on here, and give up on Python in disgust.

I'm seriously worried by the
possibility that a return statement could do something other
than return from the function it's written in.

Let me explain the use cases that led me to throwing that in
Yes, I can see that it's going to be necessary to treat
return as an exception, and accept the possibility that
it will be abused. I'd still much prefer people refrain
from abusing it that way, though. Using "return" to spell
"send value back to yield statement" would be extremely
obfuscatory.
(BTW ReturnFlow etc. aren't great
names.  Suggestions?)
I'd suggest just calling them Break, Continue and Return.
synchronized(lock):
BLOCK
transactional(db):
BLOCK
forever():
BLOCK
opening(filename) as f:
BLOCK
Hey, I like that last one! Well done!
One last thing: if we need a special name for iterators and generators
designed for use in a with-statement, how about calling them
with-iterators and with-generators.
Except that if it's no longer a "with" statement, this
doesn't make so much sense...
Greg


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Guido van Rossum wrote:
[snip]
- I think there's a better word than Flow, but I'll keep using it
  until we find something better.
How about simply reusing Iteration (ala StopIteration)?
  Pass in 'ContinueIteration' for 'continue'
  Pass in 'BreakIteration' for 'break'
  Pass in 'AbortIteration' for 'return' and finalisation.
And advise strongly *against* intercepting AbortIteration with anything other 
than a finally block.

- The new __next__() API can also (nay, *must*, to make all this work
  reliably) be used to define exception and cleanup semantics for
  generators, thereby rendering obsolete PEP 325 and the second half
  of PEP 288.  When a generator is GC'ed (whether by reference
  counting or by the cyclical garbage collector), its __next__()
  method is called with a BreakFlow exception instance as argument (or
  perhaps some other special exception created for the purpose).  If
  the generator catches the exception and yields another value, too
  bad -- I consider that broken behavior.  (The alternative would be
  to keep calling __next__(BreakFlow()) until it doesn't return a
  value, but that feels uncomfortable in a finalization context.)
As suggested above, perhaps the exception used here should be the exception that 
is raised when a 'return' statement is encountered inside the block, rather than 
the more-likely-to-be-messed-with 'break' statement.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: anonymous blocks

2005-04-26 Thread Reinhold Birkenfeld
Guido van Rossum wrote:
> [Greg Ewing]
>> I like the general shape of this, but I have one or two
>> reservations about the details.
> 
> That summarizes the feedback so far pretty well. I think we're on to
> something. And I'm not too proud to say that Ruby has led the way here
> to some extent (even if Python's implementation would be fundamentally
> different, since it's based on generators, which has some different
> possibilities and precludes some Ruby patterns).

Five random thoughts:

1. So if break and continue are allowed in with statements only when there
   is an enclosing loop, it would be a inconsistency; consider

 for item in seq:
with gen():
continue

   when the generator gen catches the ContinueFlow and does with it what it 
wants.
   It is then slightly unfair not to allow

 with x:
 continue

   Anyway, I would consider both counterintuitive. So what about making 
ReturnFlow,
   BreakFlow and ContinueFlow "private" exceptions that cannot be caught in 
user code
   and instead introducing a new statement that allows passing data to the 
generator?

2. In process of handling this, would it be reasonable to (re)introduce a 
combined
   try-except-finally statement with defined syntax (all except before finally) 
and
   behavior (finally is always executed)?

5. What about the intended usage of 'with' as in Visual B.. NO, NO, NOT THE 
WHIP!

   (not that you couldn't emulate this with a clever "generator":
  def short(x):
  yield x

  with short(my.long["object"]reference()) as _:
  _.spam = _.ham = _.eggs()

yours,
Reinhold

-- 
Mail address is perfectly valid!

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Greg Ewing
Brett C. wrote:
It executes the body, calling next() on the argument
> name on each time through until the iteration stops.
But that's no good, because (1) it mentions next(),
which should be an implementation detail, and (2)
it talks about iteration, when most of the time
the high-level intent has nothing to do with iteration.
In other words, this is too low a level of explanation.
Greg

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Guido van Rossum
[Greg Ewing]
> I like the general shape of this, but I have one or two
> reservations about the details.

That summarizes the feedback so far pretty well. I think we're on to
something. And I'm not too proud to say that Ruby has led the way here
to some extent (even if Python's implementation would be fundamentally
different, since it's based on generators, which has some different
possibilities and precludes some Ruby patterns).

> 1) We're going to have to think carefully about the naming of
> functions designed for use with this statement. If 'with'
> is going to be in there as a keyword, then it really shouldn't
> be part of the function name as well.

Of course. I only used 'with_opened' because it's been the running
example in this thread.

> I would rather see something like
> 
>with f = opened(pathname):
>  ...
> 
> This sort of convention (using a past participle as a function
> name) would work for some other cases as well:
> 
>with some_data.locked():
>  ...
> 
>with some_resource.allocated():
>  ...


Or how about

with synchronized(some_resource):
...

> On the negative side, not having anything like 'with' in the
> function name means that the fact the function is designed for
> use in a with-statement could be somewhat non-obvious. Since
> there's not going to be much other use for such a function,
> this is a bad thing.

This seems a pretty mild problem; one could argue that every function
is only useful in a context where its return type makes sense, and we
seem to be getting along just fine with naming conventions (or just
plain clear naming).

> It could also lead people into subtle usage traps such as
> 
>with f = open(pathname):
>  ...
> 
> which would fail in a somewhat obscure way.

Ouch. That one hurts. (I was going to say "but f doesn't have a next()
method" when I realized it *does*. :-) It is *almost* equivalent to

for f in open(pathname):
...

except if the "..." block raises an exception.  Fortunately your
proposal to use 'as' makes this mistake less likely.

> So maybe the 'with' keyword should be dropped (again!) in
> favour of
> 
>with_opened(pathname) as f:
>  ...

But that doesn't look so great for the case where there's no variable
to be assigned to -- I wasn't totally clear about it, but I meant the
syntax to be

with [VAR =] EXPR: BLOCK

where VAR would have the same syntax as the left hand side of an
assignment (or the variable in a for-statement).

> 2) I'm not sure about the '='. It makes it look rather deceptively
> like an ordinary assignment, and I'm sure many people are going
> to wonder what the difference is between
> 
>with f = opened(pathname):
>  do_stuff_to(f)
> 
> and simply
> 
>f = opened(pathname)
>do_stuff_to(f)
> 
> or even just unconsciously read the first as the second without
> noticing that anything special is going on. Especially if they're
> coming from a language like Pascal which has a much less magical
> form of with-statement.

Right.

> So maybe it would be better to make it look more different:
> 
>with opened(pathname) as f:
>  ...

Fredrik said this too, and as long as we're going to add 'with' as a
new keyword, we might as well promote 'as' to become a real
keyword. So then the syntax would become

with EXPR [as VAR]: BLOCK

I don't see a particular need for assignment to multiple VARs (but VAR
can of course be a tuple of identifiers).

> * It seems to me that this same exception-handling mechanism
> would be just as useful in a regular for-loop, and that, once
> it becomes possible to put 'yield' in a try-statement, people
> are going to *expect* it to work in for-loops as well.

(You can already put a yield inside a try-except, just not inside a
try-finally.)

> Guido has expressed concern about imposing extra overhead on
> all for-loops. But would the extra overhead really be all that
> noticeable? For-loops already put a block on the block stack,
> so the necessary processing could be incorporated into the
> code for unwinding a for-block during an exception, and little
> if anything would need to change in the absence of an exception.

Probably.

> However, if for-loops also gain this functionality, we end up
> with the rather embarrassing situation that there is *no difference*
> in semantics between a for-loop and a with-statement!

There would still be the difference that a for-loop invokes iter() and
a with-block doesn't.

Also, for-loops that don't exhaust the iterator leave it available for
later use. I believe there are even examples of this pattern, where
one for-loop searches the iterable for some kind of marker value and
the next for-loop iterates over the remaining items. For example:

f = open(messagefile)
# Process message headers
for line in f:
if not line.strip():
break
if line[0].isspace():
addcontinuation(line)
else:
addheader(line)
# Process mes

Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Ka-Ping Yee
On Mon, 25 Apr 2005, Brett C. wrote:
> It executes the body, calling next() on the argument name on each
> time through until the iteration stops.

There's a little more to it than that.  But on the whole I do support
the goal of finding a simple, short description of what this construct
is intended to do.  If it can be described accurately in a sentence
or two, that's a good sign that the semantics are sufficiently clear
and simple.

> I like "managers" since they are basically managing resources
> most of the time for the user.

No, please let's not call them that.  "Manager" is a very common word
to describe all kinds of classes in object-oriented designs, and it is
so generic as to hardly mean anything.  (Sorry, i don't have a better
alternative at the moment.)


-- ?!ng
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Brett C.
Greg Ewing wrote:
> Brett C. wrote:
> 
>> And before anyone decries the fact that this might confuse a newbie
>> (which
>> seems to happen with every advanced feature ever dreamed up), remember
>> this
>> will not be meant for a newbie but for someone who has experience in
>> Python and
>> iterators at the minimum, and hopefully with generators.
> 
> 
> This is dangerously close to the "you don't need to know about
> it if you're not going to use it" argument, which is widely
> recognised as false. Newbies might not need to know all the
> details of the implementation, but they will need to know
> enough about the semantics of with-statements to understand
> what they're doing when they come across them in other people's
> code.
> 

I am not saying it is totally to be ignored by people staring at Python code,
but we don't need to necessarily spell out the intricacies.

> Which leads me to another concern. How are we going to explain
> the externally visible semantics of a with-statement in a way
> that's easy to grok, without mentioning any details of the
> implementation?
> 
> You can explain a for-loop pretty well by saying something like
> "It executes the body once for each item from the sequence",
> without having to mention anything about iterators, generators,
> next() methods, etc. etc. How the items are produced is completely
> irrelevant to the concept of the for-loop.
> 
> But what is the equivalent level of description of the
> with-statement going to say?
> 
> "It executes the body with... ???"
> 

It executes the body, calling next() on the argument name on each time through
until the iteration stops.

> And a related question: What are we going to call the functions
> designed for with-statements, and the objects they return?
> Calling them generators and iterators (even though they are)
> doesn't seem right, because they're being used for a purpose
> very different from generating and iterating.
> 

I like "managers" since they are basically managing resources most of the time
for the user.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Brett C. wrote:
And before anyone decries the fact that this might confuse a newbie (which
seems to happen with every advanced feature ever dreamed up), remember this
will not be meant for a newbie but for someone who has experience in Python and
iterators at the minimum, and hopefully with generators.
This is dangerously close to the "you don't need to know about
it if you're not going to use it" argument, which is widely
recognised as false. Newbies might not need to know all the
details of the implementation, but they will need to know
enough about the semantics of with-statements to understand
what they're doing when they come across them in other people's
code.
Which leads me to another concern. How are we going to explain
the externally visible semantics of a with-statement in a way
that's easy to grok, without mentioning any details of the
implementation?
You can explain a for-loop pretty well by saying something like
"It executes the body once for each item from the sequence",
without having to mention anything about iterators, generators,
next() methods, etc. etc. How the items are produced is completely
irrelevant to the concept of the for-loop.
But what is the equivalent level of description of the
with-statement going to say?
"It executes the body with... ???"
And a related question: What are we going to call the functions
designed for with-statements, and the objects they return?
Calling them generators and iterators (even though they are)
doesn't seem right, because they're being used for a purpose
very different from generating and iterating.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Guido van Rossum wrote:
with VAR = EXPR:
BODY
This would translate to the following code:
it = EXPR
err = None
while True:
try:
if err is None:
VAR = it.next()
else:
VAR = it.next_ex(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend "except Exception:" == "except:"
if not hasattr(it, "next_ex"):
raise
I like the general shape of this, but I have one or two
reservations about the details.
1) We're going to have to think carefully about the naming of
functions designed for use with this statement. If 'with'
is going to be in there as a keyword, then it really shouldn't
be part of the function name as well. Instead of
  with f = with_file(pathname):
...
I would rather see something like
  with f = opened(pathname):
...
This sort of convention (using a past participle as a function
name) would work for some other cases as well:
  with some_data.locked():
...
  with some_resource.allocated():
...
On the negative side, not having anything like 'with' in the
function name means that the fact the function is designed for
use in a with-statement could be somewhat non-obvious. Since
there's not going to be much other use for such a function,
this is a bad thing.
It could also lead people into subtle usage traps such as
  with f = open(pathname):
...
which would fail in a somewhat obscure way.
So maybe the 'with' keyword should be dropped (again!) in
favour of
  with_opened(pathname) as f:
...
2) I'm not sure about the '='. It makes it look rather deceptively
like an ordinary assignment, and I'm sure many people are going
to wonder what the difference is between
  with f = opened(pathname):
do_stuff_to(f)
and simply
  f = opened(pathname)
  do_stuff_to(f)
or even just unconsciously read the first as the second without
noticing that anything special is going on. Especially if they're
coming from a language like Pascal which has a much less magical
form of with-statement.
So maybe it would be better to make it look more different:
  with opened(pathname) as f:
...
* It seems to me that this same exception-handling mechanism
would be just as useful in a regular for-loop, and that, once
it becomes possible to put 'yield' in a try-statement, people
are going to *expect* it to work in for-loops as well.
Guido has expressed concern about imposing extra overhead on
all for-loops. But would the extra overhead really be all that
noticeable? For-loops already put a block on the block stack,
so the necessary processing could be incorporated into the
code for unwinding a for-block during an exception, and little
if anything would need to change in the absence of an exception.
However, if for-loops also gain this functionality, we end up
with the rather embarrassing situation that there is *no difference*
in semantics between a for-loop and a with-statement!
This could be "fixed" by making the with-statement not loop,
as has been suggested. That was my initial thought as well,
but having thought more deeply, I'm starting to think that
Guido was right in the first place, and that a with-statement
should be capable of looping. I'll elaborate in another post.
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be "VAR = yield ..." like in Ruby.)
This is a very elegant idea, but I'm seriously worried by the
possibility that a return statement could do something other
than return from the function it's written in, especially if
for-loops also gain this functionality. Intercepting break
and continue isn't so bad, since they're already associated
with the loop they're in, but return has always been an
unconditional get-me-out-of-this-function. I'd feel uncomfortable
if this were no longer true.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Tim Delaney wrote:
There aren't many builtins that have magic names, and I don't think this 
should be one of them - it has obvious uses other than as an 
implementation detail.
I think there's some confusion here. As I understood the
suggestion, __next__ would be the Python name of the method
corresponding to the tp_next typeslot, analogously with
__len__, __iter__, etc.
There would be a builtin function next(obj) which would
invoke obj.__next__(), for use by Python code. For loops
wouldn't use it, though; they would continue to call the
tp_next typeslot directly.
Paul Moore wrote: 
PS The first person to replace builtin __next__ in order to implement
a "next hook" of some sort, gets shot :-)
I think he meant next(), not __next__. And it wouldn't
work anyway, since as I mentioned above, C code would
bypass next() and call the typeslot directly.
I'm +1 on moving towards __next__, BTW. IMO, that's the
WISHBDITFP. :-)
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Tim Delaney
Paul Moore wrote:
Hmm, it took me a while to get this, but what you're ssaying is that
if you modify Guido's "what I really want" solution to use
   VAR = next(it, exc)
then this builtin next makes "API v2" stuff using __next__ work while
remaining backward compatible with old-style "API v1" stuff using
0-arg next() (as long as old-style stuff isn't used in a context where
an exception gets passed back in).
Yes, but it could also be used (almost) anywhere an explicit obj.next() is 
used.

it = iter(seq)
while True:
   print next(it)
for loops would also change to use builtin next() rather than calling 
it.next() directly.

I'd suggest that the new builtin have a "magic" name (__next__ being
the obvious one :-)) to make it clear that it's an internal
implementation detail.
There aren't many builtins that have magic names, and I don't think this 
should be one of them - it has obvious uses other than as an implementation 
detail.

PS The first person to replace builtin __next__ in order to implement
a "next hook" of some sort, gets shot :-)
Damn! There goes the use case ;)
Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


  1   2   >