Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Nick Coghlan
James Y Knight wrote:
 On May 3, 2005, at 12:53 PM, Guido van Rossum wrote:
 
def saving_stdout(f):
save_stdout = sys.stdout
try:
sys.stdout = f
yield
finally:
sys.stdout = save_stdout
 
 
 I hope you aren't going to be using that in any threaded program. 

sys.stdout is a global - threading issues are inherent in monkeying with it. At 
least this approach allows all code that redirects stdout to be easily 
serialised:

def redirect_stdout(f, the_lock=Lock()):
 locking(the_lock):
 save_stdout = sys.stdout
 try:
 sys.stdout = f
 yield
 finally:
 sys.stdout = save_stdout

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Nick Coghlan
Tim Peters wrote:
 I don't think anyone has mentioned this yet, so I will:  library
 writers using Decimal (or more generally HW 754 gimmicks) have a need
 to fiddle lots of thread-local state (numeric context), and must
 restore it no matter how the routine exits.  Like boost precision to
 twice the user's value over the next 12 computations, then restore,
 and no matter what happens here, restore the incoming value of the
 overflow-happened flag.  It's just another instance of temporarily
 taking over a shared resource, but I think it's worth mentioning that
 there are a lot of things like that in the world, and to which
 decorators don't really sanely apply.

To turn this example into PEP 340 based code:

 # A template to be provided by the decimal module
 # Context is thread-local, so there is no threading problem
 def in_context(context):
 old_context = getcontext()
 try:
 setcontext(context)
 yield
 finally:
 setcontext(old_context)

Used as follows:

 block decimal.in_context(Context(prec=12)):
 # Perform higher precision operations here

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Nick Coghlan
Phillip J. Eby wrote:
 This and other examples from the PEP still have a certain awkwardness of 
 phrasing in their names.  A lot of them seem to cry out for a with 
 prefix, although maybe that's part of the heritage of PEP 310.  But Lisp 
 has functions like 'with-open-file', so I don't think that it's *all* a PEP 
 310 influence on the examples.

I've written up a few examples in the course of the discussion, and the more of 
them I have written, the more the keywordless syntax has grown on me.

No meaningful name like 'with' or 'in' is appropriate for all possible block 
iterators, which leaves only keyword-for-the-sake-of-a-keyword options like 
'block' or 'suite'. With block statements viewed as user-defined blocks, 
leaving 
the keyword out lets the block iterator be named whatever is appropriate to 
making the block statement read well. If a leading 'with' is needed, just 
include it in the name.

That is, instead of a 'block statement with the locking block iterator', you 
write a 'locking statement'. Instead of a block statement with the opening 
block 
iterator', you write an 'opening statement'.

The benefit didn't stand out for me until writing examples with real code 
around 
the start of the block statement. Unlike existing statements, the keyword is 
essentially irrelevant in understanding the implications of the statement - the 
important thing is the block iterator being used. That is hard to see when the 
keyword is the only thing dedented from the contained suite.

Consider some of the use cases from the PEP, but put inside function 
definitions 
to make it harder to pick out the name of the block iterator:

   def my_func():
   block locking(the_lock):
   do_some_operation_while_holding_the_lock()

Versus:

   def my_func():
   locking(the_lock):
   do_some_operation_while_holding_the_lock()

And:

   def my_func(filename):
   block opening(filename) as f:
   for line in f:
   print f

Versus:

   def my_func(filename):
   opening(filename) as f:
   for line in f:
   print f


And a few more without the contrast:

   def my_func():
   do_transaction():
   db.delete_everything()


   def my_func():
   auto_retry(3, IOError):
   f = urllib.urlopen(http://python.org/peps/pep-0340.html;)
   print f.read()

   def my_func():
   opening(filename, w) as f:
   with_stdout(f):
   print Hello world


When Guido last suggested this, the main concern seemed to be that the 
documentation for every block iterator would need to explain the semantics of 
block statements, since the block iterator name is the only name to be looked 
up 
in the documentation. But they don't need to explain the general semantics, 
they 
only need to explain _their_ semantics, and possibly provide a pointer to the 
general block statement documentation. That is, explain _what_ the construct 
does (which is generally straightforward), not _how_ it does it (which is 
potentially confusing).

E.g.

   def locking(the_lock):
   Executes the following nested block while holding the supplied lock

  Ensures the lock is acquired before entering the block and
  released when the block is exited (including via exceptions
  or return statements).
  If None is supplied as the argument, no locking occurs.
   
   if the_lock is None:
   yield
   else:
   the_lock.acquire()
   try:
   yield
   finally:
   the_lock.release()

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Nick Coghlan
 At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote:
But I kind of doubt that it's an issue; you'd have to have a
try/except catching StopIteration around a yield statement that
resumes the generator before this becomes an issue, and that sounds
extremely improbable.

The specific offending construct is:

   yield itr.next()

Wrapping that in StopIteration can be quite convenient, and is probably too 
common to ignore - Phillip's examples reminded me that some my _own_ code uses 
this trick.

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Nick Coghlan
Delaney, Timothy C (Timothy) wrote:
 Guido van Rossum wrote:
 
 
I'd like the block statement to be defined exclusively in terms of
__exit__() though.
 
 
 This does actually suggest something to me (note - just a thought - no
 real idea if it's got any merit).
 
 Are there any use cases proposed for the block-statement (excluding the
 for-loop) that do *not* involve resource cleanup (i.e. need an
 __exit__)?
 
 This could be the distinguishing feature between for-loops and
 block-statements:
 
 1. If an iterator declares __exit__, it cannot be used in a for-loop.
For-loops do not guarantee resource cleanup.
 
 2. If an iterator does not declare __exit__, it cannot be used in a
 block-statement.
Block-statements guarantee resource cleanup.
 
 This gives separation of API (and thus purpose) whilst maintaining the
 simplicity of the concept. Unfortunately, generators then become a pain
 :( We would need additional syntax to declare that a generator was a
 block generator.

Ah, someone else did post this idea first :)

To deal with the generator issue, one option would be to follow up on Phillip's 
idea of a decorator to convert a generator (or perhaps any standard iterator) 
into a block iterator.

I think this would also do wonders for emphasising the difference between for 
loops and block statements.

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-04 Thread Delaney, Timothy C (Timothy)
Nick Coghlan wrote:

 Ah, someone else did post this idea first :)

I knew I was standing on the shoulders of others :)

 To deal with the generator issue, one option would be to follow up on
 Phillip's idea of a decorator to convert a generator (or perhaps any
 standard iterator) into a block iterator.
 
 I think this would also do wonders for emphasising the difference
 between for loops and block statements.

I think if we are going to emphasise the difference, a decorator does
not go far enough. To use a decorator, this *must* be valid syntax::

def gen():
try:
yield
finally:
print 'Done!'

However, that generator cannot be properly used in a for-loop. So it's
only realistically valid with the decorator, and used in a block
statement (resource suite ;)

My feeling is that the above should be a SyntaxError, as it currently
is, and that a new keyword is needed which explicitly allows the above,
and creates an object conforming to the resource protocal (as I called
it).

Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Raymond Hettinger
I just made a first reading of the PEP and want to clarify my
understanding of how it fits with existing concepts.

Is it correct to say that continue parallel's its current meaning and
returns control upwards (?outwards) to the block iterator that called
it?

Likewise, is it correct that yield is anti-parallel to the current
meaning?  Inside a generator, it returns control upwards to the caller.
But inside a block-iterator, it pushes control downwards (?inwards) to
the block it controls.

Is the distinction between block iterators and generators similar to the
Gang-of-Four's distinction between external and internal iterators?

Are there some good use cases that do not involve resource locking?
IIRC, that same use case was listed a prime motivating example for
decorators (i.e. @syncronized).  TOOWTDI suggests that a single use case
should not be used to justify multiple, orthogonal control structures.  

It would be great if we could point to some code in the standard library
or in a major Python application that would be better (cleaner, faster,
or clearer) if re-written using blocks and block-iterators.  I've
scanned through the code base looking for some places to apply the idea
and have come up empty handed.  This could mean that I've not yet
grasped the essence of what makes the idea useful or it may have other
implications such as apps needing to be designed from the ground-up with
block iterators in mind.


Raymond
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread LD \Gus\ Landis
Hi,

  Sounds like a useful requirement to have for new features in 2.x,
  IMO.  that is... demonstrated need.

  If the feature implies that the app needs to be designed from the
  ground up to *really* take advantage of the feature, then, maybe
  leave it for Guido's sabbatical (e.g. Python 3000).

On 5/3/05, Raymond Hettinger [EMAIL PROTECTED] wrote: 
  It would be great if we could point to some code in the standard library
 or in a major Python application that would be better (cleaner, faster,
 or clearer) if re-written using blocks and block-iterators.  I've
 scanned through the code base looking for some places to apply the idea
 and have come up empty handed.  This could mean that I've not yet
 grasped the essence of what makes the idea useful or it may have other
 implications such as apps needing to be designed from the ground-up with
 block iterators in mind.
 
 Raymond

-- 
LD Landis - N0YRQ - from the St Paul side of Minneapolis
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
[Raymond Hettinger]
 I just made a first reading of the PEP and want to clarify my
 understanding of how it fits with existing concepts.

Thanks! Now is about the right time -- all the loose ends are being
solidified (in my mind any way).

 Is it correct to say that continue parallel's its current meaning and
 returns control upwards (?outwards) to the block iterator that called
 it?

I have a hard time using directions as metaphors (maybe because on
some hardware, stacks grow down) unless you mean up in the source
code which doesn't make a lot of sense either in this context.

But yes, continue does what you expect it to do in a loop.

Of course, in a resource allocation block, continue and break are
pretty much the same (just as they are in any loop that you know has
only one iteration).

 Likewise, is it correct that yield is anti-parallel to the current
 meaning?  Inside a generator, it returns control upwards to the caller.
 But inside a block-iterator, it pushes control downwards (?inwards) to
 the block it controls.

I have a hard time visualizing the difference. They feel the same to
me, and the implementation (from the generator's POV) is identical:
yield suspends the current frame, returning to the previous frame from
the call to next() or __next__(), and the suspended frame can be
resumed by calling next() / __next__() again.

 Is the distinction between block iterators and generators similar to the
 Gang-of-Four's distinction between external and internal iterators?

I looked it up in the book (p. 260), and I think generators have a
duality to them that makes the distinction useless, or at least
relative to your POV. With a classic for-loop driven by a generator,
the author of the for-loop thinks of it as an external iterator -- you
ask for the next item using the (implicit) call to next(). But the
author of the generator thinks of it as an internal iterator -- the
for loop resumes only when the generator feels like it.

 Are there some good use cases that do not involve resource locking?
 IIRC, that same use case was listed a prime motivating example for
 decorators (i.e. @syncronized).  TOOWTDI suggests that a single use case
 should not be used to justify multiple, orthogonal control structures.

Decorators don't need @synchronized as a motivating use case; there
are plenty of other use cases.

Anyway, @synchronized was mostly a demonstration toy; whole method
calls are rarely the right granularity of locking. (BTW in the latest
version of PEP 340 I've renamed synchronized to locking; many people
complained about the strange Javaesque term.)

Look at the examples in the PEP (version 1.16) for more use cases.

 It would be great if we could point to some code in the standard library
 or in a major Python application that would be better (cleaner, faster,
 or clearer) if re-written using blocks and block-iterators.  I've
 scanned through the code base looking for some places to apply the idea
 and have come up empty handed.  This could mean that I've not yet
 grasped the essence of what makes the idea useful or it may have other
 implications such as apps needing to be designed from the ground-up with
 block iterators in mind.

I presume you mentally discarded the resource allocation use cases
where the try/finally statement was the outermost statement in the
function body, since those would be helped by @synchronized; but look
more closely at Queue, and you'll find that the two such methods use
different locks!

Also the use case for closing a file upon leaving a block, while
clearly a resource allocation use case, doesn't work well with a
decorator.

I just came across another use case that is fairly common in the
standard library: redirecting sys.stdout. This is just a beauty (in
fact I'll add it to the PEP):

def saving_stdout(f):
save_stdout = sys.stdout
try:
sys.stdout = f
yield
finally:
sys.stdout = save_stdout

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote:
I just came across another use case that is fairly common in the
standard library: redirecting sys.stdout. This is just a beauty (in
fact I'll add it to the PEP):

def saving_stdout(f):

Very nice; may I suggest 'redirecting_stdout' as the name instead?

This and other examples from the PEP still have a certain awkwardness of 
phrasing in their names.  A lot of them seem to cry out for a with 
prefix, although maybe that's part of the heritage of PEP 310.  But Lisp 
has functions like 'with-open-file', so I don't think that it's *all* a PEP 
310 influence on the examples.

It also seems to me that it would be nice if locks, files, sockets and 
similar resources would implement the block-template protocol; then one 
could simply say:

  block self.__lock:
  ...

or:

  open(foo) as f:
  ...

And not need any special wrappers.  Of course, this could only work for 
files if the block-template protocol were distinct from the normal 
iteration protocol.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Aahz
On Tue, May 03, 2005, Phillip J. Eby wrote:
 At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote:

I just came across another use case that is fairly common in the
standard library: redirecting sys.stdout. This is just a beauty (in
fact I'll add it to the PEP):

def saving_stdout(f):
 
 Very nice; may I suggest 'redirecting_stdout' as the name instead?

You may; I'd nitpick that to either redirect_stdout or
redirected_stdout.  redirecting_stdout is slightly longer and doesn't
have quite the right flavor to my eye.  I might even go for make_stdout
or using_stdout; that relies on people understanding that a block means
temporary usage.

 This and other examples from the PEP still have a certain awkwardness
 of phrasing in their names.  A lot of them seem to cry out for a
 with prefix, although maybe that's part of the heritage of PEP 310.
 But Lisp has functions like 'with-open-file', so I don't think that
 it's *all* a PEP 310 influence on the examples.

Yes, that's why I've been pushing for with.
-- 
Aahz ([EMAIL PROTECTED])   * http://www.pythoncraft.com/

It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses.  Hit it.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Raymond Hettinger
[Raymond]
  Likewise, is it correct that yield is anti-parallel to the current
  meaning?  Inside a generator, it returns control upwards to the
caller.
  But inside a block-iterator, it pushes control downwards (?inwards)
to
  the block it controls.

[Guido van Rossum]
 I have a hard time visualizing the difference. They feel the same to
 me, and the implementation (from the generator's POV) is identical:
 yield suspends the current frame, returning to the previous frame from
 the call to next() or __next__(), and the suspended frame can be
 resumed by calling next() / __next__() again.

This concept ought to be highlighted in the PEP because it explains
clearly what yield does and it may help transition from a non-Dutch
mental model.  I expect that many folks (me included) think in terms of
caller vs callee with a parallel spatial concept of enclosing vs
enclosed.  In that model, the keywords continue, break, yield, and
return all imply a control transfer from the enclosed back to the
encloser.  

In contrast, the new use of yield differs in that the suspended frame
transfers control from the encloser to the enclosed. 



  Are there some good use cases that do not involve resource locking?
  IIRC, that same use case was listed a prime motivating example for
  decorators (i.e. @syncronized).  TOOWTDI suggests that a single use
case
  should not be used to justify multiple, orthogonal control
structures.
 
 Decorators don't need @synchronized as a motivating use case; there
 are plenty of other use cases.

No doubt about that.



 Anyway, @synchronized was mostly a demonstration toy; whole method
 calls are rarely the right granularity of locking. 

Agreed.  Since that is the case, there should be some effort to shift
some of the examples towards real use cases where a block-iterator is
the appropriate solution.  It need not hold-up releasing the PEP to
comp.lang.python, but it would go a long way towards improving the
quality of the subsequent discussion.



 (BTW in the latest
 version of PEP 340 I've renamed synchronized to locking; many people
 complained about the strange Javaesque term.)

That was diplomatic.  Personally, I find it amusing when there is an
early focus on naming rather than on functionality, implementation
issues, use cases, usability, and goodness-of-fit within the language.



  It would be great if we could point to some code in the standard
library
  or in a major Python application that would be better (cleaner,
faster,
  or clearer) if re-written using blocks and block-iterators

 look
 more closely at Queue, and you'll find that the two such methods use
 different locks!

I don't follow this one.   Tim's uses of not_empty and not_full are
orthogonal (pertaining to pending gets at one end of the queue and to
pending puts at the other end).  The other use of the mutex is
independent of either pending puts or gets; instead, it is a weak
attempt to minimize what can happen to the queue during a size query.

While the try/finallys could get factored-out into separate blocks, I do
not see how the code could be considered better off.  There is a slight
worsening of all metrics of merit:   line counts, total number of
function defs, number of calls, or number of steps executed outside the
lock (important given that the value a query result declines rapidly
once the lock is released).


 
 Also the use case for closing a file upon leaving a block, while
 clearly a resource allocation use case, doesn't work well with a
 decorator.

Right. 



 I just came across another use case that is fairly common in the
 standard library: redirecting sys.stdout. This is just a beauty (in
 fact I'll add it to the PEP):
 
 def saving_stdout(f):
 save_stdout = sys.stdout
 try:
 sys.stdout = f
 yield
 finally:
 sys.stdout = save_stdout

This is the strongest example so far.  When adding it to the PEP, it
would be useful to contrast the code with simpler alternatives like PEP
288's g.throw() or PEP 325's g.close().  On the plus side, the
block-iterator approach factors out code common to multiple callers.  On
the minus side, the other PEPs involve simpler mechanisms and their
learning curve would be nearly zero.  These pluses and minuses are
important because apply equally to all examples using blocks for
initialization/finalization.



Raymond
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
 [Raymond Hettinger]
  Likewise, is it correct that yield is anti-parallel to the current
  meaning?  Inside a generator, it returns control upwards to the caller.
  But inside a block-iterator, it pushes control downwards (?inwards) to
  the block it controls.
 
[Guido]
  I have a hard time visualizing the difference.

[Jim Jewett]
 In a normal generator, someone makes a call to establish the
 generator, which then becomes a little island -- anyone can call
 the generator, and it returns control back to whoever made the last call.
 
 With the block, every yield returns to a single designated callback.
 This callback had to be established at the same time the block was
 created, and must be textually inside it.  (An indented suite to the
 block XXX: line.)

Doesn't convince me. The common use for a regular generator is in a
for-loop, where every yield also returns to a single designated place
(calling it callback is really deceptive!).

And with a block, you're free to put the generator call ahead of the
block so you can call next() on it manually:

it = EXPR1
block it:
BLOCK1

is totally equivalent to

block EXPR1:
BLOCK1

but the first form lets you call next() on it as you please (until the
block is exited, for sure).

 But are there plenty of other use cases for PEP 340?

Yes. Patterns like do this little dance in a try/finally block and
perform this tune when you catch an XYZ exception are pretty common
in larger systems and are effectively abstracted away using the
block-statement and an appropriate iterator. The try/finally use case
often also has some setup that needs to go right before the try (and
sometimes some more setup that needs to go *inside* the try). Being
able to write this once makes it a lot easier when the little dance
has to be changed everywhere it is performed.

 If not, then why do we need PEP 340?  Are decorators not strong
 enough, or is it just that people aren't comfortable yet?  If it is a
 matter of comfort or recipies, then the new construct might have
 just as much trouble.  (So this one is not a loop, and you can tell
 the difference because ... uh, just skip that advanced stuff.)

PEP 340 and decorators are totally different things, and the only
vaguely common use case would be @synchronized, which is *not* a
proper use case for decorators, but safe locking is definitely a use
case for PEP 340.

  Anyway, @synchronized was mostly a demonstration toy; whole
  method calls are rarely the right granularity of locking.
 
 That is an important difference -- though I'm not sure that the critical
 part *shouldn't* be broken out into a separate method.

I'll be the judge of that. I have plenty of examples where breaking it
out would create an entirely artificial helper method that takes
several arguments just because it needs to use stuff that its caller
has set up for it.

  I presume you mentally discarded the resource allocation use
  cases where the try/finally statement was the outermost statement
  in the function body, since those would be helped by @synchronized;
  but look more closely at Queue, and you'll find that the two such
  methods use different locks!
 
 qsize, empty, and full could be done with a lockself decorator.
 Effectively, they *are* lockself decorators for the _xxx functions
 that subclasses are told to override.

Actually you're pointing out a bug in the Queue module: these *should*
be using a try/finally clause to ensure the mutex is released even if
the inner call raises an exception. I hadn't noticed these before
because I was scanning only for finally.

If a locking primitive had been available, I'm sure it would have been
used here.

 If you're talking about put and get, decorators don't help as much,
 but I'm not sure blocks are much better.
 
 You can't replace the outermost try ... finally with a common decorator
 because the locks are self variables.  A block, by being inside a method,
 would be delayed until self exists -- but that outer lock is only a
 tiny fraction of the boilerplate.  It doesn't help with
 [...example deleted...]
 I wouldn't object to a helper method, but using a block just to get rid of 
 four
 lines (two of which are the literals try: and finally:) seems barely worth
 doing, let alone with special new syntax.

Well, to me it does; people have been requesting new syntax for this
specific case for a long time (that's where PEP 310 is coming from).

  Also the use case for closing a file upon leaving a block, while
  clearly a resource allocation use case, doesn't work well with a
  decorator.
 
 def autoclose(fn):
 def outer(filename, *args, **kwargs):
 f = open(filename)
 val = fn(f, *args, **kwargs)
 f.close()
 return val
 return outer
 
 @autoclose
 def f1(f):
 for line in f:
 print line

But the auto-closing file, even more than the self-releasing lock,
most often occurs in the middle of some code that would be unnatural
to turn into a 

Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Tim Peters
[Tim]
 Because Queue does use condvars now instead of plain locks, I wouldn't
 approve of any gimmick purporting to hide the acquire/release's in
 put() or get():  that those are visible is necessary to seeing that
 the _condvar_ protocol is being followed (must acquire() before
 wait(); must be acquire()'ed during notify(); no path should leave the
 condvar acquire()d 'for a long time' before a wait() or release()).

[Guido]
 So you think that this would be obscure? A generic condition variable
 use could look like this:
 
block locking(self.condvar):
while not self.items:
self.condvar.wait()
self.process(self.items)
self.items = []
 
 instead of this:
 
self.condvar.acquire()
try:
while not self.items:
self.condvar.wait()
self.process(self.items)
self.items = []
finally:
self.condvar.release()

 I find that the block locking version looks just fine; it makes the
 scope of the condition variable quite clear despite not having any
 explicit acquire() or release() calls (there are some abstracted away
 in the wait() call too!).

Actually typing it all out like that makes it hard to dislike wink. 
Yup, that reads fine to me too.

I don't think anyone has mentioned this yet, so I will:  library
writers using Decimal (or more generally HW 754 gimmicks) have a need
to fiddle lots of thread-local state (numeric context), and must
restore it no matter how the routine exits.  Like boost precision to
twice the user's value over the next 12 computations, then restore,
and no matter what happens here, restore the incoming value of the
overflow-happened flag.  It's just another instance of temporarily
taking over a shared resource, but I think it's worth mentioning that
there are a lot of things like that in the world, and to which
decorators don't really sanely apply.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Raymond Hettinger
  In contrast, the new use of yield differs in that the suspended
frame
  transfers control from the encloser to the enclosed.
 
 Why does your notion of who encloses whom suddenly reverse when you go
 from a for-loop to a block-statement? This all feels very strange to
 me.

After another reading of the PEP, it seems fine.

On the earlier readings, the yield felt disorienting because the body
of the block is subordinate to the block-iterator yet its code is
co-located with the caller (albeit set-off with a colon and
indentation).



 I meant to use this as an example of the unsuitability of the
 @synchronized decorator, since it implies that all synchronization is
 on the same mutex, thereby providing a use case for the locking
 block-statement.
 
 I suspect we're violently in agreement though.

Right :-)



  This is the strongest example so far.  When adding it to the PEP, it
  would be useful to contrast the code with simpler alternatives like
PEP
  288's g.throw() or PEP 325's g.close().  On the plus side, the
  block-iterator approach factors out code common to multiple callers.
On
  the minus side, the other PEPs involve simpler mechanisms and their
  learning curve would be nearly zero.  These pluses and minuses are
  important because apply equally to all examples using blocks for
  initialization/finalization.
 
 Where do you see a learning curve for blocks?

Altering the meaning of a for-loop; introducing a new keyword; extending
the semantics of break and continue; allowing try/finally inside a
generator; introducing new control flow; adding new magic methods
__next__ and __exit__; adding a new context for as; and tranforming
yield from statement semantics to expression semantics.  This isn't a
lightweight proposal and not one where we get transference of knowledge
from other languages (except for a few users of Ruby, Smalltalk, etc).

By comparision, g.throw() or g.close() are trivially simple approaches
to generator/iterator finalization.



In section on new for-loop specification, what is the purpose of arg?
Can it be replaced with the constant None?

itr = iter(EXPR1)
brk = False
while True:
try:
VAR1 = next(itr, None)
except StopIteration:
brk = True
break
BLOCK1
if brk:
BLOCK2



In block expr as var, can var be any lvalue?
  
block context() as inputfil, outputfil, errorfil:
  for i, line in enumerate(inputfil):
   if not checkformat(line):
print  errorfil, line
   else:
print  outputfil, secret_recipe(line)
   


In re-reading the examples, it occurred to me that the word block
already has meaning in the context of threading.Lock.acquire() which has
an optional blocking argument defaulting to 1.


In example 4, consider adding a comment that the continue has its
normal (non-extending) meaning.


The examples should demonstrate the operation of the extended form of
continue, break, and return in the body of the block.



Raymond
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 05:30 PM 5/3/05 -0400, Raymond Hettinger wrote:
By comparision, g.throw() or g.close() are trivially simple approaches
to generator/iterator finalization.

That reminds me of something; in PEP 333 I proposed use of a 'close()' 
attribute in anticipation of PEP 325, so that web applications implemented 
as generators could take advantage of resource cleanup.  Is there any 
chance that as part of PEP 340, 'close()' could translate to the same as 
'__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__' 
is going to be a bit of a pain, especially since there's code in the field 
now with that assumption.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
[Phillip]
 That reminds me of something; in PEP 333 I proposed use of a 'close()'
 attribute in anticipation of PEP 325, so that web applications implemented
 as generators could take advantage of resource cleanup.  Is there any
 chance that as part of PEP 340, 'close()' could translate to the same as
 '__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__'
 is going to be a bit of a pain, especially since there's code in the field
 now with that assumption.

Maybe if you drop support for the separate protocol alternative... :-)

I had never heard of that PEP. How much code is there in the field?
Written by whom?

I suppose you can always write a decorator that takes care of the
mapping. I suppose it should catch and ignore the StopIteration that
__exit__(StopIteration) is likely to throw.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Jim Jewett
Summary:  

Resource Managers are a good idea.
First Class Suites may be a good idea.

Block Iterators try to split the difference.  They're not as powerful
as First Class Suites, and not as straightforward as Resource 
Managers.  This particular middle ground didn't work out so well.

On 5/3/05, Guido van Rossum [EMAIL PROTECTED] wrote:
 [Jim Jewett]
...
  With the block, every yield returns to a single designated callback.
  This callback had to be established at the same time the block was
  created, and must be textually inside it.  (An indented suite to the
  block XXX: line.)
 
 Doesn't convince me. The common use for a regular generator is in a
 for-loop, where every yield also returns to a single designated place
 (calling it callback is really deceptive!).

I do not consider the body of a for-loop a to be callback; the generator
has no knowledge of that body.

But with a Block Iterator, the generator (or rather, its unrolled version) 
does need to textually contain the to-be-included suite -- which is why 
that suite smells like a callback function that just doesn't happen to be 
named.
 
 And with a block, you're free to put the generator call ahead of the
 block so you can call next() on it manually:
 
 it = EXPR1
 block it:
 BLOCK1

 ... lets you call next() on it as you please (until the
 block is exited, for sure).

For a Resource Manager, the only thing this could do is effectively
discard the BLOCK1, because the yields would have been used
up (and the resource deallocated).  

I suppose this is another spelling of resources are not loops.

  But are there plenty of other use cases for PEP 340?
 
 Yes. Patterns like do this little dance in a try/finally block and
 perform this tune when you catch an XYZ exception are pretty common

...

Let me rephrase ...

The Block Iterator syntax gets awkward if it needs to yield more than
once (and the exits are not interchangable).  You have said that is OK 
because most Resource Managers only yield once.

But if you're willing to accept that, then why not just limit it to a Resource
Manager instead of an Iterator?  Resource Managers could look similar 
to the current proposal, but would be less ambitious.  They should have 
absolutely no connection to loops/iterators/generators.  There should be
no internal secret loop.  if they use the yield keyword, it should be 
described as yielding control rather than yielding the next value.  There 
would be only one yielding of control per Resource Manager.

If limiting the concept to Resource Managers is not acceptable, then
I still don't think Block Iterators are the right answer -- though First Class
Suites might be.  (And so might No Changes at all.)

Reasoning:

If there is only one yield, then you're really just wrapping the call to 
the (unnamed) suite.

(Q)Why are decorators not appropriate? 

(A1)   In some cases, the wrapper needs to capture an 
instance-variable, which isn't available at definition-time.
(A2)   Decorators can be ugly.  This is often because the
need to return a complete replacement callable leads to too 
many nested functions.

These are both problems with decorators.  They do argue for
improving the decorator syntax, but not for throwing out the
concept.  I don't think that Block Iterators will really clear things 
up -- to me, they just look like a different variety of fog.

If decoration doesn't work, why not use a regular function
that takes a callback?  Pass the callback instead of defining an
anonymous suite.  Call the callback instead of writing the single
yield.

...

 ... you are proposing to solve all its use cases by defining an
 explicit function or method representing the body of the block. 

Yes.

 The latter solution leads to way too much ugly code -- all that
 function-definition boilerplate is worse than the try/finally 
 boilerplate we're trying to hide!

In the cases I've actually seen, the ugly function definition portions
are in the decorator, rather than the regular function.  It trades a
little ugliness that gets repeated all over the place for a lot of ugliness
that happens only once (in the decorator).

That said, I'm willing to believe that breaking out a method might 
sometimes be a bad idea.  In which case you probably want 
First Class (and decorable) Suites.

If First Class Suites are not acceptable in general, then let's figure
out where they are acceptable.  For me, Resource Manager is a good
use case, but Block Iterator is not.

-jJ
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
Sorry Jim, but I just don't think you  I were intended to be on the
same language design committee. Nothing you say seems to be making any
sense to me these days. Maybe someone else can channel you
effectively, but I'm not going to try to do a line-by-line response to
your email quoted below.

On 5/3/05, Jim Jewett [EMAIL PROTECTED] wrote:
 Summary:
 
 Resource Managers are a good idea.
 First Class Suites may be a good idea.
 
 Block Iterators try to split the difference.  They're not as powerful
 as First Class Suites, and not as straightforward as Resource
 Managers.  This particular middle ground didn't work out so well.
 
 On 5/3/05, Guido van Rossum [EMAIL PROTECTED] wrote:
  [Jim Jewett]
 ...
   With the block, every yield returns to a single designated callback.
   This callback had to be established at the same time the block was
   created, and must be textually inside it.  (An indented suite to the
   block XXX: line.)
 
  Doesn't convince me. The common use for a regular generator is in a
  for-loop, where every yield also returns to a single designated place
  (calling it callback is really deceptive!).
 
 I do not consider the body of a for-loop a to be callback; the generator
 has no knowledge of that body.
 
 But with a Block Iterator, the generator (or rather, its unrolled version)
 does need to textually contain the to-be-included suite -- which is why
 that suite smells like a callback function that just doesn't happen to be
 named.
 
  And with a block, you're free to put the generator call ahead of the
  block so you can call next() on it manually:
 
  it = EXPR1
  block it:
  BLOCK1
 
  ... lets you call next() on it as you please (until the
  block is exited, for sure).
 
 For a Resource Manager, the only thing this could do is effectively
 discard the BLOCK1, because the yields would have been used
 up (and the resource deallocated).
 
 I suppose this is another spelling of resources are not loops.
 
   But are there plenty of other use cases for PEP 340?
 
  Yes. Patterns like do this little dance in a try/finally block and
  perform this tune when you catch an XYZ exception are pretty common
 
 ...
 
 Let me rephrase ...
 
 The Block Iterator syntax gets awkward if it needs to yield more than
 once (and the exits are not interchangable).  You have said that is OK
 because most Resource Managers only yield once.
 
 But if you're willing to accept that, then why not just limit it to a Resource
 Manager instead of an Iterator?  Resource Managers could look similar
 to the current proposal, but would be less ambitious.  They should have
 absolutely no connection to loops/iterators/generators.  There should be
 no internal secret loop.  if they use the yield keyword, it should be
 described as yielding control rather than yielding the next value.  There
 would be only one yielding of control per Resource Manager.
 
 If limiting the concept to Resource Managers is not acceptable, then
 I still don't think Block Iterators are the right answer -- though First Class
 Suites might be.  (And so might No Changes at all.)
 
 Reasoning:
 
 If there is only one yield, then you're really just wrapping the call to
 the (unnamed) suite.
 
 (Q)Why are decorators not appropriate?
 
 (A1)   In some cases, the wrapper needs to capture an
 instance-variable, which isn't available at definition-time.
 (A2)   Decorators can be ugly.  This is often because the
 need to return a complete replacement callable leads to too
 many nested functions.
 
 These are both problems with decorators.  They do argue for
 improving the decorator syntax, but not for throwing out the
 concept.  I don't think that Block Iterators will really clear things
 up -- to me, they just look like a different variety of fog.
 
 If decoration doesn't work, why not use a regular function
 that takes a callback?  Pass the callback instead of defining an
 anonymous suite.  Call the callback instead of writing the single
 yield.
 
 ...
 
  ... you are proposing to solve all its use cases by defining an
  explicit function or method representing the body of the block.
 
 Yes.
 
  The latter solution leads to way too much ugly code -- all that
  function-definition boilerplate is worse than the try/finally
  boilerplate we're trying to hide!
 
 In the cases I've actually seen, the ugly function definition portions
 are in the decorator, rather than the regular function.  It trades a
 little ugliness that gets repeated all over the place for a lot of ugliness
 that happens only once (in the decorator).
 
 That said, I'm willing to believe that breaking out a method might
 sometimes be a bad idea.  In which case you probably want
 First Class (and decorable) Suites.
 
 If First Class Suites are not acceptable in general, then let's figure
 out where they are acceptable.  For me, Resource Manager is a good
 use case, but Block Iterator is not.
 
 -jJ
 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 03:33 PM 5/3/05 -0700, Guido van Rossum wrote:
[Phillip]
  That reminds me of something; in PEP 333 I proposed use of a 'close()'
  attribute in anticipation of PEP 325, so that web applications implemented
  as generators could take advantage of resource cleanup.  Is there any
  chance that as part of PEP 340, 'close()' could translate to the same as
  '__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__'
  is going to be a bit of a pain, especially since there's code in the field
  now with that assumption.

Maybe if you drop support for the separate protocol alternative... :-)

I don't understand you.  Are you suggesting a horse trade, or...?


I had never heard of that PEP. How much code is there in the field?

Maybe a dozen or so web applications and frameworks (including Zope, 
Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and 
mod_python).  A lot of the servers are based on my wsgiref library, though, 
so it probably wouldn't be too horrible a job to make everybody add 
support; I might even be able to fudge wsgiref so that wsgiref-based 
servers don't even see an issue.

Modifying the spec is potentially more controversial, however; it'll have 
to go past the Web-SIG, and I assume the first thing that'll be asked is, 
Why aren't generators getting a close() method then?, so I figured I 
should ask that question first.

I'd completely forgotten about this being an issue until Raymond mentioned 
g.close(); I'd previously gotten the impression that PEP 325 was expected 
to be approved, otherwise I wouldn't have written support for it into PEP 333.


Written by whom?

I used to know who all had written implementations, but there are now too 
many to keep track of.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 07:27 PM 5/3/05 -0400, Phillip J. Eby wrote:
Modifying the spec is potentially more controversial, however; it'll have
to go past the Web-SIG, and I assume the first thing that'll be asked is,
Why aren't generators getting a close() method then?, so I figured I
should ask that question first.

You know what, never mind.  I'm still going to write the Web-SIG so they 
know the change is coming, but this is really a very minor thing; just a 
feature we won't get for free as a side effect of PEP 325.

Your decorator idea is a trivial solution, but it would also be trivial to 
allow WSGI server implementations to call __exit__ on generators.  None of 
this affects existing code in the field, because today you can't write a 
try/finally in a generator anyway.  Therefore, nobody is relying on this 
feature, therefore it's basically moot.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
 Maybe if you drop support for the separate protocol alternative... :-)
 
 I don't understand you.  Are you suggesting a horse trade, or...?

Only tongue-in-cheek. :-)

 I had never heard of that PEP. How much code is there in the field?
 
 Maybe a dozen or so web applications and frameworks (including Zope,
 Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and
 mod_python).  A lot of the servers are based on my wsgiref library, though,
 so it probably wouldn't be too horrible a job to make everybody add
 support; I might even be able to fudge wsgiref so that wsgiref-based
 servers don't even see an issue.
 
 Modifying the spec is potentially more controversial, however; it'll have
 to go past the Web-SIG, and I assume the first thing that'll be asked is,
 Why aren't generators getting a close() method then?, so I figured I
 should ask that question first.
 
 I'd completely forgotten about this being an issue until Raymond mentioned
 g.close(); I'd previously gotten the impression that PEP 325 was expected
 to be approved, otherwise I wouldn't have written support for it into PEP 333.
 
 Written by whom?
 
 I used to know who all had written implementations, but there are now too
 many to keep track of.

Given all that, it's not infeasible to add a close() method to
generators as a shortcut for this:

def close(self):
try:
self.__exit__(StopIteration)
except StopIteration:
break
else:
# __exit__() didn't
raise RuntimeError(or some other exception)

I'd like the block statement to be defined exclusively in terms of
__exit__() though.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Raymond Hettinger
 it's not infeasible to add a close() method to
 generators as a shortcut for this:
 
 def close(self):
 try:
 self.__exit__(StopIteration)
 except StopIteration:
 break
 else:
 # __exit__() didn't
 raise RuntimeError(or some other exception)
 
 I'd like the block statement to be defined exclusively in terms of
 __exit__() though.

That sounds like a winner.



Raymond
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
Given all that, it's not infeasible to add a close() method to
generators as a shortcut for this:

 def close(self):
 try:
 self.__exit__(StopIteration)
 except StopIteration:
 break
 else:
 # __exit__() didn't
 raise RuntimeError(or some other exception)

I'd like the block statement to be defined exclusively in terms of
__exit__() though.

Sure.  PEP 325 proposes a CloseGenerator exception in place of 
StopIteration, however, because:

 
 Issues: should StopIteration be reused for this purpose?  Probably
 not.  We would like close to be a harmless operation for legacy
 generators, which could contain code catching StopIteration to
 deal with other generators/iterators.
 

I don't know enough about the issue to offer either support or opposition 
for this idea, though.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Guido van Rossum
On 5/3/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
 Given all that, it's not infeasible to add a close() method to
 generators as a shortcut for this:
 
  def close(self):
  try:
  self.__exit__(StopIteration)
  except StopIteration:
  break
  else:
  # __exit__() didn't
  raise RuntimeError(or some other exception)
 
 I'd like the block statement to be defined exclusively in terms of
 __exit__() though.

(So do you want this feature now or not? Earlier you said it was no big deal.)

 Sure.  PEP 325 proposes a CloseGenerator exception in place of
 StopIteration, however, because:
 
  
  Issues: should StopIteration be reused for this purpose?  Probably
  not.  We would like close to be a harmless operation for legacy
  generators, which could contain code catching StopIteration to
  deal with other generators/iterators.
  
 
 I don't know enough about the issue to offer either support or opposition
 for this idea, though.

That would be an issue for the generator finalization proposed by the
PEP as well.

But I kind of doubt that it's an issue; you'd have to have a
try/except catching StopIteration around a yield statement that
resumes the generator before this becomes an issue, and that sounds
extremely improbable. If at all possible I'd rather not have to define
a new exception for this purpose.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Delaney, Timothy C (Timothy)
Guido van Rossum wrote:

 I'd like the block statement to be defined exclusively in terms of
 __exit__() though.

This does actually suggest something to me (note - just a thought - no
real idea if it's got any merit).

Are there any use cases proposed for the block-statement (excluding the
for-loop) that do *not* involve resource cleanup (i.e. need an
__exit__)?

This could be the distinguishing feature between for-loops and
block-statements:

1. If an iterator declares __exit__, it cannot be used in a for-loop.
   For-loops do not guarantee resource cleanup.

2. If an iterator does not declare __exit__, it cannot be used in a
block-statement.
   Block-statements guarantee resource cleanup.

This gives separation of API (and thus purpose) whilst maintaining the
simplicity of the concept. Unfortunately, generators then become a pain
:( We would need additional syntax to declare that a generator was a
block generator.

OTOH, this may not be such a problem. Any generator that contains a
finally: around a yield automatically gets an __exit__, and any that
doesn't, doesn't. Although that feels *way* too magical to me (esp. in
light of my example below, which *doesn't* use finally). I'd prefer a
separate keyword for block generators. In that case, having finally:
around a yield would be a syntax error in a normal generator.

::

resource locking(lock):
lock.acquire()
try:
yield
finally:
lock.release()

block locking(myLock):
# Code here executes with myLock held.  The lock is
# guaranteed to be released when the block is left (even
# if via return or by an uncaught exception).

To use a (modified) example from another email::

class TestCase:

resource assertRaises (self, excClass):
try:
yield
except excClass:
return
else:
if hasattr(excClass, '__name__'): excName =
excClass.__name__
else: excName = str(excClass)
raise self.failureException, %s is not raised %
excName

block self.assertRaises(TypeError):
raise TypeError

Note that this *does* require cleanup, but without using a finally:
clause - the except: and else: are the cleanup code.

Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Raymond Hettinger
  I think that realization is important.  It would be great to have a
  section of the PEP that focuses on separability and matching
features to
  benefits.  Start with above observation that the proposed examples
can
  be achieved with generators driving the block statement.
 
 Good idea. I'm kind of stuck for time (have used up most of my Python
 time for the next few weeks) -- if you or someone else could volunteer
 some text I'd appreciate it.

I'll take a crack at it in the morning (we all seem to be on borrowed
time this week).



  When the discussion hits comp.lang.python, a separability section
will
  help focus the conversation (there's a flaw/issue/dislike about
feature
  x; however, features y/z and related benefits do not depend on x).
 
 Right. The PEP started with me not worrying too much about motivation
 or use cases but instead focusing on precise specification of the
 mechanisms, since there was a lot of confusion over that. Now that's
 out of the way, motivation (you might call it spin :-) becomes more
 important.

Perhaps the cover announcement should impart the initial spin as a
request for the community to create, explore, and learn from use cases.
That will help make the discussion more constructive, less abstract, and
more grounded in reality (wishful thinking).

That probably beats, Here's 3500 words of proposal; do you like it?.



Raymond
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote:
(So do you want this feature now or not? Earlier you said it was no big deal.)

It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer to 
do the actual implementation of the 'close()' method myself, because it's 
about the same amount of work as updating PEP 333 and sorting out any 
political issues that might arise therefrom.  :)


But I kind of doubt that it's an issue; you'd have to have a
try/except catching StopIteration around a yield statement that
resumes the generator before this becomes an issue, and that sounds
extremely improbable.

But it does exist, alas; see the 'itergroup()' and 'xmap()' functions of 
this cookbook recipe:

 http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66448/

Or more pointedly, the 'roundrobin()' example in the Python 2.4 documentation:

 http://www.python.org/doc/lib/deque-recipes.html

And there are other examples as well:

 http://www.faqts.com/knowledge_base/view.phtml/aid/13516
 http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Phillip J. Eby
At 08:47 PM 5/3/05 -0400, Phillip J. Eby wrote:
  http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934

Oops; that one's not really a valid example; the except StopIteration just 
has a harmless pass, and it's not in a loop.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 340 -- concept clarification

2005-05-03 Thread Delaney, Timothy C (Timothy)
Delaney, Timothy C (Timothy) wrote:

 Guido van Rossum wrote:
 
 I'd like the block statement to be defined exclusively in terms of
 __exit__() though.
 
 1. If an iterator declares __exit__, it cannot be used in a for-loop.
For-loops do not guarantee resource cleanup.
 
 2. If an iterator does not declare __exit__, it cannot be used in a
 block-statement.
Block-statements guarantee resource cleanup.

Now some thoughts have solidified in my mind ... I'd like to define some
terminology that may be useful.

resource protocol:
__next__
__exit__

Note: __iter__ is explicitly *not* required.

resource:
An object that conforms to the resource protocol.

resource generator:
A generator function that produces a resource.

resource usage statement/suite:
A suite that uses a resource.

With this conceptual framework, I think the following makes sense:

- Keyword 'resource' for defining a resource generator.
- Keyword 'use' for using a resource.

e.g.

::

resource locker (lock):
lock.acquire()
try:
yield
finally:
lock.release()

use locker(lock):
# do stuff

Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com