Re: [Python-Dev] Is PEP 237 final -- Unifying Long Integers and Integers

2005-06-19 Thread Keith Dart
On Sat, 18 Jun 2005, Michael Hudson wrote:


 The shortest way I know of going from 2149871625L to -2145095671 is
 the still-fairly-gross:

 v = 2149871625L
 ~int(~v0x)
 -2145095671

 I suppose the best thing is to introduce an unsignedint type for this
 purpose.

 Or some kind of bitfield type, maybe.

 C uses integers both as bitfields and to count things, and at least in
 my opinion the default assumption in Python should be that this is
 what an integer is being used for, but when you need a bitfield it can
 all get a bit horrible.

 That said, I think in this case we can just make fcntl_ioctl use the
 (new-ish) 'I' format argument to PyArg_ParseTuple and then you'll just
 be able to use 2149871625L and be happy (I think, haven't tried this).

Thanks for the reply. I think I will go ahead and add some extension types 
to Python. Thankfully, Python is extensible with new objects.

It is also useful (to me, anyway) to be able to map, one to one,
external primitives from other systems to Python primitives. For
example, CORBA and SNMP have a set of types (signed ints, unsigned ints,
etc.) defined that I would like to interface to Python (actually I have
already done this to some degree). But Python makes it a bit more
difficult without that one-to-one mapping of basic types.  Having an
unsigned int type, for example, would make it easier to interface Python
to SNMP or even some C libraries.

In other words, Since the Real World has these types that I must
sometimes interface to, it is useful to have these same (predictable)
types in Python.

So, it is worth extending the basic set of data types, and I will add it
to my existing collection of Python extensions.

Therefore, I would like to ask here if anyone has already started
something like this? If not, I will go ahead and do it (if I have time).


-- 

-- ~
Keith Dart [EMAIL PROTECTED]
public key: ID: F3D288E4
=
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 --Simple Implicit Lambda

2005-06-19 Thread Josiah Carlson

Donovan Baarda [EMAIL PROTECTED] wrote:
 Nick Coghlan wrote:
  Donovan Baarda wrote:
  
 As I see it, a lambda is an anonymous function. An anonymous function is 
 a function without a name.
  
  
  And here we see why I'm such a fan of the term 'deferred expression' 
  instead of 'anonymous function'.
 
 But isn't a function just a deferred expression with a name :-)

A function in Python is actually a deferred sequence of statements and
expressions. An anonymous function in Python (a lambda) is a deferred
expression.


  Python's lambda expressions *are* the former, but they are 
  emphatically *not* the latter.
 
 Isn't that because lambda's have the limitation of not allowing 
 statements, only expressions? I know this limitation avoids side-effects 
 and has significance in some formal (functional?) languages... but is 
 that what Python is? In the Python I use, lambda's are always used where 
 you are too lazy to define a function to do it's job.

I've generally seen people use lambdas for things that don't require
names in the current context; i.e. callbacks with simple executions.


 To me, anonymous procedures/functions would be a superset of deferred 
 expressions, and if the one stone fits perfectly in the slingshot we 
 have and can kill multiple birds... why hunt for another stone?

Are deferred expressions perhaps another way of spelling function
closure?


 Oh yeah Raymond: on the def defines some variable name... are you 
 joking? You forgot the smiley :-)

'def' happens to bind the name that follows the def to the function with
the arguments and body following the name.


 I don't get what the problem is with mixing statement and expression 
 semantics... from a practial point of view, statements just offer a 
 superset of expression functionality.

Statements don't have a return value.  To be more precise, what is the
value of for i in xrange(10): z.append(...)?  Examine the selection of
statements available to Python, and ask that question.  The only one
that MAY have a return value, is 'return' itself, which really requires
an expression to the right (which passes the expression to the right to
the caller's frame).  When you have statements that ultimately need a
'return' for a return value; you may as well use a standard function
definition.


 If there really is a serious practical reason why they must be limited 
 to expressions, why not just raise an exception or something if the 
 anonymous function is too complicated...

Define too complicated?


 I did some fiddling and it seems lambda's can call methods and stuff 
 that can have side effects, which kinda defeats what I thought was the 
 point of statements vs expressions... I guess I just don't 
 understand... maybe I'm just thick :-)

There is nothing stopping anyone from modifying anything in a lambda...

a = list(...)
pop = lambda:a.pop()
lcycle = lambda: a and a.append(a.pop(0))
rcycle = lambda: a and a.insert(0, a.pop())
popany = lambda: a and a.pop(random.randrange(len(a)))


 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is PEP 237 final -- Unifying Long Integers and Integers

2005-06-19 Thread Josiah Carlson

Keith Dart [EMAIL PROTECTED] wrote:

 Therefore, I would like to ask here if anyone has already started
 something like this? If not, I will go ahead and do it (if I have time).

If all you need to do is read or write C-like types to or from memory,
you should spend some time looking through the 'struct' module if you
haven't already.

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Kay Schluehr
Donovan Baarda wrote:

 I don't get what the problem is with mixing statement and expression 
 semantics... from a practial point of view, statements just offer a 
 superset of expression functionality.
 
 If there really is a serious practical reason why they must be limited 
 to expressions, why not just raise an exception or something if the 
 anonymous function is too complicated...
 
 I did some fiddling and it seems lambda's can call methods and stuff 
 that can have side effects, which kinda defeats what I thought was the 
 point of statements vs expressions... I guess I just don't 
 understand... maybe I'm just thick :-)

The whole point is that you are able to do all the basic control flow 
operations like IF, FOR and WHILE using simple expressions ( like in 
Pythons lambda ), the lambda statement itself and recursion. Therefore 
lambda expressions constitute a Turing complete language and they also 
do so in Python. Different to many FP languages lambda plays no central
role in Python because statements won't be reduced to lambda expressions 
( or some kind of ). They are merely an add-on.

Reduction provides often the advantage to make expressions/statements 
scriptable what they are not in Python. Python is strong in scripting 
classes/objects ( a big plus of the language ) but you can't simply use 
the language to prove that

lambda x,y: x+y*y
lambda x,y: y**2+x

are essentialy the same functions with different implementations [1]. I 
think this is a severe lack of expressibility and has nothing to do with 
the silly objection that one has to write one more line for a simple 
callback - o.k. I admit that I'm lazy too ;)

Regards,
Kay


[1] Not without hacking the parse tree. Doing so one might finally end 
up accessing the expression in a simple modifieable manner:

  (lambda x,y: x+y*y).expr
('+',(x,'*',(y,y)))

  (lambda x,y: y**2+x).expr
('+',(('**',(y,2)),x))




___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Reinhold Birkenfeld
Kay Schluehr wrote:

 Reduction provides often the advantage to make expressions/statements 
 scriptable what they are not in Python. Python is strong in scripting 
 classes/objects ( a big plus of the language ) but you can't simply use 
 the language to prove that
 
 lambda x,y: x+y*y
 lambda x,y: y**2+x
 
 are essentialy the same functions with different implementations [1].

Except that they are not. Think of __pow__, think of __add__ and __radd__.

Reinhold


-- 
Mail address is perfectly valid!

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP for RFE 46738 (first draft)

2005-06-19 Thread Skip Montanaro

Simon I hacked things a bit, and instead of sending XML, sent pickles
Simon inside the XML response.

I've done the same thing (I think I may have used marshal).  It works fine
as long as you know both ends are Python.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Propose to reject PEP 294 -- Type Names in the types Module

2005-06-19 Thread Skip Montanaro
Raymond Suggest rejecting this PEP and making a note for Py3.0 to
Raymond either sync-up the type names or abandon the types module
Raymond entirely.

I thought the types module was already deprecated, at least verbally if not
officially.

Skip

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Skip Montanaro

 As I see it, a lambda is an anonymous function. An anonymous function
 is a function without a name. We already have a syntax for a
 function...  why not use it. ie:
 
 f = filter(def (a): return a  1, [1,2,3])

Kay You mix expressions with statements. 

You could remove the return and restrict the body of the def to an
expression: 

f = filter(def (a): a  1, [1,2,3])

That looks almost exactly like a lambda, but uses def and parenthesizes
the argument list. It seems to me that would remind people this is a
function.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Kay Schluehr
Reinhold Birkenfeld wrote:


lambda x,y: x+y*y
lambda x,y: y**2+x

 are essentialy the same functions with different implementations [1].
 
 
 Except that they are not. Think of __pow__, think of __add__ and __radd__.

You know the difference between the concept of a function and it's 
infinitely many representations? That's why formal definitions exist.

 
 Reinhold
 
 

Just for refresh:

Formally, a function f from a set X of input values to a set Y of 
possible output values (written as f : X - Y) is a relation between X 
and Y which satisfies:

1. f is total, or entire: for all x in X, there exists a y in Y such 
that x f y (x is f-related to y), i.e. for each input value, there is at 
least one output value in Y.

2. f is many-to-one, or functional: if x f y and x f z, then y = z. 
i.e., many input values can be related to one output value, but one 
input value cannot be related to many output values.

A more concise expression of the above definition is the following: a 
function from X to Y is a subset f of the cartesian product X  Y, such 
that for each x in X, there is a unique y in Y such that the ordered 
pair (x, y) is in f.

http://en.wikipedia.org/wiki/Function_%28mathematics%29

Kay



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Kay Schluehr
Skip Montanaro wrote:

  As I see it, a lambda is an anonymous function. An anonymous function
  is a function without a name. We already have a syntax for a
  function...  why not use it. ie:
  
  f = filter(def (a): return a  1, [1,2,3])
 
 Kay You mix expressions with statements. 
 
 You could remove the return and restrict the body of the def to an
 expression: 
 
 f = filter(def (a): a  1, [1,2,3])
 
 That looks almost exactly like a lambda, but uses def and parenthesizes
 the argument list. It seems to me that would remind people this is a
 function.

Yes, but skipping the name of a function ( anonymizing it ) is not a 
strong reason to disallow statements in the anonymus function body. The 
crucial issue is the notation of callable expressions that are not 
statements but can be defined inside of other expressions ( e.g. inside 
a filter() call as in the example ). That's why I prefer notations that 
emphasize the expression character. Using the arrow notation
( (args) - expr ) would be fine for me but I would not deselect Python 
in favor for Java if ( expr from (args) ) is used instead. To me it's a 
judean popular front vs popular front of judea kind of thing.

Kay


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Recommend accepting PEP 312 -- Simple Implicit Lambda

2005-06-19 Thread Reinhold Birkenfeld
Kay Schluehr wrote:
 Reinhold Birkenfeld wrote:
 

lambda x,y: x+y*y
lambda x,y: y**2+x

 are essentialy the same functions with different implementations [1].
 
 
 Except that they are not. Think of __pow__, think of __add__ and __radd__.
 
 You know the difference between the concept of a function and it's 
 infinitely many representations? That's why formal definitions exist.

I must say that I don't understand what you're after.

Reinhold

-- 
Mail address is perfectly valid!

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Propose to reject PEP 276 -- Simple iterator for ints

2005-06-19 Thread Facundo Batista
On 6/17/05, Raymond Hettinger [EMAIL PROTECTED] wrote:

 The principal use case was largely met by enumerate().  From PEP 276's

+1 for reject it.

.Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] gcmodule issue w/adding __del__ to generator objects

2005-06-19 Thread Phillip J. Eby
At 10:15 PM 6/18/2005 -0400, Phillip J. Eby wrote:
Okay, I think I see why you can't do it.  You could guarantee that all
relevant __del__ methods get called, but it's bloody difficult to end up
with only unreachable items in gc.garbage afterwards.   I think gc would
have to keep a new list for items reachable from finalizers, that don't
themselves have finalizers.  Then, before creating gc.garbage, you walk the
finalizers and call their finalization (__del__) methods.  Then, you put
any remaining items that are in either the finalizer list or the
reachable-from-finalizers list into gc.garbage.

This approach might need a new type slot, but it seems like it would let us
guarantee that finalizers get called, even if the object ends up in garbage
as a result.  In the case of generators, however, close() guarantees that
the generator releases all its references, and so can no longer be part of
a cycle.  Thus, it would guarantee eventual cleanup of all
generators.  And, it would lift the general limitation on __del__ methods.

Hm.  Sounds too good to be true.  Surely if this were possible, Uncle Timmy
would've thought of it already, no?  Guess we'll have to wait and see what
he thinks.

Or maybe not.  After sleeping on it, I realized that the problems are all 
in when and how often __del__ is called.  The idea I had above would end up 
calling __del__ twice on non-generator objects.  For generators it's not a 
problem because the first call ends up ensuring that the second call is a 
no-op.

However, the *order* of __del__ calls makes a difference, even for 
generators.  What good is a finally: clause if all the objects reachable 
from it have been finalized already, anyway?

Ultimately, I'm thinking that maybe we were right not to allow try-finally 
to cross yield boundaries in generators.  It doesn't seem like you can 
guarantee anything about the behavior in the presence of cycles, so what's 
the point?

For a while I played around with the idea that maybe we could still support 
'with:' in generators, though, because to implement that we could make 
frames call __exit__ on any pending 'with' blocks as part of their tp_clear 
operation.  This would only work, however, if the objects with __exit__ 
methods don't have any references back to the frame.  In essence, you'd 
need a way to put the __exit__ objects on a GC-managed list that wouldn't 
run until after all the tp_clear calls had finished.

But even that is tough to make guarantees about.  For example, can you 
guarantee in that case that a generator's 'with:' blocks are __exit__-ed in 
the proper order?

Really, if we do allow 'with' and 'try-finally' to surround yield, I think 
we're going to have to tell people that it only works if you use a with or 
try-finally in some non-generator code to ensure that the generator.close() 
gets called, and that if you end up creating a garbage cycle, we either 
have to let it end up in gc.garbage, or just not execute its finally clause 
or __exit__ methods.

Of course, this sort of happens right now for other things with __del__; if 
it's part of a cycle the __del__ method never gets called.  The only 
difference is that it hangs around in gc.garbage, doing nothing useful.  If 
it's garbage, it's not reachable from anywhere else, so it does nobody any 
good to have it around.  So, maybe we should just say, sucks to be you 
and tp_clear anything that we'd otherwise have put in gc.garbage.  :)

In other words, since we're not going to call those __del__ methods anyway, 
maybe it just needs to be part of the language semantics that __del__ isn't 
guaranteed to be called, and a garbage collector that can't find a safe way 
to call it, doesn't have to.  tp_dealloc for classic classes and heap types 
could then just skip calling __del__ if they've already been cleared... oh 
wait, how do you know you've been cleared?  Argh.  Another nice idea runs 
up on the rocks of reality.

On the other hand, if you go ahead and run __del__ after tp_clear, the 
__del__ method will quickly run afoul of an AttributeError and die with 
only a minor spew to sys.stderr, thus encouraging people to get rid of 
their silly useless __del__ methods on objects that normally end up in 
cycles.  :)

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] gcmodule issue w/adding __del__ to generator objects

2005-06-19 Thread Phillip J. Eby
Sigh.  Looks like Guido already used the time machine to bring up these 
ideas five years ago:

http://mail.python.org/pipermail/python-dev/2000-March/002514.html

And apparently you went back with him:

http://mail.python.org/pipermail/python-dev/2000-March/002478.html

So I give up, 'cause there's no way I can compete with you time travellers.  :)

Although I do wonder -- why was __cleanup__ never implemented?  The only 
clue seems to be Guido's comment that he [finds] having a separate 
__cleanup__ protocol cumbersome.  It certainly seems to me that having a 
__cleanup__ that allows an object to handle itself being garbage would be 
handy, although it's only meaningful to have a __cleanup__ if you also have 
a __del__; otherwise, there would never be a reason to call it.  Maybe 
that's the reason it was considered cumbersome.


At 04:16 PM 6/19/2005 -0400, Phillip J. Eby wrote:
At 10:15 PM 6/18/2005 -0400, Phillip J. Eby wrote:
 Okay, I think I see why you can't do it.  You could guarantee that all
 relevant __del__ methods get called, but it's bloody difficult to end up
 with only unreachable items in gc.garbage afterwards.   I think gc would
 have to keep a new list for items reachable from finalizers, that don't
 themselves have finalizers.  Then, before creating gc.garbage, you walk the
 finalizers and call their finalization (__del__) methods.  Then, you put
 any remaining items that are in either the finalizer list or the
 reachable-from-finalizers list into gc.garbage.
 
 This approach might need a new type slot, but it seems like it would let us
 guarantee that finalizers get called, even if the object ends up in garbage
 as a result.  In the case of generators, however, close() guarantees that
 the generator releases all its references, and so can no longer be part of
 a cycle.  Thus, it would guarantee eventual cleanup of all
 generators.  And, it would lift the general limitation on __del__ methods.
 
 Hm.  Sounds too good to be true.  Surely if this were possible, Uncle Timmy
 would've thought of it already, no?  Guess we'll have to wait and see what
 he thinks.

Or maybe not.  After sleeping on it, I realized that the problems are all
in when and how often __del__ is called.  The idea I had above would end up
calling __del__ twice on non-generator objects.  For generators it's not a
problem because the first call ends up ensuring that the second call is a
no-op.

However, the *order* of __del__ calls makes a difference, even for
generators.  What good is a finally: clause if all the objects reachable
from it have been finalized already, anyway?

Ultimately, I'm thinking that maybe we were right not to allow try-finally
to cross yield boundaries in generators.  It doesn't seem like you can
guarantee anything about the behavior in the presence of cycles, so what's
the point?

For a while I played around with the idea that maybe we could still support
'with:' in generators, though, because to implement that we could make
frames call __exit__ on any pending 'with' blocks as part of their tp_clear
operation.  This would only work, however, if the objects with __exit__
methods don't have any references back to the frame.  In essence, you'd
need a way to put the __exit__ objects on a GC-managed list that wouldn't
run until after all the tp_clear calls had finished.

But even that is tough to make guarantees about.  For example, can you
guarantee in that case that a generator's 'with:' blocks are __exit__-ed in
the proper order?

Really, if we do allow 'with' and 'try-finally' to surround yield, I think
we're going to have to tell people that it only works if you use a with or
try-finally in some non-generator code to ensure that the generator.close()
gets called, and that if you end up creating a garbage cycle, we either
have to let it end up in gc.garbage, or just not execute its finally clause
or __exit__ methods.

Of course, this sort of happens right now for other things with __del__; if
it's part of a cycle the __del__ method never gets called.  The only
difference is that it hangs around in gc.garbage, doing nothing useful.  If
it's garbage, it's not reachable from anywhere else, so it does nobody any
good to have it around.  So, maybe we should just say, sucks to be you
and tp_clear anything that we'd otherwise have put in gc.garbage.  :)

In other words, since we're not going to call those __del__ methods anyway,
maybe it just needs to be part of the language semantics that __del__ isn't
guaranteed to be called, and a garbage collector that can't find a safe way
to call it, doesn't have to.  tp_dealloc for classic classes and heap types
could then just skip calling __del__ if they've already been cleared... oh
wait, how do you know you've been cleared?  Argh.  Another nice idea runs
up on the rocks of reality.

On the other hand, if you go ahead and run __del__ after tp_clear, the
__del__ method will quickly run afoul of an AttributeError and die with
only a minor spew to 

[Python-Dev] misplaced PEP

2005-06-19 Thread Nick Jacobson
At the www.python.org/peps page, PEP 281 is
erroneously listed in the Finished PEPs (done,
implemented in CVS) section.



 
Yahoo! Sports 
Rekindle the Rivalries. Sign up for Fantasy Football 
http://football.fantasysports.yahoo.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Problem with embedded python

2005-06-19 Thread Luisa




___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] misplaced PEP

2005-06-19 Thread Nick Jacobson
Well, it's fixed now.  Thanks to whomever took care of
it.

--- Nick Jacobson [EMAIL PROTECTED] wrote:

 At the www.python.org/peps page, PEP 281 is
 erroneously listed in the Finished PEPs (done,
 implemented in CVS) section.
 
 

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com