Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread Chirayu Krishnappa
I do agree that it is a crazy format - and am amazed that it works at
the prompt.

For the first case - you have a mismatched double quote for test2 at
the end of the string. test2 should be r'c:\A"\B;C"\D;c:\program
files\xyz' instead. For the 2nd case - my code swallowed the ';' it
split on - so I need a acc.append (';') just before the acc.append(p)
in Accumulate. The code then works. It needs to be fixed to take care
of extra double quotes and also a missing one (cmd.exe appeats to
assume one at the end if it did not find one.)

The itertools.splitby idea sounds really cool. I did not feel like
writing a state machine as the state was so simple to maintain here -
but I'd like to write a splitby so that it makes it easier to do such
crazy splitting in general.

Chirayu.

-- 
http://mail.python.org/mailman/listinfo/python-list


Makeing TopLevel Modal ?

2005-04-03 Thread Pete Moscatt
Hi all,

I want to make a dialog (using Tk Toplevel) but need it to be modal.  Is
this possible using Tk ?

Show below is an example how I am calling the custom dialog:

class main: 
def __init__(self,parent):

top = self.top = Toplevel(parent)
top.title("Server Settings")
top.minsize(width=230,height=270)
top.maxsize(width=230,height=270)   





Pete

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: re module non-greedy matches broken

2005-04-03 Thread André Malo
* "lothar" <[EMAIL PROTECTED]> wrote:

> this response is nothing but a description of the behavior i reported.

Then you have not read my response carefully enough.

> as to whether this behaviour was intended, one would have to ask the module
> writer about that.

No, I've responded with a view on regexes, not on the module. That is the way
_regexes_ work. Non-greedy regexes do not match the minimal-length at all, they
are just ... non-greedy (technically the backtracking just stacks the longest
instead of the shortest). They *may* match the shortest match, but it's a
special case. Therefore I've stated that the documentation is incomplete.

Actually your expectations go a bit beyond the documentation. From a certain
point of view (matches always start most left) the matches you're seeing
*are* the minimal-length matches.

> because of the statement in the documentation, which places no qualification
  
  that's the point.

> on how the scan for the shortest possible match is to be done, my guess is
> that this problem was overlooked.

In the docs, yes. But buy yourself a regex book and learn for yourself ;-)
The first thing you should learn about regexes is that the source of pain
of most regex implementations is the documentation, which is very likely
to be wrong.

Finally let me ask a question:

import re
x = re.compile('<.*?>')
print x.search('..').group(0)

What would you expect to be printed out?  or ? Why?

nd
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: the bugs that try men's souls

2005-04-03 Thread Sean McIlroy

Wow again. I had a real "V8 moment" when I looked at your solution
(smacking my forhead, groaning ruefully, etc). You were right: my
intention was simply to hide the trivial cases from view; I completely
missed the fact that I was now testing for membership in a different
set. I should have remembered that python "plays fair", and looked a
little harder to find my mistake.

Thanks again,
Sean
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: the bugs that try men's souls

2005-04-03 Thread Sean McIlroy
"Jordan Rastrick" <[EMAIL PROTECTED]> wrote in message news:<[EMAIL 
PROTECTED]>...



Wow. I'd resigned myself to the task of reformulating my question in
an intelligent way, I stopped by just to leave a little note to the
effect that the thread wasn't dead, and I find out the question's been
answered. Thanks very much. I'll let you know how it turns out.

Peace,
Sean
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: (win32) speedfan api control

2005-04-03 Thread Cappy2112
Nice idea- getting the handle to a control.
But how do you know what to pass for wparam , lparam , flags ?

BTW- I don't see anything unique to Active Python here.
You can do all of this with the Python windows extensions, which can be
installed without Active Python.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: help with python-devel!!!

2005-04-03 Thread Michele Simionato
Just give (as root)

# urpmi python-devel

(assuming you have configured urpmi properly, Google
for "easy urpmi").

 Michele Simionato

-- 
http://mail.python.org/mailman/listinfo/python-list


checkbook manager

2005-04-03 Thread David Isaac
I'd like to try personal financial management using Python.
I just found PyCheckbook, but it does not support check printing.
Is there a Python check printing application kicking around?

Thanks,
Alan Isaac


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: On slashdot

2005-04-03 Thread Isle Of The Dead

<[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
> There is a discussion about "Python Moving into the Enterprise" on
> Slashdot:
>
> http://it.slashdot.org/it/05/04/03/0715209.shtml?tid=156&tid=8


Using dejanews as a proxy to measure the meme propagation of "python"
versus other scripting languages -

http://www.realmeme.com/miner/java.php?startup=/miner/java/scriptinglanguagesDejanews.png


Python is the only one that shows a clear increase in rate of growth,
which supports the "Python in Enterprise" article. 


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: what's the use of __repr__?when shall I use it?

2005-04-03 Thread Greg Ewing
Vikram wrote:
__repr__ should return something that when eval'ed yields an identical
object (if possible).
That's strictly possible in so few cases that it's not
really a very helpful guideline, in my opinion.
I find the following view more helpful:
* str() is for producing the normal output of a program,
  to be seen by the user.
* repr() is for debugging output, and should indicate
  reasonably unambiguously the *type* of the object.
  When debugging, it's often at least as important to
  know what type of object you have as what value it
  has.
--
Greg Ewing, Computer Science Dept,
University of Canterbury,   
Christchurch, New Zealand
http://www.cosc.canterbury.ac.nz/~greg
--
http://mail.python.org/mailman/listinfo/python-list


Re: Silly question re: 'for i in sys.stdin'?

2005-04-03 Thread Steven Bethard
Jeff Epler wrote:
The iterator for files is a little bit like this generator function:
def lines(f):
while 1:
chunk = f.readlines(sizehint)
for line in chunk: yield line
Inside file.readlines, the read from the tty will block until sizehint
bytes have been read or EOF is seen.
I'm not the OP, but thanks for putting 2 and 2 together for me anyway. 
:)  I just tested it on my Windows XP box, and discovered that
for line in sys.stdin:
...
actually does read a line at a time, as long as the lines are at least 
8192 characters long. ;)

def lines(f):  # untested
"""lines(f)
If f is a terminal, then return an iterator that gives a value after
each line is entered.  Otherwise, return the efficient iterator for
files."""
if hasattr(f, "fileno") and isatty(f.fileno()):
return iter(f.readline, '')
return iter(f)
Slick.  Thanks!
STeVe
--
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread Greg Ewing
Dan Bishop wrote:
John J. Lee wrote:
Doesn't work with unicode, IIRC.

u" ".join(["What's", "the", "problem?"])
u"What's the problem?"
str.join(x, y) isn't quite a drop-in replacement for
string.join(y, x), since it's not polymorphic on the
joining string:
>>> str.join(u" ", ["a", "b"])
Traceback (most recent call last):
  File "", line 1, in ?
TypeError: descriptor 'join' requires a 'str' object but received a 'unicode'
The strings being joined can be unicode, though:
>>> str.join(" ", [u"a", u"b"])
u'a b'
So it's probably not a serious problem, since in most
cases you'll know whether the joining string is unicode
or not when you write the code. If not, you'll just
have to do it the "new" way.
--
Greg Ewing, Computer Science Dept,
University of Canterbury,   
Christchurch, New Zealand
http://www.cosc.canterbury.ac.nz/~greg
--
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Ron_Adam
On 3 Apr 2005 16:21:10 -0700, "Brendan" <[EMAIL PROTECTED]> wrote:

>Thanks for the tips.  Making FW a callable class (choice 5) seems to be
>a good (if verbose) solution.  I might just wrap my temporary values in
>a list [lastX, lastA, lastB] and mutate them as Michael suggests.
>Thanks to Michael especially for the explanation of the name-binding
>process that's at the heart of the issue.
>
>The other choicess are not as helpful to me for the following reasons:
>
>choice 1: I don't want the temporary values of lastA and lastB to be
>global variables in my case as they are great big numeric arrays, and
>I'd like their memory to be reclaimed after FW is done.

Generally global variables should be avoided in python if you are
doing a large application.  For smaller ones, they are ok, but they
are just a little slower than local variables.

You could use a classic class which is a good way to store a single
group of data.  The 'del' will unbind a name from an object so the
objects can be garbage collected.

class data:
   A = []
   B = []

def countupdown():
for n in xrange(11):
data.A.append(n)
data.B.append(10-n)
print data.A
print data.B

countupdown()

# store data  # Check out pickle module for this.

del data



>choice 2:  I tried this without success.  Using Micheal's example, I
>would assume you mean something like this:


def outer():
def inner():
outer.b += 1
print outer.b

inner()

outer.b = 1   # <-- initialize here after function of same name
outer()

# save data method here

del outer   # delete outer and it's attributes


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Silly question re: 'for i in sys.stdin'?

2005-04-03 Thread Jeff Epler
The iterator for files is a little bit like this generator function:
def lines(f):
while 1:
chunk = f.readlines(sizehint)
for line in chunk: yield line
Inside file.readlines, the read from the tty will block until sizehint
bytes have been read or EOF is seen.

If you want this kind of line-at-a-time functionality, then you could
use the iter(callable, sentinel) form, and switch between it and the
readlines method based on a commandline flag or whether the file
satisfies 'os.isatty()':
def lines(f):  # untested
"""lines(f)
If f is a terminal, then return an iterator that gives a value after
each line is entered.  Otherwise, return the efficient iterator for
files."""
if hasattr(f, "fileno") and isatty(f.fileno()):
return iter(f.readline, '')
return iter(f)

for line in lines(sys.stdin):
doSomethingWith(line)

Jeff


pgpisrc7TrsDv.pgp
Description: PGP signature
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Silly question re: 'for i in sys.stdin'?

2005-04-03 Thread David Trudgett

I'm not a Python expert by any means, but you're describing the
classic symptoms of buffering. There is a '-u' command line switch for
python to turn off buffering but that does not affect file iterators. 
See http://www.hmug.org/man/1/python.html for instance.

Tom Eastman <[EMAIL PROTECTED]> writes:

> I'm not new to Python, but I didn't realise that sys.stdin could be called
> as an iterator, very cool!
>
> However, when I use the following idiom:
>
>for line in sys.stdin:
>doSomethingWith(line)

Guess what the suggested work-around on the man page was? Use
sys.stdin.readline() in a "while 1:" loop, as you have below:

>
>while True:
>   line = sys.stdin.readline()
>   if line == '': break
>   doSomethingWith(line)

David

-- 

David Trudgett
http://www.zeta.org.au/~wpower/

Reality is 20% real and 80% made up stuff in your head.
But I'm not sure about the 20%.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: property and virtuality

2005-04-03 Thread Greg Ewing
Laszlo Zsolt Nagy wrote:
My problem is about properties and the virtuality of the methods. I 
would like to create a property whose get and set methods
are virtual.
You might find the following function useful, which I
developed for use in PyGUI.
  def overridable_property(name, doc = None):
"""Creates a property which calls methods get_xxx and set_xxx of
the underlying object to get and set the property value, so that
the property's behaviour may be easily overridden by subclasses."""
getter_name = intern('get_' + name)
setter_name = intern('set_' + name)
return property(
lambda self: getattr(self, getter_name)(),
lambda self, value: getattr(self, setter_name)(value),
None,
doc)
Usage example:
  class MyClass(object):
...
spam = overridable_property('spam', "Favourite processed meat product")
...
--
Greg Ewing, Computer Science Dept,
University of Canterbury,   
Christchurch, New Zealand
http://www.cosc.canterbury.ac.nz/~greg
--
http://mail.python.org/mailman/listinfo/python-list


Re: Making a DLL with python?

2005-04-03 Thread Greg Ewing
[EMAIL PROTECTED] wrote:
I'd love to do the whole thing in Python, but I don't know how to make
a DLL purely from Python.
I don't think you can do it *purely* in Python. You'll at
least need a C or Pyrex wrapper which dispatches to Python
code.
--
Greg Ewing, Computer Science Dept,
University of Canterbury,   
Christchurch, New Zealand
http://www.cosc.canterbury.ac.nz/~greg
--
http://mail.python.org/mailman/listinfo/python-list


Silly question re: 'for i in sys.stdin'?

2005-04-03 Thread Tom Eastman
I'm not new to Python, but I didn't realise that sys.stdin could be called
as an iterator, very cool!

However, when I use the following idiom:

   for line in sys.stdin:
   doSomethingWith(line)

and then type stuff into the program interactively, nothing actually happens
until I hit CTRL-D.  I expected that 'doSomethingWith(line)' would execute
after every line I input into the program, just like what used to happen
with:

   while True:
  line = sys.stdin.readline()
  if line == '': break
  doSomethingWith(line)

What is the difference?

Thanks for your help!

  Tom



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Docorator Disected

2005-04-03 Thread Ron_Adam
On Sun, 03 Apr 2005 23:59:51 +0200, "Martin v. Löwis"
<[EMAIL PROTECTED]> wrote:

>Ron_Adam wrote:
>> This would be the same without the nesting:
>> 
>> def foo(xx):
>> global x
>> x = xx
>> return fee
>> 
>> def fee(y):
>> global x
>> return y*x
>> 
>> z = foo(2)(6)
>
>Actually, it wouldn't.

Ok, yes, besides the globals, but I figured that part is obvious so I
didn't feel I needed to mention it.  The function call works the same
even though they are not nested functions. 

>> 
>> It's not entirely a misconception. Lets see where this goes...
>> 
>> 
>>dis.dis(compiler.compile('foo(2)(6)','','eval'))
>>>
>>>  1   0 LOAD_NAME0 (foo)
>>>  3 LOAD_CONST   1 (2)
>>>  6 CALL_FUNCTION1
>>>  9 LOAD_CONST   2 (6)
>>> 12 CALL_FUNCTION1
>>> 15 RETURN_VALUE
>
>Hmm. If you think that this proves that (2)(6) is being *passed*, you
>still might have a misconception. What this really does is:

I didn't say they were passed at the same time by the stack.  It just
shows my reference to *stacks* was correct, and that there's is an
underlying mechanism for calling functions and passing arguments and
functions that use the stack. I however was not yet aware (yesterday
afternoon) of just how the stack worked in this case.  This was very
much a figure it out as you go exercise. 

Yesterday, I had made the incorrect judgement that since the functions
are all nested inside a defined function, that I should treat them as
a group instead of individual functions.  But that wasn't the correct
way of viewing it.  They are in a group in that they share name space,
so I figured, (incorectly), that they shared an argument list somehow,
and those where passed to the group.  The passing of the function, and
it's arguments silently was a big reason for me jumping to this
conclusion.

So my reference to:

>>The interesting thing about this is the 'return fee' statement gets
>>the (6) apparently appended to it. So it becomes 'return fee(6).

Which is not correct, as the order of events is wrong and they do not
share a common argument list.

The correct order is:

return fee
fee(6) 
   
with the fee(6) being evaluated after the return statement is
executed.

Another contributing factor is two days of really poor sleep. Which
probably is a bigger factor than I would like to admit. I really feel
I should have gotten it much sooner.  But I did get-it, a little bit
at a time, and had a lot of terrific help along the way. :-) 



>> Or it could be said equally the functions (objects) are passed with
>> the stack. So both view are correct depending on the view point that
>> is chosen.
>
>Maybe I don't understand your view, when you said
>
># No, I did not know that you could pass multiple sets of arguments to
># nested defined functions in that manner.

My views have changed as I added the missing peices to the puzzle
yesterday.

At first I didn't see how they were passed at all, in a group or
otherwise. There wasn't any one-to-one way to match the arguments up
visually like there are in a normal function call.

My next thought was they are passed as a group, to the group of
defined functions that shared the same name space. (Everyone seems to
think I'm stuck on this one.) 

My Next view, yesterday afternoon, was they were passed on a stack
somehow one at a time. This last one is not necessarily incorrect from
a byte code viewpoint, but it's not the best way to view the problem. 

Today I believe I have the correct view as I've said this morning. I
could be wrong yet again. I hope not though I might have to give up
programming. :/

It's interesting that I have had several others tell me they had
trouble with this too.

So it is my opinion that decorators are a little too implicit.  I
think there should be a way to make them easier to use while achieving
the same objective and use. 


Thanks again for the reply,  :)

Cheers,
Ron


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: SimpleRPCServer

2005-04-03 Thread robin
Skip Montanaro <[EMAIL PROTECTED]> wrote:

>First, from my reading of SimpleXMLRPCServer, I don't think _dispatch()
>belongs at that level.  It belongs in the request handler class or in a
>separate dispatcher class, depending on what version of Python you're using.

Quite so. As a variant I just use verify_request() to persist the
client IP address, and then wait until _dispatch() to do everything
else.

class RPCServer(SimpleXMLRPCServer):
def verify_request(self, handler, address):
self.client_ip, self.client_port = address
return True

def _dispatch(self, method, args):
do_something(self.client_ip)

Though using a firewall would not be remiss. :-)

-- robin
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: re module non-greedy matches broken

2005-04-03 Thread lothar
this response is nothing but a description of the behavior i reported.

as to whether this behaviour was intended, one would have to ask the module
writer about that.
because of the statement in the documentation, which places no qualification
on how the scan for the shortest possible match is to be done, my guess is
that this problem was overlooked.

to produce a non-greedy (minimal length) match it is required that the start
of the non-greedy part of the match repeatedly be moved right with the last
match of the left-hand part of the pattern (preceding the .*?).

why would someone want a non-greedy (minimal length) match that was not
always non-greedy (minimal length)?



"André Malo" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
* lothar wrote:

> re:
> 4.2.1 Regular Expression Syntax
> http://docs.python.org/lib/re-syntax.html
>
>   *?, +?, ??
>   Adding "?" after the qualifier makes it perform the match in non-greedy
>   or
> minimal fashion; as few characters as possible will be matched.
>
> the regular expression module fails to perform non-greedy matches as
> described in the documentation: more than "as few characters as possible"
> are matched.
>
> this is a bug and it needs to be fixed.

The documentation is just incomplete. Non-greedy regexps still start
matching the leftmost. So instead the longest of the leftmost you get the
shortest of the leftmost. One may consider this as a documentation bug,
yes.

nd
--
# André Malo,  #



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Scott David Daniels
Paul Rubin wrote:
"Terry Reedy" <[EMAIL PROTECTED]> writes:
But assert statements vanish when you turn on the optimizer.  If
you're going to run your application with the optimizer turned on, I
certainly hope you run your regression tests with the optimizer on.
I don't see why you think so.  Assertion statements in the test code make 
it harder, not easier for the test to pass.  Ditto, I believe, for any in 
the run code, if indeed there are any.

If the unit tests are expressed as assert statements, and the assert
statements get optimized away, then running the unit tests on the
optimized code can obviously never find any test failures.
Any code depending upon __debug__ being 0 won't be tested.  Sometimes
test structures update values as a side-effect of tracking the debugging
state.  Not massively likely, but it makes for a scary environment when
your tests cannot be run on a non-debug version.
--Scott David Daniels
[EMAIL PROTECTED]
--
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Roy Smith
Scott David Daniels <[EMAIL PROTECTED]> wrote:
> Any code depending upon __debug__ being 0 won't be tested.  Sometimes
> test structures update values as a side-effect of tracking the debugging
> state.  Not massively likely, but it makes for a scary environment when
> your tests cannot be run on a non-debug version.
> 
> --Scott David Daniels
> [EMAIL PROTECTED]

What would happen if you defined

def verify (value):
   if not value:
  throw AssertionError

and then everyplace in your py.test suite where you would normally have 
done "assert foo", you now do "verify (foo)"?  A quick test shows that it 
appears to do the right thing.  I made a little test file:

--
#!/usr/bin/env python

def verify (value):
if not value:
raise AssertionError

class Test_foo:
def test_one (self):
assert 0

def test_two (self):
verify (0)
--

when I run that with "python py.test", I get two failures.  When I run it 
with "python -O py.test", I get one pass and one fail, which is what I 
expected to get if the assert gets optimized away.

The output is a little more verbose, since it shows the exception raised in 
verify(), but it gives you a stack dump, so it's not that hard to look one 
frame up and see where verify() was called from.

It's interesting that given the penchant for light-weight-ness in py.test, 
that the default output is so verbose (and, to my mind, confusing) compared 
to unittest.  I guess one could write their own output formatter and cut 
down on the verbosity?
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: threading.Event file descriptor

2005-04-03 Thread elbertlev
//And there's no handle at all?

There is one (check thread_nt.h) you have to "propagate" HANDLE to
Pythom level. That's why, you have to change the interpreter. Do not
forget, that thread is a build-in module.

//I wouldn't want to derive from Event since my goal would be to submit
a
patch to make subprocess.Popen.wait take an optional threading.Event to
kill the process.

And that's it? Right now aquire_lock is non-interruptable, as a result
your Popen.wait is also non-interruptable, but if you pass derived
event you will be able to handle more generic cases.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: threading.Event file descriptor

2005-04-03 Thread Nicolas Fleury
[EMAIL PROTECTED] wrote:
//And there's no handle at all?
There is one (check thread_nt.h) you have to "propagate" HANDLE to
Pythom level. That's why, you have to change the interpreter. Do not
forget, that thread is a build-in module.
Sounds fine with me.  A fileno (or whatever) function can be added to 
threading.Event on all platforms, giving access to internal file 
descriptor/handle.

//I wouldn't want to derive from Event since my goal would be to submit
a
patch to make subprocess.Popen.wait take an optional threading.Event to
kill the process.
And that's it? Right now aquire_lock is non-interruptable, as a result
your Popen.wait is also non-interruptable, but if you pass derived
event you will be able to handle more generic cases.
I'm not 100% sure I understand what you say.  Support killing the 
process with any handle, not only an event, would be a good thing.  But 
it doesn't change the fact that, IMHO, the usefulness of threading.Event 
is just too limited if it doesn't support select or 
WaitForMultipleObjects.  I think also that threading.Thread should give 
access to its internal handle (at least thread module does).

Regards,
Nicolas
--
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Paul Rubin
"Terry Reedy" <[EMAIL PROTECTED]> writes:
> > But assert statements vanish when you turn on the optimizer.  If
> > you're going to run your application with the optimizer turned on, I
> > certainly hope you run your regression tests with the optimizer on.
> 
> I don't see why you think so.  Assertion statements in the test code make 
> it harder, not easier for the test to pass.  Ditto, I believe, for any in 
> the run code, if indeed there are any.

If the unit tests are expressed as assert statements, and the assert
statements get optimized away, then running the unit tests on the
optimized code can obviously never find any test failures.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Brendan

>James Stroud  Apr 3, 3:18 pm:
>I think you might want to look at "python generators".

I've seen discussion of generators before, but haven't invested the
time to understand them yet.  This might be a good excuse.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Brendan
F -is- in fact an iterative optimizer that minimizes A on x (B is the
derivative of A).  So yes, F will call A and B on mulitple 'x's.   In
that case, it seems the mutable object trick is the way to go.  Thanks.

I didn't follow your last sentence.  What about the Python Cookbook?

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Maurice LING
Terry Reedy wrote:
"Maurice LING" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]

Say I code my  stuffs in Jython (importing java libraries) in a file 
"text.py"

Just to be clear, Jython is not a separate langague that you code *in*, but 
a separate implementation that you may slightly differently code *for*.

Yes, I do get this point rightly. Jython is just an implementation of 
Python virtual machine using Java. I do note that there are some 
differences, such as, Jython can only handle pure python modules. 
However, I'm not a language expert to differentiate language differences 
between these 2 implementations of Python, as in Jython and CPython. If 
someone care to enlighten, it will be my pleasure to consult. TIA.


... Will there be any issues when I try to import text.py into CPython?

If text.py is written in an appropriate version of Python, it itself will 
cause no problem.  Hoqwever, when it imports javacode files, as opposed to 
CPython bytecode files, CPython will choke.

In my example, the file "text.py" is coded in Jython, importing Java 
libraries. I do get that I cannot import Java jar files directly into 
CPython. What I do not get is that what is so special about Jython that 
it can "fool" CPython into using Java libraries... or is that there will 
always be a need for Java virtual machine and Python virtual machine 
when I use Java libraries in Jython... and importing Jython coded files 
into CPython

Cheers
Maurice
--
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Brendan
Thanks for the tips.  Making FW a callable class (choice 5) seems to be
a good (if verbose) solution.  I might just wrap my temporary values in
a list [lastX, lastA, lastB] and mutate them as Michael suggests.
Thanks to Michael especially for the explanation of the name-binding
process that's at the heart of the issue.

The other choicess are not as helpful to me for the following reasons:

choice 1: I don't want the temporary values of lastA and lastB to be
global variables in my case as they are great big numeric arrays, and
I'd like their memory to be reclaimed after FW is done.

choice 2:  I tried this without success.  Using Micheal's example, I
would assume you mean something like this:

def outer():
b = 1
def inner():
outer.b += 1
print outer.b
inner()
outer()

Which gives me:
AttributeError: 'function' object has no attribute 'b'

Perhaps I misapplied this method?

choice 3:  I know that Python can return multiple values in one line,
but I don't think that applies here.  My library function F, is looking
for two separate function arguments

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Terry Reedy

"Brendan" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
> I have a function, call it F(x), which asks for two
> other functions as arguments, say A(x) and B(x).  ...

If I understand this and the rest, a third party library whose code you 
cannot modify (easily) has a function F with (at least) three parameters: 
A, B, and x.  During its operation, F calls A(x) and B(x).  Because of code 
commonality for the particular A and B arg funcs you want to feed to F, you 
want avoid duplication by having the first call to either to calculate both 
return values.

If F calls each of A and B exactly once and always in the same order and 
only for the value x you supply, the solution is pretty easy.  A calls AB, 
stashes the B value away where it can be retrieved, and return the A value. 
B retrieves the B value and returns it.  But your problem is the stash and 
retrieve part.  Solutions:
1. global variable (easiest)  - use global declaration in A;
2. closure variable - use mutable such as 1 element list (see below);
3. instance attribute - with A and B as methods.

2 is what you tried to do, but without knowing the mutable (list or dict) 
trick:

def ABwrapper():
  bsave = [None]
  def A(x):
aval,bval = AB(x)
bsave[0] = bval
return aval
  def B(x):
return bsave[0]
  return A,B

This works because A does not try to *rebind* bsave to a new object.  It 
only mutates the existing object.

If the order of calling changes, you need more logic.  If F calls A and B 
on multiple 'x's, as with, for instance, a derivative approximizer, then I 
would memoize A and/or B using the recipe posted here more than once and on 
the cookbook site and included in the new Python Cookbook v2 (and maybe v1, 
don't have it).

Terry J. Reedy



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Ron_Adam
On 3 Apr 2005 14:12:48 -0700, "Brendan" <[EMAIL PROTECTED]> wrote:

>Hi everyone
>
>I'm new to Python, so forgive me if the solution to my question should
>have been obvious.  I have a function, call it F(x), which asks for two
>other functions as arguments, say A(x) and B(x).  A and B are most
>efficiently evaluated at once, since they share much of the same math,
>ie, A, B = AB(x), but F wants to call them independantly (it's part of
>a third party library, so I can't change this behaviour easily).   My
>solution is to define a wrapper function FW(x), with two nested
>functions,  AW(x) and BW(x), which only call AB(x) if x has changed.

You have several easy choices, that would not require you modifying
your program much.

1.  Use the 'global' keyword to declare lastX, aLastX, and bLastX as
globals, then all functions will have access to them.

def FW(x):
   global lastX, aLastX, bLastX

2.  Use function attributes, which are just names attached to the
function using a '.'.

def FW(x):
#
#  Function body here
#
return F(AW, BW)

FW.lastX = None
FW.aLastX = None
FW.bLastX = None

result = FW(x)


You will need to always include the FW. in front of those names. 


3. Something else, that may help is you can return more than one value
at a time. Python has this neat feature that you can have multiple
items on either side of the '=' sign.

a,b,c = 1,2,3
 
same as:
a=1
b=2
c=3

And it also works with return statements so you can return multiple
value.

def abc(n):
return n+1, n+2, n+3

a,b,c = abc(0)


5.  Choice 5 and above is to rewrite your function as a class.  Names
in class's retain their values between calls and you can access those
values the same way as accessing function attributes.  


Hope this helped.

Cheers,
Ron


>To make this all clear, here is my (failed) attempt:
>
>#--begin code -
>
>from ThirdPartyLibrary import F
>from MyOtherModule import AB
>
>def FW(x):
>lastX = None
>aLastX = None
>bLastX = None
>
>def AW(x):
>if x != lastX:
>lastX = x
># ^ Here's the problem.  this doesn't actually
># change FW's lastX, but creates a new, local lastX
>
>aLastX, bLastX = AB(x)
>return aLastX
>
>def BW(x):
>if x != lastX:
>lastX = x
># ^ Same problem
>
>aLastX, bLastX = AB(x)
>return bLastX
>
>#finally, call the third party function and return its result
>return F(AW, BW)
>
># end code -
>
>OK, here's my problem:  How do I best store and change lastX, A(lastX)
>and B(lastX) in FW's scope?  This seems like it should be easy, but I'm
>stuck.  Any help would be appreciated!
>
>  -Brendan

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread Dan Bishop
John J. Lee wrote:
> Duncan Booth <[EMAIL PROTECTED]> writes:
> [...]
> >str.join(sep, list_of_str)
> [...]
>
> Doesn't work with unicode, IIRC.

>>> u" ".join(["What's", "the", "problem?"])
u"What's the problem?"

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Terry Hancock
On Sunday 03 April 2005 04:12 pm, Brendan wrote:
> from ThirdPartyLibrary import F
> from MyOtherModule import AB
> 
> def FW(x):
> lastX = None
> aLastX = None
> bLastX = None

I'm pretty sure your method will work if you just specify
that these are global:

def FW(x):
global lastX = None
global aLastX = None
global bLastX = None

OTOH, I'm biased against using module-level variables
for this kind of purpose and I think things that retain
state really ought to be class instances, so I'd probably replace
AB(x)  with a callable object, and define two wrappers to access it
(untested):

class AB:
def __init__(self):
 self._last_x = None
 self._last_a = None
 self._last_b = None

def __call__(self, x):
 if x == self._last_x:
 return self._last_a, self._last_b
 else:
 self._last_a, self._last_b = self.AB(x)
 return self._last_a, self._last_b

def  A(self, x):
  return self(x)[0]

def B(self, x):
 return self(x)[1]

def AB(self, x):
 """
 This is where you compute your new values when needed.
 """
 # something that computes a and b
 return a,b

ab = AB()

Then you actually pass the methods ab.A and ab.B to your
library routine.  This will usually work, though if it somehow
insists on an actual function instead of a callable, you can always
use wrapper functions.

This also has the advantage that you *can* process more than
one case at a time (i.e. if you have two different places where you
need this function to be called and you aren't sure what order
they'll be processed (or don't want to think about it), you can
give them different instances of AB to work with, and they'll
remember their previous calls separately.

Cheers,
Terry


-- 
--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks  http://www.anansispaceworks.com

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Question about the Python Cookbook: How much of this is new?

2005-04-03 Thread Robert Kern
RickMuller wrote:
I had a question about the second edition of the Python Cookbook. I own
and have thoroughly enjoyed the first edition of the Python Cookbook.
How much of the second edition is new? Is this "essential reading" if I
already have the first edition? I realize that there are new sections
that describe language features through Python 2.4, but is this, say,
10% of the book (in which case I won't buy a new copy) or is it >25% of
the book (in which case I will). The Chapter (#3) on Time/Money that's
posted on the O'Reilly website is great (and entirely new, if memory
serves).
Quoting Trent Mick, who answered this question in another thread:
"""Here is an excerpt from the preface (typing errors are mine):
If you already own the first edition, you may be wondering whether
you need this second edition, too. We think the answer is "yes." The
first edition had 245 recipes; we kep 146 of those (with lots of
editing in almost all cases), and added 192 new ones, for a total of
338 recipes in this second e4dition. So, over half of the recipes in
this edition are complete,ly new, and all the recipes are updated
to apply to today's Python -- releases 2.3 and 2.4. Indeed, this
update is the main factor which lets us have almost 100 more recipes
in a book of about the same size. The first edition covered all
versions from 1.5.2 (one sometimes earlier) to 2.2; this one focuses
fimly on 2.3 and 2.4. Thianks to the greater port of today's Python,
and, even more4, thanks to the fact that this edition avoids the
"historical" treatises about how you had to do things in Python
versions releases 5 or more years ago, we were able to provide
substantially more currently relevant recipes and information in
roughtly the same amount of space.
Trent
"""
So yeah, buy it.
--
Robert Kern
[EMAIL PROTECTED]
"In the fields of hell where the grass grows high
 Are the graves of dreams allowed to die."
  -- Richard Harter
--
http://mail.python.org/mailman/listinfo/python-list


Re: Lambda: the Ultimate Design Flaw

2005-04-03 Thread Sunnan
Artie Gold wrote:
Torsten Bronger wrote:
The whole text seems to be a variant of
.
TschÃ,
Torsten.
Ya think? ;-)
Heh. I was glad that Torsten pointed it out; I didn't get what was funny 
about the joke until then.
--
http://mail.python.org/mailman/listinfo/python-list


Re: boring the reader to death (wasRe: Lambda: the Ultimate Design Flaw

2005-04-03 Thread Sunnan
Aahz wrote:
Note very, VERY, *VERY* carefully that the quote says nothing about
"boring code".  The quote explicitly refers to "reams of trivial code"
as boring -- and that's quite true.  Consider this distinction:
Thank you for this important clarification.
if foo == 'red':
print 'foo is red'
elif foo == 'blue':
print 'foo is blue'
versus
print "foo is", foo
Is the space added automatically? (Like awk does, if you add a comma.)
I'm sure you can think of many other examples -- real examples -- if you
put your mind to work; Guido's point is about the essential necessity of
refactoring and rewriting code for conciseness and clarity.
Which is a good point to make in almost any language, for code that is 
to be maintained.

Sunnan
--
http://mail.python.org/mailman/listinfo/python-list


Re: boring the reader to death (wasRe: Lambda: the Ultimate DesignFlaw

2005-04-03 Thread Sunnan
Scott David Daniels wrote:
No, poetry is to be read slowly and carefully, appreciating the nuance
at every point.  You should be able to read "past" python, while poetry
is at least as much about the form of the expression as it is about
what is being expressed.
Right, I agree with these descriptions of python vs "the poetry 
languages". I'm not sure whether I'd consider python particularly terse, 
though, but I don't know enough about it yet. (I've read a couple of 
programs but never started a project of my own in it, mainly because I 
love poetry. I can see the appeal of a "prose language", though.)
--
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Terry Reedy

"Paul Rubin" <"http://phr.cx"@NOSPAM.invalid> wrote in message 
news:[EMAIL PROTECTED]
> "Raymond Hettinger" <[EMAIL PROTECTED]> writes:
>> When writing a large suite, you quick come to appreciate being able
>> to use assert statements with regular comparision operators, debugging
>> with normal print statements, and not writing self.assertEqual over and
>> over again.  The generative tests are especially nice.
>
> But assert statements vanish when you turn on the optimizer.  If
> you're going to run your application with the optimizer turned on, I
> certainly hope you run your regression tests with the optimizer on.

I don't see why you think so.  Assertion statements in the test code make 
it harder, not easier for the test to pass.  Ditto, I believe, for any in 
the run code, if indeed there are any.

Terry J. Reedy



-- 
http://mail.python.org/mailman/listinfo/python-list


Question about the Python Cookbook: How much of this is new?

2005-04-03 Thread RickMuller
I had a question about the second edition of the Python Cookbook. I own
and have thoroughly enjoyed the first edition of the Python Cookbook.
How much of the second edition is new? Is this "essential reading" if I
already have the first edition? I realize that there are new sections
that describe language features through Python 2.4, but is this, say,
10% of the book (in which case I won't buy a new copy) or is it >25% of
the book (in which case I will). The Chapter (#3) on Time/Money that's
posted on the O'Reilly website is great (and entirely new, if memory
serves).

Thanks in advance...

Rick

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: StopIteration in the if clause of a generator expression

2005-04-03 Thread [EMAIL PROTECTED]
This is all just making everything far too complicated. What you really
want to do is quite simple:

import itertools
def condition(x): return x < 5

list(itertools.takewhile(condition, (i for i in range(10

The 'Stop Iteration In Generator Expression' problem was solved in the
language that List Comprehensions came from, Haskell. Haskell's basic
library, prelude, had a series of functions that have found their way
into the itertools toolbox. I highly recommend having a read of the
itertools docs if you want to continue hacking around with generators.

Regards,
Stephen Thorne

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Steven Bethard
Maurice Ling wrote:
In the Java world, there is GATE (general architecture for text 
engineering) and it seems very impressive. Are there something like that 
for Python?
I worked with GATE this last summer and really hated it.  Can't decide 
whether that was just my growing distaste for Java or actually the GATE 
API.  Anyway, if you're looking for something like GATE that (in my 
experience) runs significantly faster, you should look at Ellogon 
(www.ellogon.org).  It's written in C and TCL, with C++, Java, Perl, and 
Python bindings.  And I believe, if you have any software already 
written for GATE, Ellogon can run those modules directly.  I've 
personally never done so -- all my modules are written in Python (often 
simple wrappers for things like MXPOST, MXTerminator, Charniak's parser, 
etc.)  I find the Python interface simple and easy to use, and they've 
added a number of my suggestions to the API in the last release.

STeVe
--
http://mail.python.org/mailman/listinfo/python-list


Re: threading.Event file descriptor

2005-04-03 Thread Nicolas Fleury
[EMAIL PROTECTED] wrote:
There is no event handle used in Event object (for NT at least). Do not
know about Linux...
And there's no handle at all?  It's not important if it's not an event 
handle as long as it is an handle usable with WaitForMultipleObjects.

Also, I don't understand how it will be possible to implement 
threading.Event without using finally, at the lower level, a handle, 
since as far as I know this is the mechanisms the OS offers.

Unless you want to rewrite the interpreter (namelly
PyThread_allocate_lock.c) for platforms you are talking about, you
would be better of, if you create your own class (derived from Event,
and ovewritte aquire, release and wait methods).
I wouldn't want to derive from Event since my goal would be to submit a 
patch to make subprocess.Popen.wait take an optional threading.Event to 
kill the process.

Regards,
Nicolas
--
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread James Stroud
I wish I had time to dig into your specific problem because it looks 
interesting. But I think you might want to look at "python generators". I 
beleive there is no reason that they can't yield a function.

http://www.python.org/peps/pep-0255.html
http://docs.python.org/ref/yield.html
http://linuxgazette.net/100/pramode.html

James

On Sunday 03 April 2005 02:12 pm, Brendan wrote:
> Hi everyone
>
> I'm new to Python, so forgive me if the solution to my question should
> have been obvious.  I have a function, call it F(x), which asks for two
> other functions as arguments, say A(x) and B(x).  A and B are most
> efficiently evaluated at once, since they share much of the same math,
> ie, A, B = AB(x), but F wants to call them independantly (it's part of
> a third party library, so I can't change this behaviour easily).   My
> solution is to define a wrapper function FW(x), with two nested
> functions,  AW(x) and BW(x), which only call AB(x) if x has changed.
>
> To make this all clear, here is my (failed) attempt:
>
> #--begin code -
>
> from ThirdPartyLibrary import F
> from MyOtherModule import AB
>
> def FW(x):
> lastX = None
> aLastX = None
> bLastX = None
>
> def AW(x):
> if x != lastX:
> lastX = x
> # ^ Here's the problem.  this doesn't actually
> # change FW's lastX, but creates a new, local lastX
>
> aLastX, bLastX = AB(x)
> return aLastX
>
> def BW(x):
> if x != lastX:
> lastX = x
> # ^ Same problem
>
> aLastX, bLastX = AB(x)
> return bLastX
>
> #finally, call the third party function and return its result
> return F(AW, BW)
>
> # end code -
>
> OK, here's my problem:  How do I best store and change lastX, A(lastX)
> and B(lastX) in FW's scope?  This seems like it should be easy, but I'm
> stuck.  Any help would be appreciated!
>
>   -Brendan
> --
> Brendan Simons

-- 
James Stroud, Ph.D.
UCLA-DOE Institute for Genomics and Proteomics
Box 951570
Los Angeles, CA 90095

http://www.jamesstroud.com/
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: threading.Event file descriptor

2005-04-03 Thread elbertlev

Nicolas Fleury wrote:
> Hi,
> Is there any way to get the file descriptor on Unix or handle on
Windows
> associated internally with a threading.Event object?  So that it can
be
> used in a call to select or WaitForMultipleObjects.
> Thx and regards,
> Nicolas

Good idea! But...

There is no event handle used in Event object (for NT at least). Do not
know about Linux...

Unless you want to rewrite the interpreter (namelly
PyThread_allocate_lock.c) for platforms you are talking about, you
would be better of, if you create your own class (derived from Event,
and ovewritte aquire, release and wait methods).

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Terry Reedy

"Maurice LING" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
>Say I code my  stuffs in Jython (importing java libraries) in a file 
>"text.py"

Just to be clear, Jython is not a separate langague that you code *in*, but 
a separate implementation that you may slightly differently code *for*.

>... Will there be any issues when I try to import text.py into CPython?

If text.py is written in an appropriate version of Python, it itself will 
cause no problem.  Hoqwever, when it imports javacode files, as opposed to 
CPython bytecode files, CPython will choke.

Terry J. Reedy



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: StopIteration in the if clause of a generator expression

2005-04-03 Thread Steven Bethard
Raymond Hettinger wrote:
[Peter Otten]
Do you see any chance that list comprehensions will be redefined as an
alternative spelling for list()?
Not likely.  It is possible that the latter spelling would make it possible 
for
Py3.0. eliminate list comps entirely.  However, they are very popular and
practical, so my bet is that they will live on.
I suspect you're right, but I certainly wouldn't complain if list comps 
disappeared. TOOWTDI and all, and I often find myself alternating 
between the two when I can't decide which one seems more Pythonic. 
(These days I generally write a listcomp, but I wouldn't put any money 
on my code being entirely consistent about this...)

STeVe
--
http://mail.python.org/mailman/listinfo/python-list


Re: Help me dig my way out of nested scoping

2005-04-03 Thread Michael Spencer
Brendan wrote:
Hi everyone
I'm new to Python, so forgive me if the solution to my question should
have been obvious. 
...
Good question.  For a thorough explanation see: 
http://www.python.org/dev/doc/devel/ref/naming.html

Simple version follows:
OK, here's my problem:  How do I best store and change lastX, A(lastX)
and B(lastX) in FW's scope?  This seems like it should be easy, but I'm
stuck.  Any help would be appreciated!
Assignments (i.e., binding names to objects) are always made in the local scope 
(unless you've used the 'global' declaration, which I don't think can help you 
here).  So, for an even simpler demonstration of the problem see:

 >>> def outer():
 ... b = 1
 ... def inner():
 ... b += 1
 ... print b
 ... inner()
 ...
 >>> outer()
 Traceback (most recent call last):
   File "", line 1, in ?
   File "", line 6, in outer
   File "", line 4, in inner
 UnboundLocalError: local variable 'b' referenced before assignment
The solution is not to re-bind the identifier from the enclosing scope, but 
rather to mutate the object that it references.  This requires a mutable object, 
such as a list:

 >>> def outer():
 ... b = [1] # bind b to a mutable object
 ... def inner():
 ... b[0] += 1
 ... print b[0]
 ... inner()
 ...
 >>> outer()
 2
 >>>

HTH
Michael
--
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Maurice LING
Mark Winrock wrote:

You might try http://web.media.mit.edu/~hugo/montylingua/
"Liu, Hugo (2004). MontyLingua: An end-to-end natural
language processor with common sense. Available
at: web.media.mit.edu/~hugo/montylingua."

Thanks Mark. I've downloaded MontyLingua and it looks pretty cool. To 
me, it seems like pretty much geared to people like myself who needs 
something to process written text but do not need the hardcore bolts and 
nuts of a computational linguistist. NLTK is more of the bolts and nuts 
toolkit. GATE still seems more advanced than MontyLingua but to a 
different end.

Is there anyone in this forum that is using or had used MontyLingua and 
is happy to comment more on it? I'm happy to get more opinions.

Thanks and cheers
Maurice
--
http://mail.python.org/mailman/listinfo/python-list


Re: Docorator Disected

2005-04-03 Thread "Martin v. Löwis"
Ron_Adam wrote:
This would be the same without the nesting:
def foo(xx):
global x
x = xx
return fee
def fee(y):
global x
return y*x
z = foo(2)(6)
Actually, it wouldn't.
>>> def foo(xx):
...   global x
...   x = xx
...   return fee
...
>>> def fee(y):
...   global x
...   return y*x
...
>>> z=foo(2)
>>> x=8
>>> z(6)
48
So the global variable can be changed between the time foo returns
and the time fee is invoked. This is not the same in the nested function
case: the value of x would be bound at the time foo is called. It can
be modified inside foo, but freezes once foo returns.
So if you are seeing (2)(6) as something to pass, as opposed to a sequence of 
operations, I think there's
a misconception involved. Perhaps I am taking your words askew ;-)

It's not entirely a misconception. Lets see where this goes...

dis.dis(compiler.compile('foo(2)(6)','','eval'))
 1   0 LOAD_NAME0 (foo)
 3 LOAD_CONST   1 (2)
 6 CALL_FUNCTION1
 9 LOAD_CONST   2 (6)
12 CALL_FUNCTION1
15 RETURN_VALUE
Hmm. If you think that this proves that (2)(6) is being *passed*, you
still might have a misconception. What this really does is:
0. Put foo on the stack. Stack is [value of foo]
3. Put 2 on the stack -> [value of foo, 2]
6. Call a function with one arg; invoking foo(2)
   Put the result of this call back on the stack ->
   [result of foo(2)]
9. Put 6 on the stack -> [result of foo(2), 6]
12. Call it, computing (result of foo(2))(6)
Put the result on the stack ->
[result of (result of foo(2))(6)]
13. Return top-of-stack, yielding foo(2)(6)
So at no point in time, (2)(6) actually exists. Instead,
when the 6 is being put onto the stack, the 2 is already gone.
It computes it one by one, instead of passing multiple sets
of arguments.
While all of this isn't relevant, it's knowledge in my mind, and
effects my view of programming sometimes.
There is nothing wrong with that. However, you really should try
to see what the interpreter actually does, instead of speculation
(of course, asking in a newsgroup is fine).
The calling routine, puts (passes) the second set of arguments onto
the stack before calling the function returned on the stack by the
previous call.
Sure - you need the arguments to a function before being able to
call the function. So there is always a set of arguments on the
stack, which internally indeed gets converted into a tuple right
before calling the function. However, at no point in time, there
are *two* sets of arguments.
Or it could be said equally the functions (objects) are passed with
the stack. So both view are correct depending on the view point that
is chosen.
Maybe I don't understand your view, when you said
# No, I did not know that you could pass multiple sets of arguments to
# nested defined functions in that manner.
However, they way I understood it, it seemed incorrect - there are
no multiple sets of arguments being passed, atleast not simultaneously.
It is, of course, possible to pass multiple sets of arguments
sequentially to multiple functions, eg.
a = len(x)
b = len(y)
Regards,
Martin
--
http://mail.python.org/mailman/listinfo/python-list


Re: "specialdict" module

2005-04-03 Thread Michael Spencer
Georg Brandl wrote:
I think I like Jeff's approach more (defaultvalues are just special
cases of default factories); there aren't many "hoops" required.
Apart from that, the names just get longer ;)
Yes Jeff's approach does simplify the implementation and more-or-less eliminates 
my complexity objection

But why do you write:
def __getitem__(self, key):
try:
return super(defaultdict, self).__getitem__(key)
except KeyError, err:
try:
return self.setdefault(key,
  self._default[0](*self._default[1],
   **self._default[2]))
except KeyError:
raise err
rather than:
def __getitem__(self, key):
return self.setdefault(key,
  self._default[0](*self._default[1],
   **self._default[2]))
(which could catch AttributeError in the case of _default not set)
I'm sure there's a reason, but I can't see it.

2. I would really prefer to have the default value specified in the constructor
...
Too much specialcased for my liking.
It does set up some gotchas I concede ;-)

...

Alternatively, you could provide factory functions to construct the defaultdict. 
 Someone (Michele?) recently posted an implementation of this

Yes, I think this could be reasonable.

...though this would more naturally complement a fixed-default dictionary IMO
Your design permits - even encourages (by providing convenient setters) the 
default to change over the lifetime of the dictionary.  I'm not sure whether 
that's good or bad, but it's a feature worth discussing.

3. Can you work in the tally and listappend methods that started this whole 
thread off?

They aren't necessary any longer.
Use defaultdict.setdefaultvalue(0) instead of the tally approach and
defaultdict.setdefaultfactory(list) instead of listappend.
Oops, I see what you mean... then use += or append as required.  I still prefer 
the clarity of tally for its specific use-case, but it does suffer from lack of 
generality.

Michael
--
http://mail.python.org/mailman/listinfo/python-list


Help me dig my way out of nested scoping

2005-04-03 Thread Brendan
Hi everyone

I'm new to Python, so forgive me if the solution to my question should
have been obvious.  I have a function, call it F(x), which asks for two
other functions as arguments, say A(x) and B(x).  A and B are most
efficiently evaluated at once, since they share much of the same math,
ie, A, B = AB(x), but F wants to call them independantly (it's part of
a third party library, so I can't change this behaviour easily).   My
solution is to define a wrapper function FW(x), with two nested
functions,  AW(x) and BW(x), which only call AB(x) if x has changed.

To make this all clear, here is my (failed) attempt:

#--begin code -

from ThirdPartyLibrary import F
from MyOtherModule import AB

def FW(x):
lastX = None
aLastX = None
bLastX = None

def AW(x):
if x != lastX:
lastX = x
# ^ Here's the problem.  this doesn't actually
# change FW's lastX, but creates a new, local lastX

aLastX, bLastX = AB(x)
return aLastX

def BW(x):
if x != lastX:
lastX = x
# ^ Same problem

aLastX, bLastX = AB(x)
return bLastX

#finally, call the third party function and return its result
return F(AW, BW)

# end code -

OK, here's my problem:  How do I best store and change lastX, A(lastX)
and B(lastX) in FW's scope?  This seems like it should be easy, but I'm
stuck.  Any help would be appreciated!

  -Brendan
--
Brendan Simons

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: "specialdict" module

2005-04-03 Thread Georg Brandl
Michael Spencer wrote:

> 1. Given that these are specializations, why not have:
> 
> class defaultvaluedict(dict):
>  ...
> 
> class defaultfactorydict(dict):
>  ...
> 
> rather than having to jump through hoops to make one implementation satisfy 
> both 
> cases

I think I like Jeff's approach more (defaultvalues are just special
cases of default factories); there aren't many "hoops" required.
Apart from that, the names just get longer ;)

> 2. I would really prefer to have the default value specified in the 
> constructor
> 
> I realize that this is tricky due to the kw arguments of dict.__init__, but I 
> would favor either breaking compatibility with that interface, or adopting 
> some 
> workaround to make something like d= defaultvaluedict(__default__ = 0) 
> possible.

Too much specialcased for my liking.

> One worksaround would be to store the default in the dict, not as an 
> attribute 
> of the dict.  By default the default value would be associated with the key 
> "__default__", but that keyname could be changed for the (I guess very few) 
> cases where that key conflicted with non-default content of the dict.  Then 
> dict.__init__ would simply take __default__ = value as a keyword argument, as 
> it 
> does today, and __getitem__ for a missing key would return 
> dict.__getitem__(self, "__default__")

I thought about this too (providing a singleton instance named Default,
just like None is, and using it as a key), but you would have to
special-case the (iter)keys,values,items methods to exclude the default
- definitely too much work, and too much magic.

> Alternatively, you could provide factory functions to construct the 
> defaultdict. 
>   Someone (Michele?) recently posted an implementation of this

Yes, I think this could be reasonable.

> 3. Can you work in the tally and listappend methods that started this whole 
> thread off?

They aren't necessary any longer.

Use defaultdict.setdefaultvalue(0) instead of the tally approach and
defaultdict.setdefaultfactory(list) instead of listappend.

> 4. On super, no I don't think it's necessary or particularly desirable.  
> These 
> specializations have a close association with dict.  dict.method(self,...) 
> feels 
> more appropriate in this case.

Any other opinions on this?

Thanks for the comments,

mfg
Georg
-- 
http://mail.python.org/mailman/listinfo/python-list


can't link Python 2.4.1 against external libxml2 ...

2005-04-03 Thread OpenMacNews
hi all,
i've successfully built Python-2.4.1 from src on OSX 10.3.8 as a framework 
install with:

   ./configure \
   --enable-framework \
   --with-threads \
   --with-cxx=g++ \
   --enable-ipv6 \
   --enable-toolbox-glue
   make frameworkinstall
i'm next attempting to build same but linking against an external instance of 
libxml2 (v2.6.18) installed in /usr/local ...

i've tried the ususal combinations of setting LDFLAGS and/or assigning 
--with-libs (="/usr/local/lib/libxml2.2.6.18.dylib"), to no avail.

no matter what i do, the build links against the 'native' /usr/lib/libxml* ... 
successful w/ no error, just the 'wrong' lib.

wondering whether i had unusual path probs, i also tried mv'ing the native libs 
out of the way, so that the only instance of libxml* is found in 
/usr/local/lib, but then the make fails, complaining that /usr/lib/libxml* is 
not found. sigh.

i'm fairly sure i'm missing something trivial here ...
any suggestions/pointers as to how to link *my* libxml into the framework build?
thanks!
richard

--
http://mail.python.org/mailman/listinfo/python-list


Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread Jeff Epler
The C code that Python uses to find the initial value of sys.path based
on PYTHONPATH seems to be simple splitting on the equivalent of
os.pathsep.  See the source file Python/sysmodule.c, function
makepathobject().
for (i = 0; ; i++) {
p = strchr(path, delim); // ";" on windows, ":" on unix
if (p == NULL) ...
w = PyString_FromStringAndSize(path, (int) (p - path));
if (w == NULL) ...
PyList_SetItem(v, i, w);
if (*p == '\0')
break;
path = p+1;
}
No special handling of quote characters happens here.

> I think I will stick to a simple splitting at the ;. Luckily all the
> directories I am dealing with have nice names.

If you do this, you'll match the behavior of python itself, and you'll match
the behavior of wine.

> I have not even tried to see what quirks there exist with unix.

None.  There's no way to quote anything in paths, so while you can't place a
directory with a colon in its name on your path, nobody loses any sleep over
it either.  Here's what the Open Group has to say about PATH:
PATH
This variable shall represent the sequence of path prefixes that
certain functions and utilities apply in searching for an executable
file known only by a filename. The prefixes shall be separated by a
colon ( ':' ). When a non-zero-length prefix is applied to this
filename, a slash shall be inserted between the prefix and the
filename. A zero-length prefix is a legacy feature that indicates the
current working directory. It appears as two adjacent colons ( "::" ),
as an initial colon preceding the rest of the list, or as a trailing
colon following the rest of the list. A strictly conforming application
shall use an actual pathname (such as .) to represent the current
working directory in PATH . The list shall be searched from beginning
to end, applying the filename to each prefix, until an executable file
with the specified name and appropriate execution permissions is found.
If the pathname being sought contains a slash, the search through the
path prefixes shall not be performed. If the pathname begins with a
slash, the specified path is resolved (see Pathname Resolution). If
PATH is unset or is set to null, the path search is
implementation-defined.
ah, if only windows was so well-defined!

Jeff


pgpcydRcHBwXk.pgp
Description: PGP signature
-- 
http://mail.python.org/mailman/listinfo/python-list

mini_httpd (ACME Labs) & Python 2.4.1 integration

2005-04-03 Thread Venkat B
Hi folks,I have a webserver based on mini_httpd
v1.19(http://www.acme.com/software/mini_httpd/).I'd like to run some
python-based CGI scripts via this webserver on an RH9 system.In theory, with
the right env settings, Ishould be able to launch mini_httpd like so:
mini_httpd -c *.pyand be able to run scripts like so:
http://fqdn/simple.pyUsing info from (python) sys.path, multiple
get_config_var(..)s and sys.executable, Iwas able to build and use the env
settings (CGI_PATH, CGI_LD_LIBRARY_PATH) in the mini_httpd sources.However,
when I tried accessing the simple.pyscript as in the url above, I get an
error(errno of 8 - Exec format error) when thecode attempts to invoke the
execve() function. This same script works ok from a CGIHTTPServer.py based
test webserver.I was wondering if you knew what the right env settings
should be to get the py script working... I tried to google around to no
avail.

Thanks a lot.
Regards,
/venkat


-- 
http://mail.python.org/mailman/listinfo/python-list


help with python-devel!!!

2005-04-03 Thread gferreri
I am trying to install Numeric python which fails because I do not have


"/usr/lib/python-2.3/config/Makefile"

I've done some research and figured out I don't have the "python-devel"
package ... how do I get this?? I'm running Mandrake 10.1 and typing
"urpmi python-devel" does not work (it can't find a package by that
name) .. I've tried building and installing python from source, but it
does not install the config directory...

Any ideas of what I have to do to get python-devel installed on my
system?  Thanks!

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorater inside a function? Is there a way?

2005-04-03 Thread Ron_Adam
On 3 Apr 2005 11:17:35 -0700, "George Sakkis" <[EMAIL PROTECTED]>
wrote:

>>def define(func):
>>if not ENABLE_TYPECHECKING:
>>return lambda func: func
>># else decorate func
>
>A small correction: The argument of the decorator is not 'func' but the
>parameter checks you want to enforce. A template for define would be:
>
>def define(inputTypes, outputType):
>if not ENABLE_TYPECHECKING:
>return lambda func: func
>def decorate(func):
>def typecheckedFunc(*args,**kwds):
># TYPECHECK *args, **kwds HERE #
>r = func(*args,**kwds)
># TYPECHECK r HERE #
>return r
>return typecheckedFunc
>return decorate

This is the same pattern I used except without the enable/disable at
the top.

The inline type check function also checks for TYPECHECK == True, and
TYPESTRICT == False, as default to determine the strictness of the
type checking wanted.  Where TYPESTRICT == True, causes it to give an
error if they are not the correct type, even if they are the exact
value. TYPESTRICT == False, result in it trying to convert the object,
then checks it, by converting it back to the original type. If it's
still equal it returns the converted object in the specified type.

>Depending on how much flexibility you allow in inputTypes, filling in
>the typechecking logic can be from easy to challenging. For example,
>does typechecking have to be applied in all arguments or you allow
>non-typechecked aruments ? Can it handle *varargs and **kwdargs in the
>original function ? An orthogonal extension is to support 'templated
>types' (ala C++), so that you can check if something is 'a dict with
>string keys and lists of integers for values'. I would post my module
>here or the cookbook but at 560 (commented) lines it's a bit long to
>qualify for a recipe :-)
>
>George

Sounds like your version does quite a bit more than my little test
functions. :)  

I question how far type checking should go before you are better off
with a confirmtypes() function that can do a deep type check. And then
how much flexibility should that have? 

My view point is that type checking should be available to the
singleton types, with conversions only if data integrity can be
insured. ie.. the conversion is reversible with an "identical" result
returned.

def type_convert( a, t): 
b = t(a)
aa = type(a)(b)
if a == aa:
return b
else:
raise TypeError  

In cases where a conversion is wanted, but type checking gives an
error, an explicit conversion function or method should be used.  

In containers, and more complex objects, deep type checking should be
available through a general function which can compare an object to a
template of types, specific to that object. It's important to use a
template instead of a sample, because a sample could have been
changed. 

It's all about protecting the data content with a high degree of
confidence.  In general, 98% of the time the current python way would
be adequate, but those remaining 2% are important enough to warrant
the additional effort that type checking takes.

On another note, there's the possibility that type checking in python
source code could make writing a compiler easier.

Another idea is that of assigning a name a type preference. And then
overload the assign operators to check for that first before changing
a name to point to a new object.  It could probably be done with a
second name dictionary in name space with {name:type} pairs. With that
approach you only need to give key variables a type, then they keep
that type preference until it's assigned a new type, or removed from
the list. The down side to this is that it could slow things down.

Cheers,
Ron

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: "specialdict" module

2005-04-03 Thread Michael Spencer
Georg Brandl wrote:
Hello,
in follow-up to the recent "dictionary accumulator" thread, I wrote a
little module with several subclassed dicts.
Comments (e.g. makes it sense to use super), corrections, etc.? Is this
PEP material?
Docstrings, Documentation and test cases are to be provided later.
mfg
Georg
Georg:
A few reactions:
1. Given that these are specializations, why not have:
class defaultvaluedict(dict):
...
class defaultfactorydict(dict):
...
rather than having to jump through hoops to make one implementation satisfy both 
cases

2. I would really prefer to have the default value specified in the constructor
I realize that this is tricky due to the kw arguments of dict.__init__, but I 
would favor either breaking compatibility with that interface, or adopting some 
workaround to make something like d= defaultvaluedict(__default__ = 0) possible.

One worksaround would be to store the default in the dict, not as an attribute 
of the dict.  By default the default value would be associated with the key 
"__default__", but that keyname could be changed for the (I guess very few) 
cases where that key conflicted with non-default content of the dict.  Then 
dict.__init__ would simply take __default__ = value as a keyword argument, as it 
does today, and __getitem__ for a missing key would return 
dict.__getitem__(self, "__default__")

Alternatively, you could provide factory functions to construct the defaultdict. 
 Someone (Michele?) recently posted an implementation of this

3. Can you work in the tally and listappend methods that started this whole 
thread off?

4. On super, no I don't think it's necessary or particularly desirable.  These 
specializations have a close association with dict.  dict.method(self,...) feels 
more appropriate in this case.

Michael
--
http://mail.python.org/mailman/listinfo/python-list


RE: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread Chirayu Krishnappa
My goal is to check for certain paths appearing in the current PATH (set by
a bunch of scripts run in some random order) and (1) rearrange some of them
so that they are in the "correct" order and (2) replace some for which I
have preferred alternatives.

The quote processing I saw cmd.exe do was discovered when I tried
C:\temp>dir \Pro
C:\temp>dir "\Program Files"
C:\temp>dir "\Program Files"\vim

...and it worked. That's when I placed something in c:\temp\a;b\c and set
path to begin with c:\temp"\a;b"\c;%path% and realized that cmd.exe picks it
up from there. (You can have more than 2 quotes in the path too.)

However, I noticed that

INCLUDE="c:\temp\a;b\c";%INCLUDE%

does not help the vc++ compiler find include files there. (PYTHONSTARTUP
could not find a python file there either.)

So in the end - this behavior is quirky. Google searches (and peeks into
code) suggest that each app has its own logic to process such paths. It
might make more sense to avoid using ;'s in paths.

I have not even tried to see what quirks there exist with unix.

I think I will stick to a simple splitting at the ;. Luckily all the
directories I am dealing with have nice names.

I was thinking that I was doing the wrong thing with the split(';') when I
noticed cmd.exe's behavior and was looking for something which most apps did
(mainly I just wanted to call whatever function python used to process
PATH). For now - its not a big deal.

Thanks for your response,
Chirayu.

-Original Message-
From: Jeff Epler [mailto:[EMAIL PROTECTED] 
Sent: Sunday, April 03, 2005 8:02 AM
To: chirayuk
Cc: python-list@python.org
Subject: Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X',
'c:\\a; b']

if your goal is to search for files on a windows-style path environment
variable, maybe you don't want to take this approach, but instead wrap
and use the _wsearchenv or _searchenv C library functions
 
http://msdn.microsoft.com/library/en-us/vclib/html/_crt__searchenv.2c_._wsea
rchenv.asp

Incidentally, I peeked at the implementation of _searchenv in wine (an
implementation of the win32 API for Unix), and it doesn't do the
quote-processing that you say Windows does.  The msdn page doesn't give
the syntax for the variable either, which is pretty typical.  Do you
have an "official" page that discusses the syntax?

Jeff

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorater inside a function? Is there a way?

2005-04-03 Thread George Sakkis
>def define(func):
>if not ENABLE_TYPECHECKING:
>return lambda func: func
># else decorate func

A small correction: The argument of the decorator is not 'func' but the
parameter checks you want to enforce. A template for define would be:

def define(inputTypes, outputType):
if not ENABLE_TYPECHECKING:
return lambda func: func
def decorate(func):
def typecheckedFunc(*args,**kwds):
# TYPECHECK *args, **kwds HERE #
r = func(*args,**kwds)
# TYPECHECK r HERE #
return r
return typecheckedFunc
return decorate


Depending on how much flexibility you allow in inputTypes, filling in
the typechecking logic can be from easy to challenging. For example,
does typechecking have to be applied in all arguments or you allow
non-typechecked aruments ? Can it handle *varargs and **kwdargs in the
original function ? An orthogonal extension is to support 'templated
types' (ala C++), so that you can check if something is 'a dict with
string keys and lists of integers for values'. I would post my module
here or the cookbook but at 560 (commented) lines it's a bit long to
qualify for a recipe :-)

George

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Queue.Queue-like class without the busy-wait

2005-04-03 Thread Nick Craig-Wood
Paul Rubin  wrote:
>  Nick Craig-Wood <[EMAIL PROTECTED]> writes:
> > I believe futex is the thing you want for a modern linux.  Not
> > very portable though.
> 
>  That's really cool, but I don't see how it can be a pure userspace
>  operation if the futex has a timeout.  The kernel must need to keep
>  track of the timeouts.  However, since futexes can be woken by any
>  thread, the whole thing can be done with just one futex.  In fact the
>  doc mentions something about using a file descriptor to support
>  asynchronous wakeups, but it's confusing whether that applies here.

No it isn't pure user space, only for the non-contended case which for
most locks is the most frequent operation.

   Futex operation is entirely userspace for the non-contended
   case.  The kernel is only involved to arbitrate the contended
   case. As any sane design will strive for non-contension,
   futexes are also optimised for this situation.

-- 
Nick Craig-Wood <[EMAIL PROTECTED]> -- http://www.craig-wood.com/nick
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread Michael Spencer
chirayuk wrote:
However, I just realized that the following is also a valid PATH in
windows.

PATH=c:\A"\B;C"\D;c:\program files\xyz"
(The quotes do not need to cover the entire path)
Too bad!  What a crazy format!
So here is my handcrafted solution.
def WinPathList_to_PyList (pathList):
pIter = iter(pathList.split(';'))
OddNumOfQuotes = lambda x: x.count('"') % 2 == 1
def Accumulate (p):
bAcc, acc = OddNumOfQuotes(p), [p]
while bAcc:
p = pIter.next ()
acc.append (p)
bAcc = not OddNumOfQuotes (p)
return "".join (acc).replace('"','')
return [q for q in [Accumulate (p) for p in pIter] if q]
Does it work?
I get:
 >>> test2 = r'c:\A"\B;C"\D;c:\program files\xyz"'
 >>> WinPathList_to_PyList(test2)
 Traceback (most recent call last):
   File "", line 1, in ?
   File "pathsplit", line 31, in WinPathList_to_PyList
   File "pathsplit", line 27, in Accumulate
 StopIteration
 >>>
Also, on the old test case, I get:
 >>> WinPathList_to_PyList("""\"c:\\A;B";c:\\D;""")
 ['c:\\AB', 'c:\\D']
 >>>
Should the ';' within the quotes be removed?
So now I need to check if the os is windows.
Wishful thinking: It would be nice if something like this (taking care
of the cases for other OS's) made it into the standard library - the
interpreter must already be doing it.
This sort of 'stateful' splitting is a somewhat common task.  If you're feeling 
creative, you could write itertools.splitby(iterable, separator_func)

This would be a sister function to itertools.groupby (and possible derive from 
its implementation).  separator_func is a callable that returns True if the item 
is a separator, False otherwise.

splitby would return an iterator of sub-iterators (like groupby) defined by the 
items between split points

You could then implement parsing of crazy source like your PATH variable by 
implementing a stateful separator_func

Michael
--
http://mail.python.org/mailman/listinfo/python-list


Re: "specialdict" module

2005-04-03 Thread Georg Brandl
Jeff Epler wrote:
> The software you used to post this message wrapped some of the lines of
> code.  For example:
>> def __delitem__(self, key):
>> super(keytransformdict, self).__delitem__(self,
>> self._transformer(key))

Somehow I feared that this would happen.

> In defaultdict, I wonder whether everything should be viewed as a
> factory:
> def setdefaultvalue(self, value):
> def factory(): return value
> self.setdefaultfactory(factory)

That's a reasonable approach. __init__ must be changed too, but this
shouldn't hurt too badly.

> and the "no-default" mode would either cease to exist, or 
> def cleardefault(self):
> def factory(): 
> raise KeyError, "key does not exist and no default defined"
> self.setdefaultfactory(factory)
> (too bad that the key isn't available in the factory, this degrades the
> quality of the error messge)

That can be worked around with a solution in __getitem__, see below.

> if so, __getitem__ becomes simpler:
> __slots__ = ['_default']
> def __getitem__(self, key):
>   try:
>   return super(defaultdict, self).__getitem__(key)
>   except KeyError:
>   return self.setdefault(key, apply(*self._default))

You are peculating the kwargs. Also, apply() is on the verge of being
deprecated, so better not use it.

def __getitem__(self, key):
try:
return super(defaultdict, self).__getitem__(key)
except KeyError, err:
try:
return self.setdefault(key,
  self._default[0](*self._default[1],
   **self._default[2]))
except KeyError:
raise err

Although I'm not sure whether KeyError would be the right one to raise
(perhaps a custom error?).

> I don't ever have an itch for sorted dictionaries, as far as I can
> remember, and I don't immediately understand the use of
> keytransformdict.  Can you give an example of it?

See the thread "Case-insensitive dict, non-destructive, fast, anyone?",
starting at 04/01/05 12:38.

mfg
Georg
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorator Dissection

2005-04-03 Thread Ron_Adam
On Sun, 03 Apr 2005 08:32:09 +0200, "Martin v. Löwis"
<[EMAIL PROTECTED]> wrote:

>Ron_Adam wrote:
>> I wasn't aware that the form:
>> 
>>  result = function(args)(args)
>> 
>> Was a legal python statement.
>> 
>> So python has a built in mechanism for passing multiple argument sets
>> to nested defined functions! (click)  Which means this is a decorator
>> without the decorator syntax.
>
>No. There is no mechanism for passing multiple argument sets to
>nested functions. Instead, functions are objects, which can be
>assigned to variables, passed as arguments to other functions,
>and returned:

Yes there is, it's the stack python uses to interpret the byte code.
But it's the same mechanism that is used for passing arguments to
sequential function calls (objects) also.  The only difference is the
next function (object) is returned on the stack in the nested case.
Then the next argument is then put on to the stack (passed), before
the next function is called.  

How you view this depends on the frame of reference you use, I was
using a different frame of reference, which I wasn't sure was correct
at the time, but turns out is also valid.  So both view points are
valid. 

In any case, I now have a complete picture of how it works. Inside,
and out. Which was my goal.  :)

>> So this isn't a decorator question any more.  Each argument gets
>> passed to the next inner defined function,  via... a stack(?)  ;)
>
>No, functions are objects. Notice that in step 1, the object returned
>doesn't have to be a function - other things are callable, too, like
>types, classes, and objects implementing __call__.

They are objects; which are data structures; containing program code &
data; which reside in memory; and get executed by, in this case, a
byte code interpreter.  The interpreter executes the byte code in a
sequential manner, using a *stack* to call functions (objects), along
with their arguments.

For the record, I never had any trouble understanding the concept of
objects. I think I first started programming OOP in the mid '90's with
c++.

It was the sequence of events in the objects of the nested def
functions that I was trying to understand along with where the objects
get their arguments, which isn't obvious because of the levels of
indirect calling.

Thanks for the help Martin, it's always appreciated.  :)

Cheers,
Ron


>Regards,
>Martin

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Mark Winrock
Maurice Ling wrote:
Hi,
I'm a postgraduate and my project deals with a fair bit of text 
analysis. I'm looking for some libraries and tools that is geared 
towards text analysis (and text engineering). So far, the most 
comprehensive toolkit in python for my purpose is NLTK (natural language 
tool kit) by Edward Loper and Steven Bird, followed by mxTextTools. Are 
there any OSS tools out there that is more comprehensive than NLTK?

In the Java world, there is GATE (general architecture for text 
engineering) and it seems very impressive. Are there something like that 
for Python?

Thanks in advance.
Cheers
Maurice

You might try http://web.media.mit.edu/~hugo/montylingua/
"Liu, Hugo (2004). MontyLingua: An end-to-end natural
language processor with common sense. Available
at: web.media.mit.edu/~hugo/montylingua."
--
http://mail.python.org/mailman/listinfo/python-list


Re: "specialdict" module

2005-04-03 Thread Jeff Epler
The software you used to post this message wrapped some of the lines of
code.  For example:
> def __delitem__(self, key):
> super(keytransformdict, self).__delitem__(self,
> self._transformer(key))

In defaultdict, I wonder whether everything should be viewed as a
factory:
def setdefaultvalue(self, value):
def factory(): return value
self.setdefaultfactory(factory)

and the "no-default" mode would either cease to exist, or 
def cleardefault(self):
def factory(): 
raise KeyError, "key does not exist and no default defined"
self.setdefaultfactory(factory)
(too bad that the key isn't available in the factory, this degrades the
quality of the error messge)

if so, __getitem__ becomes simpler:
__slots__ = ['_default']
def __getitem__(self, key):
  try:
  return super(defaultdict, self).__getitem__(key)
  except KeyError:
  return self.setdefault(key, apply(*self._default))

I don't ever have an itch for sorted dictionaries, as far as I can
remember, and I don't immediately understand the use of
keytransformdict.  Can you give an example of it?

Jeff


pgpTcdOKCij9W.pgp
Description: PGP signature
-- 
http://mail.python.org/mailman/listinfo/python-list

"specialdict" module

2005-04-03 Thread Georg Brandl
Hello,

in follow-up to the recent "dictionary accumulator" thread, I wrote a
little module with several subclassed dicts.

Comments (e.g. makes it sense to use super), corrections, etc.? Is this
PEP material?

Docstrings, Documentation and test cases are to be provided later.

mfg
Georg

--

class defaultdict(dict):
# _defaulttype: 0=no default, 1=defaultvalue, 2=defaultfactory
__slots__ = ['_defaulttype', '_default']

def __init__(self, *args, **kwargs):
self._defaulttype = 0

super(defaultdict, self).__init__(self, *args, **kwargs)

def setdefaultvalue(self, value):
self._defaulttype = 1
self._default = value

def setdefaultfactory(self, factory, *args, **kwargs):
if not callable(factory):
raise TypeError, 'default factory must be a callable'
self._defaulttype = 2
self._default = (factory, args, kwargs)

def cleardefault(self):
self._defaulttype = 0

def __getitem__(self, key):
try:
return super(defaultdict, self).__getitem__(key)
except KeyError:
if self._defaulttype == 0:
raise
elif self._defaulttype == 1:
return self.setdefault(key, self._default)
else:
return self.setdefault(key,
self._default[0](*self._default[1], **self._default[2]))

class keytransformdict(dict):
__slots__ = ['_transformer']

def __init__(self, *args, **kwargs):
self._transformer = lambda x: x

super(keytransformdict, self).__init__(self, *args, **kwargs)

def settransformer(self, transformer):
if not callable(transformer):
raise TypeError, 'transformer must be a callable'
self._transformer = transformer

def __setitem__(self, key, value):
super(keytransformdict, self).__setitem__(self,
self._transformer(key), value)

def __getitem__(self, key):
return super(keytransformdict, self).__getitem__(self,
self._transformer(key))

def __delitem__(self, key):
super(keytransformdict, self).__delitem__(self,
self._transformer(key))

class sorteddict(dict):
def __iter__(self):
for key in sorted(super(sorteddict, self).__iter__(self)):
yield key

def keys(self):
return list(self.iterkeys())

def items(self):
return list(self.iteritems())

def values(self):
return list(self.itervalues())

def iterkeys(self):
return iter(self)

def iteritems(self):
return ((key, self[key]) for key in self)

def itervalues(self):
return (self[key] for key in self)

if __name__ == '__main__':
x = sorteddict(a=1, b=3, c=2)

print x.keys()
print x.values()
print x.items()
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Docorator Disected

2005-04-03 Thread Ron_Adam
On 3 Apr 2005 00:11:22 -0800, "El Pitonero" <[EMAIL PROTECTED]>
wrote:

>Martin v. Löwis wrote:

>Perhaps this will make you think a bit more:

Now my problem is convincing the group I do know it. LOL


>Another example:
>
>def f():
>   return f
>
>g = f()()()()()()()()()()()
>
>is perfectly valid.

Good example!  Yes, I realize it. As I said before I just haven't come
across this particular variation before using decorators so it wasn't
clear to me at first, it is now. :)

Read my reply to Bengt Richter.

Thanks, this has been a very interesting discussion.

Ron

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Docorator Disected

2005-04-03 Thread Ron_Adam
On Sun, 03 Apr 2005 07:53:07 GMT, [EMAIL PROTECTED] (Bengt Richter) wrote:


>>No, I did not know that you could pass multiple sets of arguments to

>That phraseology doesn't sound to me like your concept space is quite 
>isomorphic
>with reality yet, sorry ;-) 

You'll be happy to know, my conceptual conceptions are conclusively
isomorphic this morning.  :-)

>It sounds like you are thinking of "multiple sets of arguments"
>as an aggregate that is passed as such, and that isn't happening, as I believe 
>El Pitonero
>is trying to indicate with his parenthesized visualization below.

Well there are multiple sets of arguments, and there are multiple
functions involved. It's just a matter of how they get matched up.
Depending on what level you look at it, it could be both ways. But the
correct way to view it is in the context of the language it self, and
not the underlying byte code, c++ or assembly code.

>What is happening is that an expression "foo(2)(6)" is being evaluated left to 
>right.
>First foo as a name evaluates to whatever it is bound to, which is the foo 
>function.
>Then () is the calling operator, which says evaluate the list inside the 
>parens left to right
>and call the thing you had so far, which was foo here. The arg list was just 
>2, so foo is called
>with 2, and foo returns something, with which we will do the next operation if 
>there is one.

Like this of course:

def foo(x):
   def fee(y):
  return y*x
   return fee

statement:  z = foo(2)(6) 
becomes:z = fee(6)
becomes:z = 12

The position of the 'def fee' inside of 'def foo' isn't relevant, it's
only needed there so it can have access to foo's name space. It could
be at the top or bottom of the function it is in, and it wouldn't make
a difference.

This would be the same without the nesting:

def foo(xx):
global x
x = xx
return fee

def fee(y):
global x
return y*x

z = foo(2)(6)


>So if you are seeing (2)(6) as something to pass, as opposed to a sequence of 
>operations, I think there's
>a misconception involved. Perhaps I am taking your words askew ;-)

It's not entirely a misconception. Lets see where this goes...

> >>> dis.dis(compiler.compile('foo(2)(6)','','eval'))
>   1   0 LOAD_NAME0 (foo)
>   3 LOAD_CONST   1 (2)
>   6 CALL_FUNCTION1
>   9 LOAD_CONST   2 (6)
>  12 CALL_FUNCTION1
>  15 RETURN_VALUE

In this example, you have byte code that was compiled from source
code, and then an interpreter running the byte code; which in it self,
is a program written in another language to execute the byte code,
C++; which gets translated into yet another language, assembly; which
at one time would have corresponded to specific hardwired registers
and circuits,(I could go further...ie... translators... PNP...
holes...), but with modern processors, it may yet get translated still
further.  

While all of this isn't relevant, it's knowledge in my mind, and
effects my view of programming sometimes.

Now take a look at the following descriptions of the above byte codes
from http://docs.python.org/lib/bytecodes.html


LOAD_NAMEnamei
Pushes the value associated with "co_names[namei]" onto the stack.

LOAD_CONSTconsti
Pushes "co_consts[consti]" onto the stack. 

CALL_FUNCTIONargc
Calls a function. The low byte of argc indicates the number of
positional parameters, the high byte the number of keyword parameters.
On the stack, the opcode finds the keyword parameters first. For each
keyword argument, the value is on top of the key. Below the keyword
parameters, the positional parameters are on the stack, with the
right-most parameter on top. Below the parameters, the function object
to call is on the stack. 

RETURN_VALUE
Returns with TOS to the caller of the function. 

*TOS = Top Of Stack.

The calling routine, puts (passes) the second set of arguments onto
the stack before calling the function returned on the stack by the
previous call.

Which is exactly how I viewed it when I referred to coming full circle
and the second sets of arguments are pass with a "stack(?)".  

Or it could be said equally the functions (objects) are passed with
the stack. So both view are correct depending on the view point that
is chosen.

Cheers,
Ron


>HTH
>
>Regards,
>Bengt Richter

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: re module non-greedy matches broken

2005-04-03 Thread Andrà Malo
* lothar wrote:

> re:
> 4.2.1 Regular Expression Syntax
> http://docs.python.org/lib/re-syntax.html
> 
>   *?, +?, ??
>   Adding "?" after the qualifier makes it perform the match in non-greedy
>   or
> minimal fashion; as few characters as possible will be matched.
> 
> the regular expression module fails to perform non-greedy matches as
> described in the documentation: more than "as few characters as possible"
> are matched.
> 
> this is a bug and it needs to be fixed.

The documentation is just incomplete. Non-greedy regexps still start
matching the leftmost. So instead the longest of the leftmost you get the
shortest of the leftmost. One may consider this as a documentation bug,
yes.

nd
-- 
# Andrà Malo,  #
--
http://mail.python.org/mailman/listinfo/python-list


re module non-greedy matches broken

2005-04-03 Thread lothar
re:
4.2.1 Regular Expression Syntax
http://docs.python.org/lib/re-syntax.html

  *?, +?, ??
  Adding "?" after the qualifier makes it perform the match in non-greedy or
minimal fashion; as few characters as possible will be matched.

the regular expression module fails to perform non-greedy matches as
described in the documentation: more than "as few characters as possible"
are matched.

this is a bug and it needs to be fixed.

examples follow.

[EMAIL PROTECTED] /ntd/vl
$ cat vwre.py
#! /usr/bin/env python

import re

vwre = re.compile("V.*?W")
vwlre = re.compile("V.*?WL")

if __name__ == "__main__":

  newdoc = "V1WVVV2WWW"
  vwli = re.findall(vwre, newdoc)
  print "vwli[], expect", ['V1W', 'V2W']
  print "vwli[], return", vwli

  newdoc = "V1WLV2WV3WV4WLV5WV6WL"
  vwlli = re.findall(vwlre, newdoc)
  print "vwlli[], expect", ['V1WL', 'V4WL', 'V6WL']
  print "vwlli[], return", vwlli

[EMAIL PROTECTED] /ntd/vl
$ python vwre.py
vwli[], expect ['V1W', 'V2W']
vwli[], return ['V1W', 'VVV2W']
vwlli[], expect ['V1WL', 'V4WL', 'V6WL']
vwlli[], return ['V1WL', 'V2WV3WV4WL', 'V5WV6WL']

[EMAIL PROTECTED] /ntd/vl
$ python -V
Python 2.3.3


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread Jeff Epler
if your goal is to search for files on a windows-style path environment
variable, maybe you don't want to take this approach, but instead wrap
and use the _wsearchenv or _searchenv C library functions

http://msdn.microsoft.com/library/en-us/vclib/html/_crt__searchenv.2c_._wsearchenv.asp

Incidentally, I peeked at the implementation of _searchenv in wine (an
implementation of the win32 API for Unix), and it doesn't do the
quote-processing that you say Windows does.  The msdn page doesn't give
the syntax for the variable either, which is pretty typical.  Do you
have an "official" page that discusses the syntax?

Jeff


pgpT4weDOp5pO.pgp
Description: PGP signature
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Python Cookbook

2005-04-03 Thread rdsteph
I want to just second this comment by Heike. I received my copy of the
2nd Edition from O'Reilly on Friday. I am still working my way slowly
through the first chapter on Text, and  I am nearing the end of that
chapter.

I intend to work my way through sequentially, because I can't think of
a better way to improve my understanding of intelligent usage of
Python. So far, even the recipes and discussions that are reach-out and
difficult for me are beginning to make sense after I read carefully and
sometimes re-read the discussions. I couldn't ask for better
instruction, and the effort I make to learn is richly rewarded.

I know that the material will become considerably more complex as I
move deeper into the book, but the work of the authors and editors in
chapter one gives me faith that I can make progress as I go, as long as
I make the effort to really try to understand each recipe's discussion.


This will no doubt be a valued reference work, but I am finding it to
be a uniquely rewarding educational experience.

Ron Stephens
www.awaretek.com/plf.html

-- 
http://mail.python.org/mailman/listinfo/python-list


On slashdot

2005-04-03 Thread bearophileHUGS
There is a discussion about "Python Moving into the Enterprise" on
Slashdot:

http://it.slashdot.org/it/05/04/03/0715209.shtml?tid=156&tid=8

Bearophile

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread Paul Rubin
"Martin v. Löwis" <[EMAIL PROTECTED]> writes:
> Out of curiosity: when thinking about Python 3.0, what is the timespan
> in which you expect that to appear? Before 2010? After 2010? After 2020?

I'm not terribly worried about Python 3.0 incompatibilities, whenever
those are.  There are already three incompatible Python versions
(CPython, Jython, IronPython) with PyPy coming right along.  If any of
those newer ones take off in popularity, there's going to be much more
interoperability hassle than 3.0 will cause.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread "Martin v. Löwis"
Andreas Beyer wrote:
If I am getting the docs etc. correctly, the string-module is depricated 
and is supposed to be removed with the release of Python 3.0.
I still use the module a lot and there are situations in which I don't 
know what to do without it. Maybe you can give me some help.
Out of curiosity: when thinking about Python 3.0, what is the timespan
in which you expect that to appear? Before 2010? After 2010? After 2020?
Regards,
Martin
--
http://mail.python.org/mailman/listinfo/python-list


Re: Docorator Disected

2005-04-03 Thread Ron_Adam
On Sun, 03 Apr 2005 08:37:02 +0200, "Martin v. Löwis"
<[EMAIL PROTECTED]> wrote:

>Ron_Adam wrote:
>>>Ah, so you did not know functions are objects just like numbers,
>>>strings or dictionaries. I think you may have been influenced by other
>>>languages where there is a concept of static declaration of functions.
>> 
>> 
>> No, I did not know that you could pass multiple sets of arguments to
>> nested defined functions in that manner.  
>
>Please read the statements carefully, and try to understand the mental
>model behind them. He did not say that you can pass around multiple
>sets of arguments. He said that functions (not function calls, but
>the functions themselves) are objects just like numbers. There is
>a way of "truly" understanding this notion, and I would encourage
>you to try doing so.

Hello Martin,

It is interesting how sometimes what we already know, and a new
situation presented in an indirect way, can lead us to viewing an
isolated situation in a biased way.

That's pretty much the situation I've experienced here with this one
point. I already knew that functions are objects, and objects can be
passed around.  My mind just wasn't clicking on this particular set of
conditions for some reason, probably because I was looking too closely
at the problem. 

(Starting off as a tech, with knowledge of how microchips work, can
sometimes be a obstacle when programming in high level languages.)

I'm sure I'm not the only one who's had difficulties with this.  But
I'm somewhat disappointed in myself for not grasping the concept as it
is, in this particular context, a bit sooner. 

Cheers,
Ron


>Regards,
>Martin

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread "Martin v. Löwis"
Andreas Beyer wrote:
Yeeh, I was expecting something like that. The only reason to use map() 
at all is for improving the performance.
That is lost when using list comprehensions (as far as I know). So, this 
is *no* option for larger jobs.
Don't believe anything you hear right away, especially not when it comes
to performance, especially not wrt. Python.
[EMAIL PROTECTED]:~$ python -m timeit -s "items=['']*1;import string" 
"map(string.upper, items)"
100 loops, best of 3: 6.32 msec per loop
[EMAIL PROTECTED]:~$ python -m timeit -s "items=['']*1;import string" 
"[s.upper for s in items]"
100 loops, best of 3: 2.22 msec per loop

So using map is *no* option for larger jobs.
Of course that statement is also false. Performance prediction is very
difficult, and you cannot imply much from this benchmark. In other
cases, list comprehension may be slower than map. More likely, for real
(i.e. non-empty) strings, the cost of .upper will make the precise
implementation of the loop irrelevant for performance reasons.
Regards,
Martin
--
http://mail.python.org/mailman/listinfo/python-list


Re: string goes away

2005-04-03 Thread John J. Lee
Duncan Booth <[EMAIL PROTECTED]> writes:
[...]
>str.join(sep, list_of_str)
[...]

Doesn't work with unicode, IIRC.



John
-- 
http://mail.python.org/mailman/listinfo/python-list


Python Cookbook

2005-04-03 Thread Heiko Wundram
Hi all!

I've received my copy of the Python Cookbook two days ago, and just thought 
that I might independently commend all you editors and recipe designers out 
there to an excellent book! I've thoroughly enjoyed reading the introductions 
in each chapter, and although I've been programming in Python for four years 
now, I've seen quite a few idioms that are new for me as well.

The book is pretty much just like this newsgroup: surprises me each and every 
time again how many smart people's comments are contained within. :-)

Keep up the good work! And thanks Alex for making the second edition possible!

-- 
--- Heiko.
listening to: Wir Sind Helden - Ruessel an Schwanz
  see you at: http://www.stud.mh-hannover.de/~hwundram/wordpress/


pgpg0wsEfR0XZ.pgp
Description: PGP signature
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: text analysis in python

2005-04-03 Thread Maurice LING
		.
I don't know if you're aware that, in a fairly strong sense,
anything "[i]n the Java world" *is* "for Python".  If you
program with Jython (for example--there are other ways to
achieve much the same end), your source code can be in
Python, but you have full access to any library coded in Java.
Yes, I do know the presence of Jython but had not used it in any 
productive ways. So I might need some assistance here... Say I code my 
stuffs in Jython (importing java libraries) in a file "text.py"... Will 
there be any issues when I try to import text.py into CPython?

My impression is that NLTK is more of a teaching tool rather than for 
production use. Please correct me if I'm wrong... The main reason I'm 
looking at NLTK is that it is pure python and is about the comprehensive 
text analysis toolkit in python. Are there any projects that uses NLTK?

Thanks and Cheers
Maurice
--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorater inside a function? Is there a way?

2005-04-03 Thread Ron_Adam
On 3 Apr 2005 00:20:32 -0800, "George Sakkis" <[EMAIL PROTECTED]>
wrote:

>Yes, it is possible to turn off type checking at runtime; just add this
>in the beginning of your define:
>
>def define(func):
>if not ENABLE_TYPECHECKING:
>return lambda func: func
># else decorate func
>
>where ENABLE_TYPECHECKING is a module level variable that can be
>exposed to the module's clients. In my module, the default is
>ENABLE_TYPECHECKING = __debug__.
>
>
>George

Cool, I'll try that. 

Thanks,
Ron

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: the bugs that try men's souls

2005-04-03 Thread Jordan Rastrick
I think I found your bug, although it took a bit of time, a fair bit of
thought, and a fair bit of extra test-framework code - your program is
very concise, reasonably complex, and very unreadable. Its perfect for
testing maths theorems of your own interest, but you probably should
have polished it up a little, and explained a little more precisely
what the actual lines of code were supposed to be doing (as distinct
what mathematical ideas they were testing) before posting it to a forum
with a request for others to debug it.

I sympathise though - I'm a Maths undergraduate student myself, and I
often write dodgy, buggy code like this in high level languages like
Python, Haskell etc to test ideas :)

Anyway, from what I can tell the problem is with the line

print '%s %5s %3s' %(str([a,b]),int([p[a],p[b]] in
s),int([p[t[a]],p[t[b]]] in s))

I've assumed/deduced you mean for the expression:
([p[a],p[b]] in s)
to test whether (a,b) is an inversion of pt, i.e. is [pt(a),pt(b)] in
the collection s of stepup pairs (thats according to my understanding
of the terms you've used). But s is not complete list of stepup pairs
if you apply the 'xor filter' in the line you've labeled #MYSTERIOUSLY
BROKEN, its the list of stepup pairs sharing a co-ordinate with [a,b].
So you correctly identified the source of the bug as being at that
line.

I think (guess) what youre really trying to do is filter at the
printing stage, so you're just printing at those cases where {a,b} and
{x,y} share one and only one element, and ignoring the other trivial
cases - i'm inferring this from the fact you don't think the 'answers'
should change, just, presumably, the amount of output. This works:

def feedback(p,t):
## GIVEN A PERMUTATION f AND A STEPUP PAIR s WITH COORDINATES IN
THE RANGE OF f,
## SHOW THE INTEGER CASE OF THE PROPOSITION << f(s) IS A STEPUP
PAIR >>
k = 18
moved = [i for i in range(len(t)) if t[i]<>i]
g,h = min(moved), max(moved)
n = len(p) - 1
s = stepups(n)
print '-'*k
print 'p: ' + str(range(n+1)) + '\n   ' + str(p)
print '-'*k
print 't = ' + str((g,h))
print '-'*k
print '%s %7s %3s' %('pair','p','pt') + '\n' + '-'*k
for [a,b] in s:
if xor(g in [a,b], h in [a,b]):
print ([p[a],p[b]]), ([p[t[a]],p[t[b]]])
print '%s %5s %3s' %(str([a,b]),int([p[a],p[b]] in
s),int([p[t[a]],p[t[b]]] in s))
print '-'*k

You can replace g and h if you want, they were pretty arbitrary
letters, but whatever you do dont use 'a' and 'b' twice like your
original code did, its awful programming practice. Even in Maths you
wouldnt let one pronumeral stand for two different quantities in the
same proof, its a recipe for disaster.

If this was wrong, and you really did want to test for inversion only
against the reduced set of pairs, a more complete explanation of what
kind of 'wrong answers' you are getting and what kind of 'right
answers' you were expecting might help. As far as I can tell though,
its quite natural the answers will be different when testing against a
subset of the stepup pairs when compared to testing against the whole
set.

Cheers,
Jordan

P.S. Oh, and if you come up with a proof of the proposition, let me
know, I'd like to see it :)

Sean McIlroy wrote:
> This needs some background so bear with me.
>
> The problem: Suppose p is a permutation on {0...n} and t is the
> transposition that switches x and y [x,y in {0...n}]. A "stepup pair"
> (just a term I invented) for p is a pair (a,b) of integers in {0...n}
> with a of p iff (p(a),p(b)) is NOT a stepup pair. Now, if {a,b}={x,y}, then
> clearly (a,b) is an inversion of p iff it is NOT an inversion of pt
> (functional composition). Also, if {a,b} and {x,y} are disjoint, then
> (a,b) is an inversion of p iff it is an inversion of pt. The
remaining
> cases are the ones where {a,b} and {x,y} have a single element in
> common, and of these, there are exactly as many inversions of p as
> there are of pt, though in general it is not the same set of stepup
> pairs for each function.
>
> The code below represents my attempt to apply python toward getting
> insight into why the number of inversions, with exactly one
coordinate
> in {x,y}, is the same for p and pt. The problem with the code is that
> if, at the relevant line ("MYSTERIOUSLY BROKEN"), I use the
> commented-out expression instead of the expression that's actually
> there, then in some cases the script gives a DIFFERENT ANSWER to the
> question whether a given pair is or is not an inversion of p
> respectively pt.
>
> I'd sure like to know what's going wrong with this. Here's the code:
>
>
> ## STEPUP PAIR: AN ORDERED PAIR (x,y) OF INTEGERS WITH x
> stepups = lambda n: n and stepups(n-1) + [[x,n] for x in range(n)] or
> []
> xor = lambda x,y: (x and not y) or (y and not x)
>
> def feedback(p,t):
> ## GIVEN A PERMUTATION f AND A STEPUP PAIR s WITH COORDINATES IN
> THE RANGE OF f,
> ## SHOW THE INTEGER CASE OF THE PROPOSITION << 

Re: text analysis in python

2005-04-03 Thread beliavsky
The book "Text Processing in Python" by David Mertz, available online
at http://gnosis.cx/TPiP/ , may be helpful.

-- 
http://mail.python.org/mailman/listinfo/python-list


Newsgroup Programming

2005-04-03 Thread Chuck
I've found and used the nntplib module for newgroup programming.  Can anyone 
suggest a library, technique or reference on how to combine mutliple 
messages with attachments such as mp3's, .wmv, *.avi, etc.? 


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Roy Smith
In article <[EMAIL PROTECTED]>,
 Paul Rubin  wrote:

> "Raymond Hettinger" <[EMAIL PROTECTED]> writes:
> > When writing a large suite, you quick come to appreciate being able
> > to use assert statements with regular comparision operators, debugging
> > with normal print statements, and not writing self.assertEqual over and
> > over again.  The generative tests are especially nice.
> 
> But assert statements vanish when you turn on the optimizer.  If
> you're going to run your application with the optimizer turned on, I
> certainly hope you run your regression tests with the optimizer on.

That's an interesting thought.  In something like C++, I would never think 
of shipping anything other than the exact binary I had tested ("test what 
you ship, ship what you test").  It's relatively common for turning on 
optimization to break something in mysterious ways in C or C++.  This is 
both because many compilers have buggy optimizers, and because many 
programmers are sloppy about depending on uninitialized values.

But, with something like Python (i.e. high-level interpreter), I've always 
assumed that turning optimization on or off would be a much safer 
operation.  It never would have occurred to me that I would need to test 
with optimization turned on and off.  Is my faith in optimization misguided?

Of course, all of the Python I write is for internal use; I haven't yet 
been able to convince an employer that we should be shipping Python to 
customers.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: A ClientForm Question

2005-04-03 Thread Francesco
Il Fri, 01 Apr 2005 02:36:24 -0800, narke ha scritto:

> Does anyone here use ClientForm to handle a HTML form on client side?
> 
> I got a form, within which there is a image control, it direct me to
> another page if i use mouse click on it.  the code of the form as
> below:
> 
>  action="CDocZ_MAG.aspx?Stat=DocZoom_DocZoom&&E=29YL53ZJBIEZ&DT=ALB&Pass=&Total=104&Pic=1&o="
> id="ZoomControl1_Form1" onkeydown="JavaScript:Navigation_ie();">
> 
> ...
> 
>  language="javascript" id="ZoomControl1_Imagebutton2"
> src="../Images/Btn_GoImage.gif" border="0" /> 
> 
> ...
> 
> 
> 
> So write below code to 'click' the image button,
> 
> forms = ParseResponse(urlopen(url))
> 
> form = forms[0]
> urlopen(form.click("ZoomControl1:Imagebutton2"))
> 
> unfortunatly, however, when the code run, it just got a page which is
> not the one i desired ( i actually wish to get the same page as i
> 'click' the button).  I guess that is "onclick=" statement cause
> something weird, but I do not understand it.  And, in the source
> containing the form, i found nowhere the Page_ClientValidate() resides.
> 
> What's wrong?
> 
> -
> narke

Similar problem for me.
In the form, i have

and i don't know how to click this.
urlopen(form.click()) doesn't nothing.
UserForm is the name of the form.

Francesco
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: StopIteration in the if clause of a generator expression

2005-04-03 Thread Raymond Hettinger
[Peter Otten]
> Do you see any chance that list comprehensions will be redefined as an
> alternative spelling for list()?

Not likely.  It is possible that the latter spelling would make it possible for
Py3.0. eliminate list comps entirely.  However, they are very popular and
practical, so my bet is that they will live on.

The more likely change is that in Py3.0 list comps will no longer expose the
loop variable outside the loop.


Raymond Hettinger



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: text analysis in python

2005-04-03 Thread Cameron Laird
In article <[EMAIL PROTECTED]>,
Maurice Ling  <[EMAIL PROTECTED]> wrote:
.
.
.
>In the Java world, there is GATE (general architecture for text 
>engineering) and it seems very impressive. Are there something like that 
>for Python?
.
.
.
I don't know if you're aware that, in a fairly strong sense,
anything "[i]n the Java world" *is* "for Python".  If you
program with Jython (for example--there are other ways to
achieve much the same end), your source code can be in
Python, but you have full access to any library coded in Java.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Paul Rubin
"Raymond Hettinger" <[EMAIL PROTECTED]> writes:
> When writing a large suite, you quick come to appreciate being able
> to use assert statements with regular comparision operators, debugging
> with normal print statements, and not writing self.assertEqual over and
> over again.  The generative tests are especially nice.

But assert statements vanish when you turn on the optimizer.  If
you're going to run your application with the optimizer turned on, I
certainly hope you run your regression tests with the optimizer on.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: unittest vs py.test?

2005-04-03 Thread Raymond Hettinger
[Peter Hansen]
> (I'm not dissing py.test, and intend to check it
> out.

Not to be disrepectful, but objections raised by someone
who hasn't worked with both tools equate to hot air.


> I'm just objecting to claims that unittest
> somehow is "heavy", when those claiming that it
> is seem to think you have to use TestSuites and
> TestRunner objects directly... I think they've
> overlooked the relatively lightweight approach
> that has worked so well for me for four years...)

Claiming?  Overlooked?  You do know that I wrote the
example in unittest docs, the tutorial example, and hundreds
of the test cases in the standard library. It is not an
uninformed opinion that the exposed object model for
unittest is more complex.

As for "heaviness", it is similar to comparing alkaline AA
batteries to lithium AA batteries.  The first isn't especially heavy,
but it does weigh twice as much as the latter.  It only becomes a
big deal when you have to carry a dozen battery packs on a hiking
trip.  My guess is that until you've written a full test suite with
py.test, you won't get it.  There is a distinct weight difference between
the packages -- that was their whole point in writing a new testing tool
when we already had two.

When writing a large suite, you quick come to appreciate being able
to use assert statements with regular comparision operators, debugging
with normal print statements, and not writing self.assertEqual over and
over again.  The generative tests are especially nice.

Until you've exercised both packages, you haven't helped the OP
whose original request was:  "Is there anybody out there who has
used both packages and can give a comparative review?"


Raymond


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Help with splitting

2005-04-03 Thread Reinhold Birkenfeld
George Sakkis wrote:

> If you don't want any null strings at the beginning or the end, an
> equivalent regexp is:
> 
 whitespaceSplitter_2 = re.compile("\w+|\s+")
 whitespaceSplitter_2.findall("1 2  3   \t\n5")
> ['1', ' ', '2', '  ', '3', '   \t\n', '5']
 whitespaceSplitter_2.findall(" 1 2  3   \t\n5 ")
> [' ', '1', ' ', '2', '  ', '3', '   \t\n', '5', ' ']

Perhaps you may want to use "\s+|\S+" if you have non-alphanumeric
characters in the string.

Reinhold
-- 
http://mail.python.org/mailman/listinfo/python-list


text analysis in python

2005-04-03 Thread Maurice Ling
Hi,
I'm a postgraduate and my project deals with a fair bit of text 
analysis. I'm looking for some libraries and tools that is geared 
towards text analysis (and text engineering). So far, the most 
comprehensive toolkit in python for my purpose is NLTK (natural language 
tool kit) by Edward Loper and Steven Bird, followed by mxTextTools. Are 
there any OSS tools out there that is more comprehensive than NLTK?

In the Java world, there is GATE (general architecture for text 
engineering) and it seems very impressive. Are there something like that 
for Python?

Thanks in advance.
Cheers
Maurice

begin:vcard
fn:Maurice Ling
n:Ling;Maurice
org:The University of Melbourne;Department of Zoology
adr:;;Gate 12, Genetics Lane;Parkville;Victoria;3010;Australia
email;internet:[EMAIL PROTECTED]
title:Probatory Ph.D. Candidate
tel;cell:+61 4 22781753
x-mozilla-html:FALSE
url:http://www.geocities.com/beldin79/
version:2.1
end:vcard

-- 
http://mail.python.org/mailman/listinfo/python-list

Re: redundant imports

2005-04-03 Thread Serge Orlov
Mike Meyer wrote:

> The semantic behavior of "include" in C is the same as "from module
> import *" in python. Both cases add all the names in the included
> namespace directly to the including namespace. This usage is
> depreciated in Python ...

 Did you mean discouraged? Or it's really slated for deprecation?

  Serge.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: the bugs that try men's souls

2005-04-03 Thread Serge Orlov
Sean McIlroy wrote:
> This needs some background so bear with me.
>
> The problem: Suppose p is a permutation on {0...n} and t is the
> transposition that switches x and y [x,y in {0...n}]. A "stepup pair"
> (just a term I invented) for p is a pair (a,b) of integers in {0...n}
> with a of p iff (p(a),p(b)) is NOT a stepup pair. Now, if {a,b}={x,y}, then
> clearly (a,b) is an inversion of p iff it is NOT an inversion of pt
> (functional composition). Also, if {a,b} and {x,y} are disjoint, then
> (a,b) is an inversion of p iff it is an inversion of pt. The remaining
> cases are the ones where {a,b} and {x,y} have a single element in
> common, and of these, there are exactly as many inversions of p as
> there are of pt, though in general it is not the same set of stepup
> pairs for each function.
>
> The code below represents my attempt to apply python toward getting
> insight into why the number of inversions, with exactly one coordinate
> in {x,y}, is the same for p and pt. The problem with the code is that
> if, at the relevant line ("MYSTERIOUSLY BROKEN"), I use the
> commented-out expression instead of the expression that's actually
> there, then in some cases the script gives a DIFFERENT ANSWER to the
> question whether a given pair is or is not an inversion of p
> respectively pt.

[snip the code]

Can you post a unit test that fails? Otherwise it's not clear what you mean
by saying "mysteriously broken" and "different answer".

  Serge.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Corectly convert from %PATH%=c:\\X; "c:\\a; b" TO ['c:\\X', 'c:\\a; b']

2005-04-03 Thread chirayuk
Michael Spencer wrote:
> chirayuk wrote:
> > Hi,
> >
> > I am trying to treat an environment variable as a python list - and
I'm
> > sure there must be a standard and simple way to do so. I know that
the
> > interpreter itself must use it (to process $PATH / %PATH%, etc) but
I
> > am not able to find a simple function to do so.
> >
> > os.environ['PATH'].split(os.sep) is wrong on Windows for the case
when
> > PATH="c:\\A;B";c:\\D;
> > where there is a ';' embedded in the quoted path.
> >
> > Does anyone know of a simple way (addons ok) which would do it in a
> > cross platform way? If not - I will roll my own. My search has
shown
> > that generally people just use the simple split menthod as above
and
> > leave it there but it seemed like such a common operation that I
> > believe there must be a way out for it which I am not seeing.
> >
> > Thanks,
> > Chirayu.
> >
> You may be able to bend the csv module to your purpose:
>
>
>   >>> test = """\"c:\\A;B";c:\\D;"""
>   >>> test1 = os.environ['PATH']
>   >>> import csv
>   >>> class path(csv.excel):
>   ... delimiter = ';'
>   ... quotechar = '"'
>   ...
>   >>> csv.reader([test],path).next()
>   ['c:\\A;B', 'c:\\D', '']
>   >>> csv.reader([test1],path).next()
>   ['C:\\WINDOWS\\system32', 'C:\\WINDOWS',
'C:\\WINDOWS\\System32\\Wbem',
> 'C:\\Program Files\\ATI Technologies\\ATI Control Panel',
> 'C:\\PROGRA~1\\ATT\\Graphviz\\bin',
'C:\\PROGRA~1\\ATT\\Graphviz\\bin\\tools',
> 'C:\\WINDOWS\\system32', 'C:\\WINDOWS',
'C:\\WINDOWS\\System32\\Wbem',
> 'C:\\Program Files\\ATI Technologies\\ATI Control Panel',
> 'C:\\PROGRA~1\\ATT\\Graphviz\\bin',
'C:\\PROGRA~1\\ATT\\Graphviz\\bin\\tools',
> 'c:\\python24', 'c:\\python24\\scripts',
'G:\\cabs\\python\\pypy\\py\\bin']
>   >>>
>
> HTH
> Michael

That is a cool use of the csv module.

However, I just realized that the following is also a valid PATH in
windows.

PATH=c:\A"\B;C"\D;c:\program files\xyz"
(The quotes do not need to cover the entire path)

So here is my handcrafted solution.

def WinPathList_to_PyList (pathList):
pIter = iter(pathList.split(';'))
OddNumOfQuotes = lambda x: x.count('"') % 2 == 1
def Accumulate (p):
bAcc, acc = OddNumOfQuotes(p), [p]
while bAcc:
p = pIter.next ()
acc.append (p)
bAcc = not OddNumOfQuotes (p)
return "".join (acc).replace('"','')
return [q for q in [Accumulate (p) for p in pIter] if q]


So now I need to check if the os is windows.

Wishful thinking: It would be nice if something like this (taking care
of the cases for other OS's) made it into the standard library - the
interpreter must already be doing it.

Thanks,
Chirayu.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Name of IDLE on Linux

2005-04-03 Thread Peter Otten
Joal Heagney wrote:

> If you're using KDE, you can set a bookmark in konqueror to the
> documentation and it'll bring it up in the bookmark toolbar. Only hassle
> is when you update python and the docs, you have to edit the bookmark.

Or you can bookmark a symlink to the documentation and bookmark that.
Another goodie are Konqueror's web shortcuts. I added one with the
URI file:/path_to_python_docs/lib/[EMAIL PROTECTED] 
and the shortcut pym, and now typing e. g.

pym os.path

in the address bar immediately brings up that module's documentation.

Peter

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorater inside a function? Is there a way?

2005-04-03 Thread George Sakkis
Yes, it is possible to turn off type checking at runtime; just add this
in the beginning of your define:

def define(func):
if not ENABLE_TYPECHECKING:
return lambda func: func
# else decorate func

where ENABLE_TYPECHECKING is a module level variable that can be
exposed to the module's clients. In my module, the default is
ENABLE_TYPECHECKING = __debug__.


George

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: StopIteration in the if clause of a generator expression

2005-04-03 Thread Peter Otten
jfj wrote:

>> To make it a bit clearer, a StopIteration raised in a generator
>> expression silently terminates that generator:
> 
> *any* exception raised from a generator, terminates the generator

Yeah, but StopIteration is the only expected exception and therefore the
only one that client code (nearly) always knows to deal with:

>>> def choke(): raise ValueError
...
>>> list(i for i in range(10) if i < 3 or choke())
Traceback (most recent call last):
  File "", line 1, in ?
  File "", line 1, in 
  File "", line 1, in choke
ValueError
>>> [i for i in range(10) if i < 3 or choke()]
Traceback (most recent call last):
  File "", line 1, in ?
  File "", line 1, in choke
ValueError

Here you can *not* tell apart list(genexp) and listcomp.

(Of course, as has since been pointed out, the StopIteration is actually
caught in the list constructor, so nothing magic to the example in my
initial post)

Peter

-- 
http://mail.python.org/mailman/listinfo/python-list


  1   2   >