Terry Reedy wrote:

It seems to me that generators are already 'channels' that connect the calling code to the __next__ method, a semi-coroutine based on the body of the generator function. At present, the next method waits until an object is requested. Then it goes into action, yields an object, and rests again. For parallel operations, we need eager, anticipatory evaluation that produces things that *will* be needed rather than lazy evaluation of things that *are* needed and whose absence is holding up everything else.

Yes, generators look very much like channels. The obvious thing, from where I'm sitting, is to have a function called "channel" that takes an iterator, runs it in a different thread/process/goroutine, and returns an iterator that reads from the channel. A single threaded version would look very much like "iter" so let's use iter to get a working example:

#!/usr/bin/python2 -u

channel = iter # placeholder for missing feature

def generate():
    i = 2
    while True:
        yield i
        i += 1

def filter(input, prime):
    for i in input:
        if i%prime != 0:
            yield i

ch = channel(generate())
try:
    while True:
        prime = ch.next()
        print prime
        ch = channel(filter(ch, prime))
except IOError:
    pass

That works fine in a single thread. It's close to the original go example, hence the evil shadowing of a builtin. I don't think the "channel" function would present any problems given an appropriate library to wrap. I got something like this working with Jython and the E language but, as I recall, had an accident and lost the code. If somebody wants to implement it using multiprocessing, go to it!


                    Graham
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to