Hi,

OK, I've revisited this, and come to the conclusion that this is possible. I 
hadn't been thinking quite as clearly as I could've done :-)

To cut to the chase, I've written some decorators that when they wrap a 
generator, they turn said generator into something we can use as a component.

The sticking point in the past when discussing this was that it created 
something like this:

@GeneratorComponent(Inboxes = ["inbox", "control"],
                    Outboxes = ["outbox", "signal"])
def Tracer(self):
    while 1:
        for msg in self.Inbox("inbox"):
            print self.tag, str(msg)
            self.send(msg, "outbox")
        self.pause()
        yield 1

The biggest, and best objections to this were:

    "Michael, I can appreciate the goal of trying to make basic things
     simple.  Your suggestion however feels really ... magical to me.  I
     like decorators, really.  I use them a lot.  But I think decorators
     are best used to apply one flavor change to the decorated function /
     class without significantly changing the decorated item's signature."
    "To my brain, that is easier to process one line at a time than trying
     to grok the mind-bending:"
       -- Steve, 

And this:
    "Though as I've said, I am not really fond of his example either. IMO it
     is coming at it from the wrong direction. It seems like it is starting
     with the class-based component and just abstracting out some bits.
     Instead I think it would work better to come at it from the starting
     point of a standard python generator containing your logic that you want
     to fit into the framework."
       -- John,

Both from this thread:
http://groups.google.com/group/kamaelia/browse_thread/thread/e6adc81dff852429

This was partly prompted by this discussion on IRC:

http://www.kamaelia.org/logs/kamaelia2009-06-16_log.html
...
[16:59] < eikenberry> Does Kamaelia have decorators for wrapping generators.
[16:59] < eikenberry> ?
[16:59] < eikenberry> Making them into components?
[17:09] < eikenberry> cool. I agree. IMO the best system would be one that 
would look almost just like a system based on generators/coroutines, that you 
would just decorate into a more capable system. Something akin to beazley's 
coroutine talk... taking the basic blocks then hooking them into a scheduler 
and allowing for more easily setting up complex pipelines.
[17:09] < MS-> That's nice, until you realise how the control flow of 
beazley's coroutines work.
[17:10] < MS-> Specifically they seem to be limited to pipelines.
[17:10] *** eikenberry nods. They are more an example than a working system. I 
just really liked how they built on idiomatic python rather than a 
framework. 
[17:10] < MS-> The reason is because the last coroutine *pulls* rather than 
pushes values.
[17:11] < MS-> Mixing push and pull - which kamaelia does - seems to be 
somewhat harder to do that way
[17:12] *** eikenberry nods. 
[17:13] < MS-> I've just posted to the list a piece of code that's a syntax 
I'm toying with.
[17:14] < MS-> But more ideas as to what you'd like to type would be v useful

It took me a while to realise it, but this CAN be done. But you need to use 
python generators in a slightly different way from the way David Beazley 
does, but ironically slightly more like a unix program uses a filename.

If I do this:
    * tail -f /var/log/messages
    * grep hello /var/log/messages

Then tail & grep both open the files provided, and use the contents in a well 
defined way.

Conversely, the way David Beazley was using generators was not quite 
equivalent. The way he was using them was equivalent to passing an open file 
handle.

Specifically, grep would look something like this:

def grep(lines, pattern):
    regex = re.compile(pattern)
    for l in lines:
        if regex.search(l):
            yield l

In that scenario, this works given either a list or a generator as an 
argument. By changing this slightly to this:

def grep(lines, pattern):
    """To stop this generator, you need to call it's .stop() method. The
      wrapper could do this"""
    regex = re.compile(pattern)
    while 1:
        for l in lines(): # foo
            if regex.search(l):
                yield l
        yield

We gain something which is only slightly more complex, but IS something we can 
decorate as a component. Specifically, we can pass in "self.Inbox" as the 
generator with is activated and read from in line "foo".

Not only this, this is actually closer to the way that grep works when you do 
this:
    * tail -f /var/log/messages |grep -

Specifically, the grep will indeed read normally until it's source is 
exhausted and wait until there's more work to do, or until it's told to stop. 
This particular issue was really my sticking point in writing decorators, and 
so without further ado, here's a simple decorator/generator based pipeline 
written using the new decorators.

They're currently highly experimental, and located here:
http://code.google.com/p/kamaelia/source/browse/trunk/Sketches/MPS/decorators.py

so I'm skipping those imports :-)

Anyway, code:

import sys
import time
import re
import Axon
from Kamaelia.Chassis.Pipeline import Pipeline
from Kamaelia.Util.DataSource import DataSource
from Kamaelia.Util.Console import ConsoleEchoer

from decorators import blockingProducer, TransformerGenComponent

@blockingProducer
def follow(fname):
    """To stop this generator, you need to call it's .stop() method.
    The wrapper could do this"""
    f = file(fname)
    f.seek(0,2) # go to the end
    while True:
        l = f.readline()
        if not l: # no data
            time.sleep(.1)
        else:
            yield l

@TransformerGenComponent
def grep(lines, pattern):
    """To stop this generator, you need to call it's .stop() method.
    The wrapper could do this"""
    regex = re.compile(pattern)
    while 1:
        for l in lines():
            if regex.search(l):
                yield l
        yield

@TransformerGenComponent
def printer(lines):
    """To stop this generator, you need to call it's .stop() method.
    The wrapper could do this"""
    while 1:
        for line in lines():
            sys.stdout.write(line)
            sys.stdout.flush()
        yield

Pipeline(
    follow("somefile.txt"),
    grep(None, "o"),
    printer(None)
).run()

In case it's not obvious, what I'm doing here is providing a means of passing 
in a way of creating a generator that reads a component inbox. Values yielded 
are passed out the outbox.

Something that is slightly more subtle is why "None" ? This is again directly 
akin to what happens with "grep" and "cat" on the commandline.

If I write:
   grep "foo" somefile

It takes somefile, and greps it for foo, outputting to stdout.

If I write:
   bla | grep "foo"

It takes the output of bla and greps it for foo, outputting to stdout.

Likewise, if I write:
Pipeline(
    follow("somefile.txt"),
    grep(None, "o"),
    printer(None)
).run()

grep being called with "None" means it looks at the stream of data coming in 
the inbox "inbox". If it is called with a generator name - eg:

def source():
    for i in ["hello", "world", "game", "over"]:
        yield i

Pipeline(
    grep(source, "l"),
    printer(None)
).run()

Then that will print out "hello" and "world".

I'd welcome feedback on this, as to what people think about it :-)

Initially, I'm thinking that the import line would need to be something more 
like this:

from Axon.Decorators import blockingProducer, TransformerGenComponent

However, better names for both decorators would be equally welcome.

Regards,


Michael.
-- 
http://yeoldeclue.com/blog
http://twitter.com/kamaelian
http://www.kamaelia.org/Home


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"kamaelia" group.
To post to this group, send email to kamaelia@googlegroups.com
To unsubscribe from this group, send email to 
kamaelia+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/kamaelia?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to