On Oct 2, 2009, at 12:30 PM, Dave Angel wrote:

(you responded off-list, which isn't the way these mailing lists work. So I'm pasting your message back to the list, with my response at the end)


Sorry about that - a slip of the "reply" button.


Actually, I was thinking of the subprocess module (introduced in 2.4). But the multiprocessing module would be useful if you were porting threading code to a process model.

Previously, I had been using threads, so I just tried moving to multiprocessing because it required few changes.

There are tons of ways to communicate between processes, though you can't do the simple variable sharing that threads can (sometimes) get away with. I would normally point you to queues, but there a number of possibilities. And since the one process is running a GUI event loop, you might want to piggyback on the OS capability to post events between processes. The code might end up OS-dependent, but I'd bet the overhead will be minimal. What is your target operating system?

My target OS is Mac OS X Leopard. I've decided just to try and trigger WX events in the GUI based on data put in one of the queues by the processing thread.


Your numbers in the original message make me nervous; sending an event between processes (or even threads) every 0.5 millisecond is impossible. But I think you might have been confusing bytes and packets.

My external hardware is actually sending 2000 packets per second right now (but that can also be changed). Each packet currently contains 6 bytes of data and 6 bytes of overhead. So, 12 bytes per packet * 2000 packets per second is 24k bytes per second. However, the serial processing process should be finding the packet payload data and adding to a queue for the GUI to deal with and plot (obviously at a much lower framerate) (at an ideal rate of 2KHz).


There could very well be multiprocess support in wxPython. I'd check there first, before re-inventing the wheel. Presumably you know of the wxPython news group, hosted on Google groups?

I actually tried using multiprocess to create a separate process for reading the serial data. The process gets passed a TX queue and an RX queue in its constructor, then it's supposed to enter an infinite loop where it waits for commands on the TX queue and reads data and puts in in the RX queue. However, I've discovered for some reason that the infinite loop terminates after the first call I make to either of the queues. Perhaps this behavior should be expected, but this is my first go-round with processes, so I was a little surprised by this result. Maybe someone with more multiprocessing experience can give me some pointers.

Maybe I am barking up the wrong tree with Python from a speed perspective, but it's just some much faster and more fun to write than anything else...

Aaron

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to