speed is not a huge concern although I must complete processing in less than 90 seconds. The longer the delay however the greater number of processes must be running parallel in order to keep the throughput up. It's the usual trade-off we have all come to know and love.
it is not necessary for the dictionary to persist beyond the life of the parent process although I have another project coming up in which this would be a good idea.
at this point, I know they will be some kind souls suggesting various SQL solutions. While I appreciate the idea, unfortunately I do not have time to puzzle out yet another component. Someday I will figure it out because I really liked what I see with SQL lite but unfortunately, today is not that day (unless they will give me their work, home and cell phone numbers so I can call when I am stuck. ;-)
So the solutions that come to mind are some form of dictionary in shared memory with locking semaphore scoreboard or a multithreaded process containing a single database (Python native dictionary, metakit, gdbm??) and have all of my processes speak to it using xmlrpc which leaves me with the question of how to make a multithreaded server using stock xmlrpc.
so feedback and pointers to information would be most welcome. I'm still exploring the idea so I am open to any and all suggestions (except maybe SQL :-)
---eric
-- http://mail.python.org/mailman/listinfo/python-list