Dear Developers,

I really would like to use multiprocessing python queue in web2py but
I'm asking for an advice. I googled the forum but cannot find an
answer.

I have a worker process (a can bus message processing daemon) that
actually it's a web2py application that shares a database table with
the main application.

Unfortunately I'm on an embedded arm (400mhz with slow uSD card) and
accessing the database for sharing messages it's really killing my
app.

Using multiprocessing queues would solve my problems but I have a
several questions popping around:

* I have to maintain a queue handler in main process but web2py
doesn't support writing a singleton shared by all controllers, is a
cache.ram that never expires the only way to keep that handler in
memory for the application lifetime?

* The worker process should be spawned by web2py, but again, I should
keep an handler in cache.ram and check if the process was previously
spawned at each request, right?

* This is not the most elegant solution, do you know of a more elegant one?

Thank you for you advice!

-- 
Profile: http://it.linkedin.com/in/compagnucciangelo

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to