Thanks Ross,  that sounds like a good approach.  I do something
analogous in my app.   All my scripts live in the app/modules/
directory.  My app is a remote monitoring system running on a Foxconn
NetTop under Linux Mint 10.   There is a web2py http interface used
for installation and configuration, but it's not used in day-to-day
operation.
The long-running scripts are children of a master script similar to
the one I appended to my post except that it has additional code to
fork  the various child processes which, conveniently, inherit the
web2py environment including the model.  The child processes write
incoming data to the sqlitedb, read from it to compute averaged data
that gets sent a server in the cloud,  monitor for anomalous
conditions and send email alerts, etc.  The master script does little
more than launch the children, monitor them, and relaunch any that
crash.
What arguments do you use in your calls to DAL() to read in your
model? I just pass 'storage.sqlite' and DAL() knows which app/database
directory to use since the app name is passed in the web2py -S
invocation.  Do you have to do anything special when importing dal to
make it work without the web2py shell?
Cheers,Mike

On Jan 2, 1:32 pm, Ross Peoples <ross.peop...@gmail.com> wrote:
> Mike,
>
> I haven't done any profiling on it, but I've had a lot of success using the
> multiprocessing library, importing the DAL, and using it directly without
> using the rest of the web2py environment. I usually accomplish this by
> creating a module (not model) and spinning off the other processes from the
> module, as the module will stay running after the initial request finishes.
>
> Ross

Reply via email to