>  If you need to, you could build a persistent daemon that occasionally
>  checked for new data to import (say, once every 30 seconds) and would
>  spawn a separate thread for each bit of data to be imported.

I think this is what I'll do.  I can't quickly check whether there is
new data to import, so I think I'll have the daemon pickle the data
into a temporary file (and when done, replace the existing file).  My
view can then unpickle from that file.  (I'll probably have it run
every 10 minutes or so, unless it gets a signal to start ASAP.)

The daemon has to read from the Django project's database; is it
possible (and safe) to have it use Django's ORM to do so?

> ...or, the easy way out, just provide a page that provides a status,
> where your user can refresh as they like.

I'll have the daemon include a datetime value in the pickled data
which can be displayed in the page, and add a refresh button that will
trigger the signal I describe above.

Thanks Christopher.

--
Daryl


On Tue, Apr 15, 2008 at 10:17 AM, Christopher Allan Webber
<[EMAIL PROTECTED]> wrote:
>
>  We developed something that did something similar here.  The import
>  process spawned a daemon which imported the data.  However, in our
>  system only one import could be done at a time (that was actually part
>  of the spec, not bad programming!), so spawning one daemon per import
>  wasn't such a bad idea.
>
>  If you need to, you could build a persistent daemon that occasionally
>  checked for new data to import (say, once every 30 seconds) and would
>  spawn a separate thread for each bit of data to be imported.
>
>  Of course, you'll have to develop a good workflow so that either your
>  user is notified on that page when the import is completed (possibly
>  through some ajax-y interface either by polling or using a Comet
>  system) or, the easy way out, just provide a page that provides a
>  status, where your user can refresh as they like.  All depends on your
>  situation.
>
>  Hope that helps,
>  Christopher Allan Webber
>
>
>
>  "Daryl Spitzer" <[EMAIL PROTECTED]> writes:
>
>  > I have a view that takes approximately 5 minutes (plus or minus 4 or
>  > so) to put together data (culled from other sources, such as our SCM
>  > and a web service that wraps our bug database) to be passed on to a
>  > template. Some form of caching may (mostly) solve this problem, but
>  > I'm worried about two things:
>  >
>  > 1) The unlucky user who is the first to go to this page after the
>  > cached data has expired will have a long wait.
>  >
>  > 2) What happens when other users go to other pages, or this page,
>  > while the view is gathering data?
>  >
>  > I don't have any experience deploying Django projects (yet) nor do I
>  > have any with Apache.  But from what I've read so far, I'm under the
>  > impression that Apache can be configured to run multiple processes,
>  > each with its own invocation of Django...right?  (These processes
>  > will, of course, be unaware of each other unless I take steps to
>  > synchronize them.)
>  >
>  > I think the proper solution is to have a separate process periodically
>  > put together the data that my view does now, and serialize (pickle) it
>  > or put it in the database. Then my view will just read and display the
>  > most-recently generated data.
>  >
>  > Do you agree?  Do you have any advice?  Should I run this separate
>  > process from within Django (somehow), or as a cronjob?
>  >
>  > --
>  > Daryl Spitzer
>  >
>  >
>
>  >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to