A method similar to what Anssi describes is what we use for local
caching of data that is too large to comfortably fetch from cache/
parse from file every time we need it. A  Doing a lazy fetch with a
maximum age like that also helps with load times since it is only
fetched when accessed. Of course the fetch needs to be reasonably
possible with the time frame of a single request in order to not
frustrate users.

# Put this class somewhere and import into your settings.py
import datetime
import urllib2
class LazyFetch(object):
    def __init__(self,url,max_age=60*60):
        self.url=url
        self.reload_delta=datetime.timedelta(seconds=max_age)

    def _get_content(self):
        last_updated=getattr(self,'_last_updated',None)
        if not last_updated or datetime.datetime.now()-last_updated >
self.reload_delta:
            self._content=urllib2.urlopen(self.url).read()
            self._age=datetime.datetime.now()
        return self._content
    content=property(_get_content)

somepage=LazyFetch("http://www.example.com",30)

then elsewhere in your code you can import settings and access the
page content

current_content=settings.somepage.content

Accessing the property "content" would call get_content() which checks
to see if the content hasn't been loaded, or expired.. and reloads on
demand.  This is just a barebones example though, you'd want to throw
in some exception handing to the urlopen call at the very least.

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to