On Wednesday, August 25, 2010 11:02:19 AM UTC-4, Michael Bayer wrote:
>
>
> On Aug 25, 2010, at 10:47 AM, Michael Bayer wrote:
>
> > 
> > On Aug 25, 2010, at 9:37 AM, Nikolaj wrote:
> > 
> >> Hello,
> >> 
> >> I'm struggling to use the Beaker caching example in my project.
> >> Accessing any attribute on an instance from cache triggers a load from
> >> database of the instance - even trying to read the primary key!
> > 
> > that means the instances that you're putting in the cache are expired, 
> or at least those attributes you are attempting to read.    The beaker 
> example takes the objects from a result and sends them straight to the 
> cache, but does not detach them from the current session.  Therefore, if 
> your backend is the "memory" backend, the objects in the cache are the same 
> as those in your current session and they'll get expired when the session 
> is committed or rolled back.   
> > 
> > There's not a really spectacular way to get around that except to not 
> use the "memory" backend, use one that pickles like memcached (well, pretty 
> much memcached is the only backend I'd ever use, but the file/dbm file 
> backends would work here too).
>
> well, this probably would work, this diff is against tip:
>
> diff -r 568ef214c1ac examples/beaker_caching/caching_query.py
> --- a/examples/beaker_caching/caching_query.py        Mon Aug 23 18:17:31 
> 2010 -0400
> +++ b/examples/beaker_caching/caching_query.py        Wed Aug 25 10:55:53 
> 2010 -0400
> @@ -63,8 +63,15 @@
>             if particular attributes have been configured.
>             
>          """
> +        
> +        def createfunc():
> +            items = list(Query.__iter__(self))
> +            for item in items:
> +                self.session.expunge(item)
> +            return items
> +            
>          if hasattr(self, '_cache_parameters'):
> -            return self.get_value(createfunc=lambda: 
> list(Query.__iter__(self)))
> +            return self.get_value(createfunc=createfunc)
>          else:
>              return Query.__iter__(self)
>  
>
> I've added a comment in the example regarding this.  
>

I want to cache mapped objects from rows which will not change during the 
process lifetime of a web application, so I used the memory backend. Based 
on the comment in the example from SQLAlchemy  0.7.9, I have also 
implemented this approach of caching detached objects. It mostly works 
fine. However, sometimes objects lazily loaded from a collection 
relationship and cached via RelationshipObject are still detached when the 
application code gets them. It looks like CachingQuery.get_value() should 
make sure any objects loaded from the cache are merged, but somehow that's 
not happening.

The web application is a single-threaded, long lived process which 
maintains one Session object for its entire lifetime, repeatedly committing 
or rolling back for each HTTP request. I'm only seeing the detached objects 
on the request to a given process so it seems that objects loaded from the 
database for inserting into the cache are staying detached but those 
fetched from the cache on subsequent requests are properly merged. Do you 
have any idea why this would be or where I should investigate?

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to