I've been developing with Rails for a few years now, but one part I haven't 
used is view fragment caching.

I'm using it now, and I'm seeing what I think is undesirable behavior, but 
I need someone with experience to tell me if it's supposed to be this way.

I'm caching a collection, and it runs the query to generate the cache key 
(count, max updated_at) as expected, but then regardless of whether there 
is a cache hit, it still runs the main query, and if there are includes, it 
runs those queries, too.  If the view fragment is being served from the 
cache, then those queries are just wasted time.  Has it always been like 
this?

I can work around it by using "<% cache @collection.cache_key do %>" 
instead of "<% cache @collection do %>", but it seems like it shouldn't run 
the query in the first place.

-- 
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to rubyonrails-talk+unsubscr...@googlegroups.com.
To post to this group, send email to rubyonrails-talk@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/rubyonrails-talk/cf613f24-c38f-4961-a63a-e52aee5a39cd%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to