On May 11, 2007, at 1:48 AM, Dag-Erling Smørgrav wrote:

Per Andreas Buer <[EMAIL PROTECTED]> writes:
Dag-Erling Smørgrav wrote:
Within reason - there is no point in using a cache file that is many
times the size of your data set. I'm not sure how the allocator works
in detail, but somewhere between 1x and 2x the size of your data set
should be sufficient.
Is an "out of memory" situation handled properly now?

No, but it won't occur if your cache is larger than your data set (and I mean "data set", not "working set" - thanks to the likes of Google, most
of your data set will be in your working set at some point or other)

Yeah, it doesn't really apply to images (which is all we are caching right now), but once we start looking at caching actual html pages, robots will kill the cache (going on the assumption we don't have space to cache everything). I think by keeping track of the number of hits within a given time period and only caching "hot" content, it should resolve this issue.

Barry



_______________________________________________
varnish-dev mailing list
varnish-dev@projects.linpro.no
http://projects.linpro.no/mailman/listinfo/varnish-dev

Reply via email to