hi,

a site of mine serves about 3,000 pages daily to its users, and about
10,000 to crawlers. users mostly surf the active pages, that are being
updated recently, but crawlers go deeper into hundreds of thousands of
pages, which have not been modified, and probably won't be for
months.

so i thought i have to have a large cache, holding many pages (or the
expensive pieces of pages) for long times, but when they are updated,
delete them. addition of new content usually causes multiple pages to
be removed from cache because relations between should also be updated
with new content.

as i am short on memory (512 MB slice), and want to decrease db load,
i eliminated memcache and db cache, and setup a file system caching
with max_entries = 30000 and cull_frecuency = 10 and it quickly filled
up about 300 MB, which was expected.

the results were unbelievably terrible. i have run into many issues,
mainly due to running out of memory, and in the end the whole system
is screwed. using no caching is doing much much better than this
setup.


Any suggestions for optimizing the caching is greatly appreciated.


--
omat

--

You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-us...@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.


Reply via email to