>> AFAIK, the bottleneck tends to be the database, so a cache
>> solution like memcached should make the deal. Think of a
>> 256Mb cache for 10 seconds maybe.
> 
> Yes, the database is generally the main bottleneck these days.
> memcached is definitely on my radar.

It would also be helpful to know things about the usage 
patterns--particularly the read-to-write ratio and 
currency-demands.

If most of your hits are reads, and stale data is okay (only 
updated every N minutes), then you're a top candidate for several 
of the caching solutions supported by Django.  Or even front-end 
caches that prevent Django from even seeing the request.

If most of your hits are reads, but they need to have the most 
up-to-date info, you can use a replicated DB cluster where writes 
are synchronized lockstep across the cluster, and then a 
load-balancer farms out the reads across the cluster.  Or, you 
can have as part of your writing (perhaps in a save() call), 
calls to your cache-controller to invalidate affected pages.

If you're write-heavy, best wishes! ;-)  Data partitioning may 
help, as might a multi-master DB configuration.

-tim







--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to