Another thing to keep an eye out for is referencing one model from
another  - when loading up the choices for something that has > 1000
possible associated objects, then system will take a while to pull all
those into place. Setting "raw_admin_id" in the model will help
alleviate this, but you loose some additional functionality as a
consequence.

Fundamentally, look at your queries to see how what's going on and
taking time. Maybe just a few indexes are needed, or it might help you
identify a coding mistake.

I frequently use the middleware snippet at
http://www.djangosnippets.org/snippets/264/ to assist here (had a
problem with our sitemap setup over the weekend I diagnosed with
that). I didn't create that middleware originally - found it on this
list. Sorry I can't tell you who created it originally.

-joe

On 6/4/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> On Jun 4, 3:48 pm, "and_ltsk" <[EMAIL PROTECTED]> wrote:
> > 1. 100MB per request - is it normal?
>
> not likely.  I have apache processes serving a couple of different
> sites within separate python interpreters and they are all somewhere
> in the 10-30MB memory range.
>
> Common gotcha: make sure you have DEBUG set to False, otherwise Django
> stores all executed SQL queries in memory.
>
> > 2. I can't use cache because one user receives cached response for another 
> > user. Did you know any solution for authenticated cache?
>
> You can set up caching per-session, look for  "vary_on_headers" and
> "Cookie" in the cache documentation.  But that greatly increases the
> cache size, of course.
>
> I prefer to cache template fragments, see 
> http://code.djangoproject.com/ticket/1065
> -- this way the parts that don't depend on logged-in user can be
> cached only once for the whole site and I still can cache fragments
> per-session.
>
> > 3. While any table growing, the responses became slower to 2-10 minutes. 
> > How to avoid the big dicts or use another another solution?
>
> Make sure you have indexes in the database.  Also, check your
> templates and code for "if big_queryset" instead of "if
> big_queryset.count" -- the former pulls all the data from the
> queryset, the latter does not.
>
> > For any performance tip thanks in advance.
>
> Point 3. looks like a hint there is something wrong with the way you
> pull data from the database.  Look into the queries, perhaps there is
> a large number of them or some of the queries download too much data.
>
> -mk
>
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to