On 8/27/07, Tomas Kopecek <[EMAIL PROTECTED]> wrote:
> I've thought, that QuerySets whose are iterated does not allocate memory
> for all objects.

QuerySet.iterator does what you want.

QuerySet.__iter__ (the python method that is called from the for loop)
returns an iterator over the results of QuerySet._get_data, which does
store a results cache.

The design fits the expectation that you'll more frequently be
iterating a smallert result set for the same queryset, and that
shouldn't hit the database twice, so the results are stored.   But
that obviously isn't what you want in this case.

One other point-- the DB-API provides cursor.fetchmany, and Django's
iterator uses this correctly.  However, some database libraries
default to a client-side cursor, meaning that even though the API
provides chunking semantics, the library still brings back the entire
resultset in one go.

I can't remember what psycopg1 does, but psycopg2 defaults to
client-side cursor.  It is possible to do server-side cursors using
names, but Django doesn't do this.  Fixing this has been (low) on my
to-do list for a long time.

In the common case of small result sets, client-side cursors are
generally a win since they do only one hop to the DB and all of the
results fit in memory easily.

Anyway, either do as Doug B suggests, iterating over slices, or do as
I suggest, directly calling QuerySet.iterator.

Doug B is wrong that there isn't much difference, though.  There
certainly is when you have an expensive query whose results don't fit
into memory.

Five Worlds of Software, recommended reading:
http://www.joelonsoftware.com/articles/FiveWorlds.html

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to