Hi,

I'm using django 1.0.2.  I have an model with a large number of records.  I
was hoping that the built-in iteration on queryset objects would reduce the
memory requirement, but it is using up all my memory (and swap), and I'm
forced to break up the chunks manually using queryset slices.  Iterating
over slices of 10000 entries at a time my memory usage is only around 40MB
of memory, but without breaking it up manually (i.e., iterating over the
entire, unsliced queryset) was using over 1GB before I killed it.

Is there anything I might be missing?

Here are the examples:

# without manual slicing
qs = MyModel.objects.all()
for obj in qs:
    # do something
    # ...
    pass

# with manual slicing
index = 0
qs = MyModel.objects.all()[0:10000]
while qs:
    for obj in qs:
        # do something
        # ...
        pass
    index += 10000
    qs = MyModel.objects.all()[index:index+10000]

Regards,
Casey

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to