Hello, I'd like to get some hints on memory usage by python/django program.

I have standalone script that populates solr index with django
application data. The queryset I have to process has over 160000
objects. No matter if I use iterator() or not, the script eats more and
more memory, finally causing the whole system to crawl due to extensive
swapping. I tried to slice the queryset in batches of 2000 objects,
explicitly call "del" on retrieved objects, assigning None to the names
I use... And all that does not help, after retrieving around 130000
objects the whole system is unusable. It looks like the memory that has
to be freed is never reused. The DEBUG setting seems to have no impact
on the overall memory usage. Using values() is a no-way because I have
to follow few foreign keys.

Can you share any hints on how to reduce the memory usage in such
situation? The underlying database structure is rather complicated and I
would like to not do all queries manually.

-- 
Jarek Zgoda
Skype: jzgoda | GTalk: [EMAIL PROTECTED] | voice: +48228430101

"We read Knuth so you don't have to." (Tim Peters)

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to