Having hauled myself a few feet up out of the abyss of ignorance, I can 
answer my own question (which might be of benefit to others getting started 
with django). To clarify the problem, I have a standalone script that 
imports the django settings and uses its ORM pleasantness to populate one of 
the model's tables with a history of 2 million entries.

My initial version was really slow and eventually ran out of memory. After a 
little searching I found two useful things:
- batch the inserts into transactions - the default behaviour is to commit 
after each save() and bundling several hundred per transaction seems to 
speed things up for me by a factor of 7
- switch DEBUG to False in the settings (something I read in the tutorial 
that wasn't relevant at the time...). This is mentioned in the FAQ
  
(http://docs.djangoproject.com/en/dev/faq/models/#why-is-django-leaking-memory)

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to