manage.py dumpdata and manage.py loaddata are nice tools for many cases. 
Particularly, they could be useful for transferring data from one database to 
another (say, from SQLite to PostgreSQL).

Unfortunately, dumpdata/loaddata won't work on large databases, because the 
whole data collection is retrieved and stored in RAM before serialization or 
after deserialization.

Is there any way to dump and load data from/to a large DB? Or, if not, is it 
possible to make serializers/deserializers more memory efficient (use 
iterators instead of lists or something like that)?

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to