Eric,

The typical mode of operation will be to incrementally load new data
each night - about 10 files each containing dozens to hundreds of rows.
Maybe only a couple will be in the thousands category. But I've also
created my scripts to work in two steps: 1) download the data from
various sources (FTP, HTTP, web scrape, custom API) and save the files
on a server, 2) parse any new files and load into the database.

Occasionally I will modify my schema, drop the old tables and reload
all the data from the "cached" files. Over time that could easily be
millions of rows and some optimization of my technique will be in
order. But for now, the simple approach (parse a row, create a django
model object, stuff it and save it) works fine.

-Dave


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users
-~----------~----~----~----~------~----~------~--~---

Reply via email to