On 3/23/06, DavidA <[EMAIL PROTECTED]> wrote: > Occasionally I will modify my schema, drop the old tables and reload > all the data from the "cached" files. Over time that could easily be > millions of rows and some optimization of my technique will be in > order. But for now, the simple approach (parse a row, create a django > model object, stuff it and save it) works fine.
For bulk-loading stuff into a database, I would definitely recommend bypassing the Django layer entirely. There's no need to use an ORM for that; use whatever bulk-loading capability your database provides. Or, if you absolutely want to do it in Python, use the cursor.executemany() functionality of the Python DB-API layer. Adrian -- Adrian Holovaty holovaty.com | djangoproject.com --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users -~----------~----~----~----~------~----~------~--~---