On Wed, Jun 3, 2009 at 11:38 PM, Kegan Gan <ke...@kegan.info> wrote:
>
> Hi Russell,
>
> On the first issue: Good point. I have not the opportunity to work
> with such a huge database.
>
> On the second issue: Yes, what I am doing now is really about writing
> conversion code to fit the old json to match the new schema. I find
> this to be quite straight forward for my use cases, it's only
> modifying a list of hash objects. But I probably haven't encounter the
> more complicated use cases out there.
>
> I feel the fact that Django ORM works, dumpdata+loaddata works, ...
> makes it compelling as the foundation for a migration tool. Developers
> can work with 100% Python code, not needing to learn migration tools
> specific APIs, and maybe easier to provide some logic in the migration
> process (for example, calculating default values for new fields, based
> on some existing database data).
>
> Also, independent Django app (as in INSTALLED_APP) developers can all
> rely on the serialized JSON format, and provide their own migration
> code, when they release new version of their app. (How is this
> currently done anyway? Lets say if django-tagging changes the model
> class in the next release.)

They Don't (tm) :-)

Or, to be more precise - they try everything possible to avoid
changing the model, and if they do need to, they publish the series of
SQL ALTER commands that are needed to update the tables in-situ.
Another approach is to change the name of the table, and provide
commands to read from the old table and insert into the new. The
changes made to Django's comments during the 0.96-1.0 transition give
you one example of how this can be done [1].

[1] http://docs.djangoproject.com/en/dev/ref/contrib/comments/upgrade/

> PS: As for this three points ...
>
>>>> * If you add a new non-null field, the fixture won't load because it won't 
>>>> provide data for that field.
>>>> * If you change the name of a field, the fixture will contain data for the 
>>>> old name, but no data for the new name. Any existing data in that field 
>>>> will be lost.
>>>> * If you change the type of a field, there is no guarantee that the old 
>>>> data will be format-compatible with the new field.
>
> ... the old fixtures are loaded with old model class. The serialized
> json is modified to fit the new model class, and then loaded using the
> new model class with "manage loaddata". This is done on a per app
> basis (only for the app that needs the migration).

As I said in my last email, this can work. However, at the very least,
you double your disk space requirements. Depending on the nature of
the conversion, there may also be a considerable processing time and
temporary memory requirements.

Ultimately, I suppose my point is this. SQL databases provide an
extensive and reliable infrastructure for managing the conversion of
schema. While you _can_ avoid that infrastructure, there comes a point
at which you are reinventing wheels just to avoid using a particular
brand of tyre.

Yours
Russ Magee %-)

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to