you may want top create a Jira on Migration and link it as related or incorporates setup
https://issues.apache.org/jira/browse/OFBIZ-635

my short answer is I have been toying with the ideas of using multi-tenant to create a new DB then a script would move all the data from the default to the new db. then using a different location for the new code created a new default and connect to the multi-tenant db created before, then use tools to compare the two for data and structure differences. your testing can be done on both dbs till you satisfied with out effecting production. eventually this will create the migration data and steps need to be done. you can do this by using more multi-tenant dbs to migrate and check progress till the migration is complete.


Matt Warnock sent the following on 9/16/2010 6:01 PM:

Hi all:

Sorry for the length of this post, but I'm trying to get a handle on
best practices for code updates, and want to make sure I understand the
process.  I envision that the upgrade process for a production database
goes something like this:

1) Backup everything (code and DB), in case you have to roll back.
2) Do "svn info" and note OLD revision number (is it in DB at all?).
3) Do "svn up" and note NEW revision number.
4) Check the wiki page BJ cites below to see what DB changes are needed.
5) Build the new code, integrating local modifications.
7) Test like crazy, especially the DB migration services/changes.
8) Automate&  test migration process with a copy of the production DB.
9) Roll out new DB and server, ready with fast "rollback" if needed.
10) Make sure that the business stays up and running.

Whether this process is done weekly, monthly, or "as feasible" with
trunk, or only every year or two between OOTB stable releases, the
upgrade path is about the same, is it not?  The only real difference is
the number of database changes that might be needed, and the time frame
for advance testing on the selected new code revision.  Could there be
data migration issues from one revision to an other within release 9.04,
for example?  I'm assuming that would be unlikely, at least.

It seems like running one consistent production DB, under two different
versions of OFBiz, is likely to be problematic. So the actual conversion
of the production DB probably needs to be done overnight, if possible.
As the newly-modified production database is used in practice, and
diverges from the old-version backup, "rolling back" soon becomes
increasingly painful (days of lost work), eventually becoming a
non-option.  Am I envisioning this correctly?

David warned last week or so that the data migration page BJ cites below
is manually updated, therefore not guaranteed to be 100% correct. It
also seems to only contain changes to table structure, not to seed data.
I assume that new code may depend not only on new table structure, but
also on data types and distinctions defined in new, additional seed
data.

Thinking about that, I conclude that it would be really important to
know, in a fully automated way, what differences exist in the entity
definitions, or seed data content, between arbitrary revisions X and Y.
Does such a tool exist?  Or does the entity-check code run on startup
somehow fully answer that need?

If not, it occurs to me that if I do a pg_dump (or the equivalent in
Derby) immediately after running "./ant run-install-seed" for two
arbitrary versions of OFBiz, I should be able to "diff" the two
resulting files and see *ALL* the changes to both table structure, and
seed data. A little more munging of those results could yield a more
meaningful result, like "In table XYZ, change field "ABC" from
"char(25)" to "char(30)", or "add (or subtract) a row of seed data as
follows:"  Are there problems you see with such an approach (other than
it is totally DB-specific, of course).

Am I worrying too much?  I'm not betting the farm on OFBiz today, but if
all goes according to plan, I soon will be.

Reply via email to