Yes - I agree with all of that.

Using database compare tools to produce a production database migration just
seems like an act of desperation.  It does makes sense as a safety check.

The most successful approach i've used is just using database scripts that
start with a time stamp (YYYY-MM-DD HH-MM-SS) followed by a description of
what the script does.  The automated build starts by loading the production
schema, reference tables and perhaps a small sample of data and then applies
the scripts in collation sequence.  The automated tests can then verify that
whatever the scripts were supposed to do worked.  This was on a team of
about 5 developers but I can't see any reason why that wouldn't scale up to
a large team.

Each developer adds scripts whenever necesary and you just use your version
control system to share database changes within the team - the database is
generated from them each time you run a build.

The rails approach of using a DSL to abstract the nasty DDL seem like
overkill - especially since many of your scripts are going to need to modify
data as well as the schema.  It also much much easier to restore a database
backup than to try to undo database schema changes.

Maybe this should have been a thread in itself - it is possibly a
contentious topic.  I certainly find it interesting.

The rails style migration approach was quite a revelation for me - I wonder
if people use these database projects only because aren't aware that it
doesn't need to be that complicated.

I was really quite shocked at the effort that the VS 2010 developers had put
into elaborating the database projects - I'd always thought they were a
ridiculous waste of time.  What those projects do seems quite cool and
complicated but i'd never want a production deployment to rely on it.

Anyone strongly in favour?  I'd be very interested to understand why.

On Fri, May 7, 2010 at 2:17 PM, Greg Keogh <g...@mira.net> wrote:

>  Good timing on this subject, as I was at a meeting this morning where we
> discussed how to manage multiple databases that are slowly diverging from
> each other and we have to stop this happening in a reliable way. We don’t
> have a dedicated DBA, just 3 developers all doing mixed tasks.
>
>
>
> We traditionally sent each other scripts to modify different DBs, but no
> matter how hard you try, they still diverge and things may happen in the
> wrong order and waste time. So we used RedGate to compare DBs and tell us
> how we had diverged and how to get the schemas back in sync again. I was
> never happy with this and I tried to convince the others that we need a
> master script to create the real DB and standard enum rows and a similar
> tool to migrate data from a known base DB version.
>
>
>
> One of our blokes has finally done the right thing, and after long hours of
> concentration has made master scripts to create and load DBs. We still find
> it’s a nuisance that you can’t manage DB scripts in version control as
> smoothly as you can with source code.
>
>
>
> Where is the “source of truth” for a database? It’s not one of the DBs
> themselves, it’s the script that makes the DB. DB compare tools should be
> for sanity/safety checks, not for primary guidance in DB development.
>
>
>
> Greg
>

Reply via email to