Could you please elaborate on 

 * All database updates (tables, SPs, population scripts etc) are to be
checked into source control system

 

That is, do you have one txt file per stored proc and table definition? If
Yes, do you simply copy the stored proc contents to a text file from VS?

 

Wal

 

 

 

From: ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-boun...@ozdotnet.com]
On Behalf Of Jano Petras
Sent: Tuesday, 1 June 2010 8:15 AM
To: ozDotNet
Subject: Re: Source control of DB scripts

 

Greg,

An approach that worked well in the past for me was:

 * No permission to execute DDL statements on staging / dev databases given
to the developers. They still have full access to their local copies.
 * All database updates (tables, SPs, population scripts etc) are to be
checked into source control system
 * All above scripts are to be re-runnable (for example a new field is to be
added to table, the ALTER statement is condition with an IF to check if the
field is not already in there)
 * The continuos integration server periodically (or on request) executes
all current SQL scripts as found in repository however this is defined
(could be a TXT file, a DB table, version numbering system etc) recreating
the stored procedures and views and tables etc


This way all updates are kept in repository and are 'scripted' in a way so
the DB deployment is testable and manageable. 


j.



On 31 May 2010 12:56, Alan Heywood <a...@heywood.id.au> wrote:

Hi Greg,

 

I use MigratorDotNet <http://code.google.com/p/migratordotnet/>  to address
this issue.  Migrator allows me to define incremental changes to the
database using code, which is then checked into source control along with
everything else.  We have a Continuous Integration setup, and when a build
is triggered the following occurs:

1.      Code is compiled
2.      MigratorDotNet is called, passing in a reference to the xxx.data.dll
from our project.
3.      Migrator uses reflection to find a list of 'migrations'.  Version
number is compared to the current one stored in a table in the database.  If
there are new migrations to be run then they are executed on the database.

Using this approach you can create tables, define foreign key constraints,
add new colums etc.  You can also execute  arbitrary SQL statements.  This
means that you can also add or modify data as part of a migration.

 

Works really well for me, particularly since I would otherwise be manually
attempting to keep > 10 databases in sync.


Alan

On 31 May 2010 11:14, Greg Keogh <g...@mira.net> wrote:

Folks, I'm sure we've all had problems where multiple developers change SQL
Server scripts and they get out of whack and waste time with stupid errors.

 

I'm starting a fresh app and I thought I'd experiment with keeping scripts
in SVN. It just means that we have to remember to always save a script to
the source controlled file whenever it's changed.

 

Because scripts aren't compiled source code, there is still the risk of
human error in not pushing any updated script files into the DB. I was
thinking of concocting a utility which automatically pushed changed scripts
into the DB, but before I start fiddling I thought I'd ask about this
subject in general first. Are there others out there who source control
their DB scripts and have techniques for reducing human error? Or perhaps
there are better techniques that I've completely overlooked.

 

Greg

 

 

Reply via email to