Testing with legacy data

2013-08-08 Thread Dan Gentry
I'm not a fan of testing with actual data, except maybe as a final run to make 
sure no existing data breaks something or for stress testing with large amounts 
of data. Your legacy DB will not cover all of the possible cases that need to 
be tested in your code. 

Instead, write quality unit tests and functional tests that try all scenarios 
using factory boy or a similar data generator.  

My 2ยข

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at http://groups.google.com/group/django-users.
For more options, visit https://groups.google.com/groups/opt_out.




Re: Testing with legacy data

2013-08-06 Thread Tom Evans
On Tue, Aug 6, 2013 at 5:59 AM, Jani Tiainen  wrote:
> Hi,
>
> I've legacy database that is rather large (around 300MB) containing lot more 
> than just data (triggers, stored procedures and such).
>
> Now how I can test with such a data? Preferably I would like to load data to 
> database, run test, rollback changes and run a next test.
>
> But I really wouldn't like to recreate database from the scratch everytime or 
> not to import data every time.
>
> Any suggestions how I could proceed?
>

You want to snapshot the pristine copy of the database, perform your
tests on the database and then roll it back to the snapshot (or copy
the snapshot, do the tests on the snapshot, drop the (altered)
snapshot, re-copy... it's the same effective process).

Theoretically with mysql (and not mysql+InnoDB) you can lock and flush
tables, "FLUSH TABLES WITH READ LOCK", which stops anything from
accessing any data file in mysql's data folder. You can then hotcopy a
database, creating your snapshot.

If your file system is clever, like ZFS (other COW fs are available),
your mysql data directory could be a separate fs, and you can snapshot
and rollback with ease - of course, you must make sure the disk files
are consistent at the point the snapshot is made.

Good luck integrating either of those two options in to tearUp/tearDown.

Cheers

Tom

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at http://groups.google.com/group/django-users.
For more options, visit https://groups.google.com/groups/opt_out.




Testing with legacy data

2013-08-05 Thread Jani Tiainen
Hi,

I've legacy database that is rather large (around 300MB) containing lot more 
than just data (triggers, stored procedures and such).

Now how I can test with such a data? Preferably I would like to load data to 
database, run test, rollback changes and run a next test.

But I really wouldn't like to recreate database from the scratch everytime or 
not to import data every time. 

Any suggestions how I could proceed?

-- 

Jani Tiainen

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at http://groups.google.com/group/django-users.
For more options, visit https://groups.google.com/groups/opt_out.