To all who replied:

Thank you.  I did typo.  I meant "transaction" instead of truncate.

I had not seriously considered pg_dump / pg_restore because I assumed it would 
be fairly slow but I will experiment with pg_restore and template techniques 
this weekend and see which ones prove viable.

I know about and use database cleaner and intend on offering it to them.  I 
posted a comment on the bug report provided.

Loading a separate SQL file seems like you are 99% of the way there.  How is 
that as far as performance?  pg_restore should be faster.

The other advantage to the pg_restore or loading an SQL file is you can have 
various files for different tests.


On Mar 12, 2013, at 10:09 PM, Joe Van Dyk wrote:

> On Mar 12, 2013, at 8:42 AM, Perry Smith <pedz...@gmail.com> wrote:
> 
>> I tried posting this from Google Groups but I did not see it come through 
>> after an hour so this may be a duplicate message for some.
>> 
>> The current testing technique for things like Ruby On Rails has three 
>> choices but all of the choices will not work in my case. 
>> 
>> The first choice is "truncate" which starts a transaction before each test.
> 
> I think you mean "transactional tests" here, not "truncate". While the test 
> database is truncated once at the start of all the tests, each test runs 
> inside its own transaction which will never be committed. 
> 
>>  If the testing is within the same process, this works and a roll back 
>> restores the DB.  But if the testing involves two processes, then the test 
>> data entered by the test rig can not be seen by the "system under test" 
>> (SUT).  With Rails, there are times when this is needed.  The test rig 
>> drives a browser which calls into a Rails application.  There are 
>> "dangerous" ways to still use this method but each has various down falls or 
>> risks. 
> 
> IMO, you don't want to use transactional tests. When you do, ActiveRecord 
> will use savepoints to mimic transactions. This means now() will never change 
> during the tests, deferred constraints/triggers won't work, other processes 
> can't see the data, etc.  The system is behaving differently in the test 
> environment than in the real one, which is bad. 
> 
> If you are treating the database as a really dumb store of data, then you may 
> want to use transactional tests. But be aware of the caveats. 
> 
>> 
>> The other two choices are delete and truncate which both end up with an 
>> empty database just after each test.  This prevents any test data that is 
>> already in the database from being used after the first test.  Note that a 
>> "test run" will run through a sequence of tests (usually quite a few). 
>> 
>> All of these are fairly fast with each one being "faster" under different 
>> conditions (according to users). 
>> 
>> Generally, this pushes the Rails community to have either "fixtures" or 
>> "factories".  Both are ok solutions but both also have problems.  In my 
>> case, I have a dozen or so tables all with very tight constraints and 
>> creating either fixtures or factories is very troublesome.  Also, I have a 
>> real database with real data in production and it seems foolish not to take 
>> advantage of the knowledge contained within that database.  By "knowledge" I 
>> mean the particular values and weirdness within the data that a factory or a 
>> fixture might not realize. 
>> 
>> One choice would be to create the database, use it, and then drop it for 
>> each test.  I would create the database from a template that already has 
>> data taken from the production database (and probably trimmed down to a 
>> small subset of it).  This requires some crafty dancing in the Rails set up 
>> since it likes to just attach to a database and run but it could be done.  
>> From first blush, this sounds like it would be really slow but may be not. 
>> 
>> The other choice would be to somehow copy the data to temporary tables 
>> before the test run and then copy it back.  The advantage to this is it is 
>> not very PostgreSQL specific.  Indeed, if the template database is already 
>> set up, then only one copy would be needed at the start of the test. 
>> 
>> The other thought I had is if there is some type of "leaky" transaction.  A 
>> transaction where another process can see the data but the roll back would 
>> still work and be effective.  Essentially I'm asking if all the protections 
>> a database offers could be dropped... but I thought I'd see if that was 
>> possible. 
>> 
>> The other thought is perhaps there is a "snap shot" type concept.  I don't 
>> see it in the list of SQL commands.  A "snap shot" would do exactly what it 
>> sounds like.  It would take a snap shot and save it somehow.  Then a 
>> "restore to snap shot" would restore the DB back to that state. 
> 
> This would be super super super awesome, but it doesn't exist as far as I 
> know. This would be a "permanent snapshot" that could be easily and quickly 
> restored. 
> 
> I wonder if it would be possible to make an extension that made this easy to 
> do. 
> 
>> 
>> I thought this group might suggest other ideas and either nuke the really 
>> bad ideas or promote the plausible ideas I've mentioned above. 
> 
> Make sure to look at database_cleaner if you haven't yet. Also this may be 
> interesting: https://github.com/bmabey/database_cleaner/issues/80
> 
> Personally, I use database_cleaner's delete method, plus I load a SQL file at 
> the beginning of each test. 
> 
> 
>> 
>> Sorry for the long post.  I appreciate your thoughts. 
>> 
>> Perry 
>> 
>> 
>> -- 
>> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
>> To make changes to your subscription:
>> http://www.postgresql.org/mailpref/pgsql-general

Reply via email to