On May 27, 2010, at 10:00 AM, Chris Withers wrote:

> Hi All,
> 
> We currently run unit tests against sqlite in memory but deploy against MySQL.
> 
> http://stackoverflow.com/questions/2716847/sqlalchemy-sqlite-for-testing-and-postgresql-for-development-how-to-port
> 
> ...suggests this is a bad idea. I'm inclined to agree, but...
> 
> ...running our unit tests against MySQL is an order of magnitude or so slower 
> than against sqlite. This is probably due to the tables being dropped and 
> created for each and ever test.
> 
> How do people get around this? What's best practice in this area?

your test suite ideally wouldn't be tearing down and building up tables many 
times.    For an application where the testing is against a fixed set of tables 
(i.e. not at all like SQLA's own unit tests), you would run all your tests in 
transactions that get rolled back when the test is complete.

I use setup/teardowns like this for this purpose (assume scoped_session, which 
yes you should probably use all the time so that the session is accessed by a 
single reference):

def setup_for_rollback():
    Session.remove()
    sess = Session()
    c = sess.bind.connect()
    global transaction
    transaction = c.begin()
    sess.bind = c

def teardown_for_rollback():
    transaction.rollback()
    Session.remove()
    
above, "transaction" is the "real" transaction.  All begin/commits inside don't 
actually commit anything.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to