Hello,

I'm working on a larger project which is using SQLAlchemy. We started
with writing automated tests using an in-memory SQLite database and
inserting test data in setUp() then emptying tables in tearDown().
Even though the in-memory SQLite is pretty fast, the process of
inserting test data (sometimes a hundred of records) takes significant
amount some time. Eventually, a test which should take 30ms, takes
300ms.

One solution is probably to create a DAO layer with find_by..()
methods on top of the SQLAlchemy layer and then write a DAO mock which
stores records in an array. Another one would be to create a fake
session which stores records in an array. The latter would probably be
better because we are not creating another unnecessary layer, but
harder to implement, because of the complex query API.

How do you cope with this situation?

Best regards,
Adam
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to