Thank you for the extensive reply. It makes things a lot clearer;
still i am not sure about how to continue.

Conceptually i would like to create 2 sets of tables/classes in a
database (as part of a prototype):

1) one set of tables/classes with the parameters to generate other
classes/tables from,
2) one set of tables/classes that is automatically generated from the
parameters in the first set. It will feature joined inheritance with
only one root base table/class.

The only database link between these two sets is the 'polymorphic on'
column in the root base table in set 2, which is a foreign key to a
Type table in set 1.

For a typical test i would like to:

1) create records in set 1 of tables (representing classes/tables with
their attributes/foreign keys and fields),
2) from these records generate the tables/classes, where the tables
will be in set 2.
3) add records to the generated tables/classes and test whether
adding, updating, deleting and querying works as intended.

To be able to perform multiple of these tests in one run, i need to
empty the tables of set 1. However i need to completely remove any
data (mappings, class definitions, records, tables) from set 2,
between individual tests.

I (naively) thought of some ways this might be possible:

1) use two separate metadata objects for the same database, bind them
to separate 'Base' classes, one for each set  and replace the one
representing set 2 before each individual test,
2) find some way to remove all data concerning set 2 of tables from
mappings, metadata, database, etc. between tests,
3) use two databases, one for each set of tables and forego the
foreign key realtionship between then (or maybe copy set 1 to the
second database)

Please advise on which of these approaches are possible, more
straightforward, ... or whether another approach might be more
appropriate.

Cheers, Lars







On Feb 26, 10:47 pm, Michael Bayer <mike...@zzzcomputing.com> wrote:
> On Feb 26, 2012, at 12:47 PM, lars van gemerden wrote:
>
> > I was wrong, the method emptied the database, but I was checking the
> > tables in the metadata.
>
> > This time I am also removing the tables from the metadata, but if i
> > generate the same tables in two separate test methods (with a call to
> > tearDown ans setUp in between), I still get an error about a backref
> > name on a relationship already existing.
>
> OK I think you're mixing concepts up here, a backref is an ORM concept.  The 
> Table and Metadata objects are part of Core and know absolutely nothing about 
> the ORM or mappings.    Removing a Table from a particular MetaData has 
> almost no effect as all the ORM mappings still point to it.  In reality the 
> MetaData.remove() method is mostly useless, except that a create_all() will 
> no longer hit that Table, foreign key references will no longer find it, and 
> you can replace it with a new Table object of the same name, but again 
> nothing to do with the ORM and nothing to do with the state of that removed 
> Table, which still points to that MetaData and will otherwise function 
> normally.
>
> If you want to remove mappings, you can call clear_mappers().  The use case 
> for removing individual mappers is not supported as there is no support for 
> doing all the reverse bookkeeping of removing relationships(), backrefs, and 
> inheritance structures, and there's really no need for such a feature.
>
> Like MetaData.remove(), there's almost no real world use case for 
> clear_mappers() except that of the SQLAlchemy unit tests themselves, or tests 
> of other ORM-integration layers like Elixir, which are testing the ORM itself 
> with various kinds of mappings against the same set of classes.
>
> Unit tests in an outside world application would normally be against a schema 
> that's an integral part of the application, and doesn't change with regards 
> to classes.   There's virtually no reason in normal applications against a 
> fixed schema to tear down mappings and table metadata between tests.    
> SQLAlchemy docs stress the Declarative pattern very much these days as we're 
> really trying to get it across that the composition of class, table metadata, 
> and mapping is best regarded as an atomic structure - it exists only as that 
> composite, or not at all.   Breaking it apart has little use unless you're 
> testing the mechanics of the mapping itself.
>
> Throughout all of this, we are *not* talking about the tables and schema that 
> are in the actual database.   It is typical that unit tests do drop all those 
> tables in between test suites, and recreate them for another test suite.    
> Though I tend to favor not actually dropping / recreating and instead running 
> the tests within a transaction that's rolled back at the end as it's much 
> more efficient, especially on backends like Oracle, Postgresql, MSSQL where 
> creates/drops are more expensive.   Dropping and recreating the tables in the 
> database though is independent of the structure represented by 
> Metadata/Table, though, that structure lives on and can be reused.    
> Metadata/Table describes only the *structure* of a particular schema.   They 
> are not linked to the actual *presence* of those tables within a target 
> schema.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to