[sqlalchemy] What is the rationale of having to manually set up a relationship between two tables?

2014-03-23 Thread Bao Niu
Suppose we have two tables in an existing database, "user" and "address". There is a one-to-many relationships between these two tables with a foreign key user.id==address_user_id. Now we *reflect* this schema directly from the database: from sqlalchemy.ext.declarative import declarative_base Ba

Re: [sqlalchemy] Bulk Inserts and Unique Constraints

2014-03-23 Thread James Meneghello
Oops, I should add - the reason I can't use an itertools counter to pre-assign IDs is because the table is potentially being dumped to by multiple scripts, which is why I have to commit the parts prior to the segments (since engine.execute can't return multiple insert_ids). On Monday, 24 March

Re: [sqlalchemy] Bulk Inserts and Unique Constraints

2014-03-23 Thread James Meneghello
Thanks for the quick reply! This seems to work pretty well. I took out the batching (as it's already batched at a higher level) and modified it to suit the insertion of children as well (and reducded the unique to a single field) , and it appears to work. with db_session() as db: existing_

[sqlalchemy] Re: Bulk Inserts and Unique Constraints

2014-03-23 Thread Cosmia Luna
Anyway, I don't think you should issue so many SELECT's, performance suffers from it most, I think. INSERT statement will return the inserted id's for you, maybe you want to read this http://docs.sqlalchemy.org/en/latest/core/tutorial.html#executing Skip duplicates... well if you turn off the c

Re: [sqlalchemy] Bulk Inserts and Unique Constraints

2014-03-23 Thread Michael Bayer
On Mar 23, 2014, at 11:33 AM, James Meneghello wrote: > I'm having a few issues with unique constraints and bulk inserts. The > software I'm writing takes data from an external source (a lot of it, > anywhere from 1,000 rows per minute to 100-200k+), crunches it down into its > hierarchy and

[sqlalchemy] Bulk Inserts and Unique Constraints

2014-03-23 Thread James Meneghello
I'm having a few issues with unique constraints and bulk inserts. The software I'm writing takes data from an external source (a lot of it, anywhere from 1,000 rows per minute to 100-200k+), crunches it down into its hierarchy and saves it to the DB, to be aggregated in the background. The func