Re: [sqlalchemy] Low performance when reflecting tables via pyodbc+mssql
On Tuesday, February 12, 2013 9:13:48 PM UTC+9, betelgeuse wrote: I had a smilar problem. I had a ms sql database that another application created and I need to select data from it. There was lots of tables so I tried reflection but it was slow so I decided to use sa declarative method. But declaring all the tables again in python was too much work. I use sqlautocode to generate declerative table classes and use them in my models with some minor modifications. if the db structure does not change too often this will speed up things. I've been doing that way with django. Tried sqlautocode but got an ImportError: cannot import name _deferred_relation error. (I'm using SA 0.8) Maybe something is broken but don't have much time to look into it :( -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at http://groups.google.com/group/sqlalchemy?hl=en. For more options, visit https://groups.google.com/groups/opt_out.
[sqlalchemy] Low performance when reflecting tables via pyodbc+mssql
For the following code: from sqlalchemy import create_engine, MetaData, Table dbengine = create_engine('mssql+pyodbc://MYDSN') dbmeta = MetaData() dbmeta.bind = dbengine def get_table(name): table = DBTable(name, dbmeta, autoload=True, autoload_with=dbengine) It takes 50 seconds or so per call to `get_table`. Did I miss something? Where should I look at? Thanks in advance. -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at http://groups.google.com/group/sqlalchemy?hl=en. For more options, visit https://groups.google.com/groups/opt_out.
Re: [sqlalchemy] Low performance when reflecting tables via pyodbc+mssql
On Tuesday, February 12, 2013 7:18:37 PM UTC+9, Simon King wrote: If you add echo='debug' to your create_engine call, SA will log all calls to the database and rows returned, which might give you an idea of where all the time is being spent. Thanks, Simon. I've looked through the debug log and found the reason. It turns out that the table has several foreign key constraints, and SA is inspecting all of the related tables and all the related tables to the related tables... There were 23 tables involved, which explained the long execution time. So is there anything I can do about this? I'm considering two possibilities: 1. Ignore the constraints to speed up 2. Or cache all the meta data to a disk file so no need to wait when restarting the program Either would be fine for me. Is it possible? -- You received this message because you are subscribed to the Google Groups sqlalchemy group. To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+unsubscr...@googlegroups.com. To post to this group, send email to sqlalchemy@googlegroups.com. Visit this group at http://groups.google.com/group/sqlalchemy?hl=en. For more options, visit https://groups.google.com/groups/opt_out.