First, have you set migrate to False for all tables, and have you bytecode 
compiled the app? That should speed things up.
 
Also, do you need all 70 tables for any requests, or can they be broken up 
based on the particular controller/function called? If so, you may speed 
things up with conditional models. Any files in /models/controller1 will 
only be executed on requests for controller1, and any files in 
/models/controller1/function1 will only be executed on requests for 
controller1/function1. (We're working on more general functionality for 
conditional models that won't be based solely on the controller and 
function.)
 
Anthony

On Monday, August 15, 2011 6:30:06 PM UTC-4, Kevin Ivarsen wrote:

> Hello, 
>
> I'm using web2py with a relatively large legacy database. We have 
> about 70 tables, and models/db.py is close to 1000 lines long. 
>
> The problem I'm facing is that all of these Field() constructors and 
> define_table() calls take about 150-200ms to execute, and this 
> overhead occurs each time any request is made to web2py. Compare this 
> to a minimal web2py app, which might have a (reasonable) 10-30ms of 
> overhead before response data starts flowing to the client. 
>
> I was hoping that a simple solution would be to declare the DAL 
> structure once in an importable module, which would be run once on 
> web2py startup rather than at each request. I could then deepcopy it 
> in my model file (which I think would be faster than all the calls to 
> define_table(), Field(), etc.), and then establish a connection to the 
> database from this copied DAL. 
>
> Unfortunately, there are a few problems with this: 
>   1. If a DAL instance is to be connected to the database, it must 
> happen in the constructor. It doesn't seem that you can do "db = 
> DAL(None)" and then establish a connection after the fact. Also, it 
> looks like some db-specific behavior is set up in the DAL constructor 
> based on the connection URL - this wouldn't happen in the case of 
> DAL(None). 
>
>   2. Table and Field instances have a reference to db, so it seems 
> that define_table() needs to be called *after* the DAL connection has 
> been established in order to set up these references. 
>
> I suppose it would be possible to deepcopy a DAL(None) instance that 
> has been established with Tables and Fields, and monkey-patch the db 
> references thoughout the DAL structure, but chances seem good that 
> this will create subtle bugs down the road if the internal DAL 
> implementation ever changes. 
>
> Can anyone suggest a better way to speed up DAL creation for large 
> schemas? 
>
> Thanks, 
> Kevin 
>
>
>
>

Reply via email to