>
>
>    1. Your "adviewer/viewads" makes 10 calls to the database. Try to 
>    optimize this either by writing fewer queries, creating a view, and/or 
> only 
>    selecting fields that you need. Also make sure you get the criteria right 
>    so that you (ideally) don't have any extra, unneeded rows.
>
> If it's feasible, also consider caching the queries for some amount of 
time (assuming the results don't change too frequently). 

>
>    1. When you absolutely have to load a few thousand rows (or more) in a 
>    query (you should avoid this whenever possible), then try using 
>    "db.executesql(query)" to manually execute a hand-crafted SQL query. This 
>    will always be faster than using the DAL directly.
>
> Note, the difference in speed is due to the fact that the DAL won't be 
converting the results set to a Rows object -- so you won't have the 
convenience of dealing with DAL Rows and Row objects. If you do 
db.executesql(query, 
as_dict=True), it will convert to a list of dictionaries (which is still 
faster than converting to a Rows object). 

>
>    1. Another point about executesql: The obvious issue is reduced 
>    portability, but if you are only planning on using PostgreSQL, then you 
> can 
>     hand-craft a SQL query and profile it against PostgreSQL for maximum 
>    performance. Once you've got it giving only the data you want, then you 
> can 
>    copy and paste that query into executesql.
>
> If you want to use db.executesql() but remain portable, you can still have 
the DAL generate the SQL for you by using the ._select() method:

db.executesql(db(query)._select(...))

Obviously in that case you don't get to hand optimize the SQL, but you 
still get the speed advantage of not converting the results to a Rows 
object (which is only significant for large results sets).

Anthony

Reply via email to