This is cool : cache.ram.clear(key)

I didn't thought about it, I figure out that you trigger this in the
function that create new records for a given cached select... Good trade
off with caching and up to date select.

Thank for this one Ross.

Richard

On Mon, May 14, 2012 at 12:02 PM, Ross Peoples <ross.peop...@gmail.com>wrote:

>
>
> On Monday, May 14, 2012 11:50:06 AM UTC-4, Anthony wrote:
>>
>>
>>>    1. Your "adviewer/viewads" makes 10 calls to the database. Try to
>>>    optimize this either by writing fewer queries, creating a view, and/or 
>>> only
>>>    selecting fields that you need. Also make sure you get the criteria right
>>>    so that you (ideally) don't have any extra, unneeded rows.
>>>
>>> If it's feasible, also consider caching the queries for some amount of
>> time (assuming the results don't change too frequently).
>>
>
> This is a good point. For things that may not change a whole lot, I
> usually set the time_expires=3600 or even None. Give the key a well-defined
> name, and then when something changes, run cache.ram.clear(key) so that the
> next query will get the changes.
>
>>
>>>    1. When you absolutely have to load a few thousand rows (or more) in
>>>    a query (you should avoid this whenever possible), then try using
>>>    "db.executesql(query)" to manually execute a hand-crafted SQL query. This
>>>    will always be faster than using the DAL directly.
>>>
>>> Note, the difference in speed is due to the fact that the DAL won't be
>> converting the results set to a Rows object -- so you won't have the
>> convenience of dealing with DAL Rows and Row objects. If you do 
>> db.executesql(query,
>> as_dict=True), it will convert to a list of dictionaries (which is still
>> faster than converting to a Rows object).
>>
>>>
>>>    1. Another point about executesql: The obvious issue is reduced
>>>    portability, but if you are only planning on using PostgreSQL, then you 
>>> can
>>>     hand-craft a SQL query and profile it against PostgreSQL for maximum
>>>    performance. Once you've got it giving only the data you want, then you 
>>> can
>>>    copy and paste that query into executesql.
>>>
>>> If you want to use db.executesql() but remain portable, you can still
>> have the DAL generate the SQL for you by using the ._select() method:
>>
>> db.executesql(db(query)._**select(...))
>>
>> Obviously in that case you don't get to hand optimize the SQL, but you
>> still get the speed advantage of not converting the results to a Rows
>> object (which is only significant for large results sets).
>>
>
> While true, since he is going for performance in a high-traffic
> environment that requires low-latency, such as a site that serves ads, he
> would definitely want to hand-craft the SQL for complex and large queries
> that slow things down. I wouldn't recommend doing it for every query, just
> the slow ones.
>
>
>>
>> Anthony
>>
>>

Reply via email to