On Mittwoch, 9. Juli 2008 Sim Zacks wrote:
> The original query:
> SELECT m.message_idnr,k.messageblk FROM dbmail_messageblks k JOIN
> dbmail_physmessage p ON k.physmessage_id = p.id JOIN dbmail_messages
> m ON p.id = m.physmessage_id WHERE mailbox_idnr = 8 AND status IN
> (0,1 ) AND k.is_header = '0' GROUP BY m.message_idnr,k.messageblk
> HAVING ENCODE(k.messageblk::bytea,'escape') LIKE '%sim%'

EXPLAIN ... gives:
HashAggregate  (cost=266.97..266.98 rows=1 width=797)

> My Query
> SELECT m.message_idnr,k.messageblk FROM dbmail_messageblks k JOIN
> dbmail_physmessage p ON k.physmessage_id = p.id JOIN dbmail_messages
> m ON p.id = m.physmessage_id WHERE mailbox_idnr = 8 AND status IN
> (0,1 ) AND k.is_header = '0' and
> ENCODE(k.messageblk::bytea,'escape') LIKE '%sim%'

EXPLAIN ... gives:
Nested Loop  (cost=2.11..266.96 rows=1 width=797)

So the 2nd form can be much faster, and only in worst case takes as long 
as the 1st form. Pure improvement.

It could be we can find other optimzations as well. Paul, is there a 
simple way to log all DB queries to a separate file? Like this, we 
could log for a typical day, and then see which are the most queries 
and where we could start to optimize. I know you, Paul, have other 
things to do - so a query log would be nice that we can share here, and 
let the community optimize it. I'm sure there are some DB gurus on this 
list :-)

mfg zmi
-- 
// Michael Monnerie, Ing.BSc    -----      http://it-management.at
// Tel: 0660 / 415 65 31                      .network.your.ideas.
// PGP Key:         "curl -s http://zmi.at/zmi.asc | gpg --import"
// Fingerprint: AC19 F9D5 36ED CD8A EF38  500E CE14 91F7 1C12 09B4
// Keyserver: www.keyserver.net                   Key-ID: 1C1209B4

Attachment: signature.asc
Description: This is a digitally signed message part.

_______________________________________________
DBmail mailing list
DBmail@dbmail.org
https://mailman.fastxs.nl/mailman/listinfo/dbmail

Reply via email to