Hi All;

I'm looking for tips / ideas per performance tuning some specific queries. 
These are generally large tables on a highly active OLTP system 
(100,000 - 200,000 plus queries per day)

First off, any thoughts per tuning inserts into large tables. I have a large 
table with an insert like this:

insert into public.bigtab1 (text_col1, text_col2, id) values ...

                QUERY PLAN                
------------------------------------------
 Result  (cost=0.00..0.01 rows=1 width=0)
(1 row)

The query cost is low but this is one of the slowest statements per pgfouine







Next we have a select count(*) that  also one of the top offenders:

select count(*) from public.tab3  where user_id=31 
and state='A' 
and amount>0;

                                             QUERY PLAN                         
                     
-----------------------------------------------------------------------------------------------------
 Aggregate  (cost=3836.53..3836.54 rows=1 width=0)
   ->  Index Scan using order_user_indx ontab3 user_id  (cost=0.00..3834.29 
rows=897 width=0)
         Index Cond: (idx_user_id = 31406948::numeric)
         Filter: ((state = 'A'::bpchar) AND (amount > 0::numeric))
(4 rows)

We have an index on the user_id but not on the state or amount, 

add index to amount ?



Thoughts ?







-- 
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to