VACUUM ANALYZE, rather

On Thu, Aug 20, 2009 at 2:21 PM, bricklen <brick...@gmail.com> wrote:

> Did you vacuum your table after the delete?
>
>
> On Thu, Aug 20, 2009 at 2:36 AM, Kumar Anand 
> <kumar.an...@merceworld.com>wrote:
>
>> Dear all,
>>
>> I face this problem from last few days.
>>
>> here is test2 table  with only one column id
>> erp_test=> \d test2
>>    Table "public.test2"
>> Column |  Type   | Modifiers
>> --------+---------+-----------
>> id     | integer |
>>
>> I insert 10,000,000 entries in this table.
>> erp_test=> INSERT INTO test2 VALUES (generate_series(1,10000000));
>> INSERT 0 10000000
>>
>> then i delete all the entries
>> erp_test=> delete from test2;
>> DELETE 10000000
>>
>> and i insert only 10 entries
>> erp_test=> INSERT INTO test2 VALUES (generate_series(1,10));
>> INSERT 0 10
>>
>> now i try to count no of rows in this table which take long time for this
>> 10 rows only
>> (about 2-3 second)
>> erp_test=> SELECT count(*) from test2;
>> count
>> -------
>>   10
>> (1 row)
>>
>> this is the output of explain analyze query of the same
>>
>> erp_test=> EXPLAIN ANALYZE SELECT count(*) from test2;
>>                                                      QUERY PLAN
>>
>>  
>> ------------------------------------------------------------------------------------------------------------------------
>> Aggregate  (cost=198652.13..198652.14 rows=1 width=0) (actual
>> time=2123.471..2123.472 rows=1 loops=1)
>>  ->  Seq Scan on test2  (cost=0.00..169732.70 rows=11567770 width=0)
>> (actual time=2123.454..2123.454 rows=10 loops=1)
>> Total runtime: 2123.609 ms
>> (3 rows)
>>
>>
>> can any one  solve my problem to speed up my select count query.
>>
>> --
>> Thanks & Regards
>> Kumar Anand
>>
>> --
>> Sent via pgsql-admin mailing list (pgsql-admin@postgresql.org)
>> To make changes to your subscription:
>> http://www.postgresql.org/mailpref/pgsql-admin
>>
>
>

Reply via email to