I was hoping that someone could help me with optimizing this query.  
Basically I need to return totals from an large database (> 1million 
records in the table).  There needs to be two different totals one by 
"a" and one by "b" where "a" and "b" could have up to 100 rows each.  So 
totals for up to 200 distinct total values.

The problem gets into the multiple where fields in the where clause.  
There could be up to 9 different where parameters. 

Instead of creating multiple queries with indexes, and creating a ton of 
indexes, I tried this approach. 

Basically I create a temp table to hold my totals.  And using a trigger, 
cause the totals to be updated. 

This work fantastic by the way!  I was just seeing if there is a better 
way to speed up the trigger. 
With one statement in the trigger the query runs in about 30 seconds 
with 1.3million records. 
With two statements, the time increases to 50 seconds for 1.3 million 
records. 

I really need to have 4 updates in the trigger and was hoping to keep 
the time < 1 minute.

Thanks for your help.

Query
--------------------------------------
replace into totals
select 0, 0, a, b, c
from table1

Trigger
-----------------------------------------
CREATE TRIGGER sum_totals after insert on totals
BEGIN
update totals set cnt = cnt + 1, a = a + NEW.a where id = NEW.b;
update totals set cnt = cnt + 1, a = a + NEW.a where id = NEW.c;
END

Totals table
---------------------------------------------
create temp table totals(id integer primary key, cnt integer, a float, b 
integer, c integer);
Begin Transaction;
insert into totals values (0, 0, 0.00, 0, 0,);
insert into totals values (1, 0, 0.00, 0, 0,);
insert into totals values (2, 0, 0.00, 0, 0);
etc... total of 500 rows
Commit;



_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to