I would like to ask the more experienced users on Postgres database a couple of questions I have on a db I manage with a lot of data. A lot of data means something like 15.000.000 rows in a table. I will try to describe the tables and what I will have to do on them :)
There is a table that has product data in the form of
Table product:
product_id varchar(8),
product_name text


and

product actions table:

product_id varchar(8),
flow char(1),
who int,
where int,
value float.

I will have to make sql queries in the form "select value from product_actions where who='someone' and where='somewhere' and maybe make also some calculations on these results. I allready have made some indexes on these tables and a view that joins the two of them but I would like to ask you people if someone is using such a big db and how can I speed up things as much as it is possible on this ... these product_actions tables exists for each year from 1988 till 2003 so this means a lot of data...

Thanks in Advance

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to