Hi,

> I use mysql database and my AccountingTable has more than 40 million records 
> per month. Does anyone here have any policy purge? I have an extract of CGI 
> access for my users and is very slow because the bank is getting too big. 
> Does anyone have any recommendation what I should do to have a page extract 
> access working well with a huge amount of data like this?

firstly use InnoDB rather than MyISAM (InnoDB has been in MySQL for ages 
now...no default
installs should not have InnoDB support...and no tools should want to slap 
MyISAM tables
into the DB..should be InnoDB by default)

secondly, edit the my.cnf to fully utilise your host....there are plenty of docs
for each InnoDB option...but..like MyISAM.there are also quite a few tools that 
will
give you a fairly good start on the way down the path eg http://mysqltuner.com/

thirdly, look at what your tool is doing (in this case RADIATOR) with the DB to 
find
out if there are any local query bottlenecks eg use the EXPLAIN command to find 
out
what the queries are doing and where it cannot find quick answers. then look at 
adding
required INDEXes to the tables

finally, move from MySQL to PostgreSQL - psql doesnt have so many nasty locking 
events
on each row/column - MySQL will cause limits whenever an update/insert is 
occuring
(from experience, default install speed of psql is similar to that of MySQL 
after
you've spent some time optimising the MySQL environment! - and THEN you can 
tweak
psql even further )

alan
_______________________________________________
radiator mailing list
radiator@open.com.au
http://www.open.com.au/mailman/listinfo/radiator

Reply via email to