On Mon, Nov 28, 2011 at 9:19 PM, Phil Dobbin <phildob...@gmail.com> wrote: > On 29/11/11 02:08, "Jason Pruim" <pru...@gmail.com> wrote: > >>> PostgreSQL? >>> >>> ;-)... >> >> In all seriousness... Would it help or change it in anyway? :) >> >> I am free to use what I want (I believe) on this project... > > It's well worth looking into. Postgres can handle far bigger db's much > quicker than MySQL but the downside is that it's a very steep learning curve > after coming from mysql. > > It's relatively easy to install & there are the drivers of course for PHP > but it'll take up a lot of your time to learn it to the extent of being > confident with it in my experience. > > Good luck, > > Cheers, > > Phil... > -- > Nothing to see here... move along, move along > > > -- > PHP Database Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > >
jason, Assuming you have indexes on the data properly, have you looked into the mysql settings to ensure that you have the ones for large or xl datasets? There are a number of settings for buffers and sort spaces that can tune the database for performance http://www.mysqlperformanceblog.com/2006/06/09/why-mysql-could-be-slow-with-large-tables/ Also, what kind of hard ware are you using? Does the db server have oodles (yeah that techie term) of RAM? Have you looked into creating views for each state? If the db has a fairly static dataset (only adding not much updating) then you create those views so that you are then doing a single select against a pre-processed dataset. More tuning can be done by sharding the data across different diskdrives to aid i/o. A great book on mysql performance is High Peformance MySQL http://www.amazon.com/dp/0596101716?tag=xaprb-20 which is chockful of great options and info about gaining performance. -- Bastien Cat, the other other white meat -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php