On Mon, 2004-01-05 at 09:50, Jaro wrote:
> This project will require handling at a
> minimum 40,000 records in one table alone, with realistic growth potential
> for up
> to 100,000 records. There will be several other tables that won't be quite
> so large, but will also be in tens of thousands. A typical makeup of the
> table will be 1-2 integer fields, 6-10 varchar fields (varying in length
> from 8 to 50 chars), and perhaps 1-2 text fields. In my past projects i've
> used MySQL almost exclusively and it's worked fine. However, i'm not sure if
> it can handle datasets this large. Has anyone had experience with using
> MySQL with large datasets? How does it perform? My next question is if MySQL
> is not robust
> enought for it, would PostgreSQL be robust enough? I'm trying to avoid going
> to Oracle, primarily for cost reasons.

Don't blow your money on Oracle. I chuckled a bit at what you describe
as a large dataset. I follow the postgres performance mailing list and
they routinely have 100GB+ databases. That's hundred of /millions/ of
rows. I've got one myself with 426,443 rows growing at around 20,000 a
day and postgres is doing just fine. The key really is having an
optimized schema and setting up proper indexes.

I can't speak directly to MySQL's performance since I prefer Postgres
but I have loaded it up with test sets greater than 100,000 rows before
and haven't had any problems.

Corey



____________________
BYU Unix Users Group 
http://uug.byu.edu/ 
___________________________________________________________________
List Info: http://uug.byu.edu/cgi-bin/mailman/listinfo/uug-list

Reply via email to