On Mon, 2004-01-05 at 10:50, Jaro wrote: > Hello, > > I've recently re-joined the list after a brief hiatus. I'm starting a new > project that needs to integrate database and PHP to create a web > application. However, in the past the datasets were at most in thousands of > records, not in tens of thousands. This project will require handling at a > minimum 40,000 records in one table alone, with realistic growth potential > for up > to 100,000 records. There will be several other tables that won't be quite > so large, but will also be in tens of thousands. A typical makeup of the > table will be 1-2 integer fields, 6-10 varchar fields (varying in length > from 8 to 50 chars), and perhaps 1-2 text fields. In my past projects i've > used MySQL almost exclusively and it's worked fine. However, i'm not sure if > it can handle datasets this large. Has anyone had experience with using > MySQL with large datasets? How does it perform? My next question is if MySQL > is not robust > enought for it, would PostgreSQL be robust enough? I'm trying to avoid going > to Oracle, primarily for cost reasons. > > I'd appreciate your advice, > jaro
Maybe Sasha will chime in on this, but in brief: Until you start talking about millions of rows I'm not impressed. MySQL eats 100,000 records for breakfast. It will probably be faster than Oracle too. The reasons to /not/ use MySQL are always relate to advanced features like stored procedures, and MySQL 4 has support for transactions so that's no longer an issue either. As for robustness I'm sure that you'll find that MySQL isn't your weak point. Your hard drive or your power will probably fail before MySQL does. If you're really worried about it, and don't need transactions, sticking with MySQL 3 might make you feel more comfortable. -- Andrew ____________________ BYU Unix Users Group http://uug.byu.edu/ ___________________________________________________________________ List Info: http://uug.byu.edu/cgi-bin/mailman/listinfo/uug-list
