Hello,

I've recently re-joined the list after a brief hiatus. I'm starting a new
project that needs to integrate database and PHP to create a web
application. However, in the past the datasets were at most in thousands of
records, not in tens of thousands. This project will require handling at a
minimum 40,000 records in one table alone, with realistic growth potential
for up
to 100,000 records. There will be several other tables that won't be quite
so large, but will also be in tens of thousands. A typical makeup of the
table will be 1-2 integer fields, 6-10 varchar fields (varying in length
from 8 to 50 chars), and perhaps 1-2 text fields. In my past projects i've
used MySQL almost exclusively and it's worked fine. However, i'm not sure if
it can handle datasets this large. Has anyone had experience with using
MySQL with large datasets? How does it perform? My next question is if MySQL
is not robust
enought for it, would PostgreSQL be robust enough? I'm trying to avoid going
to Oracle, primarily for cost reasons.

I'd appreciate your advice,
jaro


____________________
BYU Unix Users Group 
http://uug.byu.edu/ 
___________________________________________________________________
List Info: http://uug.byu.edu/cgi-bin/mailman/listinfo/uug-list

Reply via email to