"Pine Marten" <[EMAIL PROTECTED]> wrote 

> it is just a small dataset.  But how small is small?  

A potentially big topic.

But heres my purely arbitrary guidelines:

Less than 1000 records use a text file (or maybe XML)
1,000 - 100,000 records consider an in memory database 
or something like gdm (or gadfly in this case)
100,000=1000,000 records small scale or industrial SQL
Access, or SqlLite would be good examples
over 1,000,000 records go industrial. 
Postgres, MySql, Firebird/Interbase, Oracle, DB2, Sql server etc

There are lots of other issues to consider too, like numbers of 
tables, indexes, need for stored procedures etc etc. 
But for simple data storage those rough guidelines should 
suffice.

Caveat: Hardware improvements are changing the rules 
all the time. If its only simple one-off lookup, a text file 
could be used for 1,000,000 records on a fast machine 
nowadays, and if your server has 16G RAM say, then an 
in-memory database may scale up much farther. My figures 
assume long term multiple access during the application and 
average PC hardware.

And all IMHO of course :-)

-- 
Alan Gauld
Author of the Learn to Program web site
http://www.freenetpages.co.uk/hp/alan.gauld


_______________________________________________
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor

Reply via email to