Brook,

I'm not that experienced with large table sets, but I imagine that 1 big
table would be easier to use.  20M/25K = roughly 800 emails sent.  If
each time you attempt to create a new table, either for each sender (max
800 times if each sender sends only 1 set), or less if some send 2+,
then you have a number of different tables.

You still have the same insert problems with 800 tables versus 1 table,
and since inserts are considerably more overhead than a large select,
ignore this part of the problem.  Now think of the programming time to
build the logic to keep 800 tables separate (naming conventions, etc.).
Also, what happens if user 432 wants to send a 2nd email, does 432 get a
2nd table created or do they enlarge their first table?  If they enlarge
it, now you have 799 tables and table 432 is 2x times larger than the
others.  As you start having redundancy the table sizes grow just like
they would with 1 large one.  Defeating your purpose.

MS SQL 7 shouldn't have any problem running 20M records.  Heck, I have
heard of Access running with 500M records, though I think this is well
into the extreme for Access.

Another advantage to the 1 large table, if you are only sending out the
emails periodically (ie. Not every 5 minutes) than you can rebuild the
index on the table more frequently and keep some of the same speed
optimizations that a smaller table that isn't indexed (because you
forgot to do it and didn't think it was necessary on a small table).

I'm not a guru syntax coder, but I suspect you will have a harder time
pushing 20k emails out at a time through your server than you will using
the database.  Likewise, RAM is cheap, CPUs are cheap.   Your
programming time and future maintenance is not.  No one will miss a
quarter of a second in query time.  If you have to, use a set of stored
procedures to do all of your database lookups.

Ryan

-----
Date: Fri, 26 Jul 2002 22:16:28 -0700
From: brook <[EMAIL PROTECTED]>
Subject: Creating tables on the fly or using one big table
Message-ID: <[EMAIL PROTECTED]>

I am designing an email application that will be used by multiple (100's
- 
1000's) of users. Each user will have their own account. When they send 
mail, the recipients are recorded to a table. this table is basically
just 
a reference to the full recipient record and the email campaign id. If
each 
user is sending email campaigns of 25,000 recipients + what is the best
way 
to record and later generate reports on this data.

Would it make sense to generate a campaign recipient table on the fly
for 
each user of the system upon registration or first use? This way the
total 
records per table would be a bit more manageable for report generation 
later. Is this sound practice? Or should I just have one huge table
shared 
by all the systems users. I could see this table growing to 20,000,000 
records within time. Would select statements and report generation on 
subsets of data from this monster table be slow?  (SQL7) Are there
benefits 
to leaving all of the data in one big table as opposed to generating
this 
table on the fly for each user, therefor keeping the total size down?
The 
table which holds the email campaigns recipients will need to be used in
a 
number of joins to include additional email delivery stats.

What do you think?

Brook


FAQ: http://www.thenetprofits.co.uk/coldfusion/faq
Archives: http://www.mail-archive.com/cf-talk@houseoffusion.com/
Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists

Reply via email to