Jeremy,

Thanks, this is what I was originally thinking of, but how I am getting rid of 
the data in log_view_today?  OR, are you saying that log_view_today is a merge 
table for only the current day?  That table def is defined every night?  Then I 
would go about dropping everything in whatever fashion I need.

When I recreate the merge table for just the current day, don't I have to drop 
the merge table, or it just gets recreated automatically.  I am not sure why 
you reference "atomic" on ALTER TABLE . . . , if there is a log writer attached 
to that table, won't I have to wait for a lock?  What do you mean by atomic?  I 
understand the term atomic transaction, just not sure of your context to this 
example.

Thanks for the idea, I was already in this neck of the woods, but the MERGE 
table just for today, I was not sure about that.  My problems in the past deal 
with the locking of the table by the logwriter, hence the need to truncate the 
table.  But while it is truncating, the table hung, hence the need for InnoDB.

Thomas.

-----Original Message-----
From: Jeremy Cole [mailto:[EMAIL PROTECTED]
Sent: Tuesday, January 18, 2005 2:28 PM
To: Thomas Lekai
Cc: mysql@lists.mysql.com
Subject: Re: Logging Data: Should I use MyIsam or InnoDB?


Hi Thomas,

> I have a curious issue here, maybe someone can help.
> 
> I have a single process that inserts data into tables that contain
> purely logging information.  This table is then searched by our Care
> department to troubleshoot issues.  I am looking for the best way to
> store this data, and the structure on the backend.
> 
> There are 50 million inserts into table LOG a day.  The primary index
> on the table is seconds from 1971.  I only need to keep 60 days worth
> of data, and the table is only used for read purposes.  This is my
> design criteria, but my problem is how to delete old data without
> crashing the log writer that is atteched to the table.

OK, how about this:

Use MyISAM and MERGE tables.  Keep one table per day.  E.g.:

log_2005_01_15
log_2005_01_16
log_2005_01_17
log_2005_01_18

etc.

Use MERGE tables to give you the 60 day (and perhaps e.g. 14 day, 30 
day, 7 day, etc.) read views that you need, like so:

CREATE TABLE log_view_7day (
   ...
) TYPE=MERGE UNION=(
   log_2005_01_12,
   log_2005_01_13,
   log_2005_01_14,
   log_2005_01_15,
   log_2005_01_16,
   log_2005_01_17,
   log_2005_01_18
);

Create another MERGE table for "today" using INSERT_METHOD:

CREATE TABLE log_view_today (
   ...
) TYPE=MERGE INSERT_METHOD=FIRST UNION=(
   log_2005_01_18
);

You can then do all of your inserts from the log writer into the "today" 
table, and do your reads against the various MERGEs.

Every day at exactly midnight, you would use ALTER TABLE (which is 
atomic) to redefine the UNION of the MERGE definition of the various tables.

When you want to delete your "old" data, it's simply a matter of doing 
an ALTER TABLE to remove them from the MERGE, and using DROP TABLE to 
drop the log_yyyy_mm_dd table after that.

Also note that you could compress the old data and leave it live (but 
read only) in case you ever need it.  myisampack can help you with that.

I hope that helps!

Regards,

Jeremy

-- 
Jeremy Cole
Technical Yahoo - MySQL (Database) Geek

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to