how about purging rows older than a month? Do you need to keep them?
Archive them them to another database?

Actually, I got a better idea. Have your master db which is huge and
holds everything. Then on a seperate DB run a table for each feedid
with the last 100 feeds for that id.

Have a cron job that runs continually updating those tables with current data.

get it?


On Tue, 15 Feb 2005 23:02:38 +0100, Jacob Friis Larsen
<[EMAIL PROTECTED]> wrote:
> > >>>We have a table that grow by 200MB each day.
> > >>>Should we put data in different tables or is one big table just as fast?
> > >
> > > The table contains data from RSS and Atom feeds.
> > > Most users only need to see the newest items.
> > > A select could look like this: "SELECT title, desc FROM items WHERE
> > > feedid = 25 ORDER BY id DESC LIMIT 10"
> >
> > I would, however, be seriously concerned about diskspace if a table is 
> > adding
> > 200 MB a day with no archiving/compression/purges.
> 
> What if we use COMPRESS() for the text in old rows?
> 
> Jacob
> 
> --
> MySQL General Mailing List
> For list archives: http://lists.mysql.com/mysql
> To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]
> 
> 


-- 
Ryan McCullough
mailto:[EMAIL PROTECTED]

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to