More than 20163845 rows are there and my application continuously insert data in the table.
daily i think there is a increase in 2.5 Gb in that table.

Thanks

Chris Tate-Davies wrote:
How many rows is that???


On Tue, 2011-11-15 at 16:05 +0530, Adarsh Sharma wrote:
Dear all,

I have a doubt regarding fetching data from large tables.
I need to fetch selected columns from a 90Gb Table & 5Gb index on it.

CREATE TABLE `content_table` (
  `c_id` bigint(20) NOT NULL DEFAULT '0',
  `link_level` tinyint(4) DEFAULT NULL,
  `u_id` bigint(20) NOT NULL,
  `heading` varchar(150) DEFAULT NULL,
  `category` varchar(150) DEFAULT NULL,
  `c_url` varchar(500) NOT NULL,
  `keywords` varchar(500) DEFAULT NULL,
  `dt_stamp` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
  `content` mediumtext,
  PRIMARY KEY (`c_id`),
  KEY `idx_url` (`c_url`),
  KEY `idx_head` (`heading`),
  KEY `idx_dtstamp` (`dt_stamp`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1

Now I need to select distict category from content_table of size 90Gb.

Simple select command can take days to complete & I donot think creating index on that column is a good idea.
Please let me know any ideas to do that.

Thanks



Reply via email to