Thanks Leo,

However, remember that the key returns quickly on index-only queries, so the
conditions in the WHERE are not my delay at this time. I want to know why it takes 5
mins to scan 2 rows from the MYD


Bryan

----- Original Message ----- From: "Bryan Heitman" <[EMAIL PROTECTED]>
To: "leo" <[EMAIL PROTECTED]>
Sent: Thursday, December 23, 2004 9:37 AM
Subject: Re: scanning 2 rows slow index fast 26GB MyISAM



Thanks Leo,

However, remember that the key returns quickly on index-only queries, so the conditions in the WHERE are not the problem. I want to know why it takes 5 mins to scan 2 rows from the MYD

Bryan
----- Original Message ----- From: "leo" <[EMAIL PROTECTED]>
To: "Bryan Heitman" <[EMAIL PROTECTED]>
Sent: Thursday, December 23, 2004 2:00 AM
Subject: Re: scanning 2 rows slow index fast 26GB MyISAM



HI,Bryan Heitman,

You may change there column `date` to type int,and fill in it with UNIX_TMIESTAMP format value,and alter the key `myKey` on (`AccountID`,`wordid`,`position`,`Date`) if you often use statement like "where accountid = xx and wordid = xx and position = 'xx' and date > now() - interval 10 day".


======= 2004-12-22 22:17:00 您在来信中写道:=======

I am experiencing extreme slowness performing a query in which 2 rows are
returned hanging in the "sending data" status.

Performing an index only query such as SELECT COUNT(*) is extremely quick so
I know the only extra step is retrieving the data from the MYD.


I am looking for thoughts on why this is slow and what can be done to speed
it up. I find it unusual why it would take this long to simply grab 2 rows
from the MYD. vmstat reports high reads and strace confirms pread()'s on
the MYD file.


The only abnormality is my table size MYD is 26 gig and my MYI is about 30
gig.


Test system details, tests were performed with no load.
System: Redhat Linux 2.4.28
Mysql: tested on versions 4.0.22 and latest 4.1 tree
IDE (WD 2500 JB 8 mb buff) disk1 used for MYD (written contiguously at
beginning of disk)
IDE (WD 2500 JB 8 mb buff) disk2 used for MYI
dual xeon 2.4ghz
1gb ddr266 mem

Here are query tests & detail below:

mysql> select count(*) from matrix where accountid = 11 and wordid = 72 and
position = 'Body' and date > now() - interval 10 day;
+----------+
| count(*) |
+----------+
| 2 |
+----------+
1 row in set (0.06 sec)



mysql> select locationid from matrix where accountid = 11 and wordid = 72 and position = 'Body' and date > now() - interval 10 day; +------------+ | locationid | +------------+ | 47932 | | 29571 | +------------+ 2 rows in set (5 min 35.93 sec)

mysql> explain select locationid from matrix where accountid = 11 and wordid
= 71 and position = 'Body' and date > now() - interval 10 day;
+--------+------+---------------+-------+---------+-------------------+-------+-------------+
| table | type | possible_keys | key | key_len | ref | rows
| Extra |
+--------+------+---------------+-------+---------+-------------------+-------+-------------+
| matrix | ref | myKey | myKey | 9 | const,const,const |
56909 | Using where |
+--------+------+---------------+-------+---------+-------------------+-------+-------------+


CREATE TABLE `location` (
 `LocationID` int(11) unsigned NOT NULL auto_increment,
 `ImapUID` int(11) unsigned NOT NULL default '0',
 `AccountID` int(11) unsigned NOT NULL default '0',
 `Date` timestamp(19) NOT NULL,
 `FromWho` tinyblob,
 `Subject` tinyblob,
 `SentTo` tinyblob,
 `mailbox` varchar(255) default NULL,
 `body` longblob,
 PRIMARY KEY  (`LocationID`),
 KEY `myKey` (`LocationID`,`AccountID`,`Date`)
) TYPE=MyISAM MAX_ROWS=1000000000 AVG_ROW_LENGTH=300 DATA
DIRECTORY='/home/SQL/' INDEX DIRECTORY='/home/SQL/'




-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]


.

= = = = = = = = = = = = = = = = = = = =

给你的祝福,要让你招架不住!
        致
礼!


        leo [EMAIL PROTECTED]           2004-12-23








--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]



Reply via email to