Jacek Becla wrote:
Hi,

Is there a way to dynamically split a big table
into n smaller tables by doing a single scan of
the table that is being split? Here is more
details:

 * Suppose I have a million row MyISAM table X, with
   relatively small number of columns. It has
   a column "chunkId" with values between 1 and 100.

 * I need to split this table into 100 in-memory
   tables, essentially I need to do:
   INSERT INTO X001 SELECT * FROM X WHERE chunkId=1;
   INSERT INTO X002 SELECT * FROM X WHERE chunkId=2;
   INSERT INTO X003 SELECT * FROM X WHERE chunkId=3;
   and so on.

Based on the tests, each of these individual INSERTS
costs ~50% of a full table scan of X, even with
a clustered index on chunkId. The cost is totally
dominated by the "SELECT * FROM X WHERE chunkId=<n>"

Since we need to do such split many times, this
performance is not acceptable. It feels it should
be possible to do the split through a single scan.
Any suggestions?

thanks,
Jacek


Jacek,

Have you looked into MySQL partitioning? If you're using version 5.1, it might really help.. just partition the big table on chunk ID.

http://dev.mysql.com/doc/refman/5.1/en/partitioning.html

kabel

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/mysql?unsub=arch...@jab.org

Reply via email to