Hi Kabel
Yes, I did, it won't do the job for us. I didn't explain the whole
usecase: we are dealing with a 50-billion row table which we want
to split into 1-million-row tables, and then dynamically break each
of these into smaller pieces in order to speed up n^2 near-neighbor
joins. If we
Hi,
Is there a way to dynamically split a big table
into n smaller tables by doing a single scan of
the table that is being split? Here is more
details:
* Suppose I have a million row MyISAM table X, with
relatively small number of columns. It has
a column chunkId with values between 1
Why not write a simple script that scans every record and inserts them
into the proper table one at a time?
In php for example..
$query = SELECT * \n;
$query .= FROM `X` \n;
$result = mysql_query($query);
while(($row = mysql_fetch_array($result, MYSQL_ASSOC))){
$Values = ;
foreach($row as
Jacek Becla wrote:
Hi,
Is there a way to dynamically split a big table
into n smaller tables by doing a single scan of
the table that is being split? Here is more
details:
* Suppose I have a million row MyISAM table X, with
relatively small number of columns. It has
a column chunkId
Splitting a table is not realy difficult, you could use:
Create table user_info select fieldname1,fieldname2,... from
large_table_name;
But you'll probably want a connection between the new tables.
you could add a ID-record first by doing:
alter table large_table add ID int unsigned
Hi All,
mysql newbie here, this may be a silly question, but I couldn't figure out
how to word it for a google search
I want to take a table that someone created, a break it up so that one
table is NOT 125 fields large. I want to separate the table into user
info, product info and product