Normally if I don't need the entire DB (dev work) and can work w/ the
last 100 records, I will end up doing something like:
create table mytable_temp select * from original_table limit 100
insert into mytable_temp select * from original_table limit 101,200
etc, etc, etc.
Then either use that in my db config or mysqldump it out as a test
fixture. Can probably be bashed up into a one liner using mysql
client if need be.
- Jon
On Sep 12, 2007, at 4:32 PM, tedd wrote:
Hi:
I'm developing php scripts for a client that will eventually use
his database. However, I don't want to touch his database during
development and instead want to use a copy.
While I thought it was going to be easy to make a copy (i.e., just
dump the database and reload it via phpMyAdmin) the database turns
out to be too large.
So, I resorted to saving individual tables with data and then
reloading these one at a time. However, even some of those tables
are too large. At 1.3 Meg (not compressed and 132k gz compressed),
the phpMyAdmin 2.6.0-pl3, after a considerable delay, reports
"Service Unavailable" and fails.
The total database size is around 13 Meg and phpMyAdmin reports
that it can handle uploads up-to 2 Meg, but craters at far less.
So, what are my options? Any quick one line solutions? Nothing
I've read address the problem I'm facing.
Thanks in advance.
Cheers,
tedd
--
-------
http://sperling.com http://ancientstones.com http://earthstones.com
_______________________________________________
New York PHP Community MySQL SIG
http://lists.nyphp.org/mailman/listinfo/mysql
NYPHPCon 2006 Presentations Online
http://www.nyphpcon.com
Show Your Participation in New York PHP
http://www.nyphp.org/show_participation.php
_______________________________________________
New York PHP Community MySQL SIG
http://lists.nyphp.org/mailman/listinfo/mysql
NYPHPCon 2006 Presentations Online
http://www.nyphpcon.com
Show Your Participation in New York PHP
http://www.nyphp.org/show_participation.php